sha
null | last_modified
null | library_name
stringclasses 154
values | text
stringlengths 1
900k
| metadata
stringlengths 2
348k
| pipeline_tag
stringclasses 45
values | id
stringlengths 5
122
| tags
sequencelengths 1
1.84k
| created_at
stringlengths 25
25
| arxiv
sequencelengths 0
201
| languages
sequencelengths 0
1.83k
| tags_str
stringlengths 17
9.34k
| text_str
stringlengths 0
389k
| text_lists
sequencelengths 0
722
| processed_texts
sequencelengths 1
723
| tokens_length
sequencelengths 1
723
| input_texts
sequencelengths 1
61
| embeddings
sequencelengths 768
768
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
null | null | peft |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
### Framework versions
- PEFT 0.8.2 | {"library_name": "peft", "base_model": "unsloth/mistral-7b-bnb-4bit"} | null | Raven-Pro/SDtrain | [
"peft",
"safetensors",
"arxiv:1910.09700",
"base_model:unsloth/mistral-7b-bnb-4bit",
"region:us"
] | 2024-02-15T05:01:27+00:00 | [
"1910.09700"
] | [] | TAGS
#peft #safetensors #arxiv-1910.09700 #base_model-unsloth/mistral-7b-bnb-4bit #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
### Framework versions
- PEFT 0.8.2 | [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact",
"### Framework versions\n\n- PEFT 0.8.2"
] | [
"TAGS\n#peft #safetensors #arxiv-1910.09700 #base_model-unsloth/mistral-7b-bnb-4bit #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact",
"### Framework versions\n\n- PEFT 0.8.2"
] | [
41,
6,
3,
54,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4,
11
] | [
"passage: TAGS\n#peft #safetensors #arxiv-1910.09700 #base_model-unsloth/mistral-7b-bnb-4bit #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact### Framework versions\n\n- PEFT 0.8.2"
] | [
-0.1308314949274063,
0.20576155185699463,
-0.0026097034569829702,
0.03112868033349514,
0.08019401878118515,
0.021118029952049255,
0.05290699377655983,
0.1329529583454132,
0.019574517384171486,
0.10768399387598038,
0.07456347346305847,
0.11617716401815414,
0.10772760957479477,
0.20795908570289612,
0.005246891174465418,
-0.15489380061626434,
0.025961125269532204,
-0.08204516768455505,
0.014148417860269547,
0.13163451850414276,
0.14134164154529572,
-0.10299697518348694,
0.08051195740699768,
-0.018355023115873337,
-0.0013345351908355951,
-0.03611611947417259,
-0.06680051237344742,
-0.01737227663397789,
0.04540834575891495,
0.023679327219724655,
0.05969870463013649,
-0.013021142221987247,
0.09121555089950562,
-0.26081836223602295,
0.017997119575738907,
0.040787164121866226,
0.007438383996486664,
0.08508597314357758,
0.09868556261062622,
-0.040285348892211914,
0.11757465451955795,
-0.024148868396878242,
0.13962914049625397,
0.09257715195417404,
-0.0886407345533371,
-0.23045450448989868,
-0.06558653712272644,
0.07856462895870209,
0.18538209795951843,
0.08789914101362228,
-0.04718988388776779,
0.14081336557865143,
-0.07348088920116425,
0.026368239894509315,
0.03835909441113472,
-0.09479634463787079,
-0.0687912330031395,
0.06975395232439041,
0.13538290560245514,
0.06325536221265793,
-0.12559375166893005,
-0.04071038216352463,
0.03226472809910774,
0.04539511725306511,
0.05775974690914154,
0.006963003426790237,
0.16460849344730377,
0.03181316703557968,
-0.1462440937757492,
-0.05448058247566223,
0.1406955122947693,
0.012043486349284649,
-0.04809385910630226,
-0.224371537566185,
-0.001992325996980071,
-0.09598521143198013,
-0.023508813232183456,
-0.05445864424109459,
0.030209455639123917,
0.011037744581699371,
0.11240803450345993,
-0.0399402417242527,
-0.09388615190982819,
-0.02632562816143036,
0.09749587625265121,
0.05123962461948395,
0.02517337165772915,
-0.02154393121600151,
0.012092029675841331,
0.12608037889003754,
0.08068238943815231,
-0.13563522696495056,
-0.06777284294366837,
-0.07771719992160797,
-0.04276606813073158,
-0.03466116636991501,
0.04226420447230339,
0.029377296566963196,
0.0685935914516449,
0.27189505100250244,
-0.01961667463183403,
0.062482330948114395,
0.05677374452352524,
0.015956569463014603,
0.041408438235521317,
0.10702111572027206,
-0.031879723072052,
-0.15955692529678345,
-0.010800451971590519,
0.1051795706152916,
-0.005921186879277229,
-0.02830825001001358,
-0.044482357800006866,
0.031630586832761765,
0.04745662957429886,
0.11352326720952988,
0.11566732078790665,
-0.019018106162548065,
-0.08117671310901642,
-0.05933466926217079,
0.2024572640657425,
-0.15671513974666595,
0.04373093321919441,
0.02601434849202633,
-0.005998269654810429,
-0.06394220143556595,
0.005995193962007761,
0.01795661635696888,
-0.025199495255947113,
0.06099562719464302,
-0.06614238768815994,
-0.043907199054956436,
-0.1276620328426361,
-0.026016339659690857,
0.03473545238375664,
0.014134134165942669,
-0.04099368304014206,
-0.04571991041302681,
-0.08858014643192291,
-0.10889483243227005,
0.10756061226129532,
-0.05529964715242386,
-0.05354791134595871,
-0.03013642691075802,
-0.09204479306936264,
0.024152377620339394,
0.02713550068438053,
0.07391925901174545,
-0.027148615568876266,
0.05239573493599892,
0.005608804989606142,
0.05927829071879387,
0.08368168026208878,
0.02666366845369339,
-0.08616891503334045,
0.06523065268993378,
-0.1971064656972885,
0.08104258030653,
-0.08552008122205734,
0.03363678231835365,
-0.16163992881774902,
-0.010630074888467789,
0.012286900542676449,
0.027567414566874504,
0.04179586097598076,
0.16614168882369995,
-0.21620871126651764,
-0.021711019799113274,
0.15937626361846924,
-0.10980767756700516,
-0.13531285524368286,
0.045018624514341354,
-0.0426873080432415,
0.18101400136947632,
0.02952459640800953,
0.008236059918999672,
0.09739923477172852,
-0.16217833757400513,
-0.032825130969285965,
-0.02448529750108719,
0.004684120416641235,
0.07973381131887436,
0.09120327979326248,
-0.09058220684528351,
-0.00014854315668344498,
0.012337294407188892,
-0.06190880760550499,
-0.014169820584356785,
-0.04325035959482193,
-0.10577264428138733,
0.003109496319666505,
-0.08897653222084045,
0.03323417529463768,
0.006332055199891329,
-0.09794572740793228,
-0.01144403126090765,
-0.16013218462467194,
-0.06216539070010185,
0.09045906364917755,
0.0021008590701967478,
-0.024746593087911606,
-0.10923177003860474,
0.06125784292817116,
-0.04331868514418602,
-0.02536173351109028,
-0.14208555221557617,
-0.024740051478147507,
0.018860522657632828,
-0.13904164731502533,
-0.009141129441559315,
-0.12143539637327194,
0.06866858899593353,
0.005989819765090942,
-0.052013099193573,
-0.0483822338283062,
-0.0050071789883077145,
-0.00002846750612661708,
-0.05577915161848068,
-0.23292303085327148,
-0.028253156691789627,
-0.05424585938453674,
0.15809321403503418,
-0.22716403007507324,
0.04409153759479523,
0.012738440185785294,
0.11405611038208008,
0.000930649577639997,
-0.07118945568799973,
0.021413853392004967,
-0.06749802827835083,
-0.025037869811058044,
-0.07181259244680405,
-0.006300078704953194,
-0.0006182340439409018,
-0.027891812846064568,
0.010937399230897427,
-0.1117788553237915,
-0.0527648851275444,
0.10203997790813446,
0.06412186473608017,
-0.1510564684867859,
0.009712656028568745,
-0.038394514471292496,
-0.06128094717860222,
-0.07117534428834915,
-0.06814733147621155,
0.08727052807807922,
0.054459430277347565,
0.036512598395347595,
-0.07800645381212234,
-0.06984047591686249,
0.0052225929684937,
-0.02675061859190464,
-0.007416465785354376,
0.12053085118532181,
0.06766346842050552,
-0.10074050724506378,
0.09311165660619736,
0.07397682219743729,
0.017430098727345467,
0.07547994703054428,
-0.02909732237458229,
-0.10601735860109329,
-0.03335743024945259,
0.05775392800569534,
0.010990016162395477,
0.17758123576641083,
-0.07655712217092514,
0.05941421911120415,
0.04615620896220207,
-0.044998329132795334,
0.04921068623661995,
-0.08568185567855835,
0.0068824621848762035,
0.0011665916536003351,
-0.01675306260585785,
0.027526969090104103,
-0.022866005077958107,
0.0049964957870543,
0.07564208656549454,
0.055591292679309845,
0.025410784408450127,
0.025158114731311798,
-0.035250283777713776,
-0.14520670473575592,
0.18104499578475952,
-0.09418416768312454,
-0.23408734798431396,
-0.16050495207309723,
0.059783827513456345,
0.049742553383111954,
-0.013386783190071583,
0.026632891967892647,
-0.054397404193878174,
-0.09962697327136993,
-0.08450868725776672,
0.0004961398662999272,
0.035862624645233154,
-0.059273041784763336,
-0.07067174464464188,
0.043521493673324585,
0.045103639364242554,
-0.11711648851633072,
0.028737442567944527,
0.06909780949354172,
-0.013697835616767406,
0.003332741791382432,
0.05562452971935272,
0.09649348258972168,
0.18468835949897766,
-0.003249406348913908,
0.0005589842330664396,
0.0628058910369873,
0.27912551164627075,
-0.15901970863342285,
0.1074838638305664,
0.14437440037727356,
-0.06851138174533844,
0.06862621009349823,
0.18148547410964966,
0.025378497317433357,
-0.09618934243917465,
0.025287630036473274,
0.026261113584041595,
-0.0197440218180418,
-0.27399688959121704,
-0.05038442090153694,
-0.01509429793804884,
-0.08042722195386887,
0.0742383673787117,
0.08565516769886017,
0.08037156611680984,
0.041662849485874176,
-0.057161640375852585,
-0.10884979367256165,
0.027317307889461517,
0.10979653149843216,
-0.013412963598966599,
0.004124450497329235,
0.08309376239776611,
-0.045760441571474075,
0.008181767538189888,
0.08853144198656082,
-0.020641587674617767,
0.13813720643520355,
0.05180414393544197,
0.0954289510846138,
0.08305221050977707,
0.10416093468666077,
-0.008812508545815945,
0.0334746278822422,
0.014980639331042767,
0.02377958782017231,
0.024188173934817314,
-0.08673013001680374,
0.007819687947630882,
0.11021337658166885,
0.026263641193509102,
0.025612493976950645,
0.01989135704934597,
-0.047463443130254745,
0.037843573838472366,
0.18892493844032288,
0.031237319111824036,
-0.21880638599395752,
-0.08878375589847565,
0.05019382759928703,
-0.07716023921966553,
-0.15993940830230713,
-0.00947339553385973,
0.027083810418844223,
-0.16267554461956024,
0.014416925609111786,
-0.042357247322797775,
0.1022602766752243,
-0.0768977478146553,
-0.040361952036619186,
0.11016103625297546,
0.04631463438272476,
-0.01951327919960022,
0.05348300561308861,
-0.19894006848335266,
0.1026054322719574,
0.02917909249663353,
0.07806086540222168,
-0.08569887280464172,
0.09433666616678238,
0.001795238466002047,
-0.02421111986041069,
0.16781173646450043,
-0.0014241766184568405,
-0.051973406225442886,
-0.08633378893136978,
-0.09634662419557571,
-0.001295828609727323,
0.08008209615945816,
-0.12534867227077484,
0.08305131644010544,
-0.03416045755147934,
-0.02358807995915413,
-0.006853973027318716,
-0.08569129556417465,
-0.13360287249088287,
-0.1498676985502243,
0.051248881965875626,
-0.09986381232738495,
0.026854123920202255,
-0.08593129366636276,
-0.05362839624285698,
0.018979547545313835,
0.18150433897972107,
-0.21663077175617218,
-0.10799934715032578,
-0.14079797267913818,
-0.11213833093643188,
0.16611672937870026,
-0.04223022609949112,
0.0827689990401268,
0.000523950788192451,
0.15699197351932526,
0.010767796076834202,
-0.013712618499994278,
0.08606026321649551,
-0.09417591989040375,
-0.19509445130825043,
-0.050852369517087936,
0.1609855443239212,
0.14595527946949005,
0.031129715964198112,
-0.005877654533833265,
0.030237486585974693,
-0.06904648244380951,
-0.11212915927171707,
0.027940046042203903,
0.1585167497396469,
0.07066989690065384,
-0.012795859947800636,
-0.02438022568821907,
-0.10749504715204239,
-0.06243645027279854,
-0.041487280279397964,
-0.009701484814286232,
0.20475156605243683,
-0.06532672792673111,
0.14430265128612518,
0.11077863723039627,
-0.05675866827368736,
-0.21060411632061005,
0.03500484675168991,
0.04407288134098053,
0.02430238015949726,
0.041715022176504135,
-0.18317028880119324,
0.09417563676834106,
-0.01456292625516653,
-0.08475449681282043,
0.17502692341804504,
-0.17393635213375092,
-0.13420115411281586,
0.11385267972946167,
0.021139925345778465,
-0.213583841919899,
-0.14219193160533905,
-0.10271048545837402,
-0.017093542963266373,
-0.12455635517835617,
0.03826851397752762,
0.005373564548790455,
0.006915813311934471,
0.013596029952168465,
0.017182491719722748,
0.039914749562740326,
-0.05595919489860535,
0.2130710780620575,
-0.03599648177623749,
-0.0018434480298310518,
-0.05222400650382042,
-0.06938137859106064,
0.02748703584074974,
-0.05347244814038277,
0.11956922709941864,
-0.008737931959331036,
0.03901652246713638,
-0.17130282521247864,
-0.043610960245132446,
-0.05607745051383972,
0.03525308519601822,
-0.09201432019472122,
-0.0798589214682579,
-0.043554529547691345,
0.09195178002119064,
0.08820154517889023,
-0.019044002518057823,
-0.000501533504575491,
-0.09267296642065048,
0.07726012915372849,
0.2113565355539322,
0.20019596815109253,
0.07690746337175369,
-0.05249866470694542,
0.02865147776901722,
-0.03905488923192024,
0.04380790889263153,
-0.2086215317249298,
0.040241751819849014,
0.06283517926931381,
0.021422626450657845,
0.06419307738542557,
-0.010291828773915768,
-0.15782000124454498,
-0.07615043222904205,
0.08551456034183502,
-0.05990138649940491,
-0.1606905311346054,
-0.02824002504348755,
0.01837081089615822,
-0.21646413207054138,
-0.04403185099363327,
0.034296125173568726,
-0.01377115584909916,
-0.03887117654085159,
0.022175587713718414,
0.0806385725736618,
-0.028895361348986626,
0.10478547215461731,
0.09114716947078705,
0.09450291097164154,
-0.0972728580236435,
0.05581321194767952,
0.07234426587820053,
-0.035124272108078,
0.03133777901530266,
0.11818305402994156,
-0.04144471883773804,
-0.04679996892809868,
0.0801287591457367,
0.11917287111282349,
0.004449456464499235,
-0.057932112365961075,
0.0026725889183580875,
-0.04133657366037369,
0.05743590369820595,
0.10055634379386902,
0.03426501899957657,
0.0028161045629531145,
0.07494527101516724,
0.03160325810313225,
-0.09162355959415436,
0.12431052327156067,
0.0642096996307373,
0.025584381073713303,
-0.05498089641332626,
-0.03907129541039467,
-0.017796212807297707,
-0.010724605061113834,
-0.019289134070277214,
0.000976858544163406,
-0.08564959466457367,
0.004212620668113232,
-0.1274299919605255,
0.023274092003703117,
-0.07809241861104965,
0.005665923934429884,
0.03394092246890068,
-0.05004472658038139,
0.0018269678112119436,
-0.004445296246558428,
-0.07261436432600021,
-0.0566413551568985,
-0.013630598783493042,
0.07767500728368759,
-0.14233452081680298,
0.03906533494591713,
0.07943280041217804,
-0.10676193982362747,
0.06845694035291672,
-0.006024794653058052,
0.009245668537914753,
0.0005799506907351315,
-0.13978372514247894,
0.05640985816717148,
-0.028007226064801216,
-0.008777081966400146,
0.0006790097104385495,
-0.19267812371253967,
-0.00686650862917304,
-0.032083749771118164,
-0.0633387491106987,
0.02148798480629921,
0.002823612652719021,
-0.12060883641242981,
0.1061801165342331,
0.005607499275356531,
-0.058840300887823105,
-0.023103933781385422,
0.04234841465950012,
0.09107416123151779,
-0.007303046062588692,
0.13073967397212982,
-0.031087106093764305,
0.07781762629747391,
-0.17784355580806732,
-0.009127439931035042,
-0.016112040728330612,
0.05869322642683983,
-0.018492508679628372,
-0.034897755831480026,
0.061567872762680054,
-0.024487895891070366,
0.17002201080322266,
-0.007881049066781998,
0.06843184679746628,
0.04814118519425392,
0.007907981984317303,
0.04171339422464371,
0.07456093281507492,
0.0695447251200676,
-0.01308803167194128,
-0.004156623966991901,
0.03739650174975395,
-0.0021001766435801983,
-0.05535852909088135,
-0.15861870348453522,
0.056810349225997925,
0.1807055026292801,
0.0543443001806736,
0.029128728434443474,
0.012883412651717663,
-0.11816368252038956,
-0.07376158982515335,
0.110674649477005,
-0.023961715400218964,
-0.0343439057469368,
-0.06431491672992706,
0.21663875877857208,
0.13382425904273987,
-0.19784650206565857,
0.07170668244361877,
-0.0627514198422432,
-0.045536018908023834,
-0.1397603154182434,
-0.17312118411064148,
-0.06016429141163826,
-0.05203656107187271,
-0.023539505898952484,
-0.05712827667593956,
0.04642213135957718,
0.04593077301979065,
-0.003179202787578106,
-0.02814190648496151,
0.10922247171401978,
0.025152618065476418,
-0.029982203617691994,
0.045976702123880386,
0.05585537478327751,
0.03467946872115135,
-0.08854366838932037,
0.007187104318290949,
0.004341087304055691,
0.01961381360888481,
0.07213144749403,
0.018144380301237106,
-0.07143668830394745,
0.027477040886878967,
-0.02153889834880829,
-0.12383774667978287,
0.041616614907979965,
-0.003832773771136999,
-0.025235967710614204,
0.15026722848415375,
0.03822993487119675,
0.008767536841332912,
-0.014859623275697231,
0.2299458533525467,
-0.07782480120658875,
-0.08329811692237854,
-0.14083294570446014,
0.07440249621868134,
-0.07743634283542633,
0.02285110205411911,
0.024110911414027214,
-0.12433444708585739,
0.01935693807899952,
0.16919542849063873,
0.12006819993257523,
-0.0158525537699461,
0.0039762710221111774,
0.047693733125925064,
0.0031303169671446085,
-0.04392404481768608,
0.01829853653907776,
0.051059428602457047,
0.1933283507823944,
-0.07603653520345688,
0.05570526048541069,
-0.016871584579348564,
-0.07963921874761581,
-0.01660897023975849,
0.09073114395141602,
-0.011838728561997414,
-0.0037427612114697695,
-0.06671737879514694,
0.15053057670593262,
-0.08109057694673538,
-0.2061050534248352,
0.061420246958732605,
-0.0582779198884964,
-0.13958869874477386,
-0.04322183504700661,
0.037167325615882874,
-0.024962030351161957,
0.001234387862496078,
0.06302744150161743,
-0.03913385421037674,
0.17428666353225708,
0.026207074522972107,
-0.04310736432671547,
-0.08528905361890793,
0.05693257600069046,
-0.14562885463237762,
0.28354325890541077,
0.02322843298316002,
0.06263235956430435,
0.11618350446224213,
-0.02435491792857647,
-0.15055152773857117,
0.01414774265140295,
0.10941188782453537,
-0.06597111374139786,
0.07052526623010635,
0.16587229073047638,
0.008015708066523075,
0.12853825092315674,
0.06777336448431015,
-0.04240266606211662,
0.03364570066332817,
-0.0872591957449913,
-0.045617397874593735,
-0.13000372052192688,
0.0779033973813057,
-0.09150155633687973,
0.15785804390907288,
0.11730991303920746,
-0.07293424755334854,
0.007561552803963423,
-0.021111013367772102,
0.0907791405916214,
0.013039819896221161,
0.11122320592403412,
0.01076431479305029,
-0.19491861760616302,
0.03986662998795509,
0.013352732174098492,
0.0942840427160263,
-0.21196909248828888,
-0.052150920033454895,
0.04408309981226921,
-0.021817972883582115,
-0.07234376668930054,
0.11900609731674194,
0.031324177980422974,
0.028128618374466896,
-0.03783135488629341,
-0.033961545675992966,
0.00904248096048832,
0.15125739574432373,
-0.11335979402065277,
-0.01811841130256653
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | text-generation | aidonuts/enthralling-etchings-132-s1200 | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-15T05:02:35+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #llama #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #llama #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
60,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #llama #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.04654794931411743,
0.16618601977825165,
-0.005445904564112425,
0.01853804849088192,
0.0981811136007309,
0.011998992413282394,
0.06433123350143433,
0.11398410052061081,
-0.0230073444545269,
0.11406639218330383,
0.03047988750040531,
0.10172267258167267,
0.11317981779575348,
0.14841650426387787,
-0.002152352826669812,
-0.22403094172477722,
0.050844956189394,
-0.12105348706245422,
-0.033293843269348145,
0.11749980598688126,
0.1483822613954544,
-0.09928343445062637,
0.07274559140205383,
-0.029687678441405296,
-0.012143402360379696,
-0.030057786032557487,
-0.05890674889087677,
-0.046214159578084946,
0.04651786759495735,
0.06640566885471344,
0.06770290434360504,
0.0071083661168813705,
0.09012923389673233,
-0.2696533799171448,
0.018959321081638336,
0.07145345956087112,
-0.002759667346253991,
0.06957992166280746,
0.06404146552085876,
-0.07107418030500412,
0.10337356477975845,
-0.05106033384799957,
0.14650006592273712,
0.08365883678197861,
-0.09081148356199265,
-0.1895141303539276,
-0.08866965025663376,
0.09882009029388428,
0.17572562396526337,
0.04925641790032387,
-0.02320658043026924,
0.09761467576026917,
-0.08769196271896362,
0.015438909642398357,
0.04981724172830582,
-0.07620415836572647,
-0.05378096550703049,
0.05986575037240982,
0.07907199114561081,
0.06627275794744492,
-0.12434766441583633,
-0.02885502204298973,
0.005009706597775221,
0.010980482213199139,
0.0769270583987236,
0.01728810742497444,
0.146672785282135,
0.0338633768260479,
-0.12615777552127838,
-0.04880760237574577,
0.09869225323200226,
0.03395522013306618,
-0.04422314465045929,
-0.24749068915843964,
-0.03152675926685333,
-0.030810698866844177,
-0.029386121779680252,
-0.03716538846492767,
0.04340358078479767,
-0.007673026993870735,
0.08638741075992584,
-0.0060646249912679195,
-0.07403432577848434,
-0.03937075287103653,
0.06169692054390907,
0.0672287791967392,
0.02999979443848133,
-0.013745363801717758,
0.010938193649053574,
0.11620724946260452,
0.1095694974064827,
-0.12054188549518585,
-0.05555335059762001,
-0.06393084675073624,
-0.08656639605760574,
-0.040790557861328125,
0.034162238240242004,
0.03456587344408035,
0.05349370837211609,
0.25305667519569397,
0.015654386952519417,
0.059652652591466904,
0.034477248787879944,
0.007892133668065071,
0.05848940089344978,
0.11044429242610931,
-0.06018859148025513,
-0.10444226115942001,
-0.02648012898862362,
0.08843598514795303,
0.008199662901461124,
-0.03287925571203232,
-0.05088530853390694,
0.06019928678870201,
0.01946467161178589,
0.11926145106554031,
0.09061790257692337,
0.010536285117268562,
-0.07121123373508453,
-0.061038948595523834,
0.1891259253025055,
-0.16544590890407562,
0.04322727024555206,
0.035097137093544006,
-0.03903156518936157,
0.00019933005387429148,
0.013914269395172596,
0.016625655815005302,
-0.025983380153775215,
0.09017423540353775,
-0.054113563150167465,
-0.04145489260554314,
-0.11186197400093079,
-0.03383193537592888,
0.033762916922569275,
0.008953776210546494,
-0.035059962421655655,
-0.033713940531015396,
-0.08351044356822968,
-0.07577689737081528,
0.09320491552352905,
-0.07346344739198685,
-0.04878907650709152,
-0.01804324984550476,
-0.07530532777309418,
0.022395428270101547,
0.019394835457205772,
0.07707412540912628,
-0.02362251654267311,
0.04399976506829262,
-0.05189276114106178,
0.05863580107688904,
0.11207318305969238,
0.03570080175995827,
-0.05736649036407471,
0.06062258034944534,
-0.23834340274333954,
0.09552820026874542,
-0.07409077137708664,
0.05591456592082977,
-0.153293639421463,
-0.024439791217446327,
0.04788333550095558,
0.008784620091319084,
-0.009650949388742447,
0.13416339457035065,
-0.21702027320861816,
-0.02536402828991413,
0.1717337965965271,
-0.10057014971971512,
-0.07069246470928192,
0.05619903281331062,
-0.04835370555520058,
0.10988964140415192,
0.03825836628675461,
-0.025690359994769096,
0.06171267107129097,
-0.1267417073249817,
0.003717758459970355,
-0.05005312338471413,
-0.017048977315425873,
0.1548657864332199,
0.07182947546243668,
-0.07217690348625183,
0.07399354875087738,
0.025708531960844994,
-0.0246540866792202,
-0.04625825211405754,
-0.015164627693593502,
-0.10536660254001617,
0.014689887873828411,
-0.06369215250015259,
0.014470234513282776,
-0.020807426422834396,
-0.09071163833141327,
-0.027962757274508476,
-0.17504668235778809,
-0.03014434315264225,
0.08651752024888992,
-0.008693269453942776,
-0.01803150773048401,
-0.1178668737411499,
0.009341353550553322,
0.04177580401301384,
0.0061247628182172775,
-0.13462838530540466,
-0.04812471568584442,
0.02780051715672016,
-0.1600649207830429,
0.034652888774871826,
-0.05392369255423546,
0.04932025074958801,
0.025790516287088394,
-0.028889117762446404,
-0.026493212208151817,
0.021633783355355263,
0.005992184858769178,
-0.011999987065792084,
-0.24343903362751007,
-0.028118690475821495,
-0.024888472631573677,
0.1682123839855194,
-0.20917098224163055,
0.03546025976538658,
0.07867541164159775,
0.15366052091121674,
0.011240328662097454,
-0.04177491366863251,
0.005974748637527227,
-0.06935794651508331,
-0.02736494317650795,
-0.05875484645366669,
-0.0047869328409433365,
-0.03310677409172058,
-0.04545191675424576,
0.04568447172641754,
-0.16510973870754242,
-0.032636504620313644,
0.09776268899440765,
0.06289951503276825,
-0.13922683894634247,
-0.020621931180357933,
-0.03630133345723152,
-0.049253206700086594,
-0.04911839962005615,
-0.0605199858546257,
0.10893940925598145,
0.05891856551170349,
0.04574795812368393,
-0.05928509309887886,
-0.07568105310201645,
-0.001827909960411489,
-0.013898161239922047,
-0.017864689230918884,
0.09759635478258133,
0.0751434788107872,
-0.13251115381717682,
0.09224759042263031,
0.09603385627269745,
0.07919023185968399,
0.09113933145999908,
-0.02355697751045227,
-0.08261934667825699,
-0.045987509191036224,
0.031442027539014816,
0.020124373957514763,
0.13039541244506836,
-0.024294709786772728,
0.04352088272571564,
0.042134687304496765,
-0.019369594752788544,
0.014752166345715523,
-0.08687400817871094,
0.033972494304180145,
0.028472330421209335,
-0.016721390187740326,
0.050190530717372894,
-0.03876714035868645,
0.02440318465232849,
0.08830609917640686,
0.045322712510824203,
0.03507532551884651,
0.015493292361497879,
-0.05206458270549774,
-0.1083620935678482,
0.16405931115150452,
-0.12714070081710815,
-0.22483378648757935,
-0.13936103880405426,
0.0037376401014626026,
0.035628627985715866,
-0.015835661441087723,
0.002417160663753748,
-0.059374887496232986,
-0.12220635265111923,
-0.08858037739992142,
0.015140829607844353,
0.04942670464515686,
-0.09028962254524231,
-0.06437795609235764,
0.058117836713790894,
0.03889724239706993,
-0.14560972154140472,
0.017612040042877197,
0.04854894429445267,
-0.09789852797985077,
-0.006774199660867453,
0.08094939589500427,
0.0698540136218071,
0.1770169734954834,
0.017703235149383545,
-0.021850809454917908,
0.032354529947042465,
0.20614571869373322,
-0.13538233935832977,
0.11083246022462845,
0.13607586920261383,
-0.09041404724121094,
0.08072979003190994,
0.19951270520687103,
0.03932560607790947,
-0.10153959691524506,
0.031980328261852264,
0.02283124253153801,
-0.0284719280898571,
-0.24526868760585785,
-0.07212468236684799,
-0.004402178805321455,
-0.058010730892419815,
0.07660572230815887,
0.09286724030971527,
0.08215958625078201,
0.012304253876209259,
-0.09310996532440186,
-0.08154371380805969,
0.05942574888467789,
0.10367169976234436,
0.024584239348769188,
-0.010839897207915783,
0.08998730033636093,
-0.034100502729415894,
0.019626356661319733,
0.0853661298751831,
0.005239574704319239,
0.17840281128883362,
0.05159219726920128,
0.18830420076847076,
0.07925192266702652,
0.07219027727842331,
0.009912233799695969,
0.013080619275569916,
0.018877580761909485,
0.03300119563937187,
-0.002769160782918334,
-0.08440786600112915,
-0.02248465269804001,
0.11566436290740967,
0.06668911874294281,
0.010815348476171494,
0.015172341838479042,
-0.04104290530085564,
0.07965951412916183,
0.1831512451171875,
-0.007656289264559746,
-0.1783534437417984,
-0.057547420263290405,
0.07553383708000183,
-0.09879875183105469,
-0.09854305535554886,
-0.013454320840537548,
0.03072015568614006,
-0.17046253383159637,
0.023390959948301315,
-0.02239842526614666,
0.1106182336807251,
-0.14194999635219574,
-0.020490378141403198,
0.07218493521213531,
0.07199500501155853,
0.004729843698441982,
0.05758659541606903,
-0.16417601704597473,
0.10671813786029816,
0.008950476534664631,
0.06779605895280838,
-0.09610627591609955,
0.1008887067437172,
-0.004196076653897762,
-0.02063460275530815,
0.1393408179283142,
0.002700034761801362,
-0.06884108483791351,
-0.0763031542301178,
-0.08754398673772812,
-0.009632662869989872,
0.12754282355308533,
-0.1419651061296463,
0.08767123520374298,
-0.037212442606687546,
-0.0424150750041008,
-0.0017086371080949903,
-0.10206665843725204,
-0.11638247221708298,
-0.18888559937477112,
0.06001543253660202,
-0.13492922484874725,
0.03152317553758621,
-0.10799519717693329,
-0.032371897250413895,
-0.030304040759801865,
0.19337286055088043,
-0.23447458446025848,
-0.07199826091527939,
-0.1475764364004135,
-0.10233612358570099,
0.1443224400281906,
-0.0501345656812191,
0.08485390990972519,
-0.007241467013955116,
0.16846685111522675,
0.019060896709561348,
-0.02531743235886097,
0.0971490666270256,
-0.09173708409070969,
-0.19302815198898315,
-0.07869284600019455,
0.15662524104118347,
0.13260218501091003,
0.031680017709732056,
-0.002461588243022561,
0.036563750356435776,
-0.015421539545059204,
-0.11935004591941833,
0.015969349071383476,
0.1787186712026596,
0.06237189099192619,
0.02331034652888775,
-0.027346095070242882,
-0.11273157596588135,
-0.06900003552436829,
-0.028530338779091835,
0.03054865077137947,
0.17762407660484314,
-0.07057618349790573,
0.18207968771457672,
0.14163152873516083,
-0.05922834202647209,
-0.20400173962116241,
0.010538800619542599,
0.03055560030043125,
0.0009220078936778009,
0.02591954916715622,
-0.20123432576656342,
0.08688826113939285,
0.004683020059019327,
-0.05110127478837967,
0.13194532692432404,
-0.17217805981636047,
-0.14451217651367188,
0.0765485092997551,
0.038384392857551575,
-0.19559739530086517,
-0.12913893163204193,
-0.09174312651157379,
-0.045869920402765274,
-0.18591414391994476,
0.09569250047206879,
0.0305706188082695,
0.010893458500504494,
0.03030681423842907,
0.029179483652114868,
0.019487828016281128,
-0.0418255440890789,
0.18391458690166473,
-0.024792250245809555,
0.026594700291752815,
-0.08539514988660812,
-0.06927408277988434,
0.03743394836783409,
-0.052842434495687485,
0.07349982857704163,
-0.023486759513616562,
0.007861839607357979,
-0.10348054021596909,
-0.042148489505052567,
-0.03735732287168503,
0.015448716469109058,
-0.09657872468233109,
-0.08514349907636642,
-0.045032672584056854,
0.09675803780555725,
0.09690850973129272,
-0.033646680414676666,
-0.028050623834133148,
-0.07533035427331924,
0.04412057250738144,
0.19926515221595764,
0.1785389482975006,
0.042153384536504745,
-0.08034496754407883,
-0.004150947090238333,
-0.010121207684278488,
0.04310847446322441,
-0.20463712513446808,
0.06283636391162872,
0.05450061708688736,
0.01973269321024418,
0.11436162889003754,
-0.019565396010875702,
-0.15359151363372803,
-0.07263088971376419,
0.06303015351295471,
-0.060181066393852234,
-0.19620554149150848,
0.00867035984992981,
0.060603946447372437,
-0.16371412575244904,
-0.04535605385899544,
0.04643881320953369,
-0.005620351992547512,
-0.038163937628269196,
0.021896906197071075,
0.09194854646921158,
0.0026654244866222143,
0.07427921891212463,
0.05387866869568825,
0.0827430784702301,
-0.10537070035934448,
0.08090532571077347,
0.08839722722768784,
-0.08452684432268143,
0.023530138656497,
0.10478579998016357,
-0.059433579444885254,
-0.03440561518073082,
0.020135708153247833,
0.08153781294822693,
0.01775863952934742,
-0.040019966661930084,
0.013229827396571636,
-0.10452935844659805,
0.05954122915863991,
0.08839859813451767,
0.032507482916116714,
0.016702456399798393,
0.03425082191824913,
0.04607953503727913,
-0.07238735258579254,
0.12142276018857956,
0.031868141144514084,
0.017129309475421906,
-0.036505792289972305,
-0.040896978229284286,
0.019542274996638298,
-0.03214648738503456,
-0.005015232600271702,
-0.03023446537554264,
-0.07695909589529037,
-0.014793801121413708,
-0.1626158058643341,
-0.011131818406283855,
-0.05648450180888176,
0.010329355485737324,
0.03204665705561638,
-0.032609567046165466,
0.008124498650431633,
0.009250079281628132,
-0.07695289701223373,
-0.0663459524512291,
-0.020460480824112892,
0.09540658444166183,
-0.16213038563728333,
0.022481130436062813,
0.08244425803422928,
-0.12187694013118744,
0.09281346201896667,
0.016204802319407463,
-0.006236857734620571,
0.025038830935955048,
-0.1475188434123993,
0.034843120723962784,
-0.03386561945080757,
0.010836300440132618,
0.04373383894562721,
-0.21569781005382538,
-0.00004886732858722098,
-0.033673107624053955,
-0.06639216095209122,
-0.009451326914131641,
-0.03672455996274948,
-0.11508306115865707,
0.1058407872915268,
0.007236586883664131,
-0.08753558248281479,
-0.03186136856675148,
0.029325377196073532,
0.0838974118232727,
-0.021959776058793068,
0.15145497024059296,
-0.008370938710868359,
0.07429654151201248,
-0.16209737956523895,
-0.018623165786266327,
-0.006028574425727129,
0.022658247500658035,
-0.01664556935429573,
-0.01111356820911169,
0.044031109660863876,
-0.022746501490473747,
0.17925859987735748,
-0.030318550765514374,
0.02272745408117771,
0.06815794110298157,
0.019072026014328003,
-0.030184008181095123,
0.10406795144081116,
0.04094860330224037,
0.02014910988509655,
0.018591465428471565,
0.003289656015112996,
-0.04647882282733917,
-0.03173251822590828,
-0.19407226145267487,
0.07288651913404465,
0.15608493983745575,
0.09729263186454773,
-0.016707008704543114,
0.07954329252243042,
-0.10199416428804398,
-0.1109243705868721,
0.12477338314056396,
-0.04797708988189697,
-0.002418199321255088,
-0.07150927931070328,
0.13247236609458923,
0.1437523066997528,
-0.1859612911939621,
0.07269313186407089,
-0.0699717253446579,
-0.04708027467131615,
-0.10980689525604248,
-0.19441905617713928,
-0.05561789125204086,
-0.049456022679805756,
-0.016053348779678345,
-0.04698808491230011,
0.07504211366176605,
0.054538097232580185,
0.006766852922737598,
-0.0023397188633680344,
0.06506035476922989,
-0.031050674617290497,
-0.0037882844917476177,
0.032597362995147705,
0.06591679900884628,
0.012734474614262581,
-0.030802709981799126,
0.016619903966784477,
-0.013545602560043335,
0.045626189559698105,
0.06578011065721512,
0.04976864159107208,
-0.02938537672162056,
0.014603170566260815,
-0.038539156317710876,
-0.10249634087085724,
0.043612558394670486,
-0.024421939626336098,
-0.0789753645658493,
0.15477414429187775,
0.023680059239268303,
0.007779473438858986,
-0.020137663930654526,
0.23901568353176117,
-0.0738423764705658,
-0.0964353010058403,
-0.14737580716609955,
0.10557299107313156,
-0.038081806153059006,
0.05800395458936691,
0.04625935107469559,
-0.10226529091596603,
0.018044332042336464,
0.1338089406490326,
0.16182038187980652,
-0.039008259773254395,
0.020095856860280037,
0.031135575845837593,
0.00566398398950696,
-0.03622615709900856,
0.04847532883286476,
0.06906453520059586,
0.16569648683071136,
-0.04632584750652313,
0.09100406616926193,
0.0019041687482967973,
-0.09579581767320633,
-0.038361791521310806,
0.11069868505001068,
-0.016052277758717537,
0.019335128366947174,
-0.05818064883351326,
0.11742528527975082,
-0.06386786699295044,
-0.23783175647258759,
0.06453443318605423,
-0.0684293657541275,
-0.13765870034694672,
-0.02378307841718197,
0.08207765966653824,
-0.012955902144312859,
0.027587108314037323,
0.0730307325720787,
-0.07240920513868332,
0.201939657330513,
0.03798431158065796,
-0.05499868467450142,
-0.055047210305929184,
0.0805421993136406,
-0.10008571296930313,
0.2739645540714264,
0.01557221356779337,
0.04601577669382095,
0.10384146869182587,
-0.009341772645711899,
-0.13838784396648407,
0.019836371764540672,
0.09581108391284943,
-0.10502193123102188,
0.04196618124842644,
0.19815568625926971,
-0.0014755994779989123,
0.12389086186885834,
0.07657600939273834,
-0.07551808655261993,
0.0478031262755394,
-0.08054235577583313,
-0.06760486960411072,
-0.09260394424200058,
0.09703279286623001,
-0.07772123068571091,
0.14251399040222168,
0.13876807689666748,
-0.05074559152126312,
0.012724342755973339,
-0.031311117112636566,
0.044293127954006195,
-0.00010600237874314189,
0.10321761667728424,
0.004272161517292261,
-0.1832672357559204,
0.024692710489034653,
0.005650998093187809,
0.10749758034944534,
-0.16033467650413513,
-0.09566054493188858,
0.042343202978372574,
0.003505636239424348,
-0.0672195628285408,
0.1290110945701599,
0.05665452033281326,
0.04342988133430481,
-0.03997718170285225,
-0.03521440550684929,
-0.0060732318088412285,
0.13561366498470306,
-0.10713256150484085,
0.0009933578548952937
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | text-generation | Raven-Pro/SDmerge | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-15T05:05:01+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #mistral #text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #mistral #text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
56,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #mistral #text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.05921921506524086,
0.15253323316574097,
-0.004925556480884552,
0.01970141939818859,
0.09812989830970764,
0.008722675032913685,
0.07155127823352814,
0.11091651022434235,
-0.02038503810763359,
0.11541511863470078,
0.03161177039146423,
0.09504877775907516,
0.11244720220565796,
0.1593349277973175,
0.0006018498679623008,
-0.22924894094467163,
0.050943523645401,
-0.12565383315086365,
-0.028005311265587807,
0.1202453151345253,
0.14323006570339203,
-0.10873830318450928,
0.07482945919036865,
-0.03924073651432991,
-0.006830108352005482,
-0.03327549248933792,
-0.06254202127456665,
-0.05196645110845566,
0.05287102237343788,
0.06693000346422195,
0.07382122427225113,
0.0121690658852458,
0.09054198116064072,
-0.27071383595466614,
0.02402324043214321,
0.07869837433099747,
-0.00047617589007131755,
0.07642106711864471,
0.049837369471788406,
-0.08698169887065887,
0.07614438980817795,
-0.060363397002220154,
0.14962489902973175,
0.07956483215093613,
-0.09049813449382782,
-0.19196605682373047,
-0.07841940224170685,
0.10002946108579636,
0.18888257443904877,
0.05783533677458763,
-0.02747977338731289,
0.11718999594449997,
-0.08618196099996567,
0.013946855440735817,
0.06651762872934341,
-0.05830651894211769,
-0.055825375020504,
0.07012750208377838,
0.08251979202032089,
0.08537944406270981,
-0.13050076365470886,
-0.011774240992963314,
0.015172234736382961,
0.00940374843776226,
0.0883294939994812,
0.017624128609895706,
0.13745273649692535,
0.04126768559217453,
-0.1351923644542694,
-0.04287068545818329,
0.09870852530002594,
0.035997726023197174,
-0.04835180938243866,
-0.24833782017230988,
-0.023138362914323807,
-0.039952121675014496,
-0.03223174810409546,
-0.0381147637963295,
0.04236193001270294,
-0.01381280180066824,
0.07635250687599182,
-0.0030598659068346024,
-0.08292017132043839,
-0.042900193482637405,
0.07140932232141495,
0.06195797771215439,
0.025352943688631058,
-0.016651969403028488,
0.0064301020465791225,
0.12258180975914001,
0.11147689074277878,
-0.12772345542907715,
-0.053019966930150986,
-0.06414514780044556,
-0.08524893969297409,
-0.04640465974807739,
0.03045455552637577,
0.03743596002459526,
0.047410931438207626,
0.2386423945426941,
0.0032438088674098253,
0.054757438600063324,
0.046099163591861725,
0.014072372578084469,
0.06632840633392334,
0.10764557868242264,
-0.05884917825460434,
-0.09735266119241714,
-0.030795203521847725,
0.10186740756034851,
0.006704956758767366,
-0.041407015174627304,
-0.05594591051340103,
0.06964502483606339,
0.020676078274846077,
0.1224241703748703,
0.07868597656488419,
0.002938423305749893,
-0.07543925195932388,
-0.06281042098999023,
0.18152743577957153,
-0.1571107804775238,
0.0444292388856411,
0.03200872242450714,
-0.03442244604229927,
-0.009351148270070553,
0.00990392453968525,
0.02681080251932144,
-0.02011663094162941,
0.09737543761730194,
-0.05644093081355095,
-0.033681318163871765,
-0.11296935379505157,
-0.0371013842523098,
0.030811145901679993,
0.01213210541754961,
-0.029025491327047348,
-0.0342867337167263,
-0.0882277637720108,
-0.0636090338230133,
0.09107700735330582,
-0.07191670686006546,
-0.04744245857000351,
-0.017612621188163757,
-0.07794062048196793,
0.022423118352890015,
0.017721612006425858,
0.09050743281841278,
-0.021899394690990448,
0.03913994878530502,
-0.056751471012830734,
0.06101011112332344,
0.11571475863456726,
0.028108863160014153,
-0.058606795966625214,
0.06155762821435928,
-0.2421950101852417,
0.10317995399236679,
-0.07758963108062744,
0.051325954496860504,
-0.1530446857213974,
-0.026070065796375275,
0.03956404700875282,
0.012061306275427341,
-0.008345595560967922,
0.1417774260044098,
-0.2185831218957901,
-0.03138069063425064,
0.1676056981086731,
-0.10102425515651703,
-0.07971794903278351,
0.06269615143537521,
-0.05407082289457321,
0.11134804040193558,
0.04596652463078499,
-0.023191405460238457,
0.05842197686433792,
-0.14511504769325256,
-0.00791724119335413,
-0.04188765957951546,
-0.017894908785820007,
0.16635635495185852,
0.07102048397064209,
-0.06073606386780739,
0.07092984020709991,
0.019934939220547676,
-0.016795052215456963,
-0.04869792237877846,
-0.028511613607406616,
-0.10498060286045074,
0.011810078285634518,
-0.059134796261787415,
0.02167343720793724,
-0.021296551451086998,
-0.09382132440805435,
-0.029188871383666992,
-0.17379464209079742,
-0.0012200147612020373,
0.08734307438135147,
-0.010546354576945305,
-0.02201107330620289,
-0.11164727807044983,
0.008580547757446766,
0.03398929536342621,
0.0007392297266051173,
-0.13708379864692688,
-0.059298936277627945,
0.02737307921051979,
-0.16233380138874054,
0.02912268228828907,
-0.05535917729139328,
0.046022266149520874,
0.040077272802591324,
-0.03548351675271988,
-0.0344831608235836,
0.01168955210596323,
0.011000183410942554,
-0.01812567003071308,
-0.25495970249176025,
-0.017501724883913994,
-0.02502158097922802,
0.17353887856006622,
-0.22721131145954132,
0.04271984100341797,
0.07614967226982117,
0.14550280570983887,
0.0073052942752838135,
-0.034482456743717194,
0.014565827324986458,
-0.07198352366685867,
-0.03167816624045372,
-0.06257235258817673,
-0.010083765722811222,
-0.03872835263609886,
-0.06014038994908333,
0.04782424867153168,
-0.16939696669578552,
-0.03236479312181473,
0.10534932464361191,
0.06398996710777283,
-0.14835967123508453,
-0.030286256223917007,
-0.0393594354391098,
-0.047035153955221176,
-0.06618485599756241,
-0.054856978356838226,
0.12015452980995178,
0.05620792135596275,
0.04745647683739662,
-0.07151947915554047,
-0.07490099221467972,
0.007241961546242237,
-0.019977761432528496,
-0.0163256898522377,
0.09354335069656372,
0.06967450678348541,
-0.12794628739356995,
0.09154868870973587,
0.0982460081577301,
0.08392132818698883,
0.10398648679256439,
-0.015390566550195217,
-0.08757331967353821,
-0.041474130004644394,
0.023933125659823418,
0.014664852991700172,
0.1483616679906845,
-0.016296299174427986,
0.054420776665210724,
0.0360836423933506,
-0.013510678894817829,
0.01076538860797882,
-0.09628108888864517,
0.02706051431596279,
0.02971329540014267,
-0.015405743382871151,
0.03466423228383064,
-0.04367179423570633,
0.019455796107649803,
0.09001301974058151,
0.041830018162727356,
0.0396038182079792,
0.010561688803136349,
-0.04398298263549805,
-0.11032342165708542,
0.17876994609832764,
-0.12373854219913483,
-0.2460412234067917,
-0.13813963532447815,
0.010937176644802094,
0.04738753288984299,
-0.011057097464799881,
0.006951550021767616,
-0.06640941649675369,
-0.1170244961977005,
-0.09733203053474426,
0.01991088129580021,
0.04529648274183273,
-0.07728998363018036,
-0.06572148203849792,
0.06318122148513794,
0.037644270807504654,
-0.13899093866348267,
0.023945696651935577,
0.0469096377491951,
-0.0813174769282341,
-0.0011905812425538898,
0.07709334045648575,
0.06798645853996277,
0.17623907327651978,
0.014159789308905602,
-0.023712651804089546,
0.025652561336755753,
0.21002908051013947,
-0.14298869669437408,
0.1094568595290184,
0.1327279806137085,
-0.08898334950208664,
0.08212688565254211,
0.20222385227680206,
0.0385010726749897,
-0.10506977140903473,
0.03657889738678932,
0.027060477063059807,
-0.02792542427778244,
-0.24959829449653625,
-0.06908850371837616,
0.001758498721756041,
-0.053698375821113586,
0.06916391849517822,
0.08716317266225815,
0.09721273928880692,
0.016790922731161118,
-0.10066783428192139,
-0.0790279284119606,
0.05001477152109146,
0.10897587984800339,
-0.001458899350836873,
-0.014394176192581654,
0.09075857698917389,
-0.02953648567199707,
0.01689162664115429,
0.09213569760322571,
0.0019032615236938,
0.1793205291032791,
0.052213337272405624,
0.17340974509716034,
0.07910763472318649,
0.06269825994968414,
0.021207094192504883,
0.006816241890192032,
0.02095629647374153,
0.01695442944765091,
-0.004212336614727974,
-0.0863528773188591,
-0.0027415938675403595,
0.1203664243221283,
0.050876569002866745,
0.03059028834104538,
0.014285655692219734,
-0.03054206818342209,
0.08466528356075287,
0.177787184715271,
0.001063879462890327,
-0.1876421719789505,
-0.07282958924770355,
0.07934894412755966,
-0.08512143790721893,
-0.10675539821386337,
-0.029639042913913727,
0.040873926132917404,
-0.17292065918445587,
0.01861744187772274,
-0.020119842141866684,
0.10806277394294739,
-0.12885749340057373,
-0.017452897503972054,
0.055447377264499664,
0.06997017562389374,
-0.009931124746799469,
0.06633757054805756,
-0.1625119000673294,
0.1177479475736618,
0.01653103344142437,
0.06594116985797882,
-0.09538834542036057,
0.095417320728302,
-0.006962447427213192,
0.007516060955822468,
0.1403670459985733,
0.010755252093076706,
-0.0641925036907196,
-0.0961010679602623,
-0.10299893468618393,
-0.010606445372104645,
0.1309773176908493,
-0.14660196006298065,
0.08697716891765594,
-0.02743646875023842,
-0.0437387153506279,
0.0037594304885715246,
-0.12246467173099518,
-0.13224415481090546,
-0.18235477805137634,
0.05769521743059158,
-0.13171130418777466,
0.040173836052417755,
-0.1089821308851242,
-0.04585907980799675,
-0.021465247496962547,
0.1977471560239792,
-0.23280778527259827,
-0.06815840303897858,
-0.15394872426986694,
-0.08265888690948486,
0.1454220414161682,
-0.04706942290067673,
0.08337214589118958,
0.000301246385788545,
0.19080647826194763,
0.020952312275767326,
-0.017133628949522972,
0.1067209243774414,
-0.09975022822618484,
-0.20161914825439453,
-0.09120959788560867,
0.15868841111660004,
0.13963958621025085,
0.038726504892110825,
-0.004869744647294283,
0.032236017286777496,
-0.021885421127080917,
-0.12115032970905304,
0.02010788396000862,
0.17255425453186035,
0.08749033510684967,
0.026468761265277863,
-0.028463367372751236,
-0.11846643686294556,
-0.07225121557712555,
-0.03745346516370773,
0.02470988966524601,
0.1813775599002838,
-0.07139390707015991,
0.18551595509052277,
0.14274363219738007,
-0.054879751056432724,
-0.19840270280838013,
0.02148755080997944,
0.04472679644823074,
0.0060237692669034,
0.03174281120300293,
-0.20237314701080322,
0.09144619107246399,
0.0006281035020947456,
-0.05034751072525978,
0.13383205235004425,
-0.18327344954013824,
-0.15106844902038574,
0.061150215566158295,
0.04303572699427605,
-0.19199669361114502,
-0.1237611323595047,
-0.08872545510530472,
-0.046805474907159805,
-0.1568751484155655,
0.1029038056731224,
0.0011325168889015913,
0.007591354660689831,
0.03782656043767929,
0.024313677102327347,
0.012553532607853413,
-0.041947584599256516,
0.19289998710155487,
-0.02507353574037552,
0.034427378326654434,
-0.0793621614575386,
-0.06381990760564804,
0.06411149352788925,
-0.057697590440511703,
0.0750909373164177,
-0.025500034913420677,
0.015388053841888905,
-0.10115842521190643,
-0.047956179827451706,
-0.029484452679753304,
0.01986371912062168,
-0.09421123564243317,
-0.09366033226251602,
-0.04838487133383751,
0.0944879949092865,
0.08926530182361603,
-0.037268105894327164,
-0.033034052699804306,
-0.07874293625354767,
0.04173892363905907,
0.17448031902313232,
0.18235735595226288,
0.045147113502025604,
-0.07717937231063843,
-0.0013610349269583821,
-0.014655699953436852,
0.04845907539129257,
-0.22060799598693848,
0.06062275543808937,
0.045259539037942886,
0.01552091259509325,
0.11744016408920288,
-0.020618194714188576,
-0.1619492471218109,
-0.0666290745139122,
0.06087447330355644,
-0.06730270385742188,
-0.1811886727809906,
0.00352504407055676,
0.0753183513879776,
-0.16591353714466095,
-0.03711319714784622,
0.04232833534479141,
-0.011535273864865303,
-0.04050648957490921,
0.013207654468715191,
0.08094717562198639,
0.0073035703971982,
0.07697968184947968,
0.05389590561389923,
0.09186159074306488,
-0.10275198519229889,
0.07336891442537308,
0.08092255145311356,
-0.08580191433429718,
0.029650582000613213,
0.0956844761967659,
-0.0660475566983223,
-0.03553546592593193,
0.039692267775535583,
0.08463539928197861,
0.025261107832193375,
-0.04666709899902344,
0.003693421371281147,
-0.09922701120376587,
0.05857077240943909,
0.11215036362409592,
0.035282451659440994,
0.011146705597639084,
0.03799959644675255,
0.04474346339702606,
-0.07786709815263748,
0.11944296956062317,
0.024733934551477432,
0.020655835047364235,
-0.04009570553898811,
-0.040743377059698105,
0.03469119220972061,
-0.027051862329244614,
-0.011984582990407944,
-0.035381630063056946,
-0.07329677045345306,
-0.014250458218157291,
-0.16089624166488647,
-0.006425157655030489,
-0.039050452411174774,
0.006492188666015863,
0.0227071400731802,
-0.03757927939295769,
0.008156952448189259,
0.012379756197333336,
-0.06891508400440216,
-0.05483170598745346,
-0.0225595161318779,
0.09499263763427734,
-0.16361327469348907,
0.02182857319712639,
0.08322018384933472,
-0.12078364938497543,
0.09284685552120209,
0.016550488770008087,
0.002410374814644456,
0.028476644307374954,
-0.15792103111743927,
0.04754367470741272,
-0.020290223881602287,
0.012727295979857445,
0.04053649678826332,
-0.2180718630552292,
-0.005482743959873915,
-0.04065772518515587,
-0.055209364742040634,
-0.008002875372767448,
-0.03194994851946831,
-0.11256447434425354,
0.09542836248874664,
0.010766619816422462,
-0.0858173593878746,
-0.029525602236390114,
0.032997291535139084,
0.07880192995071411,
-0.02688010409474373,
0.15163032710552216,
-0.004930328112095594,
0.07543973624706268,
-0.17439891397953033,
-0.02280678227543831,
-0.009784235619008541,
0.02145213820040226,
-0.02418927662074566,
-0.016610441729426384,
0.04521343484520912,
-0.027311841025948524,
0.18978725373744965,
-0.02763848751783371,
0.047156915068626404,
0.06419318169355392,
0.01327395811676979,
-0.016141459345817566,
0.11109550297260284,
0.05755641311407089,
0.024413742125034332,
0.02059282548725605,
0.0006552583072334528,
-0.04046328365802765,
-0.012729931622743607,
-0.18779614567756653,
0.06844497472047806,
0.14769941568374634,
0.09005311876535416,
-0.014767808839678764,
0.06981590390205383,
-0.09979446232318878,
-0.11724765598773956,
0.10648569464683533,
-0.06312347948551178,
-0.011802246794104576,
-0.06541955471038818,
0.14070585370063782,
0.1514706313610077,
-0.1892511397600174,
0.06684626638889313,
-0.06704412400722504,
-0.05669668689370155,
-0.11357752978801727,
-0.1923627108335495,
-0.05791294202208519,
-0.05011613294482231,
-0.018368201330304146,
-0.05373769626021385,
0.06899537891149521,
0.057158127427101135,
0.011277895420789719,
0.008883214555680752,
0.0839093029499054,
-0.009658100083470345,
0.001425864058546722,
0.031231271103024483,
0.06669623404741287,
0.016144385561347008,
-0.0304893609136343,
0.01806715875864029,
-0.003015234600752592,
0.033999331295490265,
0.059489116072654724,
0.036065202206373215,
-0.028380198404192924,
0.013694645836949348,
-0.03632815182209015,
-0.11369726806879044,
0.043240632861852646,
-0.028342511504888535,
-0.07773103564977646,
0.13286112248897552,
0.026473212987184525,
0.005609886720776558,
-0.022322779521346092,
0.2495104819536209,
-0.07400858402252197,
-0.09536818414926529,
-0.1448878049850464,
0.11703428626060486,
-0.04134928435087204,
0.06479805707931519,
0.03765689954161644,
-0.10748469084501266,
0.018750222399830818,
0.12525403499603271,
0.1550474315881729,
-0.04537956044077873,
0.019106155261397362,
0.02858782559633255,
0.004584235139191151,
-0.04013598710298538,
0.05142189934849739,
0.06933367252349854,
0.14214643836021423,
-0.05173535272479057,
0.08858583122491837,
0.0017827433766797185,
-0.10212727636098862,
-0.04129546508193016,
0.11294585466384888,
-0.012940747663378716,
0.016553698107600212,
-0.05866444855928421,
0.1253037303686142,
-0.059382375329732895,
-0.23649652302265167,
0.061238259077072144,
-0.07580125331878662,
-0.14206883311271667,
-0.02515989914536476,
0.0734870657324791,
-0.015550101175904274,
0.026368482038378716,
0.07198820263147354,
-0.07507873326539993,
0.18898127973079681,
0.03871531784534454,
-0.05198408663272858,
-0.05836968496441841,
0.07604995369911194,
-0.117560975253582,
0.2752254605293274,
0.01097069587558508,
0.05294901132583618,
0.10413134098052979,
-0.02049596607685089,
-0.13178466260433197,
0.024117950350046158,
0.09550730884075165,
-0.08813395351171494,
0.04131056368350983,
0.21484604477882385,
-0.005940921604633331,
0.1187596246600151,
0.07743308693170547,
-0.07539036870002747,
0.047102998942136765,
-0.1141449362039566,
-0.0771128386259079,
-0.08687382191419601,
0.09549140185117722,
-0.0675748735666275,
0.14216206967830658,
0.12683449685573578,
-0.054658904671669006,
0.010759806260466576,
-0.02898469939827919,
0.045599378645420074,
0.0063186027109622955,
0.10157246887683868,
0.009957551956176758,
-0.18577666580677032,
0.02454824559390545,
0.017152229323983192,
0.10993915796279907,
-0.1806284487247467,
-0.09123970568180084,
0.04470835253596306,
0.0021878182888031006,
-0.06369121372699738,
0.12484876811504364,
0.057084910571575165,
0.04630184918642044,
-0.044473882764577866,
-0.029204387217760086,
-0.0060947248712182045,
0.1420498490333557,
-0.10524781048297882,
-0.003831128589808941
] |
null | null | transformers | # Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
This modelcard aims to be a base template for new models. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/modelcard_template.md?plain=1).
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | {"license": "apache-2.0"} | text-generation | FelixChao/Capricorn-7B-DPO | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"arxiv:1910.09700",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-15T05:07:17+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #mistral #text-generation #arxiv-1910.09700 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
| # Model Card for Model ID
This modelcard aims to be a base template for new models. It has been generated using this raw template.
## Model Details
### Model Description
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID\n\n\n\nThis modelcard aims to be a base template for new models. It has been generated using this raw template.",
"## Model Details",
"### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #mistral #text-generation #arxiv-1910.09700 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Model Card for Model ID\n\n\n\nThis modelcard aims to be a base template for new models. It has been generated using this raw template.",
"## Model Details",
"### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
64,
29,
3,
54,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #mistral #text-generation #arxiv-1910.09700 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Model Card for Model ID\n\n\n\nThis modelcard aims to be a base template for new models. It has been generated using this raw template.## Model Details### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.05098743364214897,
0.18119150400161743,
-0.005546347238123417,
0.017277177423238754,
0.09535548090934753,
0.012573855929076672,
0.06850744038820267,
0.10837393999099731,
-0.018621204420924187,
0.10630100220441818,
0.025171682238578796,
0.08657454699277878,
0.11164019256830215,
0.14977188408374786,
-0.0008030658354982734,
-0.23681508004665375,
0.04359348490834236,
-0.1266832947731018,
-0.02974628657102585,
0.11300726234912872,
0.15117916464805603,
-0.09237026423215866,
0.07796194404363632,
-0.026965906843543053,
-0.009547654539346695,
-0.029042350128293037,
-0.059507694095373154,
-0.045686688274145126,
0.045702628791332245,
0.0660756453871727,
0.06690677255392075,
0.00041688629426062107,
0.09292758256196976,
-0.26813268661499023,
0.02008185349404812,
0.0664597749710083,
-0.0012793500209227204,
0.07728859782218933,
0.056240301579236984,
-0.07440850883722305,
0.09802885353565216,
-0.048399534076452255,
0.1422666758298874,
0.08229760825634003,
-0.090943343937397,
-0.18338368833065033,
-0.09160289168357849,
0.09443580359220505,
0.18546488881111145,
0.05044415965676308,
-0.021347247064113617,
0.08999335765838623,
-0.08516385406255722,
0.005957415793091059,
0.05279853194952011,
-0.07463932782411575,
-0.05264570191502571,
0.06564901769161224,
0.07107672840356827,
0.07033055275678635,
-0.12056542932987213,
-0.023800790309906006,
0.008731144480407238,
0.010739918798208237,
0.07595204561948776,
0.021117212250828743,
0.1447419822216034,
0.03368578106164932,
-0.12334440648555756,
-0.042693156749010086,
0.1282794326543808,
0.04175516963005066,
-0.04885222390294075,
-0.24373401701450348,
-0.02595498040318489,
-0.022665152326226234,
-0.03248772770166397,
-0.03886206075549126,
0.0448467992246151,
0.002830381039530039,
0.09096220880746841,
-0.018628129735589027,
-0.07835695892572403,
-0.029315827414393425,
0.06328904628753662,
0.04903733730316162,
0.02423769235610962,
-0.013522015884518623,
0.005825151689350605,
0.1200474426150322,
0.0931648537516594,
-0.12698234617710114,
-0.0494733527302742,
-0.0625515952706337,
-0.07075663655996323,
-0.04515935108065605,
0.03235740587115288,
0.0397205576300621,
0.051469046622514725,
0.2505846917629242,
0.014960609376430511,
0.05136311799287796,
0.04400661960244179,
0.012981995940208435,
0.05845816805958748,
0.1074749305844307,
-0.05520668625831604,
-0.10785319656133652,
-0.02126893773674965,
0.08356046676635742,
0.015483551658689976,
-0.034644994884729385,
-0.05889322608709335,
0.05063844472169876,
0.01071514468640089,
0.12029755860567093,
0.09420150518417358,
-0.0020220226142555475,
-0.07016023993492126,
-0.0628134235739708,
0.19292974472045898,
-0.16400295495986938,
0.0473443940281868,
0.029314545914530754,
-0.0365741103887558,
-0.001238433294929564,
0.007521297782659531,
0.028401484712958336,
-0.0246182419359684,
0.08293787389993668,
-0.05586033686995506,
-0.03929503634572029,
-0.1104612872004509,
-0.02357238531112671,
0.03238487243652344,
0.02302432991564274,
-0.030460160225629807,
-0.0243829432874918,
-0.08628323674201965,
-0.07036904245615005,
0.09920728206634521,
-0.07587142288684845,
-0.06134931370615959,
-0.021133728325366974,
-0.07577592134475708,
0.025946343317627907,
0.02078121155500412,
0.06897282600402832,
-0.01944863609969616,
0.030176213011145592,
-0.052048422396183014,
0.05514887720346451,
0.09825149923563004,
0.033667098730802536,
-0.0565333254635334,
0.06152923405170441,
-0.23782724142074585,
0.10050790756940842,
-0.05885869637131691,
0.05631159618496895,
-0.15306192636489868,
-0.018819015473127365,
0.04125715419650078,
0.0019174626795575023,
-0.011306731961667538,
0.13673482835292816,
-0.21845147013664246,
-0.023373503237962723,
0.16045258939266205,
-0.09470109641551971,
-0.07761828601360321,
0.05749881640076637,
-0.049321115016937256,
0.11392152309417725,
0.04453955963253975,
-0.020814193412661552,
0.06921925395727158,
-0.13386112451553345,
0.008098425343632698,
-0.03911484777927399,
-0.009868915192782879,
0.14855000376701355,
0.07630777359008789,
-0.07961857318878174,
0.06684968620538712,
0.02708585001528263,
-0.036954548209905624,
-0.04370904341340065,
-0.012701238505542278,
-0.11127448827028275,
0.006379514001309872,
-0.057350050657987595,
0.012217618525028229,
-0.029090026393532753,
-0.09010448306798935,
-0.02476555109024048,
-0.16960756480693817,
-0.021764647215604782,
0.0858255922794342,
-0.008199622854590416,
-0.020143786445260048,
-0.11031179130077362,
0.01924167200922966,
0.03676341474056244,
-0.0024156419094651937,
-0.12995371222496033,
-0.05207955837249756,
0.02718118391931057,
-0.1663770228624344,
0.035771261900663376,
-0.054468683898448944,
0.04823897406458855,
0.031382765620946884,
-0.031815558671951294,
-0.02675204910337925,
0.018448038026690483,
0.0020498076919466257,
-0.010357827879488468,
-0.241690531373024,
-0.026199771091341972,
-0.024984603747725487,
0.16617265343666077,
-0.21149884164333344,
0.03924327343702316,
0.06548555940389633,
0.14468514919281006,
0.00908830389380455,
-0.03872833773493767,
0.00325130857527256,
-0.07682507485151291,
-0.02375245839357376,
-0.06197876110672951,
-0.006265480071306229,
-0.029592575505375862,
-0.056297365576028824,
0.0474017933011055,
-0.16155876219272614,
-0.03318116441369057,
0.0943460464477539,
0.06758622825145721,
-0.13294118642807007,
-0.030522173270583153,
-0.03080666810274124,
-0.04256385564804077,
-0.05002477392554283,
-0.05415209010243416,
0.11267362534999847,
0.056675393134355545,
0.04424947872757912,
-0.06770579516887665,
-0.07564353197813034,
-0.0059568556025624275,
-0.024364864453673363,
-0.021629072725772858,
0.0876426175236702,
0.06839006394147873,
-0.11861623078584671,
0.09229876846075058,
0.10877985507249832,
0.07871770113706589,
0.09374723583459854,
-0.02219315804541111,
-0.08298259973526001,
-0.05079648643732071,
0.03239036351442337,
0.0106898108497262,
0.12816645205020905,
-0.011133784428238869,
0.055088043212890625,
0.039088018238544464,
-0.013070518150925636,
0.017940703779459,
-0.09119090437889099,
0.02961762435734272,
0.030785411596298218,
-0.02225067839026451,
0.0391770638525486,
-0.03700961172580719,
0.01944563537836075,
0.08695637434720993,
0.049476608633995056,
0.04310998693108559,
0.011310328729450703,
-0.0507609024643898,
-0.11596490442752838,
0.16487768292427063,
-0.1310916393995285,
-0.22092364728450775,
-0.15299998223781586,
0.012620022520422935,
0.03313823789358139,
-0.011554974131286144,
0.0014600668800994754,
-0.06272313743829727,
-0.11571827530860901,
-0.0922229140996933,
0.01242272648960352,
0.05040787532925606,
-0.09326481074094772,
-0.059908997267484665,
0.05969405174255371,
0.039082664996385574,
-0.1428091824054718,
0.020056407898664474,
0.05564342439174652,
-0.09713013470172882,
-0.019626006484031677,
0.07774446159601212,
0.06338733434677124,
0.17780159413814545,
0.011641085147857666,
-0.023729432374238968,
0.03950480371713638,
0.22507025301456451,
-0.1325329691171646,
0.11577421426773071,
0.14521914720535278,
-0.0826810896396637,
0.08551780879497528,
0.20404234528541565,
0.043351635336875916,
-0.09732220321893692,
0.03335292264819145,
0.01921760104596615,
-0.022651171311736107,
-0.2431226670742035,
-0.07194231450557709,
-0.0063305203802883625,
-0.07651257514953613,
0.07641167938709259,
0.09124799072742462,
0.08904707431793213,
0.01695016771554947,
-0.09704168140888214,
-0.0876987874507904,
0.06060664728283882,
0.10453283786773682,
0.020114712417125702,
-0.011224123649299145,
0.08347021788358688,
-0.034951116889715195,
0.016350839287042618,
0.08870627731084824,
0.014143294654786587,
0.1766888052225113,
0.06362227350473404,
0.1846485137939453,
0.07776474207639694,
0.07361125946044922,
0.01232505775988102,
0.00872134417295456,
0.01642940193414688,
0.02886364236474037,
-0.003145672148093581,
-0.08890394866466522,
-0.008976506069302559,
0.11655743420124054,
0.05968529358506203,
0.023473775014281273,
0.013177607208490372,
-0.045911915600299835,
0.07631321251392365,
0.19093434512615204,
-0.0009153723949566483,
-0.17904233932495117,
-0.06162286922335625,
0.08616357296705246,
-0.08956354111433029,
-0.10062851011753082,
-0.023238040506839752,
0.03308749198913574,
-0.17246176302433014,
0.02651992067694664,
-0.018314840272068977,
0.1141413077712059,
-0.13121241331100464,
-0.022874070331454277,
0.0691872388124466,
0.07217199355363846,
0.003470273455604911,
0.06070614606142044,
-0.1480747014284134,
0.1045488715171814,
0.014614338986575603,
0.07220901548862457,
-0.08818291872739792,
0.10457614064216614,
-0.0062637836672365665,
-0.0010398438898846507,
0.12917250394821167,
0.012704622000455856,
-0.08315494656562805,
-0.07081159949302673,
-0.08896414935588837,
-0.005403646733611822,
0.1177091896533966,
-0.14973147213459015,
0.0827237069606781,
-0.039415400475263596,
-0.04000493884086609,
0.0012276272755116224,
-0.11071229726076126,
-0.1310325264930725,
-0.19558793306350708,
0.05247398838400841,
-0.12856444716453552,
0.04142291471362114,
-0.10432637482881546,
-0.02850436232984066,
-0.01316818967461586,
0.18452438712120056,
-0.23713962733745575,
-0.07338899374008179,
-0.14616690576076508,
-0.10958345979452133,
0.15083934366703033,
-0.05047786235809326,
0.08364177495241165,
-0.009070590138435364,
0.17759136855602264,
0.02192075550556183,
-0.02559496834874153,
0.09046733379364014,
-0.09006041288375854,
-0.18359091877937317,
-0.07509500533342361,
0.1547618806362152,
0.1364850252866745,
0.03137340396642685,
-0.00009236243204213679,
0.03403643146157265,
-0.025934826582670212,
-0.12580923736095428,
0.012445051223039627,
0.17905670404434204,
0.07442319393157959,
0.02061055228114128,
-0.03987543284893036,
-0.11310092359781265,
-0.06931383907794952,
-0.026115750893950462,
0.03206980600953102,
0.18743392825126648,
-0.06983482837677002,
0.18250030279159546,
0.1439635306596756,
-0.06464575976133347,
-0.1876271665096283,
0.01272567454725504,
0.03231727331876755,
0.007271517999470234,
0.03296826779842377,
-0.20840111374855042,
0.09315025806427002,
0.005137091036885977,
-0.05071651563048363,
0.1370120495557785,
-0.17887669801712036,
-0.14646044373512268,
0.07223373651504517,
0.028904680162668228,
-0.19215600192546844,
-0.11724954843521118,
-0.08848059177398682,
-0.05218749865889549,
-0.17890824377536774,
0.0928690955042839,
0.035659998655319214,
0.006328529212623835,
0.029228543862700462,
0.04099976271390915,
0.017570940777659416,
-0.034000616520643234,
0.1953754723072052,
-0.025309110060334206,
0.03348270058631897,
-0.08410539478063583,
-0.0634893923997879,
0.04695089906454086,
-0.057213250547647476,
0.08370446413755417,
-0.029913514852523804,
0.014560804702341557,
-0.10276637226343155,
-0.038615114986896515,
-0.028055282309651375,
0.01990450732409954,
-0.09439650923013687,
-0.08324754238128662,
-0.04971373453736305,
0.09117461740970612,
0.09475960582494736,
-0.034476231783628464,
-0.025337999686598778,
-0.07224450260400772,
0.050935011357069016,
0.17326311767101288,
0.1753956526517868,
0.04247729107737541,
-0.0770951434969902,
0.00020656864217016846,
-0.009295777417719364,
0.04424283653497696,
-0.20921464264392853,
0.06171146035194397,
0.04861802980303764,
0.015397347509860992,
0.11190910637378693,
-0.02175346575677395,
-0.1547207087278366,
-0.06734275072813034,
0.0655275210738182,
-0.05564737319946289,
-0.19119581580162048,
-0.0015418418915942311,
0.05998261272907257,
-0.1725488007068634,
-0.04902401193976402,
0.043246448040008545,
-0.004179911222308874,
-0.04368313401937485,
0.019003113731741905,
0.09280961006879807,
-0.004413523245602846,
0.06393697112798691,
0.05462450534105301,
0.07884405553340912,
-0.09972895681858063,
0.07550083100795746,
0.08475533127784729,
-0.06779135018587112,
0.021509729325771332,
0.09533023089170456,
-0.05755164474248886,
-0.032326262444257736,
0.03258265554904938,
0.07859939336776733,
0.012596184387803078,
-0.041238002479076385,
0.01913696900010109,
-0.09541146457195282,
0.05581831559538841,
0.07808537036180496,
0.03262934461236,
0.011093069799244404,
0.03359393775463104,
0.03784278780221939,
-0.0644177570939064,
0.11518306285142899,
0.02587002143263817,
0.014913653023540974,
-0.037573423236608505,
-0.05676570162177086,
0.021193647757172585,
-0.033675044775009155,
-0.00463699409738183,
-0.038403820246458054,
-0.06861646473407745,
-0.019439343363046646,
-0.16266533732414246,
-0.012191741727292538,
-0.05826016142964363,
0.013241472654044628,
0.03000813163816929,
-0.03695882856845856,
0.004327703267335892,
0.007653922773897648,
-0.07277437299489975,
-0.06814969331026077,
-0.028041290119290352,
0.09671425074338913,
-0.1584416776895523,
0.0319603756070137,
0.08783707767724991,
-0.11725916713476181,
0.08885382115840912,
0.016516920179128647,
-0.00037986721144989133,
0.030176682397723198,
-0.15937243402004242,
0.03337664157152176,
-0.03232354298233986,
0.012355729006230831,
0.04085836187005043,
-0.22666671872138977,
-0.001465921988710761,
-0.030821116641163826,
-0.060066789388656616,
-0.012130483984947205,
-0.020190633833408356,
-0.11788560450077057,
0.09418036788702011,
0.010323213413357735,
-0.09105202555656433,
-0.034208908677101135,
0.03431219235062599,
0.09318374842405319,
-0.02494099922478199,
0.15748997032642365,
-0.004250435158610344,
0.07382263988256454,
-0.16479109227657318,
-0.019081395119428635,
-0.012243317440152168,
0.0292672086507082,
-0.02226962335407734,
-0.014464865438640118,
0.034872230142354965,
-0.020914284512400627,
0.17916569113731384,
-0.027987400069832802,
0.02451128326356411,
0.06518793851137161,
0.02345157414674759,
-0.018241364508867264,
0.10278960317373276,
0.056563593447208405,
0.02294408157467842,
0.018069501966238022,
0.010073256678879261,
-0.038073886185884476,
-0.03105715662240982,
-0.2051125019788742,
0.06024419143795967,
0.14490185678005219,
0.0803639218211174,
-0.013404328376054764,
0.07971688359975815,
-0.09786573052406311,
-0.10358753800392151,
0.12096197158098221,
-0.037024762481451035,
-0.01178735215216875,
-0.06821750849485397,
0.12925268709659576,
0.15778256952762604,
-0.1957709789276123,
0.07497909665107727,
-0.060612305998802185,
-0.046610940247774124,
-0.12327694147825241,
-0.20910800993442535,
-0.05508040264248848,
-0.047508079558610916,
-0.02215556427836418,
-0.05385427549481392,
0.06967004388570786,
0.05055632442235947,
0.0030300726648420095,
-0.002720841206610203,
0.055767983198165894,
-0.022411813959479332,
-0.013799558393657207,
0.028742652386426926,
0.06666351854801178,
0.022012729197740555,
-0.03296925500035286,
0.019523845985531807,
-0.0068005952052772045,
0.041506461799144745,
0.06027986854314804,
0.05017794668674469,
-0.02476835623383522,
0.015370680019259453,
-0.04556899890303612,
-0.10415440052747726,
0.04394756630063057,
-0.02317158132791519,
-0.07887127995491028,
0.14471043646335602,
0.026771942153573036,
0.0015075445408001542,
-0.017098277807235718,
0.24631071090698242,
-0.07347014546394348,
-0.09652724862098694,
-0.15004828572273254,
0.10078779608011246,
-0.042118173092603683,
0.061743881553411484,
0.04199874401092529,
-0.10357179492712021,
0.019095564261078835,
0.11929444968700409,
0.1714634746313095,
-0.04140309989452362,
0.019019711762666702,
0.02337595634162426,
0.0031649600714445114,
-0.036799654364585876,
0.04888617619872093,
0.06792740523815155,
0.1605997532606125,
-0.049703408032655716,
0.10666686296463013,
-0.0016597626963630319,
-0.089536651968956,
-0.04966892674565315,
0.11527734249830246,
-0.025868743658065796,
0.013937955722212791,
-0.0522877499461174,
0.13030187785625458,
-0.059832725673913956,
-0.21440553665161133,
0.05762794613838196,
-0.060670070350170135,
-0.14221197366714478,
-0.022004295140504837,
0.07185368239879608,
-0.016225198283791542,
0.02670050971210003,
0.069224514067173,
-0.07622170448303223,
0.20599152147769928,
0.038055576384067535,
-0.054234836250543594,
-0.06403503566980362,
0.08333315700292587,
-0.10256686806678772,
0.27284157276153564,
0.020483549684286118,
0.043097127228975296,
0.1018582135438919,
-0.011639008298516273,
-0.13639964163303375,
0.01857207715511322,
0.09395286440849304,
-0.08456414937973022,
0.044102296233177185,
0.19214273989200592,
-0.0023759787436574697,
0.12534835934638977,
0.08313769847154617,
-0.0665183737874031,
0.04617252200841904,
-0.09977047890424728,
-0.06693964451551437,
-0.0931321457028389,
0.09399199485778809,
-0.08075552433729172,
0.14307093620300293,
0.1299404501914978,
-0.0568833202123642,
0.009563712403178215,
-0.027316907420754433,
0.04724361002445221,
0.009638158604502678,
0.09812209010124207,
0.008474581874907017,
-0.1704498678445816,
0.019718635827302933,
0.01983269676566124,
0.10463733971118927,
-0.15966567397117615,
-0.09304337203502655,
0.04343968629837036,
0.003504045307636261,
-0.059013694524765015,
0.1283906102180481,
0.05894576758146286,
0.043327439576387405,
-0.041388269513845444,
-0.03907050937414169,
-0.007298397831618786,
0.13337571918964386,
-0.1050582155585289,
-0.002106369473040104
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# codeparrot-ds
This model is a fine-tuned version of [gpt2](https://huggingface.co/gpt2) on an unknown dataset.
It achieves the following results on the evaluation set:
- eval_loss: 1.4139
- eval_runtime: 270.7102
- eval_samples_per_second: 694.603
- eval_steps_per_second: 21.71
- epoch: 0.61
- step: 80000
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0005
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 1000
- num_epochs: 1
### Framework versions
- Transformers 4.33.1
- Pytorch 2.0.1+cu117
- Datasets 2.17.0
- Tokenizers 0.13.3
| {"license": "mit", "tags": ["generated_from_trainer"], "base_model": "gpt2", "model-index": [{"name": "codeparrot-ds", "results": []}]} | text-generation | akuzdeuov/codeparrot-ds | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"generated_from_trainer",
"base_model:gpt2",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-15T05:08:16+00:00 | [] | [] | TAGS
#transformers #pytorch #gpt2 #text-generation #generated_from_trainer #base_model-gpt2 #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# codeparrot-ds
This model is a fine-tuned version of gpt2 on an unknown dataset.
It achieves the following results on the evaluation set:
- eval_loss: 1.4139
- eval_runtime: 270.7102
- eval_samples_per_second: 694.603
- eval_steps_per_second: 21.71
- epoch: 0.61
- step: 80000
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0005
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 1000
- num_epochs: 1
### Framework versions
- Transformers 4.33.1
- Pytorch 2.0.1+cu117
- Datasets 2.17.0
- Tokenizers 0.13.3
| [
"# codeparrot-ds\n\nThis model is a fine-tuned version of gpt2 on an unknown dataset.\nIt achieves the following results on the evaluation set:\n- eval_loss: 1.4139\n- eval_runtime: 270.7102\n- eval_samples_per_second: 694.603\n- eval_steps_per_second: 21.71\n- epoch: 0.61\n- step: 80000",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0005\n- train_batch_size: 32\n- eval_batch_size: 32\n- seed: 42\n- gradient_accumulation_steps: 8\n- total_train_batch_size: 256\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_steps: 1000\n- num_epochs: 1",
"### Framework versions\n\n- Transformers 4.33.1\n- Pytorch 2.0.1+cu117\n- Datasets 2.17.0\n- Tokenizers 0.13.3"
] | [
"TAGS\n#transformers #pytorch #gpt2 #text-generation #generated_from_trainer #base_model-gpt2 #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# codeparrot-ds\n\nThis model is a fine-tuned version of gpt2 on an unknown dataset.\nIt achieves the following results on the evaluation set:\n- eval_loss: 1.4139\n- eval_runtime: 270.7102\n- eval_samples_per_second: 694.603\n- eval_steps_per_second: 21.71\n- epoch: 0.61\n- step: 80000",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0005\n- train_batch_size: 32\n- eval_batch_size: 32\n- seed: 42\n- gradient_accumulation_steps: 8\n- total_train_batch_size: 256\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_steps: 1000\n- num_epochs: 1",
"### Framework versions\n\n- Transformers 4.33.1\n- Pytorch 2.0.1+cu117\n- Datasets 2.17.0\n- Tokenizers 0.13.3"
] | [
67,
97,
6,
12,
8,
3,
128,
33
] | [
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #generated_from_trainer #base_model-gpt2 #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# codeparrot-ds\n\nThis model is a fine-tuned version of gpt2 on an unknown dataset.\nIt achieves the following results on the evaluation set:\n- eval_loss: 1.4139\n- eval_runtime: 270.7102\n- eval_samples_per_second: 694.603\n- eval_steps_per_second: 21.71\n- epoch: 0.61\n- step: 80000## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0005\n- train_batch_size: 32\n- eval_batch_size: 32\n- seed: 42\n- gradient_accumulation_steps: 8\n- total_train_batch_size: 256\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_steps: 1000\n- num_epochs: 1### Framework versions\n\n- Transformers 4.33.1\n- Pytorch 2.0.1+cu117\n- Datasets 2.17.0\n- Tokenizers 0.13.3"
] | [
-0.1139693334698677,
0.17334868013858795,
-0.0046257926151156425,
0.08625609427690506,
0.11920081079006195,
0.03527503088116646,
0.07535536587238312,
0.16165271401405334,
-0.061229508370161057,
0.12753844261169434,
0.09866361320018768,
0.06884173303842545,
0.06662842631340027,
0.14405269920825958,
-0.029405472800135612,
-0.19824910163879395,
0.02738344855606556,
-0.056242819875478745,
-0.029933743178844452,
0.09191184490919113,
0.09282824397087097,
-0.08321480453014374,
0.05888552591204643,
0.017442410811781883,
-0.08629240840673447,
-0.007931861095130444,
-0.03360871970653534,
-0.06746610254049301,
0.08139306306838989,
0.011483022011816502,
0.04551839455962181,
-0.0044261799193918705,
0.08093539625406265,
-0.20373684167861938,
-0.017566118389368057,
0.08641348779201508,
0.046037595719099045,
0.0885997787117958,
0.09513221681118011,
-0.003733256133273244,
0.09878989309072495,
-0.12662650644779205,
0.08556017279624939,
0.03455183655023575,
-0.11319192498922348,
-0.15164582431316376,
-0.09693636000156403,
0.05437823385000229,
0.10356990247964859,
0.10305791348218918,
-0.02613420970737934,
0.13079458475112915,
-0.07178278267383575,
0.058906182646751404,
0.16506098210811615,
-0.2767096757888794,
-0.04493967071175575,
0.024266863241791725,
0.03141584247350693,
0.027224106714129448,
-0.09058105945587158,
-0.017602400854229927,
0.01578313112258911,
0.017652586102485657,
0.08210782706737518,
0.011519288644194603,
0.02676546759903431,
0.0042054299265146255,
-0.10533305257558823,
-0.09125639498233795,
0.12761874496936798,
0.0626254603266716,
-0.055590201169252396,
-0.16539812088012695,
-0.05497268959879875,
-0.14485563337802887,
0.0021494682878255844,
-0.017976529896259308,
0.009869363158941269,
-0.02014739438891411,
-0.06914251297712326,
-0.007294300012290478,
-0.045095156878232956,
-0.04375515878200531,
0.030571285635232925,
0.10529623925685883,
0.04236045852303505,
0.0115289818495512,
0.012030434794723988,
0.11576423794031143,
-0.0034743971191346645,
-0.1323929727077484,
-0.04396321251988411,
-0.0016021208139136434,
-0.11200468987226486,
-0.03363597020506859,
-0.04514050856232643,
-0.009933213703334332,
-0.005491035990417004,
0.183138906955719,
-0.022385822609066963,
0.08470419049263,
0.05421234667301178,
-0.011918849311769009,
-0.023454904556274414,
0.17499521374702454,
-0.036293428391218185,
-0.07763779163360596,
-0.014085760340094566,
0.12285751849412918,
0.014211011119186878,
-0.03353455290198326,
-0.05475730821490288,
-0.006691358517855406,
0.08857829123735428,
0.06163441389799118,
-0.02609921619296074,
0.025384139269590378,
-0.07212120294570923,
-0.01970633491873741,
0.04221704974770546,
-0.13605162501335144,
0.053622372448444366,
0.015294266864657402,
-0.10385636240243912,
-0.05546645075082779,
0.011830201372504234,
-0.010878323577344418,
-0.08463305979967117,
0.061188407242298126,
-0.0696297362446785,
-0.0407826267182827,
-0.0653204470872879,
-0.05435412377119064,
-0.011182446964085102,
-0.07078048586845398,
-0.018279843032360077,
-0.05405600741505623,
-0.16298127174377441,
-0.057286109775304794,
0.04666777700185776,
-0.08881743997335434,
-0.0676177591085434,
-0.04851292446255684,
-0.06566334515810013,
0.062279410660266876,
-0.002114327158778906,
0.1023130863904953,
-0.052185412496328354,
0.05904959514737129,
0.018150899559259415,
0.02790849283337593,
0.12281250208616257,
0.04240810498595238,
-0.09052976965904236,
0.058643192052841187,
-0.08701978623867035,
0.1394995152950287,
-0.058479636907577515,
-0.011963142082095146,
-0.13535061478614807,
-0.07489337027072906,
0.005916110705584288,
-0.02272218093276024,
0.09544342011213303,
0.13056041300296783,
-0.11966501921415329,
-0.03306714817881584,
0.13287855684757233,
-0.02516396902501583,
-0.06969580054283142,
0.08988817781209946,
-0.03513375297188759,
-0.02890867181122303,
0.036501407623291016,
0.12688368558883667,
0.08446679264307022,
-0.0942990705370903,
-0.0320829339325428,
0.01315390132367611,
0.0641474649310112,
0.06644246727228165,
0.08402301371097565,
-0.048885758966207504,
0.057209715247154236,
0.008117546327412128,
-0.06351882964372635,
-0.003936044406145811,
-0.056643422693014145,
-0.08873553574085236,
-0.04996296018362045,
-0.054605185985565186,
0.023031659424304962,
0.015338337048888206,
0.02765612117946148,
-0.06711321324110031,
-0.15922096371650696,
0.002758170710876584,
0.12833163142204285,
-0.069671131670475,
0.005080269183963537,
-0.09579623490571976,
0.057124555110931396,
-0.021172333508729935,
-0.009977100417017937,
-0.18420690298080444,
-0.08177777379751205,
0.052046965807676315,
-0.08970813453197479,
-0.011697476729750633,
-0.006876087747514248,
0.056400950998067856,
0.07199302315711975,
-0.011096327565610409,
-0.05123315006494522,
-0.08426830917596817,
-0.039028484374284744,
-0.09082498401403427,
-0.13813813030719757,
-0.06561946868896484,
-0.016783807426691055,
0.20280300080776215,
-0.20928052067756653,
-0.00738955195993185,
0.0031611521262675524,
0.14223416149616241,
0.011810170486569405,
-0.086420439183712,
0.0018041676376014948,
0.008839531801640987,
-0.019715337082743645,
-0.11766985058784485,
0.02969098836183548,
0.013996281661093235,
-0.10929231345653534,
-0.02248845435678959,
-0.1637803465127945,
-0.027920177206397057,
0.0688060000538826,
0.08966731280088425,
-0.10312461853027344,
-0.010504146106541157,
-0.056394413113594055,
-0.03808999061584473,
-0.07159043103456497,
-0.012596595101058483,
0.21891754865646362,
0.04443126916885376,
0.13787178695201874,
-0.05325331538915634,
-0.07639732956886292,
0.005340408068150282,
0.020214734598994255,
-0.016301680356264114,
0.11910029500722885,
0.0315665528178215,
-0.08729293942451477,
0.0586903840303421,
0.06432119756937027,
-0.010043743997812271,
0.09853474050760269,
-0.049391135573387146,
-0.11195649206638336,
-0.03879338502883911,
0.03383554518222809,
0.023455563932657242,
0.06367885321378708,
-0.07596605271100998,
0.0053622727282345295,
0.054708730429410934,
0.027407802641391754,
0.005025087855756283,
-0.1528961956501007,
0.008239984512329102,
0.05194971337914467,
-0.036084942519664764,
0.00965365581214428,
-0.01201680675148964,
0.013786734081804752,
0.06281716376543045,
0.037864506244659424,
-0.03357468545436859,
0.016369009390473366,
-0.03260262683033943,
-0.07666444033384323,
0.17267273366451263,
-0.09187090396881104,
-0.16249877214431763,
-0.14074546098709106,
0.05640975758433342,
-0.06792295724153519,
-0.00458966800943017,
0.003405754454433918,
-0.06603170186281204,
-0.0803660899400711,
-0.10361989587545395,
-0.02263035997748375,
-0.04466821998357773,
-0.010673492215573788,
0.06455470621585846,
-0.004595641046762466,
0.1037164032459259,
-0.1344902217388153,
-0.0005117678083479404,
0.012480457313358784,
-0.062160465866327286,
0.0008040216052904725,
0.06108607351779938,
0.06294500827789307,
0.12107450515031815,
0.018169816583395004,
0.029224256053566933,
-0.01371259056031704,
0.26258188486099243,
-0.09531214088201523,
-0.03260868042707443,
0.10893379151821136,
0.02533629909157753,
0.077738456428051,
0.08781961351633072,
0.013560234569013119,
-0.09439686685800552,
0.028470458462834358,
0.05718141049146652,
-0.02087084762752056,
-0.2291879504919052,
-0.024535248056054115,
-0.03227904438972473,
-0.049141693860292435,
0.14674004912376404,
0.04684371501207352,
-0.0074178799986839294,
0.05442295968532562,
-0.03236522525548935,
0.055406056344509125,
-0.0034995642490684986,
0.0809587761759758,
0.0690402239561081,
0.07663300633430481,
0.09326941519975662,
-0.009859652258455753,
-0.011905754916369915,
0.0515005923807621,
0.0019249747274443507,
0.21943186223506927,
-0.053079161792993546,
0.1849265694618225,
-0.00023320101900026202,
0.14894253015518188,
-0.032210420817136765,
0.02794405072927475,
0.017017485573887825,
0.01084110513329506,
0.010150935500860214,
-0.07203279435634613,
-0.05995478108525276,
0.041757479310035706,
0.027905438095331192,
0.0379318930208683,
-0.0951349064707756,
0.07594217360019684,
0.03231403976678848,
0.27441293001174927,
0.0938442051410675,
-0.3195393979549408,
-0.06088713929057121,
0.014226148836314678,
-0.033394601196050644,
-0.08962932974100113,
-0.0040435222908854485,
0.08466800302267075,
-0.14755845069885254,
0.049603551626205444,
-0.06503397971391678,
0.0864972397685051,
-0.08412717282772064,
-0.009778464213013649,
0.06292594224214554,
0.11190205812454224,
0.015934176743030548,
0.07776667177677155,
-0.15278410911560059,
0.1754423826932907,
0.007515854202210903,
0.09982997924089432,
-0.06663528829813004,
0.07610848546028137,
0.004740857053548098,
-0.010054949671030045,
0.11462904512882233,
0.008181859739124775,
-0.045619502663612366,
-0.1694602370262146,
-0.10945345461368561,
0.0011472043115645647,
0.10605023801326752,
-0.09530142694711685,
0.09648308157920837,
-0.06206826493144035,
-0.007171430625021458,
0.009801456704735756,
-0.018446210771799088,
-0.09305553138256073,
-0.14703001081943512,
0.04749298095703125,
-0.006582127884030342,
-0.0016847102669999003,
-0.06749124825000763,
-0.07723820209503174,
-0.05175408348441124,
0.20531505346298218,
-0.015646643936634064,
-0.058109309524297714,
-0.14018963277339935,
0.04450735077261925,
0.14807948470115662,
-0.06174604222178459,
0.03550593927502632,
0.00336434505879879,
0.13169263303279877,
0.038977671414613724,
-0.046315472573041916,
0.062025103718042374,
-0.04436786472797394,
-0.18898724019527435,
-0.05556127801537514,
0.1487196534872055,
0.023109078407287598,
0.04238056391477585,
-0.0006430558278225362,
0.04777097702026367,
-0.005291645415127277,
-0.08136559277772903,
0.0499848909676075,
0.04853365197777748,
0.0745735913515091,
0.04802054911851883,
0.014764741063117981,
0.07862334698438644,
-0.05178569629788399,
-0.012381703592836857,
0.10475198924541473,
0.27282702922821045,
-0.07234902679920197,
0.06641661375761032,
0.030772540718317032,
-0.055076584219932556,
-0.1388384997844696,
-0.0024135494604706764,
0.11897511035203934,
0.04852292686700821,
0.03750806301832199,
-0.16933639347553253,
0.07503845542669296,
0.11415820568799973,
-0.022481057792901993,
0.0251801535487175,
-0.3553479015827179,
-0.1375100314617157,
0.06226746737957001,
0.07138078659772873,
-0.014609229750931263,
-0.12989971041679382,
-0.06439362466335297,
-0.042766861617565155,
-0.11962243914604187,
0.05193301662802696,
-0.012649578973650932,
0.12102139741182327,
-0.00841994397342205,
0.046631570905447006,
0.050671640783548355,
-0.04086117073893547,
0.17518140375614166,
0.032781340181827545,
0.0507504902780056,
-0.050385765731334686,
0.04840577393770218,
0.06345320492982864,
-0.09449494630098343,
0.057786375284194946,
-0.06515949219465256,
0.08058716356754303,
-0.19235661625862122,
-0.01986914686858654,
-0.03580651804804802,
0.04794412851333618,
-0.04362507537007332,
-0.055549848824739456,
-0.03814645856618881,
0.03073941357433796,
0.08437176048755646,
-0.021690085530281067,
0.08269592374563217,
0.03436870127916336,
0.038479678332805634,
0.12961122393608093,
0.04679257050156593,
0.03286784887313843,
-0.15975335240364075,
-0.003103708615526557,
0.008800259791314602,
0.0475245900452137,
-0.12038308382034302,
0.033330269157886505,
0.11746320873498917,
0.03307526931166649,
0.1430937945842743,
0.021743973717093468,
-0.07297749817371368,
0.01633463054895401,
0.034175124019384384,
-0.10780429095029831,
-0.13230913877487183,
-0.016445506364107132,
-0.038883648812770844,
-0.1224805936217308,
-0.023983968421816826,
0.127559632062912,
-0.02209928259253502,
-0.014789694920182228,
-0.007099126931279898,
0.02513684146106243,
0.001831045956350863,
0.18085074424743652,
0.0024062765296548605,
0.0859767496585846,
-0.0799730122089386,
0.14453944563865662,
0.08959808200597763,
-0.06798718124628067,
0.0694606602191925,
0.06467106938362122,
-0.0827433243393898,
-0.0011032902402803302,
0.01667386293411255,
0.08407071977853775,
-0.011896553449332714,
-0.012988636270165443,
-0.06684647500514984,
-0.06921055912971497,
0.0739889070391655,
-0.03296758979558945,
0.03123217076063156,
0.022550683468580246,
-0.04606501758098602,
-0.004779246635735035,
-0.11905772984027863,
0.08628770709037781,
0.08589094877243042,
0.04472717270255089,
-0.13774831593036652,
0.12046332657337189,
-0.001350438455119729,
0.009475162252783775,
0.005403182469308376,
-0.012415199540555477,
-0.08516191691160202,
-0.010163119062781334,
-0.10340894013643265,
0.01562814973294735,
-0.036969225853681564,
-0.005641310941427946,
-0.01409099344164133,
-0.015545214526355267,
-0.02430248074233532,
0.04640087112784386,
-0.07159103453159332,
-0.10791655629873276,
-0.0037706352304667234,
0.05841665714979172,
-0.12484962493181229,
0.0007662986754439771,
0.023503568023443222,
-0.13558582961559296,
0.09634058177471161,
0.05670556053519249,
0.027542607858777046,
-0.006150770001113415,
-0.0476963147521019,
0.006932680960744619,
0.011035420000553131,
0.018156543374061584,
0.04616536200046539,
-0.12091395258903503,
0.0110650435090065,
-0.04605795443058014,
-0.007214352022856474,
0.00756817264482379,
0.011031410656869411,
-0.13998311758041382,
-0.04692757874727249,
-0.03624746575951576,
-0.03984927758574486,
-0.07471786439418793,
0.06141746789216995,
0.09396035969257355,
0.024114714935421944,
0.16308480501174927,
-0.05034175515174866,
0.0384715236723423,
-0.20217564702033997,
-0.036906175315380096,
0.007138614542782307,
-0.017521610483527184,
-0.05027356743812561,
-0.02833913080394268,
0.09802038222551346,
-0.060629431158304214,
0.09213598817586899,
-0.016709480434656143,
0.09122437238693237,
0.025505512952804565,
-0.018187617883086205,
0.01749863475561142,
0.02543235570192337,
0.15918004512786865,
0.0862663984298706,
-0.014912134036421776,
0.0797656774520874,
-0.03026607632637024,
0.06429258733987808,
0.09893470257520676,
0.11257097125053406,
0.13245771825313568,
0.037009935826063156,
0.08649670332670212,
0.024149741977453232,
-0.1296478509902954,
-0.17868660390377045,
0.10690252482891083,
-0.06961183249950409,
0.14098358154296875,
-0.05127263441681862,
0.14361970126628876,
0.07386162132024765,
-0.18683649599552155,
0.053558651357889175,
-0.06687649339437485,
-0.10586479306221008,
-0.10495702922344208,
-0.1160309910774231,
-0.07930190116167068,
-0.10815940797328949,
0.04219219833612442,
-0.09248332679271698,
0.07074319571256638,
0.09317132830619812,
0.013663090765476227,
-0.00477010291069746,
0.09963153302669525,
-0.049191687256097794,
-0.014721459709107876,
0.07158038765192032,
0.02162223495543003,
0.006486652884632349,
-0.010885131545364857,
-0.06866517663002014,
0.04238477721810341,
0.000683911144733429,
0.11578218638896942,
-0.036677323281764984,
0.02904031053185463,
0.04535394161939621,
-0.0034151114523410797,
-0.07754891365766525,
0.01901966705918312,
0.0039057086687535048,
0.00797119177877903,
0.03711046278476715,
0.06016576662659645,
0.018360819667577744,
-0.050783537328243256,
0.2689913511276245,
-0.07780370861291885,
-0.035645242780447006,
-0.15105058252811432,
0.150018572807312,
0.051253918558359146,
0.00723132211714983,
0.07198925316333771,
-0.12920111417770386,
-0.020017892122268677,
0.11494274437427521,
0.1037096232175827,
-0.05993145704269409,
-0.036004502326250076,
0.0006905248737893999,
-0.020533487200737,
-0.05378669127821922,
0.10329904407262802,
0.06901231408119202,
0.020243829116225243,
-0.046133920550346375,
0.031183212995529175,
-0.005160457920283079,
-0.05179864168167114,
-0.058335792273283005,
0.11409564316272736,
-0.00916283018887043,
0.014931351877748966,
-0.0330851711332798,
0.035395387560129166,
-0.002976423827931285,
-0.17416003346443176,
0.051904793828725815,
-0.14881518483161926,
-0.18511325120925903,
-0.012816498056054115,
0.01567343808710575,
-0.007264392916113138,
0.08610435575246811,
0.01589338853955269,
-0.012345082126557827,
0.1268274039030075,
-0.006934255827218294,
-0.04855489730834961,
-0.09219244867563248,
0.04533068463206291,
-0.07094145566225052,
0.23769785463809967,
0.011671879328787327,
0.058938682079315186,
0.11680519580841064,
0.03069341368973255,
-0.14791056513786316,
0.030410422012209892,
0.10270582139492035,
-0.05308840051293373,
0.06089377775788307,
0.16219785809516907,
-0.03254113346338272,
0.10218710452318192,
0.06584414839744568,
-0.07562614232301712,
-0.023000316694378853,
-0.03425635024905205,
0.03316501900553703,
-0.09163404256105423,
-0.04117589443922043,
-0.04739149287343025,
0.15417644381523132,
0.22044076025485992,
-0.03848610818386078,
-0.011922290548682213,
-0.05605834722518921,
0.02551489882171154,
0.014541223645210266,
0.0830073207616806,
-0.011294706724584103,
-0.19723457098007202,
0.04215991869568825,
0.04579141363501549,
0.0501134917140007,
-0.21251939237117767,
-0.11351419240236282,
0.04044982045888901,
-0.06572849303483963,
-0.044201746582984924,
0.14032641053199768,
0.03235068917274475,
0.015749316662549973,
-0.04516805708408356,
-0.07520067691802979,
-0.04301631450653076,
0.13526886701583862,
-0.12379159033298492,
-0.06405607610940933
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | null | kenchenxingyu/flan-large-single-label-stance-human6 | [
"transformers",
"safetensors",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-15T05:11:01+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
31,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06646376848220825,
0.2168014943599701,
-0.00225935154594481,
0.023818302899599075,
0.1271018385887146,
-0.001635765191167593,
0.04218708351254463,
0.13324736058712006,
-0.020175931975245476,
0.11144465953111649,
0.046588581055402756,
0.09377603232860565,
0.09928803145885468,
0.18404334783554077,
0.04859916493296623,
-0.2059975117444992,
0.007056170143187046,
-0.09090408682823181,
0.014076028019189835,
0.1116579994559288,
0.13719257712364197,
-0.10291384905576706,
0.08272874355316162,
-0.04045208916068077,
-0.02019004337489605,
0.00012576708104461432,
-0.09259183704853058,
-0.07032395154237747,
0.06885425746440887,
0.06264153122901917,
0.051234472543001175,
0.001456156256608665,
0.09140396863222122,
-0.2864592671394348,
0.017265573143959045,
0.08406311273574829,
0.0027674848679453135,
0.06290827691555023,
0.07236549258232117,
-0.07389893382787704,
0.11328595131635666,
-0.08021481335163116,
0.13019037246704102,
0.08625296503305435,
-0.062064990401268005,
-0.23071379959583282,
-0.07525765895843506,
0.0963398814201355,
0.12251301854848862,
0.06215599179267883,
-0.022921854630112648,
0.15455181896686554,
-0.06248689442873001,
0.012971068732440472,
0.1294165402650833,
-0.11526761949062347,
-0.05572471022605896,
0.061741601675748825,
0.11775490641593933,
0.10740239918231964,
-0.14110268652439117,
-0.0017287094378843904,
0.04900608956813812,
0.029121357947587967,
0.08589313924312592,
0.022661056369543076,
0.12003941088914871,
0.04652795568108559,
-0.13695219159126282,
-0.04037507623434067,
0.12011898308992386,
0.038862764835357666,
-0.06446044892072678,
-0.2168138176202774,
-0.006778308190405369,
-0.0601806715130806,
-0.014732478186488152,
-0.07019448280334473,
0.039128515869379044,
-0.02470310963690281,
0.07317749410867691,
-0.04465159401297569,
-0.1063927412033081,
-0.0421026237308979,
0.0892222449183464,
0.07748593389987946,
0.011527054943144321,
-0.02519804798066616,
0.04627908393740654,
0.13455867767333984,
0.05402068421244621,
-0.10399353504180908,
-0.07017925381660461,
-0.06942764669656754,
-0.09420394152402878,
-0.04035796597599983,
0.056760527193546295,
0.031942449510097504,
0.02665667235851288,
0.22703726589679718,
0.016653569415211678,
0.04155244305729866,
0.0224777739495039,
0.01032855175435543,
0.043662428855895996,
0.0955500528216362,
-0.05303520709276199,
-0.15660029649734497,
-0.04072032496333122,
0.09077946096658707,
-0.0027527001220732927,
-0.036689214408397675,
-0.03966725245118141,
0.03849169611930847,
0.06843466311693192,
0.13122352957725525,
0.07552056759595871,
-0.017929591238498688,
-0.04813180863857269,
-0.030096933245658875,
0.23523783683776855,
-0.1493375599384308,
0.04426715523004532,
-0.02271856553852558,
-0.01804111897945404,
-0.03908449783921242,
0.03597262129187584,
0.022118929773569107,
-0.000004518366949923802,
0.09706240892410278,
-0.058981191366910934,
-0.05378659814596176,
-0.10168042778968811,
-0.03272576630115509,
0.04088849574327469,
-0.013975566253066063,
-0.010589460842311382,
-0.09025166928768158,
-0.09490354359149933,
-0.04766594246029854,
0.05537205561995506,
-0.05123869329690933,
-0.03770573064684868,
0.009465423412621021,
-0.08151785284280777,
-0.005444355774670839,
-0.005417742300778627,
0.10699385404586792,
-0.03222226724028587,
0.04445803165435791,
-0.027600755915045738,
0.05225523188710213,
0.09919606149196625,
0.031576547771692276,
-0.0773419588804245,
0.0561848059296608,
-0.22559374570846558,
0.07503069192171097,
-0.11481974273920059,
0.04335082694888115,
-0.1704932004213333,
-0.042439818382263184,
0.005444696638733149,
0.0139949731528759,
0.013206101022660732,
0.12720820307731628,
-0.19255615770816803,
-0.01654396951198578,
0.13260798156261444,
-0.09212633967399597,
-0.118110790848732,
0.07884611934423447,
-0.029701577499508858,
0.1624738723039627,
0.04682036489248276,
-0.027025915682315826,
0.09224298596382141,
-0.16434773802757263,
-0.07092688232660294,
-0.00949116237461567,
-0.01727987825870514,
0.12109188735485077,
0.07512219995260239,
-0.05991523340344429,
0.046571120619773865,
0.02832140028476715,
-0.038078423589468,
-0.04424772411584854,
-0.050857074558734894,
-0.10884185880422592,
-0.01070026308298111,
-0.08987759798765182,
0.04065500199794769,
-0.01250192429870367,
-0.07916021347045898,
-0.029885273426771164,
-0.18612512946128845,
-0.0030564051121473312,
0.10038342326879501,
0.0035033065360039473,
-0.005652366206049919,
-0.08666291832923889,
0.026358824223279953,
-0.03112892620265484,
-0.008404186926782131,
-0.16764774918556213,
-0.04399421438574791,
0.046902090311050415,
-0.16094985604286194,
0.020117372274398804,
-0.06413903087377548,
0.06334125250577927,
0.03641495108604431,
-0.05590536445379257,
-0.0248766727745533,
-0.01730942726135254,
0.011945613659918308,
-0.05083848536014557,
-0.18994836509227753,
-0.056277405470609665,
-0.037882111966609955,
0.149809330701828,
-0.25956398248672485,
0.032966937869787216,
0.051140617579221725,
0.14649195969104767,
0.00406361510977149,
-0.05115427449345589,
0.01429014839231968,
-0.05360214412212372,
-0.054652128368616104,
-0.06746816635131836,
-0.006135428790003061,
-0.027576493099331856,
-0.05147203803062439,
0.019243421033024788,
-0.1755700707435608,
-0.021410830318927765,
0.09424154460430145,
0.12876708805561066,
-0.1486445665359497,
-0.018640631809830666,
-0.048725154250860214,
-0.06339836865663528,
-0.0715010017156601,
-0.07038594037294388,
0.10712739825248718,
0.0513901449739933,
0.04796046018600464,
-0.07435787469148636,
-0.07092321664094925,
0.02726263552904129,
0.006906150374561548,
-0.03382374346256256,
0.08727246522903442,
0.05199531093239784,
-0.09209315478801727,
0.0756213590502739,
0.1092359870672226,
0.07177663594484329,
0.09363535046577454,
0.01574566215276718,
-0.11756632477045059,
-0.028492970392107964,
0.036266472190618515,
0.02740776725113392,
0.1465986967086792,
-0.05952361226081848,
0.04016614332795143,
0.04494241625070572,
-0.04170418903231621,
0.022319864481687546,
-0.08787637203931808,
0.024075502529740334,
0.025203049182891846,
-0.0034381982404738665,
0.06284574419260025,
-0.02525499276816845,
-0.0050758360885083675,
0.07016654312610626,
0.047779910266399384,
0.04621000960469246,
0.009655474685132504,
-0.01720241829752922,
-0.1047825813293457,
0.16950392723083496,
-0.0951867327094078,
-0.269941508769989,
-0.17632324993610382,
0.026197833940386772,
0.04035249724984169,
-0.022378476336598396,
0.031619444489479065,
-0.07056326419115067,
-0.10630585998296738,
-0.1060405746102333,
-0.002429972169920802,
0.01714223250746727,
-0.06364088505506516,
-0.0741225928068161,
0.07348573952913284,
0.04382912442088127,
-0.14902326464653015,
0.038552410900592804,
0.055694397538900375,
-0.057955220341682434,
-0.0233661737293005,
0.09118817001581192,
0.12397737801074982,
0.14583967626094818,
-0.021366750821471214,
-0.028626007959246635,
0.029004426673054695,
0.19620531797409058,
-0.13469526171684265,
0.10371150821447372,
0.13814030587673187,
-0.04545360431075096,
0.08360563963651657,
0.1560150384902954,
0.029186224564909935,
-0.08317049592733383,
0.05044832453131676,
0.04082648828625679,
-0.043159641325473785,
-0.2666129767894745,
-0.0534592866897583,
0.012832709588110447,
-0.06255637854337692,
0.09786593168973923,
0.10183793306350708,
0.11542957276105881,
0.034910861402750015,
-0.07166364789009094,
-0.043925940990448,
-0.0058974819257855415,
0.11737963557243347,
-0.05490213260054588,
-0.012639665976166725,
0.07686592638492584,
-0.05086168646812439,
0.005355054512619972,
0.10266812145709991,
0.02973790094256401,
0.17442677915096283,
0.020399179309606552,
0.11231429129838943,
0.06195578724145889,
0.08633565157651901,
0.0007386076031252742,
0.02951662428677082,
0.05147615820169449,
0.017203815281391144,
-0.002300140680745244,
-0.10421168059110641,
-0.006156572140753269,
0.1449710875749588,
0.028103826567530632,
0.029669636860489845,
-0.0018948549404740334,
-0.005003341939300299,
0.05121048167347908,
0.1746254414319992,
-0.011592294089496136,
-0.22072425484657288,
-0.0845772922039032,
0.06936841458082199,
-0.06218599155545235,
-0.12968985736370087,
-0.026130788028240204,
0.045467354357242584,
-0.17519839107990265,
0.026703642681241035,
-0.027433741837739944,
0.0919293761253357,
-0.09345759451389313,
-0.02221956104040146,
0.03687324374914169,
0.084866963326931,
-0.014529162086546421,
0.08703910559415817,
-0.14498743414878845,
0.11886418610811234,
0.02978132851421833,
0.09024628251791,
-0.11081171780824661,
0.07909037172794342,
-0.007550720125436783,
0.009180475026369095,
0.19379350543022156,
-0.011335089802742004,
-0.03514958545565605,
-0.08774717897176743,
-0.11210042238235474,
-0.013537433929741383,
0.12687496840953827,
-0.1243172138929367,
0.08773399889469147,
-0.015198243781924248,
-0.044079482555389404,
0.00937260314822197,
-0.12100647389888763,
-0.17273177206516266,
-0.19628387689590454,
0.05585884302854538,
-0.09575839340686798,
0.025643249973654747,
-0.11914430558681488,
-0.07089093327522278,
-0.02952558360993862,
0.241120383143425,
-0.1745356321334839,
-0.06510113179683685,
-0.1468164622783661,
-0.046294767409563065,
0.1662203073501587,
-0.04437198117375374,
0.0718095526099205,
-0.0208172257989645,
0.20345525443553925,
0.005988610442727804,
-0.004939318168908358,
0.06724198162555695,
-0.08892562240362167,
-0.16873881220817566,
-0.06771010160446167,
0.1510489284992218,
0.11680185794830322,
0.04907919466495514,
-0.002248800592496991,
0.0011772146681323647,
-0.016943959519267082,
-0.1137804463505745,
-0.0033210667315870523,
0.16037839651107788,
0.03878779336810112,
0.025986969470977783,
-0.05243593826889992,
-0.08797456324100494,
-0.06899320334196091,
-0.06853509694337845,
0.06221301481127739,
0.19590823352336884,
-0.10376439243555069,
0.1700313836336136,
0.147536963224411,
-0.07305635511875153,
-0.23175598680973053,
0.035342130810022354,
0.04983805492520332,
0.0014306638622656465,
0.04886869341135025,
-0.18252557516098022,
0.10521943867206573,
0.019543392583727837,
-0.05505957826972008,
0.13485197722911835,
-0.1557481735944748,
-0.1552847921848297,
0.0722852572798729,
0.03904085233807564,
-0.22423844039440155,
-0.1354004591703415,
-0.09622503817081451,
-0.05825018882751465,
-0.14065024256706238,
0.06054598465561867,
-0.002136280992999673,
0.015948504209518433,
0.03500790148973465,
-0.0015643214574083686,
0.027123261243104935,
-0.058935679495334625,
0.18609118461608887,
-0.004065449349582195,
0.020676052197813988,
-0.060264769941568375,
-0.0478842556476593,
0.09839435666799545,
-0.06130504235625267,
0.12208222597837448,
0.004057085141539574,
0.01594383642077446,
-0.10362856835126877,
-0.048314861953258514,
-0.04328322783112526,
0.05154227837920189,
-0.07548051327466965,
-0.10070807486772537,
-0.043625857681035995,
0.08841723203659058,
0.07005169242620468,
-0.03383097052574158,
0.00549331633374095,
-0.07189501076936722,
0.10019614547491074,
0.17795267701148987,
0.17573626339435577,
0.009926567785441875,
-0.07241068035364151,
0.01677953451871872,
-0.04142116755247116,
0.044231921434402466,
-0.2513144314289093,
0.03756171092391014,
0.06098250672221184,
0.029438555240631104,
0.09217222779989243,
-0.020435843616724014,
-0.1820858269929886,
-0.04050002992153168,
0.08094815909862518,
-0.05452597141265869,
-0.22617179155349731,
-0.019085140898823738,
0.0954197570681572,
-0.2020406424999237,
-0.007372708059847355,
0.03995226323604584,
-0.048725228756666183,
-0.023169852793216705,
0.00010950004070764408,
0.06317184865474701,
0.002471912419423461,
0.09773622453212738,
0.0735151618719101,
0.09715340286493301,
-0.08337292820215225,
0.10562895983457565,
0.10150538384914398,
-0.09572599828243256,
0.03605884686112404,
0.06754924356937408,
-0.05300498008728027,
-0.043293699622154236,
0.03665391728281975,
0.033023297786712646,
0.005234600510448217,
-0.060321882367134094,
0.013913018628954887,
-0.036497246474027634,
0.044923391193151474,
0.08326134830713272,
0.03754979372024536,
-0.013354414142668247,
0.06462216377258301,
0.03401726484298706,
-0.10898099094629288,
0.10366570204496384,
0.01731540448963642,
0.04105307161808014,
-0.08384523540735245,
-0.019968897104263306,
0.035425446927547455,
0.030576206743717194,
-0.01765924133360386,
-0.02306121215224266,
-0.02860277332365513,
-0.01614218018949032,
-0.14299540221691132,
-0.023106401786208153,
-0.07243485748767853,
0.006181265693157911,
0.014656842686235905,
-0.031884219497442245,
-0.011233693920075893,
0.02475680410861969,
-0.06979699432849884,
-0.07426341623067856,
-0.006949664559215307,
0.09833318740129471,
-0.15115703642368317,
0.008848577737808228,
0.06907843053340912,
-0.11088496446609497,
0.08190931379795074,
-0.008411259390413761,
0.016245156526565552,
0.022527478635311127,
-0.15448406338691711,
0.05601610988378525,
0.0008648968650959432,
0.01916889287531376,
0.025886621326208115,
-0.16471809148788452,
0.004104440100491047,
-0.04661374166607857,
-0.02149827405810356,
-0.00004464812809601426,
-0.02647159807384014,
-0.12325995415449142,
0.06858719140291214,
-0.015622655861079693,
-0.035931166261434555,
-0.02701525390148163,
0.0539589487016201,
0.07888586074113846,
-0.027474910020828247,
0.10445091128349304,
-0.008690856397151947,
0.04941811040043831,
-0.16801609098911285,
-0.02470702864229679,
-0.04982255399227142,
0.019377702847123146,
0.009884213097393513,
-0.007693959400057793,
0.04183054715394974,
-0.00976533442735672,
0.21883612871170044,
-0.05075952783226967,
0.1607085019350052,
0.05847611650824547,
-0.017352959141135216,
-0.0007513365126214921,
0.06180921941995621,
0.05997028574347496,
0.04658793285489082,
0.009480604901909828,
0.023740366101264954,
-0.022450892254710197,
-0.006695089396089315,
-0.15932634472846985,
0.01890849508345127,
0.14999441802501678,
0.06301083415746689,
0.024745315313339233,
0.05866100639104843,
-0.12775006890296936,
-0.12135478109121323,
0.09311001747846603,
-0.026755332946777344,
0.00928465835750103,
-0.08245618641376495,
0.1358020007610321,
0.14980104565620422,
-0.14000412821769714,
0.05256148427724838,
-0.06134212389588356,
-0.05217423290014267,
-0.10388828068971634,
-0.12032219022512436,
-0.05887215584516525,
-0.053666237741708755,
0.002330566756427288,
-0.03760887682437897,
0.054546963423490524,
0.03344334661960602,
-0.009351172484457493,
-0.00022941511997487396,
0.13597318530082703,
-0.019751882180571556,
-0.0028988157864660025,
0.048313532024621964,
0.03693558648228645,
0.02373051457107067,
-0.05275435373187065,
0.02940409444272518,
0.02539868652820587,
0.032232340425252914,
0.06546790152788162,
0.033412106335163116,
-0.047448933124542236,
0.03804153576493263,
-0.0025254099164158106,
-0.11207924783229828,
0.019641218706965446,
-0.00460948096588254,
-0.0742158442735672,
0.1268945336341858,
0.0407399944961071,
0.010224059224128723,
-0.03741471841931343,
0.24361543357372284,
-0.06653323769569397,
-0.06378097087144852,
-0.13251738250255585,
0.10491154342889786,
-0.0027236645109951496,
0.06476365029811859,
0.023412218317389488,
-0.1284150779247284,
0.005243356805294752,
0.13858191668987274,
0.12181595712900162,
0.0045748427510261536,
0.009228081442415714,
0.0518609918653965,
0.0025186820421367884,
-0.06998204439878464,
0.054019294679164886,
0.06992026418447495,
0.12919506430625916,
-0.07847554981708527,
0.07680778950452805,
0.0006860480643808842,
-0.08370215445756912,
-0.02947772853076458,
0.11312682181596756,
-0.0409729965031147,
0.03491825982928276,
-0.047444481402635574,
0.10916327685117722,
-0.05787910893559456,
-0.29412412643432617,
0.02350960113108158,
-0.09588567912578583,
-0.15202060341835022,
-0.018367812037467957,
0.05944539234042168,
-0.02624768204987049,
0.018029648810625076,
0.06971040368080139,
-0.06011629104614258,
0.20098382234573364,
0.0335683599114418,
-0.07864278554916382,
-0.0664360448718071,
0.04837050288915634,
-0.06564252078533173,
0.2949807047843933,
0.008418165147304535,
0.02863333560526371,
0.10770907253026962,
-0.03253700211644173,
-0.18271861970424652,
0.010723991319537163,
0.1133992001414299,
-0.08056149631738663,
0.08200647681951523,
0.19000613689422607,
-0.012578671798110008,
0.1209007054567337,
0.05294662341475487,
-0.047376248985528946,
0.04217283055186272,
-0.03389401361346245,
-0.051268599927425385,
-0.10752558708190918,
0.058453381061553955,
-0.05909625440835953,
0.15447644889354706,
0.10152646154165268,
-0.05671518296003342,
-0.004550917539745569,
-0.05555408447980881,
0.04875178262591362,
0.01804669201374054,
0.12263146042823792,
0.02951994352042675,
-0.1865430772304535,
0.032826557755470276,
-0.01144319772720337,
0.10186848044395447,
-0.25588861107826233,
-0.08421015739440918,
0.08833149075508118,
-0.011924264021217823,
-0.05105875805020332,
0.10560628771781921,
0.057650718837976456,
0.04243382066488266,
-0.043439045548439026,
-0.10480839014053345,
-0.02186836116015911,
0.14663739502429962,
-0.1469624787569046,
-0.025013303384184837
] |
null | null | transformers | # Model Card for Eclipse-13B-dpo
Mistral-7B-v0.1 model fine-tuned on the Ultrafeedback dataset using techinques shown in the paper [Self-Rewarding Language Models](https://arxiv.org/abs/2401.10020).
## Instruction format
In order to leverage instruction fine-tuning, your prompt should be surrounded by `[INST]` and `[/INST]` tokens. The very first instruction should begin with a begin of sentence id. The next instructions should not. The assistant generation will be ended by the end-of-sentence token id.
E.g.
```
text = "<s>[INST] What is your favourite condiment? [/INST]"
"Well, I'm quite partial to a good squeeze of fresh lemon juice. It adds just the right amount of zesty flavour to whatever I'm cooking up in the kitchen!</s> "
"[INST] Do you have mayonnaise recipes? [/INST]"
```
This format is available as a [chat template](https://huggingface.co/docs/transformers/main/chat_templating) via the `apply_chat_template()` method:
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
device = "cuda" # the device to load the model onto
model = AutoModelForCausalLM.from_pretrained("Xenon1/Eclipse-13B-dpo")
tokenizer = AutoTokenizer.from_pretrained("Xenon1/Eclipse-13B-dpo")
messages = [
{"role": "user", "content": "What is your favourite condiment?"},
{"role": "assistant", "content": "Well, I'm quite partial to a good squeeze of fresh lemon juice. It adds just the right amount of zesty flavour to whatever I'm cooking up in the kitchen!"},
{"role": "user", "content": "Do you have mayonnaise recipes?"}
]
encodeds = tokenizer.apply_chat_template(messages, return_tensors="pt")
model_inputs = encodeds.to(device)
model.to(device)
generated_ids = model.generate(model_inputs, max_new_tokens=1000, do_sample=True)
decoded = tokenizer.batch_decode(generated_ids)
print(decoded[0])
```
## Model Architecture
This instruction model is based on Mistral-7B-v0.1, a transformer model with the following architecture choices:
- Grouped-Query Attention
- Sliding-Window Attention
- Byte-fallback BPE tokenizer | {"language": ["en"], "license": "apache-2.0", "tags": ["mistral", "Eclipse-13B-dpo"], "pipeline_tag": "text-generation"} | text-generation | Xenon1/Eclipse-13B-dpo | [
"transformers",
"safetensors",
"mixtral",
"text-generation",
"mistral",
"Eclipse-13B-dpo",
"en",
"arxiv:2401.10020",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-15T05:13:22+00:00 | [
"2401.10020"
] | [
"en"
] | TAGS
#transformers #safetensors #mixtral #text-generation #mistral #Eclipse-13B-dpo #en #arxiv-2401.10020 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
| # Model Card for Eclipse-13B-dpo
Mistral-7B-v0.1 model fine-tuned on the Ultrafeedback dataset using techinques shown in the paper Self-Rewarding Language Models.
## Instruction format
In order to leverage instruction fine-tuning, your prompt should be surrounded by '[INST]' and '[/INST]' tokens. The very first instruction should begin with a begin of sentence id. The next instructions should not. The assistant generation will be ended by the end-of-sentence token id.
E.g.
This format is available as a chat template via the 'apply_chat_template()' method:
## Model Architecture
This instruction model is based on Mistral-7B-v0.1, a transformer model with the following architecture choices:
- Grouped-Query Attention
- Sliding-Window Attention
- Byte-fallback BPE tokenizer | [
"# Model Card for Eclipse-13B-dpo\n\nMistral-7B-v0.1 model fine-tuned on the Ultrafeedback dataset using techinques shown in the paper Self-Rewarding Language Models.",
"## Instruction format\n\nIn order to leverage instruction fine-tuning, your prompt should be surrounded by '[INST]' and '[/INST]' tokens. The very first instruction should begin with a begin of sentence id. The next instructions should not. The assistant generation will be ended by the end-of-sentence token id.\n\nE.g.\n\n\nThis format is available as a chat template via the 'apply_chat_template()' method:",
"## Model Architecture\nThis instruction model is based on Mistral-7B-v0.1, a transformer model with the following architecture choices:\n- Grouped-Query Attention\n- Sliding-Window Attention\n- Byte-fallback BPE tokenizer"
] | [
"TAGS\n#transformers #safetensors #mixtral #text-generation #mistral #Eclipse-13B-dpo #en #arxiv-2401.10020 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Model Card for Eclipse-13B-dpo\n\nMistral-7B-v0.1 model fine-tuned on the Ultrafeedback dataset using techinques shown in the paper Self-Rewarding Language Models.",
"## Instruction format\n\nIn order to leverage instruction fine-tuning, your prompt should be surrounded by '[INST]' and '[/INST]' tokens. The very first instruction should begin with a begin of sentence id. The next instructions should not. The assistant generation will be ended by the end-of-sentence token id.\n\nE.g.\n\n\nThis format is available as a chat template via the 'apply_chat_template()' method:",
"## Model Architecture\nThis instruction model is based on Mistral-7B-v0.1, a transformer model with the following architecture choices:\n- Grouped-Query Attention\n- Sliding-Window Attention\n- Byte-fallback BPE tokenizer"
] | [
78,
46,
105,
56
] | [
"passage: TAGS\n#transformers #safetensors #mixtral #text-generation #mistral #Eclipse-13B-dpo #en #arxiv-2401.10020 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Model Card for Eclipse-13B-dpo\n\nMistral-7B-v0.1 model fine-tuned on the Ultrafeedback dataset using techinques shown in the paper Self-Rewarding Language Models.## Instruction format\n\nIn order to leverage instruction fine-tuning, your prompt should be surrounded by '[INST]' and '[/INST]' tokens. The very first instruction should begin with a begin of sentence id. The next instructions should not. The assistant generation will be ended by the end-of-sentence token id.\n\nE.g.\n\n\nThis format is available as a chat template via the 'apply_chat_template()' method:## Model Architecture\nThis instruction model is based on Mistral-7B-v0.1, a transformer model with the following architecture choices:\n- Grouped-Query Attention\n- Sliding-Window Attention\n- Byte-fallback BPE tokenizer"
] | [
-0.049504298716783524,
0.009758960455656052,
-0.006403662264347076,
0.023708999156951904,
0.0774274617433548,
-0.03904717415571213,
0.12328093498945236,
0.028514118865132332,
0.03779086098074913,
0.029749028384685516,
0.006698272190988064,
0.1322273313999176,
0.09164708107709885,
0.14450311660766602,
-0.024339668452739716,
-0.2441525012254715,
0.08236086368560791,
-0.0365990474820137,
0.10275685787200928,
0.0680546909570694,
0.075037382543087,
-0.04603796824812889,
0.0649280771613121,
0.04091426730155945,
-0.07138459384441376,
-0.004746535327285528,
0.0021199441980570555,
-0.024397755041718483,
0.0977708101272583,
0.07261383533477783,
0.04840614274144173,
0.055262673646211624,
-0.012279009446501732,
-0.11746904253959656,
0.02178177982568741,
0.0850755125284195,
0.01796024851500988,
0.05744301527738571,
0.09005696326494217,
0.08770722150802612,
0.06273561716079712,
-0.031095163896679878,
0.04323282837867737,
0.0607551708817482,
-0.10511036217212677,
-0.07619845122098923,
-0.10070805996656418,
0.04536562040448189,
0.19847972691059113,
0.04901724308729172,
0.00043460799497552216,
0.10125846415758133,
0.07206171751022339,
0.09081334620714188,
0.11061476171016693,
-0.22266164422035217,
-0.041152168065309525,
0.07313908636569977,
0.04330766573548317,
0.07138354331254959,
-0.04540339112281799,
-0.008537346497178078,
-0.00003947209916077554,
0.03661688044667244,
0.021013321354985237,
-0.051166050136089325,
-0.05295776575803757,
-0.08479208499193192,
-0.10532306879758835,
-0.03863932192325592,
0.19841167330741882,
0.0012203473597764969,
-0.05680998042225838,
-0.12832562625408173,
-0.10480888932943344,
0.10539282858371735,
-0.01811066083610058,
-0.08570074290037155,
0.0536436028778553,
0.0713464692234993,
0.14836539328098297,
-0.053622327744960785,
-0.08086443692445755,
0.0054199183359742165,
-0.05961163714528084,
-0.018183469772338867,
-0.03552553430199623,
0.046002987772226334,
-0.10306616872549057,
0.11905562877655029,
-0.02849343605339527,
-0.12352733314037323,
-0.08158943057060242,
-0.06308408081531525,
-0.03639760985970497,
-0.06847431510686874,
0.008767172694206238,
-0.08834909647703171,
0.04668332263827324,
0.1775519698858261,
-0.05553337186574936,
0.07524986565113068,
-0.04408584535121918,
0.0464756153523922,
0.005467998329550028,
0.10925346612930298,
0.04410237446427345,
-0.06702186167240143,
0.07830029726028442,
0.02687249705195427,
0.13067199289798737,
-0.00658115791156888,
-0.0632207989692688,
-0.08662889897823334,
-0.03341313824057579,
0.025426307693123817,
0.06606972962617874,
0.06320874392986298,
-0.03793428838253021,
-0.03974916785955429,
0.2729331851005554,
-0.09396523237228394,
0.01784314401447773,
-0.013031663373112679,
0.008219636976718903,
0.02764413133263588,
0.08418112993240356,
0.0013153020991012454,
-0.044271320104599,
-0.06167769059538841,
-0.047248225659132004,
-0.040502678602933884,
-0.0885191559791565,
-0.036365993320941925,
0.01592312939465046,
0.05048635974526405,
-0.03327302262187004,
-0.10273005068302155,
-0.25609880685806274,
-0.028184907510876656,
0.06795671582221985,
-0.026192691177129745,
-0.04325785115361214,
-0.04576157033443451,
-0.027243688702583313,
0.017637627199292183,
-0.024297639727592468,
0.017412815243005753,
-0.020295074209570885,
0.0199618898332119,
-0.005310053937137127,
0.05985475704073906,
-0.11875589191913605,
0.030642621219158173,
-0.10240329056978226,
0.057805418968200684,
-0.3117842376232147,
0.11302918195724487,
0.019003694877028465,
0.05687880888581276,
-0.08185548335313797,
0.027928480878472328,
0.041571810841560364,
-0.012302299030125141,
0.058051787316799164,
0.15232089161872864,
-0.18477493524551392,
-0.0018298812210559845,
0.1494016796350479,
-0.1711387187242508,
-0.04916699603199959,
0.10104444622993469,
0.0011373625602573156,
0.05082504451274872,
0.04979388415813446,
0.11083589494228363,
0.07851234078407288,
-0.041679948568344116,
-0.004313242621719837,
0.05022745579481125,
-0.11098062992095947,
0.021682342514395714,
-0.036662764847278595,
-0.02026904560625553,
-0.021145178005099297,
0.06970246881246567,
-0.0035822244826704264,
0.03232979029417038,
0.03359708935022354,
0.009814681485295296,
-0.036086682230234146,
0.03597414121031761,
-0.010654907673597336,
-0.040534477680921555,
-0.06370832771062851,
-0.0319516621530056,
-0.031190570443868637,
0.060769177973270416,
0.0863933339715004,
-0.020035045221447945,
-0.011418689042329788,
-0.0444270484149456,
0.026879558339715004,
-0.07338083535432816,
0.01997009664773941,
-0.09798314422369003,
-0.061140093952417374,
0.04015267267823219,
0.012076989747583866,
-0.059361573308706284,
0.036169152706861496,
0.043159883469343185,
0.045213665813207626,
0.037101250141859055,
-0.056134894490242004,
0.06205765902996063,
-0.04963434860110283,
-0.05162540450692177,
-0.08194076269865036,
-0.022665152326226234,
-0.04968809708952904,
0.09820650517940521,
-0.18740886449813843,
0.0675407126545906,
0.015066422522068024,
0.06841538846492767,
0.03253719583153725,
-0.043817222118377686,
0.018023906275629997,
-0.036362696439027786,
-0.023592865094542503,
-0.05623184144496918,
0.05954147130250931,
0.11476292461156845,
0.05053940415382385,
0.10268445312976837,
-0.19038155674934387,
-0.17372450232505798,
0.05241992697119713,
-0.004652624484151602,
-0.026218470185995102,
-0.12509304285049438,
-0.04534509778022766,
-0.04599161073565483,
-0.021964292973279953,
-0.06802483648061752,
0.19855742156505585,
0.028253456577658653,
0.11959504336118698,
-0.05460710451006889,
-0.07268967479467392,
-0.0011743399081751704,
-0.032370708882808685,
-0.04711470380425453,
0.04223855584859848,
-0.04391633719205856,
-0.08642129600048065,
0.06381626427173615,
0.029324183240532875,
0.029445726424455643,
0.0764470100402832,
-0.013071304187178612,
-0.012488540261983871,
-0.0014300853945314884,
0.06180363520979881,
-0.012451517395675182,
0.039691511541604996,
-0.11071261018514633,
0.03296772390604019,
0.05348676070570946,
0.02021678350865841,
0.016050782054662704,
-0.09858012199401855,
0.054335132241249084,
0.0020480521488934755,
-0.013143260963261127,
0.006164942402392626,
0.048581112176179886,
0.007764643523842096,
0.0394635908305645,
-0.011541283689439297,
0.004451699089258909,
0.034768931567668915,
-0.04223734140396118,
-0.1313118189573288,
0.11188344657421112,
-0.0935434103012085,
-0.2351178526878357,
-0.18454046547412872,
-0.07734999060630798,
-0.08259434252977371,
0.025944150984287262,
0.07140358537435532,
0.009474719874560833,
-0.07515518367290497,
-0.09584636986255646,
0.0066683036275208,
0.04670652374625206,
-0.06524519622325897,
0.008829106576740742,
-0.0425102524459362,
0.03346668556332588,
-0.1362149864435196,
-0.015358420088887215,
0.010534512810409069,
-0.04973685368895531,
0.07097217440605164,
-0.016994452103972435,
0.009026024490594864,
0.12994763255119324,
-0.026191452518105507,
-0.0066566201858222485,
0.012391761876642704,
0.2179836630821228,
-0.05484241992235184,
0.12529878318309784,
0.22909991443157196,
-0.011992129497230053,
0.09069828689098358,
0.20414908230304718,
-0.038414400070905685,
-0.08045000582933426,
0.039391692727804184,
-0.061656393110752106,
-0.01325912307947874,
-0.1759905219078064,
-0.06384220719337463,
-0.08311455696821213,
-0.023833686485886574,
0.022321708500385284,
0.04450026527047157,
0.06116887554526329,
0.053822316229343414,
-0.08307521045207977,
0.08641694486141205,
0.09859351813793182,
0.12176424264907837,
0.1343122124671936,
0.006558688823133707,
0.05169953405857086,
-0.04764142259955406,
0.027143236249685287,
0.058867741376161575,
0.062177401036024094,
0.1296902745962143,
-0.03355669975280762,
0.0963607132434845,
0.05535842105746269,
-0.030081365257501602,
0.048517562448978424,
0.05802103504538536,
-0.04660160839557648,
-0.013068350031971931,
-0.01724810153245926,
-0.0920400395989418,
-0.035185523331165314,
0.041083794087171555,
-0.1838190257549286,
0.07966772466897964,
-0.03269457817077637,
0.05379248037934303,
0.038960523903369904,
0.27627450227737427,
0.04193643480539322,
-0.2240854948759079,
-0.08811087906360626,
0.08436712622642517,
-0.013926316052675247,
-0.14678005874156952,
0.0022893627174198627,
0.14365032315254211,
-0.06515061110258102,
0.057168442755937576,
-0.016548728570342064,
0.08621153980493546,
-0.11023663729429245,
0.01802068203687668,
-0.04332106187939644,
0.14875058829784393,
-0.018846098333597183,
0.03564874082803726,
-0.15137387812137604,
0.04378119483590126,
0.016724059358239174,
0.08677356690168381,
-0.03554878383874893,
0.10434888303279877,
0.037958938628435135,
0.14983855187892914,
0.08022970706224442,
0.0168781615793705,
-0.031266871839761734,
-0.009004287421703339,
-0.06329335272312164,
0.006140951998531818,
-0.03479385003447533,
-0.0026251364033669233,
0.00740571366623044,
-0.05193367600440979,
-0.02543501742184162,
-0.0026675681583583355,
0.0020984867587685585,
-0.12290262430906296,
-0.14811094105243683,
0.02421914041042328,
0.0825595036149025,
0.07448072731494904,
-0.05365966260433197,
0.024093007668852806,
-0.03344930335879326,
0.1465730220079422,
-0.058201584964990616,
-0.09473448991775513,
-0.09636776894330978,
-0.05869600549340248,
0.004724693018943071,
-0.019443096593022346,
0.017956679686903954,
-0.012856034561991692,
0.1625014990568161,
-0.02650783583521843,
-0.09720686823129654,
0.05073059722781181,
-0.1275038868188858,
-0.02470490150153637,
-0.052050717175006866,
0.04239877313375473,
0.043361153453588486,
-0.03885950893163681,
0.03932114690542221,
-0.002046820241957903,
-0.019169872626662254,
-0.06603274494409561,
0.010629847645759583,
0.22094425559043884,
0.07926589995622635,
-0.02097315713763237,
-0.044396769255399704,
-0.2354736328125,
-0.008912255987524986,
-0.016048872843384743,
0.03205075114965439,
0.220875084400177,
-0.02323518693447113,
0.1023101806640625,
0.13779450953006744,
-0.06757806241512299,
-0.1257343590259552,
0.050271403044462204,
0.02808200567960739,
0.0034641334787011147,
-0.03391316533088684,
-0.10584904253482819,
0.0913470908999443,
0.057249803096055984,
-0.0300131868571043,
0.13107934594154358,
-0.24828191101551056,
-0.0670197382569313,
0.0740668848156929,
0.06508895754814148,
0.13371193408966064,
-0.05259300023317337,
-0.07384060323238373,
-0.043507736176252365,
-0.15245187282562256,
0.0016115129692479968,
-0.013318300247192383,
0.016404161229729652,
-0.032036006450653076,
0.05164692550897598,
0.04081278666853905,
-0.029662011191248894,
0.08720067888498306,
-0.03845880925655365,
0.0791698694229126,
-0.08974691480398178,
0.026778969913721085,
0.03468870371580124,
-0.08743185549974442,
0.12100128084421158,
0.008696619421243668,
0.03987155482172966,
-0.046159837394952774,
-0.004502440802752972,
-0.06489250063896179,
0.0823037177324295,
-0.03212390094995499,
-0.06424158066511154,
0.04183824360370636,
-0.020781347528100014,
0.06877133995294571,
0.036259036511182785,
-0.050060804933309555,
-0.0752929300069809,
-0.006263457704335451,
0.10652727633714676,
0.09586188942193985,
-0.1071852296590805,
-0.020940354093909264,
-0.03089328669011593,
0.0012122251791879535,
0.05977753922343254,
-0.04663969203829765,
0.022458260878920555,
0.040561188012361526,
-0.004085134714841843,
0.12432357668876648,
0.044570472091436386,
-0.047597941011190414,
-0.01712985895574093,
0.032579027116298676,
-0.1376611888408661,
-0.08531296253204346,
-0.04887047037482262,
0.23916080594062805,
-0.05162333324551582,
0.05995293706655502,
0.1146959662437439,
-0.014299341477453709,
-0.0271085686981678,
0.04585311934351921,
0.04564131423830986,
0.0009636003524065018,
-0.031613826751708984,
-0.0838666707277298,
0.03142358735203743,
-0.0736650750041008,
0.05115025117993355,
0.05771251022815704,
-0.10694706439971924,
0.0038705335464328527,
0.1379585713148117,
-0.13378731906414032,
-0.06076003238558769,
-0.030539914965629578,
0.08905289322137833,
0.04391294717788696,
-0.04592818766832352,
0.008669235743582249,
-0.09288997948169708,
0.026869457215070724,
0.07308457791805267,
-0.007320262957364321,
-0.026893027126789093,
-0.009677167981863022,
0.027125414460897446,
-0.07145749032497406,
0.08665028214454651,
-0.05764630064368248,
0.0814359039068222,
-0.06904678791761398,
-0.012935731559991837,
-0.01663922518491745,
-0.01872284710407257,
-0.021859154105186462,
-0.07270504534244537,
-0.06509548425674438,
-0.025861920788884163,
-0.10962123423814774,
0.0716000348329544,
-0.0784890428185463,
0.005355407949537039,
-0.006930189207196236,
0.02678956277668476,
0.0073379832319915295,
0.03932943567633629,
-0.026892684400081635,
-0.010611772537231445,
-0.03610001876950264,
0.08440279215574265,
-0.07561738044023514,
-0.023707110434770584,
0.010426608845591545,
-0.10341576486825943,
0.11930757761001587,
0.0075213005766272545,
-0.054898250848054886,
-0.07251846045255661,
-0.1704823225736618,
-0.0261366106569767,
0.03432825580239296,
0.04496483877301216,
0.04412772133946419,
-0.038540471345186234,
-0.03186760097742081,
0.016291357576847076,
-0.07617504894733429,
-0.07169218361377716,
0.061384424567222595,
-0.07872815430164337,
0.03920532390475273,
0.03770922124385834,
-0.07258766889572144,
-0.08951213210821152,
0.0029476345516741276,
0.05787467211484909,
0.024045079946517944,
0.09106779843568802,
-0.08651696145534515,
0.002204492222517729,
-0.14439113438129425,
-0.03167160600423813,
0.07802081853151321,
-0.03329777717590332,
0.029601624235510826,
-0.03912429139018059,
0.025277849286794662,
-0.002342014806345105,
0.005463370122015476,
0.031849708408117294,
0.015343084000051022,
0.0136851342394948,
-0.024522554129362106,
-0.05282508209347725,
0.010361873544752598,
0.12109792977571487,
0.0030757992062717676,
0.018828272819519043,
0.04113934934139252,
0.055635012686252594,
0.06957526504993439,
0.04193240404129028,
0.08582606911659241,
0.05619899928569794,
-0.008036570623517036,
0.08265282213687897,
-0.0012737064389511943,
-0.020937908440828323,
-0.15878939628601074,
-0.010333680547773838,
-0.04637192189693451,
0.08631639927625656,
-0.04709639027714729,
0.14059512317180634,
0.17627133429050446,
-0.10466969758272171,
0.03687252849340439,
-0.008327576331794262,
-0.019972609356045723,
-0.06673012673854828,
-0.2598629891872406,
0.0012610492995008826,
-0.11413323879241943,
-0.04429592937231064,
-0.09852173179388046,
0.043650511652231216,
0.029962241649627686,
-0.018414735794067383,
0.005666309967637062,
0.04899636656045914,
0.007185029797255993,
-0.04467066004872322,
0.013964299112558365,
-0.03553440049290657,
0.041143521666526794,
-0.07189302891492844,
-0.029796091839671135,
0.08170992136001587,
0.023216912522912025,
0.042174894362688065,
0.06258371472358704,
0.12873953580856323,
0.014820157550275326,
-0.0048609222285449505,
-0.07167404145002365,
0.016012219712138176,
0.038451626896858215,
-0.06371743977069855,
0.03838516026735306,
0.10325487703084946,
-0.0057030231691896915,
0.015390682965517044,
0.15838585793972015,
-0.045202452689409256,
-0.11067670583724976,
-0.09388367086648941,
0.1446257084608078,
-0.047132980078458786,
0.01935148611664772,
-0.0019349339418113232,
-0.12561029195785522,
-0.023705007508397102,
0.14388974010944366,
0.041843898594379425,
0.02359911799430847,
-0.018901152536273003,
-0.0445115864276886,
-0.0066877626813948154,
-0.05013800784945488,
0.11404165625572205,
0.08462873101234436,
0.24713848531246185,
-0.026822512969374657,
0.06005535647273064,
0.013031906448304653,
-0.010365328751504421,
-0.05080604553222656,
0.03404577821493149,
-0.0643426775932312,
-0.026080049574375153,
0.02561822533607483,
0.05873692408204079,
-0.03757175803184509,
-0.0452297180891037,
-0.06223159655928612,
0.011705087497830391,
-0.033113397657871246,
-0.029786989092826843,
0.04438251256942749,
-0.0287789274007082,
0.0447784848511219,
-0.021284669637680054,
-0.008846654556691647,
0.25251975655555725,
-0.06184489279985428,
-0.054818522185087204,
-0.04347999766469002,
0.022406674921512604,
-0.1517801731824875,
0.16943158209323883,
-0.0036855482030659914,
0.05188552290201187,
0.06882989406585693,
0.0648869201540947,
-0.13368771970272064,
0.02393784560263157,
-0.039596077054739,
-0.04235782101750374,
0.037763308733701706,
0.07449015974998474,
-0.001855391776189208,
0.026686353608965874,
0.041747063398361206,
-0.12609617412090302,
0.016172753646969795,
0.05902290344238281,
-0.005969881545752287,
-0.06031062453985214,
0.04789412021636963,
-0.05126141011714935,
0.11275240778923035,
0.0526658333837986,
0.0028300646226853132,
0.005948117468506098,
-0.02895626239478588,
0.011785544455051422,
0.015131378546357155,
0.06209886074066162,
0.0003851765941362828,
-0.10980120301246643,
0.022825051099061966,
0.05032999813556671,
0.04601684957742691,
-0.20196926593780518,
-0.0645468533039093,
-0.07470249384641647,
-0.05249994993209839,
-0.10221794992685318,
0.057723112404346466,
0.09134519845247269,
0.04167749732732773,
-0.03680980205535889,
-0.018407832831144333,
0.0031590138096362352,
0.1377294361591339,
-0.053666554391384125,
-0.06385459005832672
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
### Framework versions
- PEFT 0.8.2 | {"language": ["my"], "library_name": "transformers", "tags": ["text-generation-inference"], "base_model": "mistralai/Mistral-7B-v0.1", "pipeline_tag": "text-generation"} | text-generation | Pplus/mistral-health-faq_log_50 | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"text-generation-inference",
"my",
"arxiv:1910.09700",
"base_model:mistralai/Mistral-7B-v0.1",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-15T05:13:39+00:00 | [
"1910.09700"
] | [
"my"
] | TAGS
#transformers #safetensors #mistral #text-generation #text-generation-inference #my #arxiv-1910.09700 #base_model-mistralai/Mistral-7B-v0.1 #autotrain_compatible #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
### Framework versions
- PEFT 0.8.2 | [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact",
"### Framework versions\n\n- PEFT 0.8.2"
] | [
"TAGS\n#transformers #safetensors #mistral #text-generation #text-generation-inference #my #arxiv-1910.09700 #base_model-mistralai/Mistral-7B-v0.1 #autotrain_compatible #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact",
"### Framework versions\n\n- PEFT 0.8.2"
] | [
74,
6,
3,
54,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4,
11
] | [
"passage: TAGS\n#transformers #safetensors #mistral #text-generation #text-generation-inference #my #arxiv-1910.09700 #base_model-mistralai/Mistral-7B-v0.1 #autotrain_compatible #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact### Framework versions\n\n- PEFT 0.8.2"
] | [
-0.07007022202014923,
0.17216657102108002,
-0.0044931978918612,
0.009951205924153328,
0.1131531223654747,
0.01096577662974596,
0.08301664888858795,
0.11091405898332596,
-0.017744656652212143,
0.13544400036334991,
0.027204709127545357,
0.0907534658908844,
0.11184169352054596,
0.18117913603782654,
-0.004374158103018999,
-0.22582994401454926,
0.06418736279010773,
-0.11617618054151535,
0.015118241310119629,
0.12464745342731476,
0.14238113164901733,
-0.09939833730459213,
0.06758810579776764,
-0.036237556487321854,
-0.00847383588552475,
-0.0301818810403347,
-0.0681241825222969,
-0.05691312998533249,
0.06426545232534409,
0.06862884759902954,
0.06025746464729309,
0.027254221960902214,
0.07589249312877655,
-0.30748265981674194,
0.020479226484894753,
0.0755985751748085,
0.004736555274575949,
0.0660804808139801,
0.09297988563776016,
-0.06783685088157654,
0.12824349105358124,
-0.05548833683133125,
0.14444255828857422,
0.07492650300264359,
-0.09439358860254288,
-0.18775145709514618,
-0.07577677071094513,
0.08664055168628693,
0.16314123570919037,
0.06213073432445526,
-0.03767126798629761,
0.1368471086025238,
-0.07709400355815887,
0.01745004765689373,
0.0951375812292099,
-0.08371777087450027,
-0.06298305839300156,
0.03925740346312523,
0.07888459414243698,
0.09285017102956772,
-0.1269974261522293,
-0.006057520397007465,
0.03999495878815651,
0.01758549176156521,
0.09068720787763596,
0.020391998812556267,
0.12334398925304413,
0.02139975130558014,
-0.1372612565755844,
-0.0593627467751503,
0.13676847517490387,
0.033188458532094955,
-0.04385112226009369,
-0.23196908831596375,
-0.008587258867919445,
-0.03676225244998932,
-0.03305363282561302,
-0.04521414265036583,
0.032500091940164566,
-0.02931971289217472,
0.09178682416677475,
0.010864448733627796,
-0.06584383547306061,
-0.04802355170249939,
0.08901771157979965,
0.06800875067710876,
0.028233278542757034,
-0.030495978891849518,
0.01608407497406006,
0.11591225117444992,
0.08912530541419983,
-0.12737871706485748,
-0.06457502394914627,
-0.0609903559088707,
-0.0921572893857956,
-0.04878385365009308,
0.03485938161611557,
0.08922164142131805,
0.06468601524829865,
0.18931348621845245,
-0.027914652600884438,
0.057908795773983,
0.03639543429017067,
0.011834617704153061,
0.05957827717065811,
0.05115082859992981,
-0.06416898965835571,
-0.1386060267686844,
-0.05250956490635872,
0.1188293993473053,
0.003456867765635252,
-0.021914470940828323,
-0.0237666554749012,
0.050104979425668716,
0.059296365827322006,
0.12223809957504272,
0.07050306349992752,
0.015567705035209656,
-0.0729752779006958,
-0.03959690406918526,
0.18003545701503754,
-0.15482020378112793,
0.012381725013256073,
0.023959007114171982,
-0.05113372206687927,
-0.02997114509344101,
0.003178349696099758,
0.011684882454574108,
-0.029027173295617104,
0.10634123533964157,
-0.06836062669754028,
-0.033616676926612854,
-0.10634177178144455,
-0.05229957774281502,
0.032634224742650986,
0.0023613926023244858,
-0.03421846032142639,
-0.03262892737984657,
-0.10544931888580322,
-0.07711925357580185,
0.06270017474889755,
-0.07204209268093109,
-0.07996293902397156,
-0.03229460120201111,
-0.06876879930496216,
0.01486404798924923,
0.0002548435877542943,
0.12428608536720276,
-0.03278392180800438,
0.036232464015483856,
-0.04389956220984459,
0.06568844616413116,
0.1334300935268402,
0.02972482517361641,
-0.06120942905545235,
0.06939606368541718,
-0.2016572505235672,
0.10966432839632034,
-0.10155929625034332,
0.028649024665355682,
-0.17428885400295258,
-0.03757236897945404,
0.015466705895960331,
0.024316225200891495,
-0.01214563474059105,
0.14113181829452515,
-0.18038971722126007,
-0.04048161208629608,
0.17529131472110748,
-0.1301843374967575,
-0.09400205314159393,
0.05691378563642502,
-0.043602678924798965,
0.11423614621162415,
0.04334885627031326,
-0.028929773718118668,
0.08229176700115204,
-0.13379287719726562,
-0.015990247949957848,
-0.05165699124336243,
0.019206436350941658,
0.1611340343952179,
0.07194012403488159,
-0.060496870428323746,
0.029713748022913933,
0.019016290083527565,
-0.04383252188563347,
-0.05135297402739525,
-0.043560903519392014,
-0.0978030413389206,
0.010351533070206642,
-0.06678906083106995,
0.013440856710076332,
-0.016217121854424477,
-0.0798160508275032,
-0.03391555696725845,
-0.15091665089130402,
0.005175219848752022,
0.10159014910459518,
0.009244544431567192,
-0.030563080683350563,
-0.08848219364881516,
0.004710964392870665,
0.026217307895421982,
-0.014112339355051517,
-0.1643935739994049,
-0.0704331025481224,
0.030449124053120613,
-0.17085792124271393,
0.01253602560609579,
-0.04533807188272476,
0.036921288818120956,
0.03759137541055679,
-0.05006613954901695,
-0.0169720146805048,
0.00892031379044056,
0.013210953213274479,
-0.017225008457899094,
-0.2490008920431137,
-0.017066052183508873,
-0.05404505878686905,
0.17701205611228943,
-0.23433353006839752,
0.04612627997994423,
0.07548178732395172,
0.1281396448612213,
0.0012473975075408816,
-0.04735951125621796,
0.03987478092312813,
-0.04553639143705368,
-0.033707525581121445,
-0.06379081308841705,
0.00481377961114049,
-0.0359448678791523,
-0.041427213698625565,
0.030441666021943092,
-0.18181093037128448,
-0.015902213752269745,
0.09740739315748215,
0.07919257134199142,
-0.1655759960412979,
-0.08645899593830109,
-0.03326718136668205,
-0.06301309913396835,
-0.09190620481967926,
-0.054805826395750046,
0.11487458646297455,
0.048859089612960815,
0.04560605809092522,
-0.07613297551870346,
-0.05512377992272377,
0.006562414113432169,
-0.0023015830665826797,
-0.04069887101650238,
0.10103511810302734,
0.07136524468660355,
-0.10866732895374298,
0.10353238135576248,
0.05708415433764458,
0.06237545236945152,
0.1002880185842514,
-0.00007064964302117005,
-0.09633834660053253,
-0.013897167518734932,
0.04325821250677109,
0.0059199645183980465,
0.122816301882267,
-0.08842988312244415,
0.02685580402612686,
0.044696956872940063,
-0.032585062086582184,
0.01932637020945549,
-0.10239683091640472,
0.02132706344127655,
0.023719890043139458,
-0.017367253080010414,
0.02572731114923954,
-0.04240568354725838,
0.019186943769454956,
0.10660936683416367,
0.02979191765189171,
0.02250889129936695,
0.004275399725884199,
-0.044923100620508194,
-0.1269157975912094,
0.1781591922044754,
-0.09617447108030319,
-0.25258758664131165,
-0.12530945241451263,
0.014719489961862564,
0.037887345999479294,
-0.026203157380223274,
0.018268613144755363,
-0.05262831225991249,
-0.09911014884710312,
-0.0993904322385788,
0.022795064374804497,
0.05274422839283943,
-0.09232530742883682,
-0.037059418857097626,
0.05619334802031517,
0.03720129653811455,
-0.12143722921609879,
0.031004147604107857,
0.05316212773323059,
-0.06760138273239136,
0.002292343880981207,
0.054185185581445694,
0.077271968126297,
0.17408515512943268,
0.026298170909285545,
-0.012244116514921188,
0.011016701348125935,
0.2362947314977646,
-0.14931544661521912,
0.09147528558969498,
0.14036309719085693,
-0.04276866465806961,
0.08585868775844574,
0.22114619612693787,
0.03406374529004097,
-0.09148888289928436,
0.042979560792446136,
0.03869602456688881,
-0.027051430195569992,
-0.23789075016975403,
-0.07563423365354538,
-0.0062853265553712845,
-0.08982802927494049,
0.1010618656873703,
0.09721110016107559,
0.11412711441516876,
0.05037218704819679,
-0.11309226602315903,
-0.0799541175365448,
0.06477019190788269,
0.11566304415464401,
-0.013344850391149521,
0.008980720303952694,
0.09896431863307953,
-0.0280318446457386,
0.018983203917741776,
0.07995910942554474,
0.015489152632653713,
0.19934868812561035,
0.04322971776127815,
0.1356404721736908,
0.08792906254529953,
0.06491667032241821,
0.019378261640667915,
0.013966104947030544,
0.023625172674655914,
0.03074001520872116,
-0.016531452536582947,
-0.09206534177064896,
-0.0019642736297100782,
0.13143789768218994,
0.024012312293052673,
0.036462876945734024,
0.01839156076312065,
-0.04132327064871788,
0.06566638499498367,
0.19778576493263245,
0.014198499731719494,
-0.2241695523262024,
-0.05855998024344444,
0.0779438316822052,
-0.07737105339765549,
-0.11225864291191101,
-0.006669743452221155,
0.03595970198512077,
-0.1826065182685852,
0.055689968168735504,
-0.02543032355606556,
0.1040540263056755,
-0.098399817943573,
-0.02506032958626747,
0.05872899293899536,
0.06888821721076965,
-0.03339259326457977,
0.07716517895460129,
-0.2004852592945099,
0.15839076042175293,
0.004238042514771223,
0.061132971197366714,
-0.09730493277311325,
0.08817940205335617,
0.02049981616437435,
0.01998664252460003,
0.16277162730693817,
-0.006233964115381241,
-0.07969716936349869,
-0.06710035353899002,
-0.07333749532699585,
-0.00851830467581749,
0.10374880582094193,
-0.119022898375988,
0.08728116750717163,
-0.009845673106610775,
-0.033534735441207886,
-0.008647039532661438,
-0.11049225181341171,
-0.14500078558921814,
-0.17863692343235016,
0.06486384570598602,
-0.10681203752756119,
0.04576342925429344,
-0.11316319555044174,
-0.06685791909694672,
-0.04438050091266632,
0.177337646484375,
-0.20108501613140106,
-0.09153132885694504,
-0.14666935801506042,
-0.07529181241989136,
0.1350918859243393,
-0.04595456272363663,
0.06873366981744766,
0.0030155333224684,
0.2071489542722702,
0.00417616730555892,
0.000595996854826808,
0.07922636717557907,
-0.09489859640598297,
-0.21072575449943542,
-0.08102815598249435,
0.14811863005161285,
0.12189377099275589,
0.04148714616894722,
-0.008190339431166649,
0.024378053843975067,
-0.0031244028359651566,
-0.11677618324756622,
0.012768102809786797,
0.14990170300006866,
0.0895787701010704,
0.03665802255272865,
-0.036793217062950134,
-0.15181754529476166,
-0.11253168433904648,
-0.045346830040216446,
0.015278095379471779,
0.18255352973937988,
-0.070163294672966,
0.1547125279903412,
0.1534806191921234,
-0.060473520308732986,
-0.19479450583457947,
0.03185559809207916,
0.05695788934826851,
-0.016955113038420677,
0.028245683759450912,
-0.21447153389453888,
0.07511495798826218,
0.03288934752345085,
-0.055963773280382156,
0.13512617349624634,
-0.18937912583351135,
-0.1436237096786499,
0.08063424378633499,
0.06978213787078857,
-0.2315143197774887,
-0.14839693903923035,
-0.10886243730783463,
-0.051572367548942566,
-0.12794552743434906,
0.07710566371679306,
0.008850512094795704,
0.0008830989827401936,
0.03528085723519325,
0.0250087957829237,
0.01630784571170807,
-0.046906646341085434,
0.20955048501491547,
0.0020430502481758595,
0.038992878049612045,
-0.0715077742934227,
-0.07674292474985123,
0.028848381713032722,
-0.05751539394259453,
0.0663011446595192,
-0.021192258223891258,
0.0073353080078959465,
-0.10886449366807938,
-0.059305425733327866,
-0.06568103283643723,
0.035433199256658554,
-0.08981285989284515,
-0.08954839408397675,
-0.05353783816099167,
0.10278448462486267,
0.08859456330537796,
-0.03576088324189186,
-0.0412970595061779,
-0.10614104568958282,
0.09167167544364929,
0.2134261280298233,
0.1891546994447708,
0.07615596055984497,
-0.080021433532238,
0.0011183436727151275,
-0.01876096986234188,
0.05365345999598503,
-0.1975526213645935,
0.04296958073973656,
0.050431568175554276,
0.035165105015039444,
0.13235437870025635,
-0.024213729426264763,
-0.1503533273935318,
-0.04391707479953766,
0.058047402650117874,
-0.059608906507492065,
-0.1805514395236969,
-0.0059903888031840324,
0.10045134276151657,
-0.1622408777475357,
-0.05036443844437599,
0.03251100704073906,
-0.02649686299264431,
-0.030590277165174484,
0.002945733955129981,
0.08646061271429062,
0.02590290457010269,
0.12058506906032562,
0.05032283812761307,
0.11979082226753235,
-0.10713419318199158,
0.0742834284901619,
0.08200490474700928,
-0.09405406564474106,
0.020458171144127846,
0.08504309505224228,
-0.06727524101734161,
-0.036187492311000824,
0.01820012368261814,
0.06389964371919632,
0.042645927518606186,
-0.07021811604499817,
0.0028932266868650913,
-0.11859320849180222,
0.0681084468960762,
0.14435605704784393,
0.03161701187491417,
-0.0038415840826928616,
0.05028916150331497,
0.015496023930609226,
-0.0850900262594223,
0.10899036377668381,
0.04021456465125084,
0.03565603122115135,
-0.0535336397588253,
-0.0015395901864394546,
0.03303303197026253,
-0.0180622860789299,
-0.0170016810297966,
-0.03820356726646423,
-0.06679307669401169,
-0.008154413662850857,
-0.15656228363513947,
0.032507557421922684,
-0.09030274301767349,
0.007973136380314827,
0.016889072954654694,
-0.040304601192474365,
-0.009464473463594913,
0.009600612334907055,
-0.08098160475492477,
-0.041689153760671616,
-0.008696116507053375,
0.1168205663561821,
-0.1535230427980423,
0.018481019884347916,
0.09162583202123642,
-0.12737098336219788,
0.08006944507360458,
-0.0014931897167116404,
-0.007888589054346085,
0.021528344601392746,
-0.14086826145648956,
0.05843579024076462,
-0.004117744974792004,
0.007637839764356613,
0.022557614371180534,
-0.21338015794754028,
0.005586002487689257,
-0.0496617928147316,
-0.049058202654123306,
-0.002663566265255213,
-0.04568304866552353,
-0.11893048882484436,
0.08990754932165146,
0.003580483142286539,
-0.08129750937223434,
-0.018480706959962845,
0.04276391491293907,
0.12282580137252808,
-0.052702080458402634,
0.13983815908432007,
-0.021727459505200386,
0.06252221763134003,
-0.180588036775589,
-0.01657816581428051,
-0.016596540808677673,
0.019065063446760178,
-0.023052049800753593,
-0.0033404789865016937,
0.05848023295402527,
-0.016081420704722404,
0.21039538085460663,
-0.04198361188173294,
0.03565425053238869,
0.06855200231075287,
-0.004131033550947905,
-0.0164280254393816,
0.0888247862458229,
0.05538523569703102,
0.0351928286254406,
0.016887199133634567,
0.02425265684723854,
-0.05003520846366882,
-0.02976570837199688,
-0.13282382488250732,
0.08092635869979858,
0.17945943772792816,
0.07149085402488708,
-0.006156144198030233,
0.05559084191918373,
-0.1264938861131668,
-0.06027240678668022,
0.11694290488958359,
-0.030709294602274895,
-0.008311858400702477,
-0.05723434314131737,
0.1348825991153717,
0.17173373699188232,
-0.18521584570407867,
0.0682738795876503,
-0.06612344831228256,
-0.05502048879861832,
-0.11512868851423264,
-0.15780095756053925,
-0.06656219810247421,
-0.03713955730199814,
-0.006657142657786608,
-0.07599110156297684,
0.058146361261606216,
0.10814768075942993,
0.01649489998817444,
0.012411192990839481,
0.08115821331739426,
-0.023476915434002876,
-0.017222139984369278,
0.056713175028562546,
0.05987583100795746,
0.03374504670500755,
-0.07288883626461029,
0.004022928886115551,
0.01519998162984848,
0.03514529764652252,
0.05022862181067467,
0.027062498033046722,
-0.027844702824950218,
0.00266394205391407,
-0.020648643374443054,
-0.10066942125558853,
0.04150643199682236,
-0.025371480733156204,
-0.059392500668764114,
0.1425405889749527,
0.021591631695628166,
-0.001896145404316485,
-0.02174076810479164,
0.2362985461950302,
-0.07640518248081207,
-0.08484780043363571,
-0.14615602791309357,
0.14522431790828705,
-0.048668526113033295,
0.0541016161441803,
0.053670741617679596,
-0.09649816155433655,
0.023890214040875435,
0.12918691337108612,
0.14267030358314514,
-0.024175671860575676,
0.012749777175486088,
0.01700887829065323,
0.0074583557434380054,
-0.020442897453904152,
0.03548114374279976,
0.0545295812189579,
0.11312546581029892,
-0.06511163711547852,
0.09795662760734558,
-0.0026150604244321585,
-0.07668136805295944,
-0.02761838585138321,
0.11868252605199814,
0.001529296743683517,
0.018759965896606445,
-0.0797269269824028,
0.13996590673923492,
-0.06425666064023972,
-0.2453904002904892,
0.06402044743299484,
-0.06147700920701027,
-0.16456370055675507,
-0.021682903170585632,
0.015881149098277092,
0.0013109506107866764,
0.024318749085068703,
0.06509236246347427,
-0.06498810648918152,
0.16337989270687103,
0.042915429919958115,
-0.0702371820807457,
-0.08443083614110947,
0.0819341316819191,
-0.08810044825077057,
0.2804708778858185,
0.0032089892774820328,
0.046412281692028046,
0.09941036999225616,
-0.04573097452521324,
-0.13503609597682953,
0.046525660902261734,
0.10299289971590042,
-0.06600441038608551,
0.06489510089159012,
0.19861240684986115,
-0.007146832533180714,
0.11715326458215714,
0.08334899693727493,
-0.08566517382860184,
0.0474541075527668,
-0.06844794005155563,
-0.078901506960392,
-0.09705349802970886,
0.09238405525684357,
-0.05988804250955582,
0.14773398637771606,
0.1369680017232895,
-0.047565218061208725,
-0.004905369598418474,
-0.02382342331111431,
0.05518682301044464,
0.0009391483617946506,
0.12553124129772186,
0.01690833643078804,
-0.1990947425365448,
0.028391404077410698,
-0.020028050988912582,
0.103692427277565,
-0.23527520895004272,
-0.07436610013246536,
0.046738263219594955,
-0.01355393510311842,
-0.055957842618227005,
0.11939758062362671,
0.049826230853796005,
0.04208337143063545,
-0.048016853630542755,
-0.08991681039333344,
0.0031305148731917143,
0.17133277654647827,
-0.1144007220864296,
0.0017134913941845298
] |
null | null | spacy | [Elite Testo Max](https://atozsupplement.com/elite-testo-max/) It's vital to take note of that not all upgrade techniques are therapeutically or experimentally demonstrated, and numerous items advertised for these reasons might need guideline or logical proof supporting their adequacy and wellbeing. Prior to considering any type of male upgrade, it's significant to talk with a medical care proficient to grasp expected dangers, viability, and legitimate use. Furthermore, solid way of life decisions like standard activity, a decent eating regimen, overseeing pressure, and sufficient rest can emphatically influence sexual wellbeing and execution.
VISIT HERE FOR OFFICIAL WEBSITE:-https://atozsupplement.com/elite-testo-max/
| {"language": ["en"], "license": "cc-by-nc-sa-3.0", "library_name": "spacy", "tags": ["Elite Testo Max"]} | null | elite-testo-max/elite-testo-max | [
"spacy",
"Elite Testo Max",
"en",
"license:cc-by-nc-sa-3.0",
"region:us"
] | 2024-02-15T05:13:39+00:00 | [] | [
"en"
] | TAGS
#spacy #Elite Testo Max #en #license-cc-by-nc-sa-3.0 #region-us
| Elite Testo Max It's vital to take note of that not all upgrade techniques are therapeutically or experimentally demonstrated, and numerous items advertised for these reasons might need guideline or logical proof supporting their adequacy and wellbeing. Prior to considering any type of male upgrade, it's significant to talk with a medical care proficient to grasp expected dangers, viability, and legitimate use. Furthermore, solid way of life decisions like standard activity, a decent eating regimen, overseeing pressure, and sufficient rest can emphatically influence sexual wellbeing and execution.
VISIT HERE FOR OFFICIAL WEBSITE:-URL
| [] | [
"TAGS\n#spacy #Elite Testo Max #en #license-cc-by-nc-sa-3.0 #region-us \n"
] | [
30
] | [
"passage: TAGS\n#spacy #Elite Testo Max #en #license-cc-by-nc-sa-3.0 #region-us \n"
] | [
-0.024020496755838394,
-0.0367460623383522,
-0.00937611237168312,
0.01722409389913082,
-0.014614095911383629,
0.07559514790773392,
0.12369019538164139,
0.052746668457984924,
0.18637074530124664,
-0.009229659102857113,
0.15488672256469727,
0.06742281466722488,
-0.014434183947741985,
0.10343649983406067,
-0.0037071353290230036,
-0.029836293309926987,
0.03358063846826553,
0.00030736022745259106,
0.0532495453953743,
0.07715319842100143,
0.015752052888274193,
-0.07270153611898422,
0.03412116318941116,
-0.054492633789777756,
-0.08295165002346039,
0.02577744424343109,
0.09311698377132416,
-0.04014236852526665,
0.08930923044681549,
-0.04076502099633217,
0.10010763257741928,
0.13521268963813782,
-0.010516847483813763,
-0.19270168244838715,
0.018071794882416725,
-0.04878418520092964,
-0.10402709990739822,
0.03789559006690979,
0.034646280109882355,
0.0350784957408905,
0.09461933374404907,
0.13540272414684296,
-0.036244455724954605,
0.06609418988227844,
-0.18420948088169098,
-0.05127379298210144,
-0.10928840935230255,
-0.02053707465529442,
0.006036162842065096,
0.06677085161209106,
0.03739415109157562,
0.2151900976896286,
-0.21661458909511566,
-0.053810685873031616,
0.09945118427276611,
-0.30382785201072693,
0.01802622340619564,
0.19861885905265808,
0.09472570568323135,
0.08819207549095154,
-0.006049321964383125,
0.07308827340602875,
0.10169561952352524,
-0.008067941293120384,
-0.013147208839654922,
-0.07139497250318527,
0.07911616563796997,
0.1450815200805664,
-0.04996142536401749,
-0.044372010976076126,
0.267823725938797,
0.044979024678468704,
-0.029151402413845062,
0.06643495708703995,
0.01562215480953455,
-0.11020875722169876,
0.04479480907320976,
-0.01034362893551588,
0.07120180130004883,
0.1026013195514679,
0.047348231077194214,
0.013563436456024647,
-0.1396663933992386,
-0.08888128399848938,
-0.1484111249446869,
0.05386529862880707,
-0.044852081686258316,
0.09241423010826111,
-0.0944114476442337,
-0.012131520546972752,
-0.10702050477266312,
-0.03170028328895569,
-0.023205729201436043,
-0.07581116259098053,
0.030176972970366478,
-0.01575780287384987,
-0.028398603200912476,
0.12220922857522964,
0.11583768576383591,
0.12663009762763977,
-0.030351800844073296,
-0.01532839983701706,
-0.017585240304470062,
0.14017564058303833,
-0.029000243172049522,
0.0032354826107621193,
0.11038471758365631,
0.12852296233177185,
-0.0038023318629711866,
-0.03679462894797325,
0.03463003784418106,
-0.00980096310377121,
-0.08059001713991165,
0.029558269307017326,
-0.1885039359331131,
0.10468936711549759,
0.005447860807180405,
-0.12334510684013367,
-0.12191057205200195,
0.060227170586586,
0.2240765243768692,
0.012023793533444405,
-0.021703220903873444,
0.01563791185617447,
0.028357068076729774,
-0.06769818067550659,
-0.11033112555742264,
0.01381140947341919,
0.11910762637853622,
0.0694495365023613,
-0.12197311967611313,
-0.040125757455825806,
-0.0034693488851189613,
0.06878285109996796,
0.11887076497077942,
-0.03376918286085129,
0.019592858850955963,
-0.14586292207241058,
-0.05806213617324829,
0.028108367696404457,
0.00687472615391016,
-0.014353783801198006,
-0.023214222863316536,
0.0800914466381073,
0.003674885956570506,
-0.0024101734161376953,
-0.05529814213514328,
-0.09105122834444046,
-0.08928026258945465,
0.08312057703733444,
0.0259360671043396,
0.003797662677243352,
-0.13013489544391632,
-0.03819587826728821,
-0.11849166452884674,
0.03524630144238472,
-0.06429658085107803,
-0.08883217722177505,
-0.055177439004182816,
0.1071869507431984,
-0.03995106369256973,
-0.038588251918554306,
-0.12288738787174225,
0.03836341202259064,
-0.03408017009496689,
0.166179820895195,
-0.16032737493515015,
-0.03774021193385124,
0.040393829345703125,
-0.12006688117980957,
-0.16909636557102203,
0.06052754074335098,
0.056586336344480515,
-0.04366513714194298,
0.06189604476094246,
0.2311328649520874,
-0.0952657088637352,
-0.16398772597312927,
-0.03489076718688011,
0.11131682246923447,
-0.07236410677433014,
-0.2122563123703003,
0.1703907549381256,
-0.09175684303045273,
-0.07755865901708603,
-0.008612010627985,
-0.08541465550661087,
0.06762015074491501,
-0.026289738714694977,
-0.08341983705759048,
-0.016995087265968323,
-0.041039157658815384,
0.05072331055998802,
0.02619745023548603,
0.03336253762245178,
-0.09066478163003922,
0.0053667910397052765,
-0.02116117626428604,
0.0364626906812191,
0.11735959351062775,
0.011382567696273327,
-0.10010268539190292,
0.06677883863449097,
-0.00991063192486763,
-0.04179874062538147,
-0.04269637167453766,
0.0392160564661026,
0.02438078075647354,
0.07845975458621979,
0.07218212634325027,
0.13890567421913147,
0.022113198414444923,
-0.09998345375061035,
-0.0223495215177536,
-0.015612554736435413,
-0.0224784966558218,
0.011895745992660522,
0.0239546038210392,
-0.08557581156492233,
0.044398702681064606,
0.0016658147796988487,
-0.044571734964847565,
-0.15966705977916718,
-0.02880834974348545,
0.14106953144073486,
-0.04462715983390808,
-0.012215491384267807,
0.008249220438301563,
-0.008846234530210495,
0.04496868699789047,
0.007172819226980209,
0.034079890698194504,
0.08405699580907822,
0.021260999143123627,
-0.10595257580280304,
0.13900993764400482,
0.02271236479282379,
0.17063026130199432,
0.1639825850725174,
-0.12933577597141266,
0.018392369151115417,
0.04222496598958969,
0.004678618628531694,
0.021480167284607887,
-0.017046142369508743,
-0.004895033314824104,
0.03702929615974426,
-0.025736728683114052,
0.03000897541642189,
-0.06481566280126572,
0.02725737914443016,
0.003916962072253227,
0.0025574900209903717,
-0.08445943146944046,
0.058032434433698654,
0.24917885661125183,
-0.07717666774988174,
0.08633361011743546,
0.416439950466156,
0.03683162108063698,
0.06622538715600967,
-0.0761617049574852,
-0.0197669118642807,
-0.06625603884458542,
0.040129125118255615,
-0.024276088923215866,
0.16075152158737183,
-0.03685234487056732,
0.01790762133896351,
0.008197407238185406,
0.017163390293717384,
0.020661331713199615,
-0.15454663336277008,
-0.14746686816215515,
-0.008579992689192295,
-0.0687297135591507,
-0.20902499556541443,
0.026106052100658417,
-0.09500544518232346,
0.031107082962989807,
0.031521931290626526,
-0.09423822164535522,
0.11666177213191986,
-0.024157961830496788,
-0.09548521041870117,
0.06883802264928818,
-0.10671907663345337,
-0.07752876728773117,
-0.07199651002883911,
0.020453032106161118,
0.015182143077254295,
0.0397440604865551,
0.026823410764336586,
-0.07726340740919113,
0.011166220530867577,
0.004575369413942099,
-0.10411445051431656,
-0.08914220333099365,
0.029504645615816116,
0.0011220010928809643,
0.07373945415019989,
-0.012590180151164532,
-0.03709965571761131,
-0.07107797265052795,
-0.043780505657196045,
-0.05588452145457268,
0.06302180886268616,
-0.08562500029802322,
0.09742926061153412,
0.12024739384651184,
0.04162715747952461,
0.028163310140371323,
-0.05965181067585945,
0.09135598689317703,
-0.07758469879627228,
-0.1114712730050087,
0.21599385142326355,
0.056189801543951035,
-0.011208967305719852,
0.1069410964846611,
0.09574881941080093,
-0.08988946676254272,
-0.005604198202490807,
-0.09105890244245529,
-0.11454511433839798,
-0.24127943813800812,
-0.05189871788024902,
-0.08509476482868195,
0.10235987603664398,
0.02397969551384449,
0.09814240038394928,
0.038298238068819046,
0.0037958920001983643,
0.056048568338155746,
-0.07506690174341202,
-0.03832349181175232,
-0.0128251351416111,
0.08440588414669037,
-0.020135927945375443,
0.009100787341594696,
-0.11082512140274048,
0.006147482432425022,
0.14944040775299072,
0.08925037831068039,
0.017533445730805397,
0.23541425168514252,
0.12855352461338043,
0.10344229638576508,
0.12449892610311508,
0.11373548954725266,
0.008164619095623493,
0.08026532083749771,
-0.06600522994995117,
-0.003989760763943195,
-0.0648675486445427,
0.02768222987651825,
0.07319990545511246,
0.042817797511816025,
-0.11593692749738693,
0.01449445728212595,
-0.2084927260875702,
0.0636204406619072,
-0.0016504385275766253,
0.09338134527206421,
-0.10289043933153152,
0.09885557740926743,
0.06749364733695984,
0.08090537786483765,
0.03733375668525696,
0.11009193956851959,
0.03980148211121559,
-0.023576701059937477,
0.07029969245195389,
0.06703703850507736,
0.055348485708236694,
0.07892129570245743,
0.05693184211850166,
-0.021982908248901367,
-0.10800294578075409,
0.0370011180639267,
0.04749936982989311,
-0.2426878958940506,
0.19793221354484558,
0.0411483533680439,
-0.04341961070895195,
-0.013481768779456615,
-0.0712975561618805,
0.015006701461970806,
0.15454427897930145,
0.14395470917224884,
0.029216468334197998,
-0.16610519587993622,
-0.047770608216524124,
-0.01234948355704546,
0.01689545065164566,
0.04106225073337555,
0.0021385117433965206,
-0.12459394335746765,
-0.005832830909639597,
0.04706284776329994,
0.016242220997810364,
0.1391330510377884,
-0.0866943746805191,
0.06492019444704056,
0.018345961347222328,
0.11818429827690125,
-0.03361363708972931,
-0.06936377286911011,
0.009550072252750397,
-0.05980244278907776,
0.10593023896217346,
-0.10250884294509888,
0.05935332179069519,
-0.03251327574253082,
-0.15945148468017578,
0.059445515275001526,
-0.016910262405872345,
-0.04503439739346504,
0.02028268575668335,
-0.05725375935435295,
-0.10456684976816177,
-0.1384628564119339,
0.09715013206005096,
-0.046813033521175385,
-0.02703974023461342,
-0.05495976284146309,
0.09405674040317535,
-0.05545003339648247,
0.1052863746881485,
-0.014217079617083073,
0.05541538819670677,
-0.026907453313469887,
-0.07504747807979584,
0.1274063140153885,
-0.16053062677383423,
-0.0011140344431623816,
0.03361775726079941,
0.030848506838083267,
0.03197833150625229,
0.004867393523454666,
-0.10585691779851913,
0.1114429384469986,
0.35901039838790894,
-0.07116366922855377,
0.14019173383712769,
0.16342157125473022,
-0.0663333535194397,
-0.20442429184913635,
-0.016061285510659218,
-0.20775869488716125,
-0.057007402181625366,
0.07299255579710007,
-0.12506288290023804,
0.02156735770404339,
0.20385585725307465,
-0.111254021525383,
0.19571612775325775,
-0.1971951723098755,
-0.056913841515779495,
0.0034542495850473642,
-0.034797754138708115,
0.32058030366897583,
-0.14417555928230286,
-0.08716431260108948,
-0.05694178119301796,
-0.019378766417503357,
0.08556412160396576,
-0.011052392423152924,
0.037768784910440445,
-0.03152783215045929,
-0.1672993004322052,
-0.020572805777192116,
-0.012510323897004128,
0.18281936645507812,
-0.037693217396736145,
0.046369750052690506,
-0.015357136726379395,
-0.08907529711723328,
0.18640123307704926,
0.01779477670788765,
-0.06515569984912872,
-0.14419738948345184,
-0.04662831127643585,
-0.12169855833053589,
0.030787236988544464,
-0.060672957450151443,
0.06600243598222733,
0.0014018237125128508,
-0.040329691022634506,
-0.08533293753862381,
-0.0030910540372133255,
-0.0939653068780899,
-0.020080655813217163,
0.23243597149848938,
-0.08569463342428207,
0.0936465784907341,
0.04137921333312988,
0.07672259956598282,
-0.14465533196926117,
-0.061670392751693726,
0.010192835703492165,
-0.09441123157739639,
0.03643687814474106,
-0.21800538897514343,
-0.025969961658120155,
0.09854716062545776,
-0.002807523123919964,
0.016217736527323723,
0.0767940953373909,
-0.048573896288871765,
0.0452745147049427,
0.20371763408184052,
-0.11857926100492477,
0.008281564339995384,
-0.030305473133921623,
0.03792873024940491,
0.0812198594212532,
0.0011751255951821804,
0.02749509923160076,
-0.01690053567290306,
0.018366670235991478,
0.002464054385200143,
-0.0008123856969177723,
-0.17458492517471313,
0.010328765958547592,
0.03246402367949486,
0.019675668329000473,
-0.08122257888317108,
0.14491930603981018,
0.03851478919386864,
-0.040137045085430145,
-0.07928396761417389,
0.06598499417304993,
-0.02760455571115017,
-0.07772012054920197,
-0.06945295631885529,
-0.08582653105258942,
-0.10043464601039886,
-0.07839986681938171,
-0.08808209747076035,
-0.10708123445510864,
0.0022260439582169056,
0.12947118282318115,
0.09025836735963821,
0.05189315602183342,
0.09808497130870819,
-0.04192613437771797,
0.07184576243162155,
-0.0774218887090683,
-0.22127744555473328,
0.0166323259472847,
-0.09653022885322571,
-0.053054168820381165,
0.008510219864547253,
0.05772601068019867,
-0.06269361078739166,
0.0013676079688593745,
-0.11882668733596802,
0.10278919339179993,
0.05159518122673035,
0.005062561482191086,
-0.03225645050406456,
-0.04437185451388359,
0.022985495626926422,
-0.017116723582148552,
-0.08155056089162827,
-0.0307324081659317,
-0.11500424891710281,
0.04789862036705017,
0.03980257734656334,
0.07617812603712082,
-0.035725466907024384,
0.008097651414573193,
0.06908933818340302,
-0.000856397149618715,
0.06847427040338516,
0.02037731371819973,
0.026332663372159004,
0.12055350840091705,
-0.1293855905532837,
-0.00555580435320735,
0.08434543758630753,
0.042540837079286575,
-0.04141209274530411,
0.04622538387775421,
-0.002905281027778983,
0.09379792213439941,
-0.07608280330896378,
0.07902497798204422,
-0.13065867125988007,
-0.09405891597270966,
-0.05118058621883392,
0.023528797551989555,
-0.12381798028945923,
0.012774805538356304,
-0.11060575395822525,
0.20848405361175537,
0.04738444834947586,
0.11472431570291519,
0.043276578187942505,
-0.023913390934467316,
-0.07096109539270401,
-0.0011613140814006329,
-0.02332199551165104,
-0.07746094465255737,
-0.1296844780445099,
-0.012087603099644184,
-0.011913731694221497,
-0.02755570411682129,
0.2878299653530121,
-0.04138209670782089,
-0.12856987118721008,
0.07163816690444946,
0.11907456815242767,
0.060080669820308685,
0.03208731859922409,
0.28766244649887085,
0.06822820752859116,
-0.016839170828461647,
-0.06168719753623009,
0.01718471199274063,
-0.020116135478019714,
-0.029625797644257545,
0.10412179678678513,
0.1502685844898224,
0.035389889031648636,
0.03584279865026474,
0.12604358792304993,
-0.04588354006409645,
0.03255966678261757,
-0.05230998247861862,
0.006464780308306217,
0.015167874284088612,
0.010305498726665974,
0.08320283889770508,
0.14412568509578705,
-0.06251217424869537,
-0.04249812290072441,
-0.10856562852859497,
0.03387986496090889,
-0.1572173833847046,
-0.05553644895553589,
-0.020375190302729607,
-0.1365293562412262,
0.08073931932449341,
-0.010437427088618279,
0.01929403841495514,
0.25427019596099854,
-0.00786540750414133,
-0.031456783413887024,
-0.08526678383350372,
-0.09348848462104797,
-0.014954169280827045,
0.05665222555398941,
-0.034971170127391815,
-0.006029731128364801,
-0.10867772996425629,
-0.06617576628923416,
0.014360054396092892,
-0.15955035388469696,
-0.03468147665262222,
-0.007844899781048298,
-0.01909187249839306,
-0.01599452830851078,
-0.07746078819036484,
-0.05166849493980408,
-0.031923092901706696,
0.05207694321870804,
-0.017750350758433342,
0.1381901055574417,
-0.0037136096507310867,
-0.00007784684567013755,
0.062031667679548264,
0.07578778266906738,
-0.008977560326457024,
-0.033293914049863815,
-0.036762773990631104,
0.11834944039583206,
-0.028640763834118843,
0.07501106709241867,
-0.035789113491773605,
-0.026764413341879845,
0.017109766602516174,
0.19236481189727783,
0.21535129845142365,
-0.04980277270078659,
0.020052462816238403,
-0.02797621116042137,
0.024041613563895226,
0.04894815757870674,
0.13564828038215637,
-0.00010589529119897634,
0.14426089823246002,
-0.08810767531394958,
-0.022553278133273125,
-0.060656365007162094,
0.0764831155538559,
-0.0010506578255444765,
-0.013395330868661404,
0.03293031081557274,
-0.04720672219991684,
-0.0808071717619896,
0.09689703583717346,
-0.05518501624464989,
0.1504453420639038,
0.18307001888751984,
-0.1475667804479599,
0.03258441016077995,
-0.03398456424474716,
0.06525049358606339,
-0.043229151517152786,
0.07384618371725082,
-0.12500204145908356,
-0.08838016539812088,
-0.2250373363494873,
0.0024549635127186775,
-0.2653076648712158,
-0.1394958794116974,
0.07105578482151031,
0.13613709807395935,
0.1489792913198471,
0.008275690488517284,
0.1797720342874527,
0.03617959842085838,
0.05668848380446434,
-0.07367151975631714,
0.1497594118118286,
0.027260717004537582,
-0.08935949206352234,
-0.09745963662862778,
-0.14738744497299194,
-0.033564984798431396,
0.07247781753540039,
0.06755490601062775,
0.09414992481470108,
0.03881697729229927,
0.06996218860149384,
-0.06503487378358841,
-0.05970132350921631,
-0.01220509223639965,
-0.10029774904251099,
0.08346304297447205,
-0.022571202367544174,
-0.011008981615304947,
-0.06297831237316132,
-0.03551122173666954,
0.007245989050716162,
0.05772782489657402,
-0.11997650563716888,
-0.06025950238108635,
0.02597557008266449,
0.019914908334612846,
0.156619131565094,
-0.04144411161541939,
-0.07977093011140823,
-0.013523791916668415,
-0.07055453956127167,
0.0515933558344841,
-0.08837614208459854,
0.030466806143522263,
0.11698136478662491,
-0.003215652657672763,
-0.0028530294075608253,
-0.18718260526657104,
0.0492398701608181,
-0.010087563656270504,
-0.0654299333691597,
-0.07009358704090118
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/OpenAccess-AI-Collective/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/OpenAccess-AI-Collective/axolotl)
<details><summary>See axolotl config</summary>
axolotl version: `0.4.0`
```yaml
base_model: mistralai/Mistral-7B-v0.1
model_type: MistralForCausalLM
tokenizer_type: LlamaTokenizer
is_mistral_derived_model: true
load_in_8bit: false
load_in_4bit: false
strict: false
datasets:
- path: utrgvseniorproject/medtext
type: completion
dataset_prepared_path: last_run_prepared
val_set_size: 0.05
output_dir: ./Mistral_Medtext
sequence_len: 4096
sample_packing: true
pad_to_sequence_len: true
adapter:
lora_model_dir:
lora_r:
lora_alpha:
lora_dropout:
lora_target_linear:
lora_fan_in_fan_out:
wandb_project:
wandb_entity:
wandb_watch:
wandb_name:
wandb_log_model:
gradient_accumulation_steps: 1
micro_batch_size: 1
num_epochs: 1
optimizer: adamw_bnb_8bit
lr_scheduler: cosine
learning_rate: 0.0002
train_on_inputs: false
group_by_length: false
bf16: auto
fp16:
tf32: false
gradient_checkpointing: true
early_stopping_patience:
resume_from_checkpoint:
local_rank:
logging_steps: 1
xformers_attention:
flash_attention: true
flash_attn_cross_entropy: false
flash_attn_rms_norm: true
flash_attn_fuse_qkv: false
flash_attn_fuse_mlp: true
warmup_steps: 100
evals_per_epoch: 4
eval_table_size:
eval_sample_packing:
saves_per_epoch: 1
debug:
deepspeed: deepspeed_configs/zero2.json
weight_decay: 0.1
fsdp:
fsdp_config:
special_tokens:
```
</details><br>
# Mistral_Medtext
This model is a fine-tuned version of [mistralai/Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 7.2873
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- distributed_type: multi-GPU
- num_devices: 3
- total_train_batch_size: 3
- total_eval_batch_size: 3
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 100
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 1.4797 | 0.01 | 1 | 1.5677 |
| 2.1541 | 0.25 | 29 | 2.2183 |
| 8.1882 | 0.5 | 58 | 8.1331 |
| 7.2045 | 0.75 | 87 | 7.4206 |
| 6.9946 | 1.0 | 116 | 7.2873 |
### Framework versions
- Transformers 4.38.0.dev0
- Pytorch 2.1.2+cu118
- Datasets 2.17.0
- Tokenizers 0.15.0
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "base_model": "mistralai/Mistral-7B-v0.1", "model-index": [{"name": "Mistral_Medtext", "results": []}]} | text-generation | joseagmz/Mistral_Medtext | [
"transformers",
"pytorch",
"tensorboard",
"safetensors",
"mistral",
"text-generation",
"generated_from_trainer",
"base_model:mistralai/Mistral-7B-v0.1",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-15T05:14:46+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #safetensors #mistral #text-generation #generated_from_trainer #base_model-mistralai/Mistral-7B-v0.1 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
| <img src="URL alt="Built with Axolotl" width="200" height="32"/>
See axolotl config
axolotl version: '0.4.0'
Mistral\_Medtext
================
This model is a fine-tuned version of mistralai/Mistral-7B-v0.1 on the None dataset.
It achieves the following results on the evaluation set:
* Loss: 7.2873
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 0.0002
* train\_batch\_size: 1
* eval\_batch\_size: 1
* seed: 42
* distributed\_type: multi-GPU
* num\_devices: 3
* total\_train\_batch\_size: 3
* total\_eval\_batch\_size: 3
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: cosine
* lr\_scheduler\_warmup\_steps: 100
* num\_epochs: 1
### Training results
### Framework versions
* Transformers 4.38.0.dev0
* Pytorch 2.1.2+cu118
* Datasets 2.17.0
* Tokenizers 0.15.0
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0002\n* train\\_batch\\_size: 1\n* eval\\_batch\\_size: 1\n* seed: 42\n* distributed\\_type: multi-GPU\n* num\\_devices: 3\n* total\\_train\\_batch\\_size: 3\n* total\\_eval\\_batch\\_size: 3\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_steps: 100\n* num\\_epochs: 1",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.38.0.dev0\n* Pytorch 2.1.2+cu118\n* Datasets 2.17.0\n* Tokenizers 0.15.0"
] | [
"TAGS\n#transformers #pytorch #tensorboard #safetensors #mistral #text-generation #generated_from_trainer #base_model-mistralai/Mistral-7B-v0.1 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0002\n* train\\_batch\\_size: 1\n* eval\\_batch\\_size: 1\n* seed: 42\n* distributed\\_type: multi-GPU\n* num\\_devices: 3\n* total\\_train\\_batch\\_size: 3\n* total\\_eval\\_batch\\_size: 3\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_steps: 100\n* num\\_epochs: 1",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.38.0.dev0\n* Pytorch 2.1.2+cu118\n* Datasets 2.17.0\n* Tokenizers 0.15.0"
] | [
86,
165,
4,
38
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #safetensors #mistral #text-generation #generated_from_trainer #base_model-mistralai/Mistral-7B-v0.1 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0002\n* train\\_batch\\_size: 1\n* eval\\_batch\\_size: 1\n* seed: 42\n* distributed\\_type: multi-GPU\n* num\\_devices: 3\n* total\\_train\\_batch\\_size: 3\n* total\\_eval\\_batch\\_size: 3\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_steps: 100\n* num\\_epochs: 1### Training results### Framework versions\n\n\n* Transformers 4.38.0.dev0\n* Pytorch 2.1.2+cu118\n* Datasets 2.17.0\n* Tokenizers 0.15.0"
] | [
-0.10205147415399551,
0.11104954779148102,
-0.005103578791022301,
0.07587040215730667,
0.09228968620300293,
0.05022608861327171,
0.13960163295269012,
0.1392371654510498,
-0.04063837230205536,
0.13996362686157227,
0.11349740624427795,
0.07068509608507156,
0.07862435281276703,
0.16190506517887115,
-0.013364739716053009,
-0.23395077884197235,
0.0284760519862175,
-0.04201376065611839,
-0.11062177270650864,
0.10562101751565933,
0.07911345362663269,
-0.11216519773006439,
0.08539211750030518,
-0.02120226062834263,
-0.10879954695701599,
-0.05408269166946411,
-0.039131686091423035,
-0.021849118173122406,
0.09209329634904861,
0.04078130051493645,
0.09414611011743546,
0.03673205152153969,
0.09989326447248459,
-0.22384509444236755,
0.003984707407653332,
0.0767521858215332,
0.01190943457186222,
0.08251094073057175,
0.10622477531433105,
0.020986948162317276,
0.108693927526474,
-0.10340917110443115,
0.05484730377793312,
0.03423544391989708,
-0.10194722563028336,
-0.18054042756557465,
-0.09238584339618683,
0.08125151693820953,
0.111581951379776,
0.06358436495065689,
-0.0012773004127666354,
0.07067886739969254,
-0.04015582799911499,
0.07882550358772278,
0.23846621811389923,
-0.2322188764810562,
-0.07112189382314682,
0.03733127564191818,
0.05366478115320206,
0.08869029581546783,
-0.08457241207361221,
-0.025490207597613335,
0.016542954370379448,
0.024229321628808975,
0.09080037474632263,
0.0050606438890099525,
0.03891090676188469,
-0.003217353718355298,
-0.1407020390033722,
-0.07684147357940674,
0.12274319678544998,
0.04981933906674385,
-0.008978858590126038,
-0.09442498534917831,
-0.0668049082159996,
-0.1996190994977951,
-0.03634101524949074,
-0.004114001058042049,
0.023772235959768295,
-0.03691738098859787,
-0.018971525132656097,
0.02961837872862816,
-0.07662142068147659,
-0.08266773074865341,
0.019334498792886734,
0.06398151814937592,
0.05563626065850258,
0.00031278884853236377,
0.021366437897086143,
0.1116562932729721,
0.04220530390739441,
-0.15772411227226257,
-0.006628209259361029,
0.005547582171857357,
-0.06784101575613022,
-0.013283567503094673,
0.006465314421802759,
0.04128652811050415,
0.07176810503005981,
0.1453552544116974,
-0.08076079934835434,
0.06710261106491089,
0.05453978851437569,
0.020707983523607254,
-0.06686288863420486,
0.12956318259239197,
-0.07844235748052597,
-0.06839589774608612,
-0.026447344571352005,
0.11439760774374008,
0.0297316312789917,
-0.01616210862994194,
-0.07562742382287979,
0.03082180954515934,
0.11014078557491302,
0.06390631943941116,
0.020844701677560806,
0.032868560403585434,
-0.0630245953798294,
-0.025894619524478912,
0.13077464699745178,
-0.10269510000944138,
0.03334518522024155,
0.0511801652610302,
-0.049929339438676834,
-0.01161257829517126,
0.009727970696985722,
-0.014516577124595642,
-0.025480175390839577,
0.07477706670761108,
-0.0848497524857521,
-0.03174283355474472,
-0.0703268051147461,
-0.10551817715167999,
0.04486449435353279,
-0.06758584827184677,
-0.024157507345080376,
-0.08611206710338593,
-0.11312054097652435,
-0.04428308084607124,
0.036909058690071106,
-0.06344007700681686,
-0.07180524617433548,
-0.03984714671969414,
-0.09548372030258179,
0.04570630565285683,
-0.002862736349925399,
0.11406044661998749,
-0.06364801526069641,
0.0751533955335617,
-0.009197449311614037,
0.051293447613716125,
0.07149115949869156,
0.039214324206113815,
-0.042207445949316025,
0.0824892595410347,
-0.14589381217956543,
0.04556167870759964,
-0.09710683673620224,
0.062347736209630966,
-0.13148672878742218,
-0.102984718978405,
0.025412730872631073,
-0.0319475494325161,
0.06892167776823044,
0.11686769872903824,
-0.1451936513185501,
-0.05349984019994736,
0.16983871161937714,
-0.08531217277050018,
-0.10555848479270935,
0.11609324812889099,
0.008279827423393726,
-0.08469782024621964,
0.011255837045609951,
0.11699224263429642,
0.1492622345685959,
-0.11824619024991989,
-0.022087037563323975,
0.0041585564613342285,
0.09527303278446198,
0.017750242725014687,
0.10086776316165924,
-0.019043073058128357,
0.05116068944334984,
0.011867841705679893,
-0.05908355861902237,
0.03428628668189049,
-0.08807048946619034,
-0.0848429724574089,
-0.0388515442609787,
-0.07554961740970612,
0.0008594731916673481,
0.0397198423743248,
0.00976471696048975,
-0.0653216764330864,
-0.12046371400356293,
-0.017745180055499077,
0.1177644357085228,
-0.07981568574905396,
0.004800752736628056,
-0.05530889332294464,
0.09495566785335541,
-0.00750574329867959,
0.004238449037075043,
-0.149492084980011,
-0.11435690522193909,
0.06358359009027481,
-0.08469713479280472,
-0.013193161226809025,
0.005532289855182171,
0.05566927418112755,
0.10018762201070786,
-0.03296889364719391,
-0.0478549525141716,
-0.017575562000274658,
-0.003409880679100752,
-0.07663580775260925,
-0.2367965131998062,
-0.06461818516254425,
-0.022196203470230103,
0.14028610289096832,
-0.19139660894870758,
0.026799729093909264,
0.036836300045251846,
0.12031581997871399,
0.007937183603644371,
-0.038060497492551804,
-0.005328606814146042,
0.05183000490069389,
-0.04101534187793732,
-0.08514610677957535,
0.036795806139707565,
-0.013913866132497787,
-0.07819896191358566,
-0.0003521908656693995,
-0.17381489276885986,
0.07366662472486496,
0.08785136044025421,
0.04631226882338524,
-0.09048265218734741,
-0.030411824584007263,
-0.06541575491428375,
-0.06446274369955063,
-0.0029899992514401674,
-0.011129345744848251,
0.09648105502128601,
0.016235923394560814,
0.09924343973398209,
-0.06821438670158386,
-0.062062203884124756,
0.03607175126671791,
0.01591695286333561,
-0.015455816872417927,
0.1419801414012909,
0.07463735342025757,
-0.08814491331577301,
0.1326836794614792,
0.10568290203809738,
-0.06050722673535347,
0.09608185291290283,
-0.07632023096084595,
-0.07424241304397583,
-0.048327311873435974,
0.05187135934829712,
0.03784717619419098,
0.09276550263166428,
-0.06883338838815689,
0.013890586793422699,
0.03445201739668846,
0.009753676131367683,
0.004632663447409868,
-0.1714213490486145,
0.00823256466537714,
0.021385831758379936,
-0.08983331173658371,
0.037888359278440475,
-0.018131602555513382,
-0.0024717412889003754,
0.08813696354627609,
0.0062223803251981735,
-0.042740289121866226,
-0.020178189501166344,
-0.019162306562066078,
-0.08347038924694061,
0.2050354778766632,
-0.1151605099439621,
-0.12280405312776566,
-0.13796234130859375,
0.036125414073467255,
-0.022782335057854652,
-0.004937958437949419,
0.02438107691705227,
-0.051554687321186066,
-0.04757517948746681,
-0.09489108622074127,
-0.011163000017404556,
0.00001829074608394876,
0.029784219339489937,
0.017060890793800354,
0.004469578620046377,
0.05518035963177681,
-0.10255907475948334,
-0.003594939596951008,
0.02032715640962124,
-0.056915316730737686,
0.02456817962229252,
0.03622644022107124,
0.09864721447229385,
0.15530310571193695,
0.039163943380117416,
-0.004458875861018896,
-0.01403838861733675,
0.17895206809043884,
-0.07958564162254333,
0.02799282781779766,
0.09971015900373459,
0.011649614199995995,
0.07243563234806061,
0.16081105172634125,
0.03954193741083145,
-0.04839462414383888,
-0.005726081319153309,
0.02198171429336071,
-0.020750055089592934,
-0.22128647565841675,
-0.04618680849671364,
-0.043391164392232895,
0.05327976495027542,
0.10778553783893585,
0.052171796560287476,
0.014116179198026657,
0.041929394006729126,
-0.044151753187179565,
0.01782136783003807,
0.028832633048295975,
0.06686892360448837,
0.052582401782274246,
0.04514636844396591,
0.11308789998292923,
-0.028248365968465805,
-0.003663119161501527,
0.051833197474479675,
0.008960695937275887,
0.225430428981781,
-0.030728919431567192,
0.22756005823612213,
0.04988579824566841,
0.15870332717895508,
0.0038864859379827976,
0.052671000361442566,
0.021279683336615562,
0.014407290145754814,
0.010143506340682507,
-0.06318136304616928,
-0.008247983641922474,
0.03886168450117111,
0.036566413938999176,
0.018773026764392853,
-0.07265113294124603,
0.04581007733941078,
0.04811064898967743,
0.2856237590312958,
0.050912272185087204,
-0.31204402446746826,
-0.09217233955860138,
0.03207968547940254,
-0.03176344186067581,
-0.0249190516769886,
0.008921177126467228,
0.14781567454338074,
-0.08915189653635025,
0.06060705706477165,
-0.059408415108919144,
0.07739534229040146,
-0.06384971737861633,
0.0012181580532342196,
0.11149589717388153,
0.11563421040773392,
0.01875612884759903,
0.06149118021130562,
-0.21723733842372894,
0.2431492656469345,
-0.004452757071703672,
0.02742261067032814,
-0.05082029104232788,
0.05330187827348709,
0.004749925807118416,
0.040212418884038925,
0.08393983542919159,
-0.006979281082749367,
-0.11864744126796722,
-0.18694503605365753,
-0.1358848363161087,
0.0022215568460524082,
0.11267425864934921,
-0.07580214738845825,
0.1081199049949646,
-0.032368555665016174,
-0.03904346749186516,
0.0341046042740345,
-0.05226081237196922,
-0.0784204825758934,
-0.10764268785715103,
0.05268080532550812,
-0.02961786277592182,
-0.0029193793889135122,
-0.08401530981063843,
-0.08730956166982651,
-0.08890248090028763,
0.16155016422271729,
-0.17019477486610413,
-0.0564071461558342,
-0.11976627260446548,
0.047904565930366516,
0.16695468127727509,
-0.09833614528179169,
0.045504674315452576,
-0.025842372328042984,
0.1027868390083313,
0.021222343668341637,
-0.04568665847182274,
0.08556189388036728,
-0.08804615586996078,
-0.2459443211555481,
-0.054173924028873444,
0.12720152735710144,
0.029272831976413727,
0.0588049478828907,
-0.03904341906309128,
0.029460696503520012,
-0.014556379988789558,
-0.10739642381668091,
0.024976998567581177,
0.10503972321748734,
0.06382006406784058,
0.02238212153315544,
-0.04435287043452263,
-0.01308432500809431,
-0.03887290135025978,
-0.03511917218565941,
0.080237977206707,
0.28449514508247375,
-0.10577777773141861,
0.05268498882651329,
0.05896349251270294,
-0.06871555745601654,
-0.1782340556383133,
-0.055770691484212875,
0.09973971545696259,
0.020647989585995674,
-0.009423399344086647,
-0.16004200279712677,
0.04917868971824646,
0.11220576614141464,
-0.025734659284353256,
0.11044054478406906,
-0.3390224277973175,
-0.13987405598163605,
0.06651093810796738,
0.08503680676221848,
-0.05923612415790558,
-0.20073430240154266,
-0.07384516298770905,
-0.002968677319586277,
-0.12153307348489761,
0.08083470910787582,
-0.018226729705929756,
0.10138954222202301,
-0.02878907136619091,
0.0002312206634087488,
0.009698199108242989,
-0.06585696339607239,
0.17453426122665405,
-0.0021879100240767,
0.05228699743747711,
-0.04972142353653908,
-0.009936424903571606,
0.061804670840501785,
-0.0717400386929512,
0.029627857729792595,
-0.11733773350715637,
0.048566341400146484,
-0.07490265369415283,
-0.014233868569135666,
-0.06992752850055695,
0.014800194650888443,
-0.05805031582713127,
-0.02561766654253006,
-0.044997524470090866,
0.046739362180233,
0.06458362936973572,
-0.01817421428859234,
0.10585500299930573,
0.015524695627391338,
0.12827421724796295,
0.15691877901554108,
0.06447616964578629,
0.03273140266537666,
-0.09633719176054001,
-0.016055431216955185,
-0.01482329424470663,
0.031430844217538834,
-0.11228447407484055,
0.015234380029141903,
0.14445379376411438,
0.020063528791069984,
0.1067691296339035,
0.045133087784051895,
-0.06633971631526947,
-0.013026577420532703,
0.06734073907136917,
-0.12396708130836487,
-0.1499142348766327,
0.005976437591016293,
0.010745447129011154,
-0.14999857544898987,
-0.0006290266173891723,
0.11335789412260056,
-0.042020320892333984,
-0.008449737913906574,
0.007529626600444317,
0.07051725685596466,
-0.011199025437235832,
0.2200963944196701,
0.03120182827115059,
0.09573440998792648,
-0.08604836463928223,
0.07360705733299255,
0.07373198866844177,
-0.11149702966213226,
0.015551237389445305,
0.12238749861717224,
-0.09223097562789917,
-0.03210105001926422,
0.08560796082019806,
0.07276103645563126,
-0.004097891040146351,
-0.03629552945494652,
-0.09887084364891052,
-0.13703382015228271,
0.08367998152971268,
0.10169609636068344,
0.03980432450771332,
0.06442722678184509,
0.010051253251731396,
0.020157987251877785,
-0.0767652615904808,
0.13252542912960052,
0.06346318125724792,
0.08010025322437286,
-0.13632480800151825,
0.10932895541191101,
-0.006037987768650055,
0.020037773996591568,
-0.001721333828754723,
0.048433706164360046,
-0.11906783282756805,
-0.028823405504226685,
-0.1278727799654007,
0.03431564196944237,
-0.06818655133247375,
-0.008639350533485413,
0.01042361930012703,
-0.0385793000459671,
-0.025693589821457863,
0.020174618810415268,
-0.07333938777446747,
-0.06379896402359009,
-0.0543021634221077,
0.07977690547704697,
-0.12454552203416824,
-0.026884635910391808,
0.0302604828029871,
-0.1051601693034172,
0.09120769053697586,
0.02297547087073326,
0.03936685621738434,
0.003244943916797638,
-0.08916986733675003,
0.03612101450562477,
0.02892244979739189,
0.021022485569119453,
0.029357101768255234,
-0.14709928631782532,
-0.009105256758630276,
-0.03762134164571762,
-0.02062767557799816,
-0.00009074130502995104,
0.03510432690382004,
-0.11657016724348068,
0.025933662429451942,
-0.027346324175596237,
-0.05048943683505058,
-0.07244198024272919,
0.040541596710681915,
0.09650032222270966,
-0.01747920736670494,
0.1318662017583847,
-0.07236181944608688,
0.06530889123678207,
-0.2220498025417328,
-0.008251804858446121,
0.009110924787819386,
-0.06874970346689224,
-0.07342319190502167,
-0.024374308064579964,
0.09251390397548676,
-0.05006644129753113,
0.09459884464740753,
-0.052744485437870026,
0.027664391323924065,
0.019233502447605133,
-0.03046373277902603,
0.039626702666282654,
0.07208621501922607,
0.13382159173488617,
0.032300446182489395,
-0.03181714937090874,
0.031700704246759415,
-0.013788570649921894,
0.049141064286231995,
0.025421587750315666,
0.18122828006744385,
0.1291033923625946,
0.01655600033700466,
0.06469593197107315,
0.07191230356693268,
-0.1486256718635559,
-0.12672659754753113,
0.09721167385578156,
-0.0959649533033371,
0.12858524918556213,
-0.03805823624134064,
0.16499552130699158,
0.09069075435400009,
-0.2075779139995575,
0.02883223257958889,
-0.04315060377120972,
-0.09605875611305237,
-0.09726133942604065,
-0.09290264546871185,
-0.08147204667329788,
-0.13580335676670074,
-0.010117446072399616,
-0.10782245546579361,
0.037490636110305786,
0.07753649353981018,
0.03560889884829521,
0.02546846494078636,
0.11262664943933487,
0.07142575085163116,
0.022533254697918892,
0.033071309328079224,
0.05087420716881752,
-0.004418181721121073,
-0.018885411322116852,
-0.09462473541498184,
0.020344460383057594,
-0.023478657007217407,
0.05044659227132797,
-0.02341393008828163,
-0.029812702909111977,
0.0833415761590004,
0.004986943211406469,
-0.08991537988185883,
0.02338995784521103,
-0.01840285211801529,
0.010003599338233471,
0.07878965139389038,
0.012386632151901722,
0.0022176741622388363,
-0.013584943488240242,
0.15407373011112213,
-0.06905080378055573,
-0.07739628851413727,
-0.10749512910842896,
0.21948622167110443,
-0.028812645003199577,
-0.014751888811588287,
0.04992855712771416,
-0.04962417855858803,
-0.029142476618289948,
0.1425287276506424,
0.20049427449703217,
-0.02845402993261814,
-0.014822565950453281,
0.030471524223685265,
-0.013883516192436218,
-0.019329678267240524,
0.09069506824016571,
0.10255855321884155,
0.09252645075321198,
-0.0681166872382164,
-0.008735118433833122,
-0.011483324691653252,
-0.027094725519418716,
-0.07163097709417343,
0.03691234439611435,
0.007827822118997574,
0.009965849108994007,
-0.01771985925734043,
0.05887484923005104,
-0.05299029499292374,
-0.09344898909330368,
0.07786358892917633,
-0.17552515864372253,
-0.16460274159908295,
-0.025985345244407654,
0.09393315017223358,
0.0005982358707115054,
0.0429937019944191,
-0.0013052013237029314,
-0.02645113877952099,
0.128783717751503,
-0.021477164700627327,
-0.08392365276813507,
-0.0674297884106636,
0.05313778296113014,
-0.07247769832611084,
0.17996175587177277,
-0.03669577091932297,
0.057184115052223206,
0.1289466768503189,
0.018885888159275055,
-0.12644648551940918,
0.022105395793914795,
0.0992271900177002,
-0.10728719085454941,
0.0366622656583786,
0.1264578253030777,
-0.03701504319906235,
0.08738075196743011,
0.04616988077759743,
-0.06701906770467758,
-0.01737711951136589,
-0.044175803661346436,
-0.01723981276154518,
-0.05086539313197136,
-0.018484242260456085,
-0.042450200766325,
0.15891270339488983,
0.19599628448486328,
-0.052533652633428574,
-0.022044358775019646,
-0.027375217527151108,
0.022153163328766823,
0.026043426245450974,
0.12551437318325043,
-0.01816023513674736,
-0.2501623034477234,
0.039540939033031464,
-0.013949686661362648,
0.060062434524297714,
-0.16598466038703918,
-0.09752063453197479,
0.02104497328400612,
-0.03152112290263176,
-0.09673244506120682,
0.11420969665050507,
0.05381792411208153,
0.037267331033945084,
-0.049947138875722885,
-0.08537225425243378,
-0.054636623710393906,
0.14928212761878967,
-0.1641344428062439,
-0.08220068365335464
] |
null | null | diffusers | # terminus-xl lora test
<Gallery />
## Model description
Finetuned on the same dataset as terminus xl gamma v2.1
## Download model
Weights for this model are available in Safetensors format.
[Download](/ptx0/simpletuner-lora-test/tree/main) them in the Files & versions tab.
| {"tags": ["text-to-image", "stable-diffusion", "lora", "diffusers", "template:sd-lora"], "widget": [{"text": "-", "output": {"url": "images/1707954767.739185614890e1e4a68ed3d718d03dc5acb17a3.png"}}, {"text": "-", "output": {"url": "images/1707954543.5697017a2116b9c1cffc06ec68aad451a7d0456.png"}}], "base_model": "ptx0/terminus-xl-gamma-v2-1"} | text-to-image | ptx0/simpletuner-lora-test | [
"diffusers",
"text-to-image",
"stable-diffusion",
"lora",
"template:sd-lora",
"base_model:ptx0/terminus-xl-gamma-v2-1",
"region:us"
] | 2024-02-15T05:14:52+00:00 | [] | [] | TAGS
#diffusers #text-to-image #stable-diffusion #lora #template-sd-lora #base_model-ptx0/terminus-xl-gamma-v2-1 #region-us
| # terminus-xl lora test
<Gallery />
## Model description
Finetuned on the same dataset as terminus xl gamma v2.1
## Download model
Weights for this model are available in Safetensors format.
Download them in the Files & versions tab.
| [
"# terminus-xl lora test\n\n<Gallery />",
"## Model description \n\nFinetuned on the same dataset as terminus xl gamma v2.1",
"## Download model\n\nWeights for this model are available in Safetensors format.\n\nDownload them in the Files & versions tab."
] | [
"TAGS\n#diffusers #text-to-image #stable-diffusion #lora #template-sd-lora #base_model-ptx0/terminus-xl-gamma-v2-1 #region-us \n",
"# terminus-xl lora test\n\n<Gallery />",
"## Model description \n\nFinetuned on the same dataset as terminus xl gamma v2.1",
"## Download model\n\nWeights for this model are available in Safetensors format.\n\nDownload them in the Files & versions tab."
] | [
54,
13,
19,
28
] | [
"passage: TAGS\n#diffusers #text-to-image #stable-diffusion #lora #template-sd-lora #base_model-ptx0/terminus-xl-gamma-v2-1 #region-us \n# terminus-xl lora test\n\n<Gallery />## Model description \n\nFinetuned on the same dataset as terminus xl gamma v2.1## Download model\n\nWeights for this model are available in Safetensors format.\n\nDownload them in the Files & versions tab."
] | [
-0.137460395693779,
0.023597532883286476,
-0.0004847591044381261,
0.05391412973403931,
0.09277546405792236,
0.06886763870716095,
0.093262679874897,
0.08197885006666183,
0.13242864608764648,
0.08833739161491394,
0.060828592628240585,
0.012406816706061363,
-0.009636325761675835,
0.2509184777736664,
-0.09571496397256851,
-0.1989978402853012,
-0.018241221085190773,
-0.01794709451496601,
0.01630633883178234,
0.02819664590060711,
0.05531300604343414,
-0.07305832952260971,
0.10498257726430893,
-0.06272619217634201,
-0.01790449023246765,
0.027910422533750534,
0.07375551760196686,
-0.03790106251835823,
0.025812646374106407,
0.038765691220760345,
-0.00793380755931139,
0.09941624104976654,
0.07632394880056381,
-0.050581738352775574,
0.049413129687309265,
0.008400185033679008,
-0.0223927553743124,
0.039992108941078186,
-0.0014872632455080748,
-0.12928305566310883,
0.18750904500484467,
-0.11285144835710526,
-0.03691674768924713,
0.007523239124566317,
-0.015161008574068546,
-0.17177973687648773,
-0.06473781913518906,
-0.07833752036094666,
0.011876978911459446,
0.00551012996584177,
-0.016734229400753975,
0.08166364580392838,
0.025406109169125557,
0.05656854435801506,
0.2744027078151703,
-0.2597995102405548,
-0.04789337143301964,
0.24239304661750793,
0.09628188610076904,
0.22079229354858398,
-0.0014749631518498063,
0.1556086391210556,
0.12061210721731186,
-0.0733075961470604,
-0.008888991549611092,
-0.06839334964752197,
0.0973210483789444,
0.03682737424969673,
-0.09064601361751556,
0.04287601262331009,
0.3253087103366852,
0.01389878150075674,
-0.0947648286819458,
-0.09599892050027847,
-0.07925355434417725,
0.03296832740306854,
-0.08046074211597443,
0.027990860864520073,
0.044871725142002106,
-0.040503691881895065,
-0.034401558339595795,
-0.1098468154668808,
-0.061860207468271255,
-0.16433210670948029,
-0.0655418410897255,
0.22638805210590363,
-0.00036394456401467323,
0.07437513768672943,
-0.03929079696536064,
0.11367594450712204,
-0.28437331318855286,
-0.0854092612862587,
-0.015425891615450382,
-0.08591299504041672,
-0.008842130191624165,
0.0713108628988266,
-0.05889322981238365,
-0.07766015827655792,
0.14305023849010468,
0.03118223138153553,
0.030186230316758156,
-0.03337586298584938,
-0.0465213768184185,
0.06798358261585236,
0.05345563217997551,
0.041945915669202805,
0.006741875316947699,
-0.12015490233898163,
0.07211782783269882,
0.031165847554802895,
0.08340556174516678,
-0.0067383176647126675,
-0.09567391872406006,
-0.04319711774587631,
-0.046895965933799744,
0.030476339161396027,
0.013072507455945015,
0.014313328079879284,
-0.045374490320682526,
-0.012743894942104816,
0.11271078884601593,
-0.034126825630664825,
-0.00915008969604969,
-0.0007455890299752355,
-0.039241496473550797,
0.10626138001680374,
0.06305117905139923,
-0.04063558951020241,
0.13610459864139557,
-0.051199156790971756,
-0.08498992770910263,
-0.04276496171951294,
-0.05050290748476982,
-0.08639004081487656,
-0.057135652750730515,
-0.05166105553507805,
0.03505562245845795,
-0.12191210687160492,
-0.2522289454936981,
-0.016397764906287193,
-0.0071499221958220005,
-0.050079215317964554,
0.0638822540640831,
0.0036421173717826605,
0.02662568911910057,
0.028574803844094276,
0.003389138961210847,
0.05608333647251129,
-0.06324194371700287,
0.08899147063493729,
0.0757080689072609,
0.16309519112110138,
-0.039359066635370255,
0.0015738132642582059,
-0.06807298213243484,
0.02989817038178444,
-0.1927485167980194,
0.025804471224546432,
-0.1171630397439003,
-0.03159169480204582,
-0.04619528353214264,
-0.061585087329149246,
-0.08167596161365509,
0.026532573625445366,
-0.006234225817024708,
0.14763137698173523,
-0.25958630442619324,
-0.03748135641217232,
0.16006377339363098,
-0.17063191533088684,
-0.06071571633219719,
0.07483605295419693,
0.02788735367357731,
0.02654404751956463,
0.061403851956129074,
0.14858944714069366,
0.0797848030924797,
-0.24686050415039062,
-0.006171874701976776,
0.08120456337928772,
-0.020705480128526688,
-0.030919594690203667,
0.11301443725824356,
0.06204350292682648,
-0.05281619727611542,
0.0554092638194561,
-0.220864400267601,
0.024713752791285515,
-0.08624263852834702,
0.016416752710938454,
-0.07728872448205948,
-0.12497935444116592,
0.07352224737405777,
0.012218079529702663,
0.030960988253355026,
-0.0963427722454071,
-0.039969101548194885,
-0.05314815789461136,
0.19124799966812134,
-0.09295415878295898,
-0.015669334679841995,
-0.02162548527121544,
0.2279714047908783,
-0.15038852393627167,
-0.02488277293741703,
-0.07443170249462128,
-0.10602916777133942,
0.03738898038864136,
0.21757084131240845,
0.008208789862692356,
0.09500550478696823,
0.0722026452422142,
0.02670302242040634,
-0.05649781599640846,
-0.00912593025714159,
0.038061387836933136,
-0.022600846365094185,
-0.03509782627224922,
-0.10912378132343292,
0.00010326653864467517,
-0.07598011940717697,
0.10544178634881973,
-0.24888792634010315,
-0.005958152003586292,
0.05607209354639053,
0.0018818210810422897,
0.07596016675233841,
0.0017903305124491453,
0.05508428439497948,
-0.07656028866767883,
-0.0520341731607914,
-0.051616426557302475,
-0.006930425763130188,
-0.017030704766511917,
-0.060194533318281174,
0.10720903426408768,
-0.0642194151878357,
0.13670499622821808,
0.1593930870294571,
0.12516117095947266,
-0.010627444833517075,
-0.057174764573574066,
-0.01483322773128748,
0.02843264676630497,
-0.10639286786317825,
-0.05411996692419052,
-0.04150262102484703,
-0.0762968584895134,
0.0557689405977726,
-0.05572758987545967,
0.08766698837280273,
0.044250454753637314,
-0.053793732076883316,
-0.03890390321612358,
0.08761858940124512,
0.1457097828388214,
-0.020173540338873863,
0.03889359533786774,
0.08691573143005371,
-0.07015751302242279,
0.08655314147472382,
-0.0019171149469912052,
-0.1407565325498581,
-0.0073074232786893845,
-0.016559215262532234,
0.047723472118377686,
0.14841654896736145,
0.0523483045399189,
-0.03375135362148285,
0.039893317967653275,
-0.05526623874902725,
0.07782939076423645,
-0.07148868590593338,
-0.07384901493787766,
0.03376000002026558,
-0.06078851968050003,
0.040118250995874405,
0.11694933474063873,
-0.08050070703029633,
0.10058633238077164,
-0.12082070857286453,
-0.038806695491075516,
-0.05693995952606201,
-0.02424769476056099,
-0.07343582808971405,
0.08322293311357498,
-0.014624550007283688,
-0.13192541897296906,
-0.057755034416913986,
0.06807174533605576,
-0.023839322850108147,
0.02407335862517357,
0.015263318084180355,
-0.055818118155002594,
-0.0784393772482872,
-0.07421734929084778,
0.020792286843061447,
0.07570023089647293,
0.08312854170799255,
-0.0528719425201416,
-0.02736821398139,
-0.052509550005197525,
-0.10040627419948578,
-0.049607761204242706,
-0.10425961017608643,
-0.09655866771936417,
0.08246678858995438,
-0.09898731112480164,
0.13370062410831451,
0.06015082076191902,
-0.005487330257892609,
0.030445067211985588,
0.002502634422853589,
0.13760389387607574,
-0.06293974071741104,
0.04384481906890869,
0.28973516821861267,
0.1414872258901596,
-0.02364482544362545,
-0.07326813787221909,
0.009773963131010532,
-0.10138437151908875,
0.06112124025821686,
-0.044008269906044006,
-0.11087939888238907,
-0.10255885869264603,
-0.12754525244235992,
-0.0637710839509964,
0.005506320856511593,
0.016762126237154007,
0.06440749019384384,
-0.022900015115737915,
0.11029807478189468,
0.03344389423727989,
-0.012178121134638786,
0.0016797706484794617,
0.013921580277383327,
-0.020357251167297363,
-0.04678885638713837,
0.08633506298065186,
-0.06328953057527542,
0.041915424168109894,
0.17835940420627594,
0.03342708572745323,
0.2427501529455185,
-0.033642690628767014,
0.02387823350727558,
0.035070501267910004,
0.012708486057817936,
0.11586920917034149,
0.06421545147895813,
-0.06394334882497787,
-0.036399323493242264,
-0.032234009355306625,
-0.1115322932600975,
0.03116992674767971,
0.06500270217657089,
-0.014442184939980507,
-0.01961539126932621,
-0.05106882378458977,
0.04115854203701019,
0.03609121963381767,
-0.12618708610534668,
0.10162302106618881,
-0.2652341425418854,
-0.009613415226340294,
0.08778557926416397,
0.0991562232375145,
-0.01814691349864006,
0.030072800815105438,
0.14351694285869598,
-0.028567742556333542,
0.014542626217007637,
-0.018397482112050056,
0.0426945760846138,
0.01567172445356846,
-0.068250872194767,
-0.05327921733260155,
0.09692621231079102,
-0.03235291689634323,
-0.0013954604510217905,
-0.07882001250982285,
0.10448680073022842,
0.02104022726416588,
0.008901498280465603,
0.00019066035747528076,
-0.011993104591965675,
0.0855972096323967,
0.14514760673046112,
0.1502501368522644,
-0.026358088478446007,
0.01957470364868641,
0.05029696226119995,
-0.15852245688438416,
0.05711520463228226,
0.0351623110473156,
-0.08329105377197266,
0.06784223765134811,
-0.023360872641205788,
-0.035300735384225845,
0.009297105483710766,
0.028390999883413315,
-0.11941679567098618,
-0.05544554814696312,
-0.011762125417590141,
0.06051420047879219,
-0.07259482890367508,
-0.06939888745546341,
-0.10673801600933075,
-0.1514630913734436,
0.18802325427532196,
0.0958954319357872,
-0.08356241881847382,
-0.09213168919086456,
0.03526240587234497,
0.15080802142620087,
-0.03909764438867569,
0.02605351433157921,
0.058438848704099655,
0.08200567215681076,
-0.04093638435006142,
-0.11734098941087723,
0.03308498486876488,
-0.0665765330195427,
-0.13400548696517944,
-0.046912748366594315,
0.10632341355085373,
-0.024541597813367844,
0.030638378113508224,
0.012918233871459961,
0.062168851494789124,
-0.012872403487563133,
-0.10558755695819855,
0.051647260785102844,
0.16294679045677185,
0.043643560260534286,
0.08330134302377701,
0.03133160620927811,
-0.13598987460136414,
0.017612211406230927,
0.002261360641568899,
0.0697098821401596,
0.177579864859581,
-0.0764102041721344,
0.012146559543907642,
-0.054245688021183014,
-0.017155714333057404,
-0.17788280546665192,
0.10278546810150146,
-0.05900710076093674,
0.02778906747698784,
0.04072502627968788,
-0.08285543322563171,
0.12772659957408905,
0.14162366092205048,
-0.04073352366685867,
0.2707717716693878,
-0.2661217153072357,
-0.10210437327623367,
0.04076627641916275,
0.12697817385196686,
0.18503639101982117,
-0.17435072362422943,
-0.043117955327034,
-0.02635185793042183,
-0.11943680793046951,
0.039096713066101074,
-0.15025357902050018,
0.08287902921438217,
-0.033407799899578094,
0.026542624458670616,
0.03445715457201004,
-0.03230787813663483,
0.19357720017433167,
-0.0841444805264473,
0.041355300694704056,
0.013050428591668606,
-0.0566837452352047,
0.09238240867853165,
-0.0017331080744042993,
0.11703404039144516,
-0.16670316457748413,
0.0876593291759491,
-0.06351661682128906,
-0.06518213450908661,
0.0341656431555748,
0.05537539720535278,
0.012533456087112427,
-0.06073949858546257,
-0.07372208684682846,
0.04014594107866287,
-0.07896246761083603,
0.01319118682295084,
-0.005675202701240778,
-0.07390069961547852,
-0.0712636336684227,
0.02061024121940136,
0.003019084455445409,
0.09907782822847366,
-0.08327852189540863,
-0.01597948558628559,
-0.02967727556824684,
0.07216885685920715,
-0.23970776796340942,
-0.015086079016327858,
0.07739236950874329,
0.05982321873307228,
0.04130997136235237,
0.02939298376441002,
-0.0023991286288946867,
0.12115000933408737,
0.09913419932126999,
-0.12215742468833923,
-0.12654033303260803,
-0.07356304675340652,
-0.14757414162158966,
0.01010538637638092,
0.1157621219754219,
0.10885464400053024,
-0.05232337489724159,
-0.006753505207598209,
-0.06446858495473862,
0.06660686433315277,
-0.021183904260396957,
0.12281110137701035,
0.10413859784603119,
-0.013527230359613895,
-0.08205799758434296,
0.011241475120186806,
-0.03845224156975746,
-0.00184861128218472,
-0.07301723957061768,
0.012968258000910282,
-0.04273359850049019,
-0.03455590829253197,
-0.05084807053208351,
0.03805844113230705,
-0.03571321815252304,
-0.0020030145533382893,
-0.14311149716377258,
-0.016923172399401665,
-0.034586068242788315,
0.08946903049945831,
0.10341797024011612,
-0.02878745086491108,
0.004145889542996883,
-0.0059559959918260574,
-0.05435621738433838,
0.06769023090600967,
0.04989630728960037,
0.08801987022161484,
-0.18382011353969574,
-0.044769398868083954,
-0.02007746323943138,
0.016736473888158798,
-0.0915968045592308,
-0.031248094514012337,
-0.06996552646160126,
0.0043854545801877975,
-0.07748262584209442,
0.11841535568237305,
-0.0753195658326149,
-0.013790852390229702,
-0.033708956092596054,
-0.0767354965209961,
-0.05126161500811577,
0.06974034011363983,
-0.045882388949394226,
0.039619091898202896,
0.00017776885943021625,
0.026596713811159134,
-0.08179596811532974,
-0.009257161058485508,
0.012364264577627182,
-0.04716639965772629,
0.05257229879498482,
0.06864728778600693,
0.022468747571110725,
0.04438519477844238,
-0.20404168963432312,
0.04532171040773392,
0.08721066266298294,
0.04182461276650429,
0.0281515009701252,
0.017467457801103592,
0.06776580214500427,
0.009557482786476612,
0.01620912179350853,
-0.016092577949166298,
0.026353875175118446,
-0.04919694736599922,
0.08983894437551498,
-0.08077831566333771,
0.04169676452875137,
-0.013975430279970169,
0.0057080150581896305,
0.12308662384748459,
0.14336280524730682,
0.0383022241294384,
-0.07675696909427643,
0.005735110025852919,
-0.1515035182237625,
0.03258166089653969,
-0.018589023500680923,
-0.09359204024076462,
0.015652772039175034,
0.055223505944013596,
0.06342817842960358,
-0.038962844759225845,
0.24044106900691986,
0.09653247147798538,
-0.041338540613651276,
-0.05366036668419838,
0.07536676526069641,
0.2525428533554077,
0.015271416865289211,
0.2662186026573181,
0.06924238055944443,
0.09703271836042404,
-0.10419777780771255,
0.11997383087873459,
0.12031631171703339,
0.03609224036335945,
0.07301904261112213,
0.11232217401266098,
-0.08297058194875717,
0.05329824239015579,
0.002401200821623206,
-0.03845753148198128,
-0.06728579103946686,
-0.0330173633992672,
-0.11741922795772552,
0.0005217944853939116,
-0.027972795069217682,
0.015418180264532566,
0.15671223402023315,
-0.10385975241661072,
-0.04873603209853172,
0.041271958500146866,
-0.02140781655907631,
-0.13046011328697205,
-0.18446756899356842,
-0.13235707581043243,
-0.21066974103450775,
0.03529008850455284,
-0.10808315128087997,
-0.004771880339831114,
-0.016795098781585693,
-0.006802516523748636,
0.013514832593500614,
0.047036755830049515,
-0.07078928500413895,
-0.03677130118012428,
0.05989750474691391,
-0.014116765931248665,
-0.07106438279151917,
0.0328512080013752,
-0.04962896928191185,
0.07668816298246384,
-0.0652693659067154,
-0.00728321960195899,
0.003964689560234547,
0.04779805615544319,
0.05189802497625351,
0.006206411402672529,
-0.027141859754920006,
-0.10886482894420624,
0.013259034603834152,
-0.030530789867043495,
0.0766395702958107,
0.08155866712331772,
-0.019844288006424904,
0.0015886803157627583,
0.2029745727777481,
-0.0399361290037632,
-0.02034718170762062,
-0.07701234519481659,
0.11077290028333664,
-0.09896861761808395,
0.03268589824438095,
0.003665321506559849,
-0.11909688264131546,
0.027161577716469765,
0.17601318657398224,
0.2769465148448944,
-0.013218053616583347,
0.048653073608875275,
-0.020551620051264763,
-0.018636222928762436,
0.00878783967345953,
0.03812098130583763,
0.030858013778924942,
0.2918931543827057,
-0.03564852476119995,
-0.09118040651082993,
-0.08292955905199051,
-0.01183487568050623,
0.003977120853960514,
-0.016813218593597412,
-0.028902169317007065,
-0.032326724380254745,
-0.05077388137578964,
0.08732790499925613,
-0.014621217735111713,
-0.03848625347018242,
0.07595836371183395,
-0.1000303328037262,
-0.057497721165418625,
-0.06053533777594566,
-0.013488003052771091,
0.02437346801161766,
0.008790998719632626,
-0.13814781606197357,
-0.011013234034180641,
-0.10949330031871796,
0.009143919683992863,
-0.14177152514457703,
-0.060152795165777206,
0.03314841166138649,
-0.04480038583278656,
0.13809822499752045,
-0.03765357658267021,
0.06123661249876022,
0.0010512302396818995,
-0.018668316304683685,
-0.07504458725452423,
0.13079732656478882,
-0.009674127213656902,
-0.15159788727760315,
0.05162354186177254,
0.06341128051280975,
-0.0792585238814354,
0.13067765533924103,
0.0366501621901989,
-0.010954318568110466,
0.0008727664826437831,
0.08956699073314667,
-0.06419870257377625,
-0.12954655289649963,
0.004727991297841072,
-0.1669708639383316,
0.12024455517530441,
0.04703613743185997,
-0.029801761731505394,
-0.0069383434019982815,
-0.0344361811876297,
0.12368015199899673,
0.07604972273111343,
0.06589534133672714,
0.0736461877822876,
-0.11460481584072113,
-0.06027005612850189,
0.11995217204093933,
-0.0034114534500986338,
-0.18049730360507965,
0.005311942659318447,
-0.15335941314697266,
0.0068803271278738976,
0.018793199211359024,
-0.0038108783774077892,
0.2451801598072052,
-0.0021489618811756372,
-0.03264738246798515,
-0.16286414861679077,
0.04428039491176605,
0.07561882585287094,
-0.1351311206817627,
-0.08125831931829453
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# my_awesome_billsum_model
This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.5383
- Rouge1: 0.1433
- Rouge2: 0.0505
- Rougel: 0.1159
- Rougelsum: 0.1157
- Gen Len: 19.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:-------:|
| No log | 1.0 | 62 | 2.8261 | 0.1266 | 0.0357 | 0.1041 | 0.1044 | 19.0 |
| No log | 2.0 | 124 | 2.6153 | 0.1398 | 0.0484 | 0.1136 | 0.1134 | 19.0 |
| No log | 3.0 | 186 | 2.5545 | 0.1443 | 0.052 | 0.1162 | 0.116 | 19.0 |
| No log | 4.0 | 248 | 2.5383 | 0.1433 | 0.0505 | 0.1159 | 0.1157 | 19.0 |
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.1
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "metrics": ["rouge"], "base_model": "t5-small", "model-index": [{"name": "my_awesome_billsum_model", "results": []}]} | text2text-generation | Mouad2023/my_awesome_billsum_model | [
"transformers",
"tensorboard",
"safetensors",
"t5",
"text2text-generation",
"generated_from_trainer",
"base_model:t5-small",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-15T05:15:38+00:00 | [] | [] | TAGS
#transformers #tensorboard #safetensors #t5 #text2text-generation #generated_from_trainer #base_model-t5-small #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
| my\_awesome\_billsum\_model
===========================
This model is a fine-tuned version of t5-small on an unknown dataset.
It achieves the following results on the evaluation set:
* Loss: 2.5383
* Rouge1: 0.1433
* Rouge2: 0.0505
* Rougel: 0.1159
* Rougelsum: 0.1157
* Gen Len: 19.0
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 4
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.35.2
* Pytorch 2.1.0+cu121
* Datasets 2.17.0
* Tokenizers 0.15.1
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 4\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
"TAGS\n#transformers #tensorboard #safetensors #t5 #text2text-generation #generated_from_trainer #base_model-t5-small #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 4\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
77,
113,
4,
33
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #t5 #text2text-generation #generated_from_trainer #base_model-t5-small #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 4\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1"
] | [
-0.10266303271055222,
0.09942521154880524,
-0.0027460495475679636,
0.08506198227405548,
0.09827578812837601,
-0.016694581136107445,
0.17796874046325684,
0.15241588652133942,
-0.11869107931852341,
0.06479435414075851,
0.13762591779232025,
0.11301229149103165,
0.04989040270447731,
0.18094857037067413,
-0.07865184545516968,
-0.21702048182487488,
0.0487079992890358,
0.04069749638438225,
-0.022144494578242302,
0.11783156543970108,
0.0921868234872818,
-0.1199086382985115,
0.09187910705804825,
0.02190937101840973,
-0.16739913821220398,
-0.0030882861465215683,
0.01701863296329975,
-0.08225079625844955,
0.10471375286579132,
0.03941856697201729,
0.08672547340393066,
0.04765917360782623,
0.04237496107816696,
-0.160819873213768,
0.010987335816025734,
0.06780378520488739,
-0.0039007605519145727,
0.09378297626972198,
0.05589458346366882,
-0.004321986343711615,
0.09477748721837997,
-0.08507537841796875,
0.06829997152090073,
0.025195609778165817,
-0.12519501149654388,
-0.270436555147171,
-0.10439898073673248,
0.04553831368684769,
0.09781288355588913,
0.07649438828229904,
-0.009699190966784954,
0.18823648989200592,
-0.011819587089121342,
0.1117096096277237,
0.2329760491847992,
-0.3124503195285797,
-0.05767359584569931,
-0.017354562878608704,
0.05743923783302307,
0.09337034821510315,
-0.08021918684244156,
-0.022407449781894684,
0.04033557325601578,
0.032946906983852386,
0.14834946393966675,
-0.015498110093176365,
-0.025067411363124847,
-0.022222083061933517,
-0.13189932703971863,
-0.05746176093816757,
0.16773870587348938,
0.03877754881978035,
-0.05357826501131058,
-0.08723752200603485,
-0.07638707011938095,
-0.15592488646507263,
-0.053318753838539124,
-0.0008843340910971165,
0.036514390259981155,
-0.03438882157206535,
-0.08090747892856598,
-0.02033470757305622,
-0.08844487369060516,
-0.047973956912755966,
-0.03086782619357109,
0.1302313357591629,
0.041189342737197876,
0.015882693231105804,
-0.06402517110109329,
0.0628759115934372,
-0.044563181698322296,
-0.16711127758026123,
-0.012644926086068153,
0.013418344780802727,
0.018381454050540924,
-0.04567931592464447,
-0.036929138004779816,
-0.13687144219875336,
0.023709125816822052,
0.1625734567642212,
-0.10052040964365005,
0.08213169872760773,
-0.04391355440020561,
0.034021954983472824,
-0.08673739433288574,
0.16437660157680511,
-0.015755396336317062,
0.011864244937896729,
0.03606988489627838,
0.089845210313797,
0.08179647475481033,
-0.028087440878152847,
-0.11046683043241501,
0.04640234261751175,
0.12124527990818024,
0.03273812308907509,
-0.029490871354937553,
0.0534440353512764,
-0.04012022167444229,
-0.007337839342653751,
0.07540822774171829,
-0.10175082087516785,
0.030294230207800865,
-0.008811160922050476,
-0.04248370975255966,
-0.050292450934648514,
0.01757016032934189,
0.010109174065291882,
-0.026074815541505814,
0.07424141466617584,
-0.07763741165399551,
0.005413346458226442,
-0.07534826546907425,
-0.14024892449378967,
0.0363825187087059,
-0.07813459634780884,
0.010452181100845337,
-0.10487692058086395,
-0.1447802037000656,
-0.005675955209881067,
0.049260035157203674,
-0.04101930931210518,
-0.04061088711023331,
-0.04248911887407303,
-0.09408006072044373,
0.053473006933927536,
-0.020437145605683327,
0.07196332514286041,
-0.07499518245458603,
0.08443655073642731,
0.05650881677865982,
0.0719069167971611,
-0.04536323621869087,
0.02706277184188366,
-0.09715518355369568,
0.048435211181640625,
-0.22561901807785034,
0.03767653554677963,
-0.05071815103292465,
0.09027127176523209,
-0.10288933664560318,
-0.07824349403381348,
0.024840503931045532,
-0.016366969794034958,
0.10170317441225052,
0.09988585859537125,
-0.16340391337871552,
-0.05967511981725693,
0.1988009363412857,
-0.11426207423210144,
-0.1685018688440323,
0.1416521817445755,
-0.03319467231631279,
0.01913369446992874,
0.05732490494847298,
0.22480852901935577,
0.0671531930565834,
-0.10648266971111298,
-0.013441050425171852,
-0.04127618297934532,
0.06455893069505692,
-0.06639663130044937,
0.07740022987127304,
0.004497361835092306,
0.05036419630050659,
-0.0008821621304377913,
0.006420924328267574,
0.03556085005402565,
-0.0681903138756752,
-0.07708018273115158,
-0.059334322810173035,
-0.07862328737974167,
0.004096251912415028,
0.03935942426323891,
0.05668242275714874,
-0.14975154399871826,
-0.10835389047861099,
0.04757039621472359,
0.07202425599098206,
-0.0846206396818161,
0.04550793394446373,
-0.10578818619251251,
0.11520956456661224,
-0.08046855032444,
-0.0002570033830124885,
-0.16134457290172577,
-0.03296736627817154,
0.03343024477362633,
-0.00607240991666913,
0.001018970855511725,
-0.06891804188489914,
0.07886433601379395,
0.08227881044149399,
-0.05062584951519966,
-0.04340014234185219,
-0.006420465186238289,
0.016094699501991272,
-0.11400224268436432,
-0.20500794053077698,
-0.019874362275004387,
-0.04522257670760155,
0.09961424767971039,
-0.17922531068325043,
0.0495467483997345,
0.07116594910621643,
0.11218499392271042,
0.055526264011859894,
-0.022319354116916656,
0.00024562832550145686,
0.06259395182132721,
-0.047551874071359634,
-0.07540730386972427,
0.051584865897893906,
0.035610489547252655,
-0.08733835071325302,
0.023530159145593643,
-0.18741491436958313,
0.18563275039196014,
0.13927310705184937,
0.019740937277674675,
-0.060882218182086945,
-0.0072165741585195065,
-0.04053666070103645,
-0.027119942009449005,
-0.02841464802622795,
0.007585851941257715,
0.11799774318933487,
0.01377149485051632,
0.1580929458141327,
-0.10946159064769745,
-0.05151931941509247,
0.05324254184961319,
-0.0419224388897419,
-0.012738498859107494,
0.10642477124929428,
0.01232284214347601,
-0.15083542466163635,
0.14412836730480194,
0.16575977206230164,
-0.054133083671331406,
0.1376602202653885,
-0.07591306418180466,
-0.06638303399085999,
-0.029946837574243546,
0.020254015922546387,
0.04705879092216492,
0.11843962222337723,
-0.09316182881593704,
-0.011844477616250515,
0.025639653205871582,
0.019600028172135353,
-0.001068953308276832,
-0.18689358234405518,
0.004469056148082018,
0.04397616907954216,
-0.052488017827272415,
-0.0360410213470459,
-0.00948757492005825,
0.0006864893366582692,
0.0985848605632782,
0.0013814045814797282,
-0.050749506801366806,
0.03292948752641678,
0.012107262387871742,
-0.0782194659113884,
0.19148997962474823,
-0.10121958702802658,
-0.16004718840122223,
-0.12373155355453491,
-0.07501552253961563,
-0.0526365302503109,
0.005165599286556244,
0.08680209517478943,
-0.07718652486801147,
-0.05765688791871071,
-0.13379646837711334,
-0.04207777604460716,
0.019947722554206848,
0.02537538856267929,
0.031222287565469742,
-0.005786570720374584,
0.08875072747468948,
-0.10663697123527527,
-0.0242903009057045,
-0.005538546480238438,
0.020185640081763268,
0.05550415441393852,
0.015606836415827274,
0.11035410314798355,
0.11454463005065918,
-0.02768484130501747,
0.026855457574129105,
-0.04520440474152565,
0.22365714609622955,
-0.06751158833503723,
-0.009069539606571198,
0.14321796596050262,
-0.017753466963768005,
0.07892601191997528,
0.1328100711107254,
0.040495045483112335,
-0.09489186108112335,
0.008325621485710144,
0.002026212867349386,
-0.039961088448762894,
-0.2117120325565338,
-0.005978260654956102,
-0.04835420474410057,
0.012694560922682285,
0.10253371298313141,
0.034162070602178574,
0.02409863844513893,
0.04997194930911064,
-0.0012590158730745316,
0.05210113897919655,
0.006844469346106052,
0.11161457747220993,
0.12362489849328995,
0.059593185782432556,
0.14215794205665588,
-0.06981051713228226,
-0.023042071610689163,
0.042921025305986404,
0.006111869588494301,
0.19145622849464417,
-0.0017796240281313658,
0.20047178864479065,
0.04252196103334427,
0.14637111127376556,
0.030354300513863564,
0.07418353855609894,
-0.020296210423111916,
-0.023927949368953705,
-0.005538864526897669,
-0.05893278494477272,
-0.033417388796806335,
0.027602724730968475,
-0.09276147186756134,
0.041566092520952225,
-0.11586572229862213,
0.031639207154512405,
0.053788527846336365,
0.287695050239563,
0.05063551664352417,
-0.37267160415649414,
-0.11169996857643127,
0.024737687781453133,
-0.034405745565891266,
-0.046427562832832336,
0.007671518716961145,
0.12434177845716476,
-0.0456317774951458,
0.07896226644515991,
-0.08301747590303421,
0.0987011045217514,
-0.036288194358348846,
0.030777746811509132,
0.03482005372643471,
0.08587408810853958,
-0.02172146923840046,
0.047961898148059845,
-0.289306640625,
0.26918619871139526,
0.036740999668836594,
0.08372160792350769,
-0.05923974886536598,
0.018408114090561867,
0.01058921217918396,
0.06636027246713638,
0.06373895704746246,
-0.01659916713833809,
-0.15324166417121887,
-0.16280505061149597,
-0.10514041036367416,
0.016879811882972717,
0.08567094057798386,
0.02256089821457863,
0.1175837516784668,
-0.0193764828145504,
-0.006598425097763538,
0.055891212075948715,
-0.047972749918699265,
-0.06898321211338043,
-0.10877244919538498,
0.010329768061637878,
0.05474388599395752,
-0.027534382417798042,
-0.08975715190172195,
-0.09384521842002869,
-0.05669771507382393,
0.168386310338974,
0.005470228847116232,
-0.06743709743022919,
-0.12362560629844666,
0.026500528678297997,
0.06199413165450096,
-0.0846586674451828,
0.03680703416466713,
-0.010508010163903236,
0.13098451495170593,
-0.000845418602693826,
-0.07462523877620697,
0.1255895048379898,
-0.07254725694656372,
-0.1739870309829712,
-0.04884868860244751,
0.11863284558057785,
-0.001831216854043305,
0.04478384926915169,
0.0006338421953842044,
0.035670019686222076,
-0.016424348577857018,
-0.06206925958395004,
0.02771734446287155,
-0.00452116085216403,
0.08439946919679642,
-0.049840547144412994,
-0.006989129353314638,
0.006057324819266796,
-0.062127042561769485,
-0.03699815645813942,
0.15753710269927979,
0.2861994206905365,
-0.07755649089813232,
0.052348505705595016,
0.05252021178603172,
-0.049478184431791306,
-0.15926766395568848,
0.01528804562985897,
0.03528272733092308,
0.0029511426109820604,
0.01347943302243948,
-0.14003169536590576,
0.03104008361697197,
0.08022407442331314,
-0.024563217535614967,
0.07413551956415176,
-0.2934284806251526,
-0.13413526117801666,
0.10167514532804489,
0.14532315731048584,
0.08977613598108292,
-0.17300456762313843,
-0.05126972869038582,
-0.036605726927518845,
-0.11098074913024902,
0.12279374152421951,
-0.14445345103740692,
0.09520666301250458,
-0.020718403160572052,
0.060543518513441086,
0.009990067221224308,
-0.06134441867470741,
0.12024450302124023,
-0.05103888362646103,
0.09074650704860687,
-0.07215151935815811,
0.05557102710008621,
0.11195147037506104,
-0.09390486776828766,
0.04751903563737869,
-0.13102006912231445,
0.04018243029713631,
-0.09185189008712769,
-0.013343730010092258,
-0.049702759832143784,
0.011095497757196426,
-0.03554220870137215,
-0.03016565553843975,
-0.04630908742547035,
0.0051308427937328815,
0.0573776438832283,
-0.02907496690750122,
0.20224803686141968,
0.01325426809489727,
0.16250240802764893,
0.1732770949602127,
0.10947415232658386,
-0.12218565493822098,
-0.018971547484397888,
0.01824200712144375,
-0.043683506548404694,
0.04990244656801224,
-0.17097443342208862,
0.04617168754339218,
0.1150502860546112,
-0.0012685583205893636,
0.11801750957965851,
0.05614103376865387,
-0.06263415515422821,
0.024932362139225006,
0.06670577824115753,
-0.17211948335170746,
-0.12047730386257172,
-0.0030301182996481657,
0.07191778719425201,
-0.1245526447892189,
0.05093434453010559,
0.13364943861961365,
-0.06805215775966644,
-0.011110194027423859,
0.0003092775004915893,
0.025012578815221786,
-0.009071833454072475,
0.17803573608398438,
0.029509153217077255,
0.06807717680931091,
-0.09390272945165634,
0.08138387650251389,
0.05262184888124466,
-0.12088165432214737,
0.05917257070541382,
0.09318205714225769,
-0.09721631556749344,
-0.03469793498516083,
0.0605245977640152,
0.16685067117214203,
-0.027197927236557007,
-0.07270361483097076,
-0.16381356120109558,
-0.12599049508571625,
0.0754195973277092,
0.20168153941631317,
0.059554003179073334,
0.004074434284120798,
-0.009709738194942474,
-0.005299560260027647,
-0.12363755702972412,
0.11737324297428131,
0.040743932127952576,
0.09119971841573715,
-0.13940037786960602,
0.10061512887477875,
-0.011384516954421997,
0.012776357121765614,
-0.012589380145072937,
0.03386136144399643,
-0.12098221480846405,
-0.0005737891769967973,
-0.13880057632923126,
0.017767494544386864,
-0.045330822467803955,
0.0007833231356926262,
-0.0169923547655344,
-0.035399988293647766,
-0.06358985602855682,
0.022843562066555023,
-0.10138259083032608,
-0.03165670484304428,
0.018034718930721283,
0.030357643961906433,
-0.12848028540611267,
-0.02710983343422413,
0.009629127569496632,
-0.09206820279359818,
0.07023186981678009,
0.031718816608190536,
-0.0014221190940588713,
0.023881010711193085,
-0.06426647305488586,
0.01271724421530962,
0.06100593879818916,
0.002253940561786294,
0.057487763464450836,
-0.12048253417015076,
-0.018321771174669266,
0.02381713129580021,
0.013948812149465084,
0.027821069583296776,
0.12355297803878784,
-0.10767985135316849,
0.002816900610923767,
-0.003109660232439637,
-0.05220190808176994,
-0.06180930510163307,
0.060289036482572556,
0.09783346951007843,
-0.0003612824948504567,
0.19537124037742615,
-0.09806109219789505,
0.006676600780338049,
-0.1960899829864502,
0.0024083631578832865,
0.008512699976563454,
-0.14606308937072754,
-0.08193287253379822,
-0.026717524975538254,
0.06399236619472504,
-0.07209169119596481,
0.10877018421888351,
-0.00888234842568636,
0.037215061485767365,
0.057289011776447296,
-0.05344785749912262,
-0.0006220239447429776,
0.025485793128609657,
0.1985389143228531,
0.010647653602063656,
-0.041874006390571594,
0.06017183139920235,
0.011771045625209808,
0.09661087393760681,
0.09870865195989609,
0.17870698869228363,
0.13799318671226501,
0.013202548958361149,
0.11147920787334442,
0.0314071923494339,
-0.026530331000685692,
-0.1657513976097107,
0.05492253601551056,
-0.0403125062584877,
0.14051711559295654,
-0.00561195332556963,
0.18468590080738068,
0.1622944176197052,
-0.14729461073875427,
0.025115182623267174,
-0.05077555403113365,
-0.08006945252418518,
-0.11111024022102356,
-0.09130322933197021,
-0.10081513226032257,
-0.15068954229354858,
-0.01496124267578125,
-0.11965283006429672,
0.04001060873270035,
0.04620897397398949,
0.01653970777988434,
0.0002623218751978129,
0.1445862203836441,
0.04075893387198448,
0.023079311475157738,
0.052309732884168625,
-0.0029216953553259373,
-0.043824080377817154,
-0.02870883233845234,
-0.08448407799005508,
0.02895146980881691,
-0.013583807274699211,
0.04091192036867142,
-0.0011072547640651464,
-0.0027579618617892265,
0.05546439811587334,
-0.01961744762957096,
-0.12019144743680954,
0.014756981283426285,
0.034735068678855896,
0.06313183158636093,
0.04032192379236221,
0.022336795926094055,
-0.0033388689626008272,
-0.007404383271932602,
0.20696811378002167,
-0.07801946997642517,
-0.06288088113069534,
-0.1081545278429985,
0.2337484061717987,
0.00884774699807167,
-0.03294113650918007,
0.01947205886244774,
-0.07880499958992004,
0.00913690123707056,
0.181210458278656,
0.15779082477092743,
-0.019358670338988304,
-0.003919410053640604,
-0.046972837299108505,
-0.012201392091810703,
-0.0416761189699173,
0.107706218957901,
0.12026834487915039,
0.0025540459901094437,
-0.06326358020305634,
-0.035592954605817795,
-0.051305461674928665,
-0.010678485967218876,
-0.06502880156040192,
0.07262595742940903,
0.013440313749015331,
0.002860026666894555,
-0.025311710312962532,
0.06222166121006012,
-0.019485119730234146,
-0.05108613520860672,
0.0012514075497165322,
-0.1980455070734024,
-0.15112994611263275,
-0.0015842573484405875,
0.09525816887617111,
-0.0218973346054554,
0.045034218579530716,
-0.006427586544305086,
0.010699852369725704,
0.06557680666446686,
-0.02100391685962677,
-0.05933181196451187,
-0.0781278908252716,
0.07714269310235977,
-0.16642840206623077,
0.20427480340003967,
-0.02483687363564968,
0.0267473291605711,
0.14546597003936768,
0.028050348162651062,
-0.12199324369430542,
0.07674708962440491,
0.045504581183195114,
-0.05747728794813156,
0.024333389475941658,
0.12646135687828064,
-0.03079926408827305,
0.10015774518251419,
0.04794524610042572,
-0.11440655589103699,
-0.01181810349225998,
-0.09389258176088333,
-0.02958603948354721,
-0.024973060935735703,
-0.039545029401779175,
-0.049931012094020844,
0.12692764401435852,
0.1687367707490921,
-0.050729766488075256,
0.0017514824867248535,
-0.05075788125395775,
0.02167186327278614,
0.07317686825990677,
-0.00506155751645565,
-0.03136763349175453,
-0.26867395639419556,
0.015635555610060692,
0.0934654101729393,
0.0009502703323960304,
-0.29234492778778076,
-0.08321619778871536,
-0.009031860157847404,
-0.036056939512491226,
-0.11139780282974243,
0.09026020765304565,
0.1131095141172409,
0.04359126463532448,
-0.07401403784751892,
-0.04045771807432175,
-0.06713926792144775,
0.1674027144908905,
-0.12061624974012375,
-0.07073422521352768
] |
null | null | transformers | # Necessary installations for running in Google Colab
!apt install swig cmake
!pip install -r https://raw.githubusercontent.com/huggingface/deep-rl-class/main/notebooks/unit1/requirements-unit1.txt
!sudo apt-get update
!sudo apt-get install -y python3-opengl
!apt install ffmpeg
!apt install xvfb
!pip3 install pyvirtualdisplay
# Setup a virtual display for rendering environments in Colab
import os
from pyvirtualdisplay import Display
virtual_display = Display(visible=0, size=(1400, 900))
virtual_display.start()
# Import necessary libraries
import gymnasium as gym
from stable_baselines3 import PPO
from stable_baselines3.common.env_util import make_vec_env
from stable_baselines3.common.monitor import Monitor
from stable_baselines3.common.evaluation import evaluate_policy
from huggingface_sb3 import package_to_hub, load_from_hub
from huggingface_hub import notebook_login
# Define the environment
env_id = "LunarLander-v2"
env = gym.make(env_id)
# SOLUTION: Parameters to accelerate the training of PPO model
model = PPO(
policy='MlpPolicy',
env=env,
n_steps=1024,
batch_size=64,
n_epochs=4,
gamma=0.999,
gae_lambda=0.98,
ent_coef=0.01,
verbose=1
)
# Train the PPO agent
model.learn(total_timesteps=1000000)
# Save the model
model_name = "lunarLander1"
model.save(model_name)
# Evaluate the agent
eval_env = Monitor(gym.make(env_id))
mean_reward, std_reward = evaluate_policy(model, eval_env, n_eval_episodes=10, deterministic=True)
print(f"mean_reward={mean_reward:.2f} +/- {std_reward}")
# Login to Hugging Face and configure git
notebook_login()
!git config --global credential.helper store
# Define the Hugging Face Hub repository details
repo_id = "FTU/lunarLanderPPO"
commit_message = "The first deep reinforced learning model"
# Assuming 'model' is your trained model object, package and push it to the Hugging Face Hub
package_to_hub(
model=model,
model_name=model_name,
model_architecture="PPO",
env_id=env_id,
eval_env=eval_env,
repo_id=repo_id,
commit_message=commit_message
)
| {} | null | FTU/lunarLanderPPO | [
"transformers",
"endpoints_compatible",
"region:us"
] | 2024-02-15T05:16:22+00:00 | [] | [] | TAGS
#transformers #endpoints_compatible #region-us
| # Necessary installations for running in Google Colab
!apt install swig cmake
!pip install -r URL
!sudo apt-get update
!sudo apt-get install -y python3-opengl
!apt install ffmpeg
!apt install xvfb
!pip3 install pyvirtualdisplay
# Setup a virtual display for rendering environments in Colab
import os
from pyvirtualdisplay import Display
virtual_display = Display(visible=0, size=(1400, 900))
virtual_display.start()
# Import necessary libraries
import gymnasium as gym
from stable_baselines3 import PPO
from stable_baselines3.common.env_util import make_vec_env
from stable_baselines3.common.monitor import Monitor
from stable_baselines3.common.evaluation import evaluate_policy
from huggingface_sb3 import package_to_hub, load_from_hub
from huggingface_hub import notebook_login
# Define the environment
env_id = "LunarLander-v2"
env = URL(env_id)
# SOLUTION: Parameters to accelerate the training of PPO model
model = PPO(
policy='MlpPolicy',
env=env,
n_steps=1024,
batch_size=64,
n_epochs=4,
gamma=0.999,
gae_lambda=0.98,
ent_coef=0.01,
verbose=1
)
# Train the PPO agent
URL(total_timesteps=1000000)
# Save the model
model_name = "lunarLander1"
URL(model_name)
# Evaluate the agent
eval_env = Monitor(URL(env_id))
mean_reward, std_reward = evaluate_policy(model, eval_env, n_eval_episodes=10, deterministic=True)
print(f"mean_reward={mean_reward:.2f} +/- {std_reward}")
# Login to Hugging Face and configure git
notebook_login()
!git config --global URL store
# Define the Hugging Face Hub repository details
repo_id = "FTU/lunarLanderPPO"
commit_message = "The first deep reinforced learning model"
# Assuming 'model' is your trained model object, package and push it to the Hugging Face Hub
package_to_hub(
model=model,
model_name=model_name,
model_architecture="PPO",
env_id=env_id,
eval_env=eval_env,
repo_id=repo_id,
commit_message=commit_message
)
| [
"# Necessary installations for running in Google Colab\n!apt install swig cmake\n!pip install -r URL\n!sudo apt-get update\n!sudo apt-get install -y python3-opengl\n!apt install ffmpeg\n!apt install xvfb\n!pip3 install pyvirtualdisplay",
"# Setup a virtual display for rendering environments in Colab\nimport os\nfrom pyvirtualdisplay import Display\n\nvirtual_display = Display(visible=0, size=(1400, 900))\nvirtual_display.start()",
"# Import necessary libraries\nimport gymnasium as gym\nfrom stable_baselines3 import PPO\nfrom stable_baselines3.common.env_util import make_vec_env\nfrom stable_baselines3.common.monitor import Monitor\nfrom stable_baselines3.common.evaluation import evaluate_policy\nfrom huggingface_sb3 import package_to_hub, load_from_hub\nfrom huggingface_hub import notebook_login",
"# Define the environment\nenv_id = \"LunarLander-v2\"\nenv = URL(env_id)",
"# SOLUTION: Parameters to accelerate the training of PPO model\nmodel = PPO(\n policy='MlpPolicy',\n env=env,\n n_steps=1024,\n batch_size=64,\n n_epochs=4,\n gamma=0.999,\n gae_lambda=0.98,\n ent_coef=0.01,\n verbose=1\n)",
"# Train the PPO agent\nURL(total_timesteps=1000000)",
"# Save the model\nmodel_name = \"lunarLander1\"\nURL(model_name)",
"# Evaluate the agent\neval_env = Monitor(URL(env_id))\nmean_reward, std_reward = evaluate_policy(model, eval_env, n_eval_episodes=10, deterministic=True)\nprint(f\"mean_reward={mean_reward:.2f} +/- {std_reward}\")",
"# Login to Hugging Face and configure git\nnotebook_login()\n!git config --global URL store",
"# Define the Hugging Face Hub repository details\nrepo_id = \"FTU/lunarLanderPPO\"\ncommit_message = \"The first deep reinforced learning model\"",
"# Assuming 'model' is your trained model object, package and push it to the Hugging Face Hub\npackage_to_hub(\n model=model,\n model_name=model_name,\n model_architecture=\"PPO\",\n env_id=env_id,\n eval_env=eval_env,\n repo_id=repo_id,\n commit_message=commit_message\n)"
] | [
"TAGS\n#transformers #endpoints_compatible #region-us \n",
"# Necessary installations for running in Google Colab\n!apt install swig cmake\n!pip install -r URL\n!sudo apt-get update\n!sudo apt-get install -y python3-opengl\n!apt install ffmpeg\n!apt install xvfb\n!pip3 install pyvirtualdisplay",
"# Setup a virtual display for rendering environments in Colab\nimport os\nfrom pyvirtualdisplay import Display\n\nvirtual_display = Display(visible=0, size=(1400, 900))\nvirtual_display.start()",
"# Import necessary libraries\nimport gymnasium as gym\nfrom stable_baselines3 import PPO\nfrom stable_baselines3.common.env_util import make_vec_env\nfrom stable_baselines3.common.monitor import Monitor\nfrom stable_baselines3.common.evaluation import evaluate_policy\nfrom huggingface_sb3 import package_to_hub, load_from_hub\nfrom huggingface_hub import notebook_login",
"# Define the environment\nenv_id = \"LunarLander-v2\"\nenv = URL(env_id)",
"# SOLUTION: Parameters to accelerate the training of PPO model\nmodel = PPO(\n policy='MlpPolicy',\n env=env,\n n_steps=1024,\n batch_size=64,\n n_epochs=4,\n gamma=0.999,\n gae_lambda=0.98,\n ent_coef=0.01,\n verbose=1\n)",
"# Train the PPO agent\nURL(total_timesteps=1000000)",
"# Save the model\nmodel_name = \"lunarLander1\"\nURL(model_name)",
"# Evaluate the agent\neval_env = Monitor(URL(env_id))\nmean_reward, std_reward = evaluate_policy(model, eval_env, n_eval_episodes=10, deterministic=True)\nprint(f\"mean_reward={mean_reward:.2f} +/- {std_reward}\")",
"# Login to Hugging Face and configure git\nnotebook_login()\n!git config --global URL store",
"# Define the Hugging Face Hub repository details\nrepo_id = \"FTU/lunarLanderPPO\"\ncommit_message = \"The first deep reinforced learning model\"",
"# Assuming 'model' is your trained model object, package and push it to the Hugging Face Hub\npackage_to_hub(\n model=model,\n model_name=model_name,\n model_architecture=\"PPO\",\n env_id=env_id,\n eval_env=eval_env,\n repo_id=repo_id,\n commit_message=commit_message\n)"
] | [
17,
75,
50,
103,
29,
86,
17,
21,
93,
23,
43,
92
] | [
"passage: TAGS\n#transformers #endpoints_compatible #region-us \n# Necessary installations for running in Google Colab\n!apt install swig cmake\n!pip install -r URL\n!sudo apt-get update\n!sudo apt-get install -y python3-opengl\n!apt install ffmpeg\n!apt install xvfb\n!pip3 install pyvirtualdisplay# Setup a virtual display for rendering environments in Colab\nimport os\nfrom pyvirtualdisplay import Display\n\nvirtual_display = Display(visible=0, size=(1400, 900))\nvirtual_display.start()# Import necessary libraries\nimport gymnasium as gym\nfrom stable_baselines3 import PPO\nfrom stable_baselines3.common.env_util import make_vec_env\nfrom stable_baselines3.common.monitor import Monitor\nfrom stable_baselines3.common.evaluation import evaluate_policy\nfrom huggingface_sb3 import package_to_hub, load_from_hub\nfrom huggingface_hub import notebook_login# Define the environment\nenv_id = \"LunarLander-v2\"\nenv = URL(env_id)# SOLUTION: Parameters to accelerate the training of PPO model\nmodel = PPO(\n policy='MlpPolicy',\n env=env,\n n_steps=1024,\n batch_size=64,\n n_epochs=4,\n gamma=0.999,\n gae_lambda=0.98,\n ent_coef=0.01,\n verbose=1\n)# Train the PPO agent\nURL(total_timesteps=1000000)# Save the model\nmodel_name = \"lunarLander1\"\nURL(model_name)# Evaluate the agent\neval_env = Monitor(URL(env_id))\nmean_reward, std_reward = evaluate_policy(model, eval_env, n_eval_episodes=10, deterministic=True)\nprint(f\"mean_reward={mean_reward:.2f} +/- {std_reward}\")"
] | [
-0.06963688135147095,
0.19220958650112152,
-0.007916860282421112,
0.03517264127731323,
0.1445450335741043,
0.03047131560742855,
0.09424978494644165,
0.13149836659431458,
0.0341273695230484,
0.10861429572105408,
0.02681245468556881,
0.11315413564443588,
0.07792917639017105,
0.1816299855709076,
0.041919853538274765,
-0.2030896544456482,
0.055132828652858734,
-0.06737969815731049,
0.0005056443624198437,
0.062040966004133224,
0.04081902652978897,
-0.051707275211811066,
0.08162340521812439,
0.05901845544576645,
-0.031068390235304832,
-0.002273190300911665,
-0.05838629975914955,
-0.054018158465623856,
0.02207033894956112,
0.03190556913614273,
0.04462766647338867,
-0.01713462546467781,
0.08781146258115768,
-0.2454437017440796,
-0.010955401696264744,
0.11259859800338745,
0.014194704592227936,
0.07482615858316422,
0.18818670511245728,
0.0008222329779528081,
0.10191081464290619,
-0.09568878263235092,
0.06793827563524246,
0.07355964183807373,
-0.08338712155818939,
-0.16554555296897888,
-0.11329418420791626,
0.06709463149309158,
0.12644602358341217,
0.10489410907030106,
-0.017154935747385025,
0.10350070893764496,
-0.042740222066640854,
0.03924771025776863,
0.14680218696594238,
-0.21348188817501068,
-0.01656954176723957,
0.02440827153623104,
0.01431361399590969,
-0.018574722111225128,
-0.047708842903375626,
-0.040176354348659515,
-0.023204533383250237,
0.007050462998449802,
0.02120732143521309,
-0.036875657737255096,
-0.050033584237098694,
-0.031335894018411636,
-0.10472268611192703,
-0.030390074476599693,
0.13526207208633423,
0.03104536421597004,
-0.02696053311228752,
-0.11603162437677383,
-0.026521839201450348,
-0.06629080325365067,
-0.010926959104835987,
-0.027772963047027588,
0.03781406581401825,
-0.00641576386988163,
0.09242478013038635,
-0.07033805549144745,
-0.06367862224578857,
-0.02612709440290928,
0.007516387850046158,
0.023237058892846107,
0.03701947256922722,
-0.011944599449634552,
0.01165003888309002,
0.13415464758872986,
0.03556811437010765,
-0.09936611354351044,
0.014143547043204308,
-0.049900833517313004,
-0.15020447969436646,
-0.03777895122766495,
-0.01145730260759592,
-0.054776109755039215,
-0.006867536809295416,
0.12257477641105652,
0.004549665376543999,
0.07145985215902328,
0.01731359027326107,
0.023880094289779663,
0.008577685803174973,
0.20439088344573975,
-0.09570344537496567,
-0.1063249483704567,
-0.0484292171895504,
0.044361382722854614,
0.01469512190669775,
0.054648566991090775,
-0.011745716445147991,
-0.023841610178351402,
-0.017333734780550003,
0.028974460437893867,
0.015608502551913261,
0.047129612416028976,
-0.10153914988040924,
-0.021722059696912766,
0.10663538426160812,
-0.09942449629306793,
0.024393389001488686,
0.023229749873280525,
-0.09275180101394653,
0.12491032481193542,
0.10468504577875137,
-0.011835061013698578,
-0.10779910534620285,
0.077930748462677,
-0.07259530574083328,
0.02716919220983982,
-0.03428208455443382,
-0.06005093455314636,
0.023327158764004707,
-0.05238175019621849,
-0.0007419352768920362,
-0.06618107855319977,
-0.14779360592365265,
-0.05653578042984009,
0.06459735333919525,
-0.049761898815631866,
-0.011219847947359085,
-0.024331508204340935,
0.0077590057626366615,
-0.06184423342347145,
0.011698592454195023,
-0.03091420978307724,
-0.04690840467810631,
-0.02481001242995262,
-0.028238115832209587,
0.018168434500694275,
0.1384824812412262,
0.030696194618940353,
-0.06148572266101837,
0.039072755724191666,
-0.19634756445884705,
0.09909885376691818,
-0.09889401495456696,
0.0686681866645813,
-0.14094656705856323,
-0.05172278359532356,
0.025231704115867615,
0.021502148360013962,
0.028672192245721817,
0.13293318450450897,
-0.13690389692783356,
-0.02335979975759983,
0.16398969292640686,
-0.050280652940273285,
-0.054393667727708817,
0.01232232991605997,
0.0012882495066151023,
0.028511041775345802,
0.04368196427822113,
0.12188497185707092,
0.11118779331445694,
-0.24422428011894226,
0.019901100546121597,
0.0035856757313013077,
-0.06360554695129395,
0.026705263182520866,
0.037563908845186234,
-0.10634142905473709,
0.07087111473083496,
0.07517698407173157,
-0.0985175296664238,
0.08778675645589828,
0.013720101676881313,
-0.017765019088983536,
0.005318921059370041,
-0.04436320438981056,
-0.035237062722444534,
-0.018968237563967705,
-0.007038835436105728,
0.016257384791970253,
-0.10726051777601242,
0.07735253870487213,
0.11555123329162598,
-0.024662431329488754,
-0.015355370007455349,
-0.020968561992049217,
0.030500708147883415,
-0.0011258284794166684,
-0.011725477874279022,
-0.11127348989248276,
-0.13727828860282898,
0.03553105890750885,
-0.17463397979736328,
0.020414795726537704,
0.04877656698226929,
0.0480746366083622,
0.0481247715651989,
0.03993416205048561,
-0.08552000671625137,
-0.011158083565533161,
-0.019890720024704933,
-0.003762277076020837,
-0.0805656760931015,
-0.03613932803273201,
-0.012943893671035767,
0.22025497257709503,
-0.1535376012325287,
0.03560740128159523,
-0.08466053754091263,
0.1464804857969284,
0.024579374119639397,
-0.06629013270139694,
0.0636577233672142,
-0.0550852008163929,
0.025447294116020203,
-0.0921352207660675,
-0.0028553379233926535,
0.02463173121213913,
0.026006177067756653,
0.020453354343771935,
-0.09977816045284271,
-0.06971055269241333,
0.09593082219362259,
0.03363291919231415,
-0.07807303965091705,
-0.0877397358417511,
-0.0517343133687973,
-0.030723197385668755,
-0.022647468373179436,
-0.015238303691148758,
0.10200726240873337,
0.09952551126480103,
0.10798672586679459,
-0.04622318595647812,
-0.058460891246795654,
0.0058922553434967995,
-0.0594547763466835,
-0.024642402306199074,
0.07307591289281845,
0.08299628645181656,
0.0429738312959671,
0.04066726937890053,
0.025160280987620354,
-0.016404252499341965,
0.1069175973534584,
0.021390492096543312,
-0.0894259512424469,
-0.03330004960298538,
0.04130350053310394,
-0.00589100131765008,
0.00944566074758768,
0.012268466874957085,
0.06897379457950592,
0.055273815989494324,
-0.021604688838124275,
0.06965233385562897,
-0.09012802690267563,
0.015278677456080914,
0.021542921662330627,
-0.014808581210672855,
0.04701937362551689,
0.01584942825138569,
0.02911819890141487,
0.012234292924404144,
-0.031118879094719887,
0.048730477690696716,
-0.015456268563866615,
-0.0011011926690116525,
-0.07755976170301437,
0.11196134984493256,
-0.1228078082203865,
-0.1760774552822113,
-0.1833818554878235,
-0.04099991172552109,
-0.04599452763795853,
-0.004124317783862352,
-0.007882311940193176,
-0.0649770051240921,
-0.11521061509847641,
-0.07633049041032791,
0.041239745914936066,
0.030674001201987267,
-0.09310534596443176,
-0.047909192740917206,
0.09415362030267715,
0.06648537516593933,
-0.0872916504740715,
-0.009529633447527885,
-0.01029499713331461,
-0.015418575145304203,
-0.013393638655543327,
0.01074265968054533,
0.05391016602516174,
0.07121087610721588,
0.03588711470365524,
0.02230236679315567,
0.030993467196822166,
0.18505224585533142,
-0.051947955042123795,
0.049604736268520355,
0.16901907324790955,
0.04982820525765419,
0.06002228334546089,
0.11195104569196701,
0.004198658745735884,
-0.011267431080341339,
0.009460891596972942,
0.035471681505441666,
0.023702042177319527,
-0.2887578308582306,
-0.09733586013317108,
-0.051405712962150574,
-0.055520884692668915,
0.10828236490488052,
0.02795949950814247,
0.007518954575061798,
0.07468371093273163,
-0.004073490388691425,
0.024388395249843597,
0.020281212404370308,
0.09947595745325089,
0.1330011487007141,
0.01837630569934845,
0.03604568913578987,
-0.0032422442454844713,
0.012365990318357944,
0.057959482073783875,
0.04278181865811348,
0.15113112330436707,
-0.05544840916991234,
0.06713403016328812,
0.026223450899124146,
0.10256391763687134,
-0.04793200269341469,
0.08347972482442856,
0.04796506464481354,
0.018377261236310005,
0.020748239010572433,
-0.07261347770690918,
-0.07249747961759567,
0.08295020461082458,
0.03888972848653793,
-0.06100045144557953,
-0.02269533835351467,
0.01989579014480114,
0.054614197462797165,
0.11879415810108185,
0.009042593650519848,
-0.26264098286628723,
0.03078867308795452,
0.005119368899613619,
0.03919028490781784,
-0.0795259103178978,
-0.01302300114184618,
0.002407411579042673,
-0.14121127128601074,
0.04149526730179787,
-0.03828902542591095,
0.06963982433080673,
-0.0846211165189743,
-0.012055641040205956,
0.09403606504201889,
0.18819980323314667,
0.027853405103087425,
0.052300795912742615,
-0.012435708194971085,
0.09760496765375137,
0.00827899668365717,
0.05791531130671501,
-0.014238421805202961,
0.11377942562103271,
0.0031097084283828735,
-0.03947213292121887,
0.12378373742103577,
0.01695806346833706,
0.01820584014058113,
-0.14488485455513,
-0.0637562945485115,
-0.03071642480790615,
0.04374613240361214,
-0.0018130602547898889,
0.07656049728393555,
-0.020921221002936363,
-0.05059704929590225,
-0.046623677015304565,
-0.06627971678972244,
-0.11958624422550201,
-0.17424149811267853,
0.05713232234120369,
-0.08433258533477783,
0.05097140371799469,
-0.05264171212911606,
-0.026518117636442184,
-0.15479207038879395,
0.17888091504573822,
0.011474225670099258,
-0.12103765457868576,
-0.1349315643310547,
-0.05969925597310066,
0.1994582712650299,
-0.06247392296791077,
0.06904890388250351,
-0.036834802478551865,
0.0928129255771637,
-0.01610245555639267,
-0.05419077351689339,
0.02187778614461422,
-0.10557381808757782,
-0.10875923931598663,
-0.06668837368488312,
0.10735368728637695,
0.0274247657507658,
-0.016566120088100433,
0.017554238438606262,
-0.03505971282720566,
-0.04267922043800354,
-0.09785144031047821,
-0.027042867615818977,
0.14569315314292908,
0.036234159022569656,
0.015498441644012928,
-0.10351382195949554,
-0.06076828017830849,
-0.0718977078795433,
0.039042871445417404,
0.02116532064974308,
0.1954181045293808,
-0.060669753700494766,
0.05914747342467308,
0.06843166798353195,
-0.05077153444290161,
-0.036286383867263794,
0.009016819298267365,
0.12630970776081085,
-0.03986165300011635,
0.13098622858524323,
-0.15415959060192108,
0.1503930538892746,
0.14642265439033508,
-0.0014338813489302993,
0.05291726440191269,
-0.17719212174415588,
-0.06953606009483337,
0.07336131483316422,
0.035913266241550446,
-0.23546773195266724,
-0.06980140507221222,
-0.05713086202740669,
-0.015544110909104347,
-0.21658092737197876,
0.05050712451338768,
0.030601223930716515,
0.04843975976109505,
0.0074968645349144936,
0.0632096529006958,
0.06287768483161926,
-0.06345875561237335,
0.14727941155433655,
0.019676631316542625,
0.015560159459710121,
-0.02945798821747303,
0.05669846013188362,
0.01999817043542862,
-0.07222982496023178,
0.14073467254638672,
-0.01590200699865818,
0.03679583594202995,
-0.18600159883499146,
-0.020673837512731552,
-0.053357772529125214,
0.05965564027428627,
-0.028548896312713623,
-0.05071917921304703,
-0.0009758489322848618,
0.055136602371931076,
0.10637608170509338,
0.00800290983170271,
-0.018372010439634323,
0.013544764369726181,
-0.02362128719687462,
0.18278004229068756,
-0.030657107010483742,
0.0877370536327362,
-0.17973794043064117,
-0.024441620334982872,
-0.0007270295172929764,
-0.01422818098217249,
-0.03260068967938423,
0.01457749493420124,
0.11907579749822617,
-0.03583485633134842,
0.10283682495355606,
-0.015959380194544792,
-0.1133611798286438,
-0.023573676124215126,
0.04418940842151642,
-0.0978214293718338,
0.005302654579281807,
-0.0208475012332201,
0.038420405238866806,
-0.0301387757062912,
-0.040045756846666336,
0.13459345698356628,
0.0014634474646300077,
-0.03357839211821556,
0.019302187487483025,
0.047637760639190674,
-0.016453109681606293,
0.17260655760765076,
0.00716078607365489,
0.02310789003968239,
-0.08261864632368088,
0.12041497975587845,
0.05640152469277382,
-0.04230368137359619,
0.00011791629367507994,
0.12199028581380844,
-0.095799520611763,
-0.04959152638912201,
-0.0399545393884182,
0.02121674455702305,
0.0004492060106713325,
-0.015100616030395031,
-0.0004393510171212256,
-0.0484342947602272,
0.02199767902493477,
-0.11462213099002838,
0.02173137106001377,
-0.025539632886648178,
-0.022285692393779755,
-0.02607692778110504,
-0.06251214444637299,
0.04449186846613884,
0.14589793980121613,
0.021685712039470673,
-0.10053599625825882,
0.07872405648231506,
0.015537431463599205,
-0.0034315907396376133,
0.017756259068846703,
-0.00850555021315813,
-0.09929073601961136,
-0.00897421408444643,
-0.09141859412193298,
0.014084470458328724,
-0.10366711765527725,
-0.010701127350330353,
0.002488349797204137,
0.02641376107931137,
-0.008471652865409851,
0.007335626985877752,
-0.0838761180639267,
-0.08761324733495712,
-0.03300852328538895,
0.11302857100963593,
-0.11227592825889587,
-0.009780858643352985,
0.029908305034041405,
-0.11366237699985504,
0.04863927140831947,
0.05621135234832764,
0.026896486058831215,
0.010447605513036251,
-0.11287304013967514,
-0.050480637699365616,
0.05244046822190285,
0.035536039620637894,
0.04974810779094696,
-0.09233853965997696,
0.026515044271945953,
-0.029804565012454987,
-0.05804045870900154,
-0.07811476290225983,
-0.02579781971871853,
-0.13339227437973022,
0.03505087271332741,
0.01873401179909706,
-0.042359307408332825,
-0.07952182739973068,
0.010214583948254585,
0.09439258277416229,
0.017847158014774323,
0.12412727624177933,
-0.018267003819346428,
0.05802908539772034,
-0.1597953736782074,
-0.041471514850854874,
0.003510851413011551,
0.031030287966132164,
0.039692219346761703,
-0.03039771504700184,
0.05527925491333008,
0.016812613233923912,
0.12887990474700928,
0.029729796573519707,
0.06090190261602402,
-0.0032847649417817593,
-0.006935500539839268,
0.016059551388025284,
-0.0049685887061059475,
-0.007239520084112883,
0.010578008368611336,
0.0520600825548172,
0.007215763907879591,
-0.04900140315294266,
0.03127161040902138,
-0.09809641540050507,
-0.004138403106480837,
0.10827191919088364,
-0.051433760672807693,
0.08561379462480545,
0.005359708331525326,
-0.11146795004606247,
-0.09044922888278961,
0.062183964997529984,
-0.06103985011577606,
0.07442431151866913,
-0.05587594956159592,
0.13443809747695923,
0.06265150010585785,
-0.154693603515625,
0.040271248668432236,
0.06042436137795448,
-0.05315902829170227,
-0.05141136422753334,
-0.09253624081611633,
-0.011192367412149906,
-0.10840116441249847,
0.045950133353471756,
-0.06132020428776741,
0.047523144632577896,
0.024673879146575928,
0.013022012077271938,
-0.0016490031266584992,
0.06038380041718483,
0.01239082682877779,
-0.13076946139335632,
0.003253126982599497,
0.03030288964509964,
0.0014207252534106374,
-0.02604878507554531,
-0.0120284678414464,
-0.015542490407824516,
0.04224367067217827,
0.04617713391780853,
0.02397209405899048,
0.01875811070203781,
0.02683277241885662,
0.005571546498686075,
-0.0287553071975708,
0.008896888233721256,
-0.019240176305174828,
-0.0942753255367279,
0.10960298776626587,
0.04034401476383209,
0.01690051145851612,
-0.04893378168344498,
0.13320299983024597,
-0.0586823895573616,
-0.022295791655778885,
-0.13991861045360565,
0.11713352054357529,
0.06407574564218521,
0.048079680651426315,
-0.020071152597665787,
-0.09123100340366364,
-0.06139300391077995,
0.21513161063194275,
0.14395646750926971,
-0.019376572221517563,
-0.04195467382669449,
0.016052115708589554,
-0.01826249435544014,
-0.04611518979072571,
0.07179625332355499,
0.052405036985874176,
0.13217101991176605,
-0.037877362221479416,
0.13889726996421814,
-0.02212960459291935,
-0.049384817481040955,
-0.04851628839969635,
-0.0007288653287105262,
-0.040562037378549576,
0.005457542836666107,
0.004513839725404978,
0.09325966984033585,
-0.09456615895032883,
-0.15384943783283234,
0.022917531430721283,
-0.03581756353378296,
-0.12657010555267334,
-0.02850111946463585,
0.04183639585971832,
-0.004092765040695667,
0.08521362394094467,
-0.013009694404900074,
-0.0473221018910408,
0.2662406861782074,
-0.02185973711311817,
-0.08559349924325943,
-0.08853120356798172,
0.013199305161833763,
-0.049400679767131805,
0.211677685379982,
0.014820646494626999,
-0.016547948122024536,
0.06640314310789108,
0.008654211647808552,
-0.1466364562511444,
0.04705427214503288,
0.03421039134263992,
-0.14664074778556824,
0.015472138300538063,
0.04279173165559769,
-0.029490012675523758,
0.02456381916999817,
0.02833370864391327,
-0.04485740140080452,
0.027414996176958084,
0.017893973737955093,
0.04229152947664261,
-0.0988948792219162,
0.011197136715054512,
-0.06888056546449661,
0.10427165776491165,
0.1723640114068985,
0.01087367907166481,
0.025555511936545372,
-0.05729641392827034,
0.06312385201454163,
0.047061488032341,
0.06194094941020012,
-0.01575443148612976,
-0.08630769699811935,
0.050210386514663696,
0.0387570783495903,
0.03088856302201748,
-0.04343597590923309,
-0.04731707274913788,
-0.035848382860422134,
-0.0847383588552475,
-0.014013003557920456,
0.12317151576280594,
-0.013108215294778347,
0.025817984715104103,
0.015699883922934532,
-0.05153605714440346,
0.0035087266005575657,
0.11378219723701477,
-0.0995204746723175,
-0.04894336685538292
] |
null | null | transformers |
# <b>AceGPT</b>
[AceGPT](https://huggingface.co/FreedomIntelligence/AceGPT-13B-chat) is a fully fine-tuned generative text model collection based on LlaMA2, particularly in the
Arabic language domain. This is the repository for the 13B-chat pre-trained model.
## Model Developers
We are from the School of Data Science, the Chinese University of Hong Kong, Shenzhen (CUHKSZ), the Shenzhen Research Institute of Big Data (SRIBD), and the King Abdullah University of Science and Technology (KAUST).
## Variations
AceGPT families come in a range of parameter sizes —— 7B and 13B, each size of model has a base category and a -chat category.
## Input
Models input text only.
## Output
Models output text only.
## Model Evaluation Results
Experiments on Arabic Vicuna-80, Arabic AlpacaEval. Numbers are the average performance ratio of ChatGPT over three runs. We do not report the results of raw Llama-2 models since they cannot properly generate Arabic texts.
| | Arabic Vicuna-80 | Arabic AlpacaEval |
|------------------------------|--------------------|---------------------|
| Phoenix Chen et al. (2023a) | 71.92% ± 0.2% | 65.62% ± 0.3% |
| Phoenix–multiple-langs Chen et al. (2023b) | 71.67% ± 0.7% | 65.36% ± 0.1% |
| Jais-13B-chat Sengupta et al. (2023) | 75.40% ± 1.6% | 74.95% ± 0.2% |
| AceGPT-7B-chat | 94.82% ± 0.2% | 93.81% ± 0.1% |
| AceGPT-13B-chat | 100.88% ± 0.4% | 97.95% ± 0.1% |
# You can get more detail at https://github.com/FreedomIntelligence/AceGPT/tree/main | {"language": ["ar", "en"], "license": "apache-2.0", "library_name": "transformers", "pipeline_tag": "text-generation"} | text-generation | MhmdSyd/AceGPT-13B-chat-GGUF | [
"transformers",
"gguf",
"text-generation",
"ar",
"en",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2024-02-15T05:16:44+00:00 | [] | [
"ar",
"en"
] | TAGS
#transformers #gguf #text-generation #ar #en #license-apache-2.0 #endpoints_compatible #region-us
| **AceGPT**
==========
AceGPT is a fully fine-tuned generative text model collection based on LlaMA2, particularly in the
Arabic language domain. This is the repository for the 13B-chat pre-trained model.
Model Developers
----------------
We are from the School of Data Science, the Chinese University of Hong Kong, Shenzhen (CUHKSZ), the Shenzhen Research Institute of Big Data (SRIBD), and the King Abdullah University of Science and Technology (KAUST).
Variations
----------
AceGPT families come in a range of parameter sizes —— 7B and 13B, each size of model has a base category and a -chat category.
Input
-----
Models input text only.
Output
------
Models output text only.
Model Evaluation Results
------------------------
Experiments on Arabic Vicuna-80, Arabic AlpacaEval. Numbers are the average performance ratio of ChatGPT over three runs. We do not report the results of raw Llama-2 models since they cannot properly generate Arabic texts.
Arabic Vicuna-80: Phoenix Chen et al. (2023a), Arabic AlpacaEval: 71.92% ± 0.2%
Arabic Vicuna-80: Phoenix–multiple-langs Chen et al. (2023b), Arabic AlpacaEval: 71.67% ± 0.7%
Arabic Vicuna-80: Jais-13B-chat Sengupta et al. (2023), Arabic AlpacaEval: 75.40% ± 1.6%
Arabic Vicuna-80: AceGPT-7B-chat, Arabic AlpacaEval: 94.82% ± 0.2%
Arabic Vicuna-80: AceGPT-13B-chat, Arabic AlpacaEval: 100.88% ± 0.4%
You can get more detail at URL
==============================
| [] | [
"TAGS\n#transformers #gguf #text-generation #ar #en #license-apache-2.0 #endpoints_compatible #region-us \n"
] | [
37
] | [
"passage: TAGS\n#transformers #gguf #text-generation #ar #en #license-apache-2.0 #endpoints_compatible #region-us \n"
] | [
-0.023276561871170998,
0.0755726769566536,
-0.006161889061331749,
-0.024881955236196518,
0.08120552450418472,
0.012826803140342236,
0.11012514680624008,
0.11240921914577484,
0.026647957041859627,
-0.035186074674129486,
0.1699749380350113,
0.14615756273269653,
0.019152997061610222,
0.03730889409780502,
-0.0564739890396595,
-0.17964649200439453,
0.11766041815280914,
0.03436604142189026,
-0.1072850376367569,
0.05578524246811867,
0.10768216103315353,
0.02902216464281082,
0.07210594415664673,
0.0044547077268362045,
-0.09302859008312225,
0.011177112348377705,
0.03264320269227028,
-0.08118116110563278,
0.06836853921413422,
0.09945820271968842,
0.005125577095896006,
0.029813870787620544,
-0.08626099675893784,
-0.23556886613368988,
0.01906028762459755,
0.008821442723274231,
-0.09929205477237701,
0.02631736733019352,
0.005246378481388092,
-0.03188706561923027,
0.10539618134498596,
0.0432116836309433,
-0.11966506391763687,
0.0844283178448677,
-0.0974518284201622,
-0.20444521307945251,
-0.11112808436155319,
0.03599478304386139,
0.05702062323689461,
0.0855056568980217,
0.03454052656888962,
-0.012952452525496483,
-0.03944520652294159,
0.01942574977874756,
0.17366904020309448,
-0.3666975796222687,
0.000030261351639637724,
0.1262531727552414,
0.04290684685111046,
0.017733413726091385,
-0.036313753575086594,
0.10228525847196579,
0.06159699335694313,
-0.001606145640835166,
-0.01674288883805275,
-0.06434156745672226,
-0.030184496194124222,
0.09559721499681473,
-0.04447653889656067,
-0.08252603560686111,
0.2524309456348419,
0.009154530242085457,
0.001738762017339468,
0.007597954012453556,
-0.04829477518796921,
0.054180145263671875,
-0.05060447007417679,
0.07925127446651459,
0.03214262053370476,
0.1779118776321411,
0.09461621195077896,
-0.09826743602752686,
-0.1262405663728714,
-0.0313192680478096,
-0.1808099001646042,
0.12837809324264526,
-0.012636539526283741,
0.0927920788526535,
-0.13916346430778503,
0.0233768317848444,
-0.1355164796113968,
-0.09756426513195038,
-0.0834740549325943,
-0.073096364736557,
0.12618373334407806,
0.09244322031736374,
-0.1018533930182457,
0.05112073943018913,
0.20162838697433472,
0.22035354375839233,
-0.0005404280382208526,
0.028539910912513733,
-0.04374666139483452,
0.12240727245807648,
-0.07526219636201859,
0.044619590044021606,
0.008299203589558601,
-0.04271223023533821,
0.10794772207736969,
-0.18684855103492737,
0.060907550156116486,
-0.03239995241165161,
-0.14956296980381012,
-0.05589922517538071,
-0.09366301447153091,
0.09011004865169525,
0.049844834953546524,
0.0310314130038023,
-0.003411100013181567,
0.054260481148958206,
0.11353107541799545,
-0.032921358942985535,
-0.028537865728139877,
-0.017553018406033516,
0.05227160453796387,
0.07684217393398285,
0.04170374572277069,
0.051371023058891296,
-0.0271091777831316,
0.029358381405472755,
-0.04283921793103218,
-0.03249257430434227,
-0.0011931628687307239,
0.04399256408214569,
0.07648544758558273,
-0.08442535251379013,
0.06592680513858795,
-0.11571988463401794,
-0.17131942510604858,
0.03516492620110512,
0.08762071281671524,
-0.009711164981126785,
-0.04840615764260292,
0.03344252333045006,
-0.05298330634832382,
0.06516547501087189,
-0.08567200601100922,
-0.011578695848584175,
-0.09015173465013504,
0.050549622625112534,
-0.057988304644823074,
0.027588486671447754,
-0.22725775837898254,
0.05632519721984863,
-0.04648052901029587,
0.008697699755430222,
-0.07950690388679504,
0.019401585683226585,
-0.0945499986410141,
0.19708122313022614,
-0.07100702822208405,
0.007348526734858751,
-0.061711832880973816,
0.01140629407018423,
-0.04795625060796738,
0.13346447050571442,
-0.04932761937379837,
-0.05505289137363434,
0.2534349858760834,
-0.10417250543832779,
-0.2009427398443222,
0.0728919580578804,
0.049147386103868484,
-0.021988477557897568,
0.055969204753637314,
0.20028777420520782,
0.03396109119057655,
-0.013380303978919983,
0.06705010682344437,
0.19640164077281952,
-0.06364299356937408,
-0.17658527195453644,
0.08604857325553894,
-0.09695585817098618,
-0.11149213463068008,
0.04729445278644562,
-0.08463636785745621,
0.13707362115383148,
0.036220014095306396,
-0.05430053547024727,
-0.08299712091684341,
-0.0463794507086277,
-0.05122124403715134,
-0.036179400980472565,
0.0507967546582222,
-0.05420916900038719,
-0.01873035915195942,
-0.12482638657093048,
0.011087517254054546,
0.024356016889214516,
0.07711831480264664,
-0.05584484338760376,
0.06850255280733109,
-0.020857073366642,
0.08428364247083664,
-0.06785191595554352,
-0.009543395601212978,
-0.04039030894637108,
0.007378318812698126,
-0.005081642884761095,
0.05538870021700859,
0.06092728674411774,
-0.10831113904714584,
-0.015312997624278069,
0.05108857899904251,
0.11120794713497162,
0.017519133165478706,
0.017132535576820374,
-0.11558470875024796,
0.09494146704673767,
-0.014402735978364944,
0.02385394275188446,
-0.01769186556339264,
0.022176822647452354,
0.11539532244205475,
0.04995183274149895,
-0.06437547504901886,
0.04627485200762749,
-0.015688680112361908,
-0.07845830172300339,
-0.05363249033689499,
-0.029876859858632088,
0.08972707390785217,
0.06854763627052307,
-0.13815072178840637,
0.2499186396598816,
-0.03149867802858353,
0.23979775607585907,
0.1936594545841217,
-0.13415829837322235,
0.1049293652176857,
-0.016649441793560982,
-0.012849769555032253,
0.01174030639231205,
0.07040535658597946,
-0.008552596904337406,
0.061862751841545105,
0.00261153606697917,
0.12466277182102203,
-0.048802491277456284,
-0.025741634890437126,
-0.006584848742932081,
-0.05428240820765495,
-0.01258009858429432,
0.006916818208992481,
0.08237932622432709,
-0.14221806824207306,
0.1911940723657608,
0.3011881709098816,
0.03468962758779526,
0.09266205877065659,
-0.08991637825965881,
-0.014716878533363342,
0.04871600493788719,
0.03469163179397583,
-0.011871112510561943,
0.01896408200263977,
-0.2145782858133316,
0.0046659428626298904,
0.09122131764888763,
0.06836240738630295,
0.09570600837469101,
-0.1243278831243515,
-0.06947697699069977,
0.0067056757397949696,
-0.08785302191972733,
-0.04432704299688339,
0.07566720992326736,
-0.0747433453798294,
0.07160511612892151,
0.0001743610337143764,
-0.0068114688619971275,
0.13681884109973907,
0.0013969307765364647,
-0.0842047855257988,
0.1600053310394287,
-0.1511688232421875,
-0.20327319204807281,
-0.15878474712371826,
-0.13900049030780792,
-0.06720392405986786,
0.04034673795104027,
0.16221275925636292,
-0.081741102039814,
-0.05982306972146034,
0.021050728857517242,
-0.022166751325130463,
-0.06249639019370079,
0.031537558883428574,
0.08392579108476639,
0.0239842738956213,
-0.008700068108737469,
-0.11154016852378845,
-0.0525163970887661,
0.0380428284406662,
-0.0394759401679039,
0.05757215619087219,
-0.11289696395397186,
0.1099560558795929,
0.08848415315151215,
0.08107487857341766,
0.058358728885650635,
0.0032937207724899054,
0.1844698190689087,
-0.07475234568119049,
-0.03042547218501568,
0.21369245648384094,
0.011559097096323967,
0.06219550594687462,
0.0968332439661026,
0.012898123823106289,
-0.10404513776302338,
-0.008629266172647476,
-0.03372027352452278,
-0.11407846212387085,
-0.24210688471794128,
-0.055439967662096024,
-0.12902569770812988,
0.016746701672673225,
-0.018993107602000237,
0.1086023598909378,
0.11184495687484741,
0.09655573964118958,
-0.024840857833623886,
0.03754850849509239,
0.039267417043447495,
0.06002257391810417,
0.23175205290317535,
-0.01886778324842453,
0.06624739617109299,
-0.14113786816596985,
0.010636142455041409,
0.15414251387119293,
0.11531420797109604,
0.15636837482452393,
0.13290844857692719,
0.1309976726770401,
0.09569588303565979,
0.07219436764717102,
0.07529865950345993,
0.09027360379695892,
0.00304053770378232,
-0.03706042468547821,
-0.061007045209407806,
-0.020669391378760338,
-0.04285459965467453,
0.07707508653402328,
-0.09549515694379807,
-0.19549623131752014,
0.025715092197060585,
-0.16236840188503265,
0.11248873174190521,
0.14823368191719055,
0.011919665150344372,
-0.08766735345125198,
-0.005272697191685438,
0.1055743396282196,
0.0013768323697149754,
-0.0616871677339077,
0.09730438143014908,
-0.0337710902094841,
-0.073941670358181,
0.15898554027080536,
-0.027354063466191292,
0.12861154973506927,
0.03563886135816574,
0.04042022302746773,
-0.029965588822960854,
-0.10972537100315094,
0.06082861125469208,
0.1374710202217102,
-0.3107079863548279,
0.17431674897670746,
-0.0029041520319879055,
-0.01685408316552639,
-0.05616317316889763,
0.030532442033290863,
0.06452921032905579,
0.21946381032466888,
0.1611575335264206,
0.034031301736831665,
-0.16249486804008484,
0.06517545133829117,
-0.01371923927217722,
0.07676871865987778,
0.06944926083087921,
-0.03228896111249924,
-0.06923208385705948,
-0.051296547055244446,
0.0003601699718274176,
0.006718328222632408,
0.08947952836751938,
-0.0387103371322155,
-0.20372796058654785,
0.06703179329633713,
0.1339239627122879,
0.08158400654792786,
-0.07323288172483444,
0.08779534697532654,
-0.09591315686702728,
0.19518156349658966,
-0.1350187212228775,
-0.08225267380475998,
-0.11098843067884445,
-0.12565977871418,
0.043231457471847534,
-0.03690456598997116,
0.07258176803588867,
-0.10132146626710892,
-0.007546853739768267,
-0.07527027279138565,
-0.1989217847585678,
0.09370209276676178,
-0.13749262690544128,
0.02015865221619606,
-0.0026712571270763874,
0.10008695721626282,
-0.08690966665744781,
-0.013954084366559982,
0.03178583085536957,
0.004346902947872877,
-0.10528378188610077,
-0.1874193400144577,
0.010383307002484798,
0.0344257652759552,
0.008758476935327053,
-0.03209175541996956,
-0.03653687238693237,
0.06316906958818436,
0.04142289608716965,
-0.08523353189229965,
0.2102593183517456,
0.18020406365394592,
-0.0654136911034584,
0.16979898512363434,
0.19093337655067444,
-0.09855446219444275,
-0.2875858545303345,
-0.16518659889698029,
-0.19051434099674225,
-0.0813416987657547,
-0.08964667469263077,
-0.18693014979362488,
0.11552584171295166,
0.059789080172777176,
-0.0903056412935257,
0.17623884975910187,
-0.19851833581924438,
-0.04976765811443329,
0.14411717653274536,
-0.04595423489809036,
0.42939266562461853,
-0.19589759409427643,
-0.11115071922540665,
-0.127628892660141,
-0.2785263657569885,
0.11241429299116135,
-0.11520825326442719,
0.07664484530687332,
0.00040287288720719516,
0.005515292752534151,
-0.04162563756108284,
-0.059476833790540695,
0.18622630834579468,
0.015090908855199814,
0.02068859338760376,
-0.11393607407808304,
0.10542911291122437,
0.11319305747747421,
-0.005137590225785971,
0.03811649978160858,
-0.20319224894046783,
0.004387238994240761,
-0.09116349369287491,
-0.034064240753650665,
-0.04371026158332825,
0.07809018343687057,
0.035671867430210114,
-0.03795947879552841,
-0.09239693731069565,
-0.05989811569452286,
0.037625059485435486,
0.02229933813214302,
0.25459524989128113,
-0.021623952314257622,
0.06449855118989944,
0.06585056334733963,
0.030009597539901733,
-0.23944364488124847,
-0.004609856754541397,
-0.07234261929988861,
-0.04906316474080086,
0.058443836867809296,
-0.25950974225997925,
0.05080731213092804,
0.06230161711573601,
-0.060688890516757965,
0.04013720899820328,
0.08883136510848999,
-0.011680941097438335,
-0.014774724841117859,
0.1347791850566864,
-0.1361032873392105,
-0.09518012404441833,
-0.030103445053100586,
0.03714943677186966,
0.0963781550526619,
0.08104061335325241,
0.12452206015586853,
0.03851164132356644,
0.004428500309586525,
-0.001813731505535543,
0.0345538966357708,
-0.11669300496578217,
-0.004765540361404419,
0.01925995573401451,
-0.019570589065551758,
-0.13678237795829773,
0.1159721240401268,
-0.04053886607289314,
-0.12381035834550858,
-0.006115455646067858,
0.07116254419088364,
-0.1555875986814499,
-0.10896828025579453,
-0.05238216742873192,
0.0447726808488369,
-0.20639145374298096,
-0.11030592769384384,
-0.00803393218666315,
-0.13576170802116394,
0.05752231925725937,
0.10324204713106155,
0.07502082735300064,
0.08767125755548477,
0.0399027094244957,
-0.038497719913721085,
0.06174023076891899,
-0.07425834983587265,
-0.1170637458562851,
0.021842675283551216,
-0.07152006775140762,
-0.16248029470443726,
-0.01851959340274334,
0.08861545473337173,
-0.03578269109129906,
0.017115790396928787,
-0.09300819039344788,
0.010478893294930458,
-0.1872207075357437,
0.0007030560518614948,
-0.11696798354387283,
-0.022481288760900497,
0.021027661859989166,
-0.0762501209974289,
-0.04313672333955765,
0.01636447384953499,
-0.11419066786766052,
-0.04378185793757439,
-0.05682570859789848,
0.03917974233627319,
-0.10050930827856064,
-0.02249733917415142,
0.10580579191446304,
-0.013270271942019463,
0.13223423063755035,
0.12690109014511108,
-0.06407217681407928,
0.1023106575012207,
-0.24668829143047333,
-0.08197634667158127,
0.076627217233181,
-0.004372868686914444,
-0.011681959964334965,
0.010889182798564434,
0.012527331709861755,
0.10392705351114273,
-0.02616814523935318,
0.03143467381596565,
-0.011880789883434772,
-0.13349270820617676,
-0.09786307066679001,
-0.07512642443180084,
-0.04930471256375313,
-0.015076079405844212,
-0.10049550980329514,
0.1820743829011917,
0.06646633893251419,
0.12015726417303085,
-0.0009017768315970898,
0.028548048809170723,
0.0018634141888469458,
0.038412246853113174,
-0.0003580093616619706,
-0.15188829600811005,
-0.05665552243590355,
-0.058862295001745224,
-0.06477298587560654,
-0.004740712698549032,
0.3212067484855652,
-0.05905963107943535,
-0.07647062838077545,
0.06498513370752335,
0.043897926807403564,
0.07251013815402985,
-0.013339506462216377,
0.3209492862224579,
0.09481343626976013,
0.01076635904610157,
-0.10885608196258545,
0.04611445590853691,
0.03863825276494026,
-0.12316886335611343,
0.05906081944704056,
0.0772201344370842,
0.0306931771337986,
0.1391046792268753,
-0.02030019834637642,
-0.02286570891737938,
-0.072509765625,
-0.0013496062019839883,
-0.0024640755727887154,
0.06193726882338524,
0.012767615728080273,
0.10948531329631805,
0.21744418144226074,
-0.07485602796077728,
0.04372509941458702,
-0.014848022721707821,
-0.005490123759955168,
-0.15995685756206512,
-0.1307494193315506,
-0.057452499866485596,
-0.1973835974931717,
0.03481684625148773,
-0.07905029505491257,
0.0798519030213356,
0.1461956948041916,
0.04907847195863724,
-0.025037305429577827,
0.03528234362602234,
-0.016675418242812157,
-0.08539876341819763,
0.05456584692001343,
-0.053437720984220505,
-0.01894894801080227,
0.0409630611538887,
-0.04067869484424591,
-0.035857997834682465,
-0.08019540458917618,
-0.012042197398841381,
0.07825042307376862,
0.04523523896932602,
0.07148617506027222,
-0.11926731467247009,
-0.03768781200051308,
-0.07533006370067596,
0.08693098276853561,
-0.01814131997525692,
0.16261336207389832,
0.009609810076653957,
-0.013352809473872185,
0.11191777139902115,
0.18199358880519867,
-0.07832281291484833,
-0.1000518798828125,
-0.062112048268318176,
0.0448271818459034,
0.016774442046880722,
0.09015222638845444,
-0.05358610674738884,
0.02345212735235691,
-0.03472713753581047,
0.3493066132068634,
0.1969047635793686,
-0.05034978687763214,
0.00725562172010541,
-0.0031722565181553364,
0.03191050887107849,
0.12112582474946976,
0.09152760356664658,
0.12746287882328033,
0.20853088796138763,
-0.06733695417642593,
-0.07567428052425385,
-0.011957352980971336,
-0.0030267653055489063,
-0.1729206144809723,
0.12422246485948563,
-0.040159281343221664,
-0.09727714210748672,
0.012924990616738796,
0.12241502851247787,
-0.11716841161251068,
0.05823730677366257,
-0.011642428115010262,
-0.05793998762965202,
0.018984606489539146,
-0.022891025990247726,
0.11436373740434647,
0.023061562329530716,
0.03369443491101265,
-0.02781488746404648,
-0.08712740987539291,
0.10900632292032242,
0.010213391855359077,
-0.24472366273403168,
-0.024156156927347183,
0.0510605089366436,
-0.013890191912651062,
0.09748604893684387,
-0.0004002299392595887,
-0.022910205647349358,
0.08829575031995773,
0.046098172664642334,
-0.09349032491445541,
0.09085521101951599,
-0.013252520002424717,
-0.028171727433800697,
-0.02191055938601494,
-0.07501012831926346,
-0.043618977069854736,
-0.09050396084785461,
0.05232694000005722,
-0.046454742550849915,
0.06185968220233917,
0.0921182781457901,
-0.06019584834575653,
-0.02390289679169655,
-0.01976850815117359,
-0.14529351890087128,
0.05225794017314911,
-0.023679219186306,
-0.029478691518306732,
-0.039813779294490814,
-0.04916274920105934,
0.00989289116114378,
0.020220376551151276,
-0.09780392050743103,
-0.05720142647624016,
0.026190228760242462,
-0.0498066172003746,
0.15454086661338806,
0.028789497911930084,
-0.08988301455974579,
-0.0010402332991361618,
-0.05190783739089966,
0.08315116912126541,
-0.10183680057525635,
0.03933577239513397,
0.12780506908893585,
-0.0015780834946781397,
-0.02670148015022278,
-0.13588668406009674,
0.050726696848869324,
0.018365047872066498,
-0.024458637461066246,
-0.09747596085071564
] |
null | null | transformers |


This model is a finetune of jondurbin's excellent [bagel](https://huggingface.co/jondurbin/bagel-34b-v0.2) model.
It has been trained with new datasets and a new technique, which we will share to the community soon.
This model has not utilised any form of merging.
### Evaluation Results
| Average | ARC | HellaSwag | MMLU | TruthfulQA | Winogrande | GSM8K |
| --- | --- | --- | --- | --- | --- | --- |
| 77.29 | 74.23 | 86.76 | 76.66 | 70.22 | 83.66 | 72.18 |
### Contamination Results
With reference model jondurbin/bagel-34b-v0.2:
| ARC | TruthfulQA | GSM8K |
| --- | --- | --- |
| 0.08| 0.38| 0.88| | {"license": "other", "license_name": "yi-license", "license_link": "https://huggingface.co/01-ai/Yi-34B-200K/blob/main/LICENSE", "base_model": "jondurbin/bagel-34b-v0.2"} | text-generation | LoneStriker/Smaug-34B-v0.1-2.65bpw-h6-exl2 | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"base_model:jondurbin/bagel-34b-v0.2",
"license:other",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-15T05:16:47+00:00 | [] | [] | TAGS
#transformers #safetensors #llama #text-generation #conversational #base_model-jondurbin/bagel-34b-v0.2 #license-other #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
| !image/png
!image/png
This model is a finetune of jondurbin's excellent bagel model.
It has been trained with new datasets and a new technique, which we will share to the community soon.
This model has not utilised any form of merging.
### Evaluation Results
### Contamination Results
With reference model jondurbin/bagel-34b-v0.2:
ARC: 0.08, TruthfulQA: 0.38, GSM8K: 0.88
| [
"### Evaluation Results",
"### Contamination Results\n\n\nWith reference model jondurbin/bagel-34b-v0.2:\n\n\nARC: 0.08, TruthfulQA: 0.38, GSM8K: 0.88"
] | [
"TAGS\n#transformers #safetensors #llama #text-generation #conversational #base_model-jondurbin/bagel-34b-v0.2 #license-other #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"### Evaluation Results",
"### Contamination Results\n\n\nWith reference model jondurbin/bagel-34b-v0.2:\n\n\nARC: 0.08, TruthfulQA: 0.38, GSM8K: 0.88"
] | [
72,
5,
40
] | [
"passage: TAGS\n#transformers #safetensors #llama #text-generation #conversational #base_model-jondurbin/bagel-34b-v0.2 #license-other #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### Evaluation Results### Contamination Results\n\n\nWith reference model jondurbin/bagel-34b-v0.2:\n\n\nARC: 0.08, TruthfulQA: 0.38, GSM8K: 0.88"
] | [
-0.04525256156921387,
0.11974851042032242,
-0.0023604375310242176,
0.012170374393463135,
0.03191312029957771,
-0.05886117368936539,
0.17323105037212372,
0.05917754024267197,
-0.04918491840362549,
0.019062822684645653,
0.16053636372089386,
0.18094675242900848,
-0.008187057450413704,
0.1386977881193161,
-0.12849180400371552,
-0.028414348140358925,
0.10181622952222824,
-0.001769914524629712,
0.06670404970645905,
0.11774345487356186,
0.1055540069937706,
-0.04559142142534256,
0.11616022884845734,
-0.000478314672363922,
-0.047079816460609436,
0.057999301701784134,
0.07410825043916702,
-0.11545513570308685,
0.10300738364458084,
0.07263334095478058,
0.022486131638288498,
0.08843635022640228,
-0.020663965493440628,
-0.12247072160243988,
0.0387629009783268,
-0.0251071285456419,
-0.06758589297533035,
0.06031336635351181,
0.01187992561608553,
-0.06556987017393112,
0.07975111901760101,
0.038842957466840744,
-0.01809644140303135,
0.08711165934801102,
-0.10160006582736969,
-0.013214037753641605,
-0.07276489585638046,
0.07464108616113663,
0.12269559502601624,
0.07799924165010452,
-0.033364187926054,
0.17100784182548523,
-0.03681858628988266,
0.10829465091228485,
0.12859967350959778,
-0.2861159145832062,
0.012810084968805313,
0.14649921655654907,
0.015632623806595802,
-0.014787672087550163,
-0.02818293496966362,
0.127168670296669,
0.11412127315998077,
-0.038246817886829376,
-0.05368415266275406,
-0.05218542739748955,
-0.05121970549225807,
0.04191635921597481,
-0.028297731652855873,
-0.031205181032419205,
0.24603690207004547,
0.047117650508880615,
-0.09339313209056854,
-0.03837243467569351,
-0.08512642234563828,
-0.04135356843471527,
0.005270859692245722,
0.008432362228631973,
-0.026538344100117683,
0.04785020276904106,
-0.07052703201770782,
0.038619764149188995,
-0.09188854694366455,
-0.07066746056079865,
-0.10828105360269547,
0.18378926813602448,
-0.010209337808191776,
0.03299311175942421,
-0.08632496744394302,
0.07517071068286896,
-0.06521965563297272,
-0.14724700152873993,
-0.08512477576732635,
-0.06403377652168274,
0.11848216503858566,
0.01366229448467493,
0.016455238685011864,
-0.0035586729645729065,
0.1555667370557785,
0.1479959487915039,
-0.028327224776148796,
0.0006063813925720751,
-0.09035154432058334,
0.0205267071723938,
-0.011179042980074883,
0.017841985449194908,
-0.046613987535238266,
-0.00803209189325571,
0.09749525040388107,
-0.025481615215539932,
0.13511274755001068,
-0.022282762452960014,
-0.08684057742357254,
-0.002710809698328376,
0.042806316167116165,
0.10156551003456116,
-0.0140672093257308,
0.08425431698560715,
-0.052823781967163086,
0.040074653923511505,
0.07843570411205292,
-0.05424026772379875,
-0.05436987802386284,
0.028668098151683807,
0.015002930536866188,
-0.01679180935025215,
0.09796661883592606,
0.052758943289518356,
0.01722336933016777,
0.02221459522843361,
-0.09820661693811417,
-0.05040527135133743,
-0.007780633866786957,
-0.07567847520112991,
0.044102612882852554,
0.026247646659612656,
0.03990452364087105,
-0.20050622522830963,
-0.2024843990802765,
0.017882604151964188,
-0.021485023200511932,
-0.028603633865714073,
-0.023840170353651047,
-0.01781654916703701,
-0.03690173849463463,
0.020091397687792778,
-0.05038277804851532,
-0.0453876256942749,
-0.09905695915222168,
0.07193168252706528,
0.015691842883825302,
0.05465209111571312,
-0.11070174723863602,
0.010288162156939507,
-0.12189900130033493,
0.055784162133932114,
-0.0008082325221039355,
-0.024562334641814232,
-0.06984174251556396,
0.12232629209756851,
-0.09646580368280411,
0.004350407049059868,
-0.06777644157409668,
-0.009058627299964428,
0.031188344582915306,
0.2439412772655487,
-0.056106749922037125,
-0.0483710803091526,
0.14184477925300598,
-0.10290747135877609,
-0.146474689245224,
0.09316866844892502,
-0.02648594044148922,
0.09551043808460236,
0.13569827377796173,
0.10372859984636307,
-0.08362486213445663,
-0.12071352452039719,
-0.04245147854089737,
0.031019972637295723,
-0.021407349035143852,
0.032973140478134155,
0.056698575615882874,
-0.04596508666872978,
-0.15885303914546967,
0.040428876876831055,
0.10893809050321579,
0.05181001126766205,
-0.03325556591153145,
-0.06067856773734093,
-0.06387163698673248,
-0.09264726936817169,
0.0875277891755104,
-0.056905005127191544,
0.03776533529162407,
-0.11693596839904785,
-0.06886328756809235,
-0.09717018157243729,
0.07803769409656525,
-0.028727812692523003,
-0.011137898080050945,
-0.128780797123909,
0.16040876507759094,
-0.12373936921358109,
0.010161287151277065,
-0.036752041429281235,
-0.006272847764194012,
-0.0440339632332325,
0.017466934397816658,
0.021833565086126328,
-0.047452982515096664,
0.04236302152276039,
0.050389427691698074,
-0.047711387276649475,
-0.0021017601247876883,
0.07062821090221405,
0.023353898897767067,
-0.023983022198081017,
-0.12796476483345032,
0.09416372328996658,
-0.028211945667862892,
0.1304721236228943,
-0.09916528314352036,
0.01916375569999218,
0.0667835995554924,
0.06526730209589005,
0.0018856950337067246,
0.009114283137023449,
0.08137739449739456,
-0.04938162863254547,
-0.08771296590566635,
-0.01176049280911684,
0.03573398292064667,
-0.0019473363645374775,
-0.12689007818698883,
0.13121262192726135,
-0.1341925412416458,
0.24663500487804413,
0.15531542897224426,
-0.049411214888095856,
-0.026963306590914726,
0.01810063049197197,
-0.0019616030622273684,
0.026632843539118767,
-0.029577618464827538,
-0.007773878984153271,
0.05558868125081062,
-0.03407000005245209,
0.10185427963733673,
-0.08900423347949982,
0.014143328182399273,
0.01616692915558815,
-0.10224787890911102,
-0.049550969153642654,
0.11395908892154694,
0.022674305364489555,
-0.21802012622356415,
0.1282704472541809,
0.15238741040229797,
-0.012105832807719707,
0.12698812782764435,
0.024525528773665428,
-0.031773846596479416,
0.0069692861288785934,
0.01455776859074831,
-0.008089111186563969,
0.07266543805599213,
-0.14309173822402954,
0.05488405376672745,
0.07744058221578598,
-0.02179626002907753,
0.04645966365933418,
-0.0939142256975174,
-0.04648003354668617,
0.011251467280089855,
-0.05396752431988716,
-0.07915814965963364,
0.10041439533233643,
-0.03633032739162445,
0.14521215856075287,
-0.03023960255086422,
-0.05886389687657356,
0.0597975067794323,
0.00873294286429882,
-0.14194972813129425,
0.20696020126342773,
-0.05018157884478569,
-0.2539389729499817,
-0.15807662904262543,
-0.1259155571460724,
-0.12086830288171768,
0.07073884457349777,
0.10442011058330536,
-0.12064488232135773,
-0.10227535665035248,
-0.09715291857719421,
0.005357338115572929,
0.05081578716635704,
0.025437192991375923,
0.06566254049539566,
0.02001352049410343,
0.03598915413022041,
-0.1263357549905777,
-0.03752841427922249,
0.009426312521100044,
-0.044434282928705215,
0.06842487305402756,
-0.10476984828710556,
0.136917844414711,
0.13368754088878632,
-0.005514014046639204,
0.010711542330682278,
-0.010348239913582802,
0.2177879810333252,
-0.03046317957341671,
-0.011799454689025879,
0.2519260346889496,
-0.05130639299750328,
0.023998664692044258,
0.1930372565984726,
0.00034387398045510054,
-0.09298261255025864,
0.060426268726587296,
-0.03400789573788643,
-0.07959618419408798,
-0.2291390299797058,
-0.08585230261087418,
-0.005240465048700571,
0.09024275094270706,
-0.01967555843293667,
0.06269354373216629,
0.17713089287281036,
0.11363354325294495,
-0.0586823895573616,
-0.042059000581502914,
0.07949195057153702,
0.10106640309095383,
0.1497822254896164,
-0.033522699028253555,
0.13658195734024048,
-0.0753566175699234,
-0.07923572510480881,
0.0819254070520401,
-0.011328653432428837,
0.0074868290685117245,
0.10603635758161545,
0.022833775728940964,
0.0897650346159935,
0.04280072823166847,
0.11630502343177795,
0.10067429393529892,
0.018635738641023636,
-0.06781565397977829,
-0.03464825078845024,
-0.08098717033863068,
-0.012124256230890751,
0.0859992504119873,
-0.1179543063044548,
0.029849160462617874,
0.025358248502016068,
-0.043092723935842514,
0.07799109816551208,
-0.04413076490163803,
0.16482216119766235,
-0.2662866413593292,
-0.08682014048099518,
0.09255517274141312,
0.018726011738181114,
-0.08015884459018707,
0.05407383292913437,
-0.0405050627887249,
-0.04566274955868721,
0.09152435511350632,
0.010035226121544838,
0.07530494779348373,
0.057576797902584076,
0.06461914628744125,
-0.12900123000144958,
-0.14413250982761383,
-0.06746280193328857,
0.0970132127404213,
-0.2956734597682953,
0.20926906168460846,
0.05372067540884018,
-0.003084513358771801,
-0.05033077299594879,
-0.02428804524242878,
0.030122704803943634,
0.1391783356666565,
0.1509656459093094,
-0.01981334201991558,
-0.06564483791589737,
-0.06485388427972794,
-0.12255921214818954,
0.05480664595961571,
-0.00819749291986227,
0.03382544592022896,
0.034050218760967255,
0.010311155579984188,
-0.004561418201774359,
-0.0026016244664788246,
0.09061220288276672,
-0.21963302791118622,
-0.046655286103487015,
0.05248333513736725,
0.10832510888576508,
0.12074069678783417,
-0.06447115540504456,
-0.07513546943664551,
-0.08508055657148361,
0.23702093958854675,
-0.0625227615237236,
-0.10194088518619537,
-0.09014973789453506,
-0.021780971437692642,
0.01762191392481327,
-0.0663316398859024,
0.02208435907959938,
-0.08036525547504425,
0.024505993351340294,
-0.010240960866212845,
-0.14934593439102173,
0.10480043292045593,
-0.049296315759420395,
-0.12117796391248703,
-0.025198617950081825,
0.15699182450771332,
-0.05874878540635109,
0.03404039889574051,
0.05122647061944008,
0.014375274069607258,
-0.030037662014365196,
-0.10779748857021332,
0.009121141396462917,
0.0329253114759922,
0.030129294842481613,
0.04136652126908302,
-0.048795122653245926,
-0.17945589125156403,
-0.05609149858355522,
-0.06241508945822716,
0.1563376933336258,
0.3060047924518585,
-0.0401444248855114,
0.09593530744314194,
0.1683492660522461,
-0.04123794659972191,
-0.2577572762966156,
-0.07495813071727753,
-0.15076082944869995,
0.03437121585011482,
-0.036793097853660583,
-0.016836166381835938,
0.060471221804618835,
0.055813200771808624,
-0.08341040462255478,
0.12228112667798996,
-0.1719401627779007,
-0.1605696976184845,
0.22616682946681976,
0.05346270650625229,
0.41733214259147644,
-0.12373313307762146,
-0.06255658715963364,
-0.1327088475227356,
-0.2468731850385666,
0.046127799898386,
-0.13185331225395203,
0.07590366154909134,
0.013204272836446762,
0.09867428243160248,
0.0189632847905159,
-0.058187421411275864,
0.1694808453321457,
-0.01832689717411995,
0.07232312113046646,
-0.08308630436658859,
-0.07397674769163132,
0.023600811138749123,
0.007810712326318026,
0.11084768921136856,
-0.08314908295869827,
0.07256767153739929,
-0.02639179117977619,
-0.059907618910074234,
-0.022033901885151863,
0.07684612274169922,
-0.02306857518851757,
-0.08536715060472488,
-0.08441435545682907,
-0.04730981960892677,
-0.017372172325849533,
-0.022709108889102936,
0.19115155935287476,
-0.11369631439447403,
0.17018923163414001,
0.19033950567245483,
0.07518313825130463,
-0.13623355329036713,
0.09502947330474854,
0.027228476479649544,
-0.10156794637441635,
0.05541050061583519,
-0.20051199197769165,
0.07639018446207047,
0.061430923640728,
-0.0012111717369407415,
0.10185527056455612,
0.043502096086740494,
-0.030594022944569588,
0.061301153153181076,
0.09786342829465866,
-0.22464708983898163,
0.10059235244989395,
-0.05757242441177368,
-0.0007416980806738138,
-0.008538971655070782,
0.10571233928203583,
0.1879286766052246,
-0.028340907767415047,
0.0011921939440071583,
-0.008149935863912106,
0.04514331743121147,
-0.057173922657966614,
0.13196025788784027,
0.04099012538790703,
0.01646367833018303,
-0.11045177280902863,
0.07405988872051239,
-0.015266580507159233,
-0.0029268215876072645,
0.002587819704785943,
-0.068906769156456,
-0.11984097957611084,
-0.07245661318302155,
-0.0835028663277626,
0.14852671325206757,
-0.0926840528845787,
-0.06982482969760895,
-0.13130107522010803,
-0.12495394051074982,
0.005092257168143988,
0.2045643925666809,
0.07511714845895767,
0.1291884481906891,
-0.006392376031726599,
-0.11234872788190842,
0.0015836305683478713,
0.09020483493804932,
0.04129631817340851,
0.013021300546824932,
-0.11610196530818939,
0.029401004314422607,
-0.027276942506432533,
0.04721243306994438,
-0.08050666749477386,
0.0022118939086794853,
-0.13819047808647156,
0.003819393692538142,
-0.20548875629901886,
0.0315348319709301,
-0.06913459300994873,
-0.028310168534517288,
-0.02152065746486187,
-0.05790617689490318,
-0.08014465868473053,
0.023150626569986343,
-0.06844570487737656,
0.027660731226205826,
0.015278948470950127,
0.0657072514295578,
-0.09135891497135162,
-0.019728990271687508,
0.07895719259977341,
-0.008307642303407192,
0.07862573117017746,
0.0676976889371872,
-0.07427100837230682,
0.05139647051692009,
-0.24940131604671478,
0.0018649037228897214,
0.11401069164276123,
-0.037043459713459015,
-0.013385551981627941,
-0.0676761120557785,
0.011837771162390709,
0.08617475628852844,
0.0029287610668689013,
0.06422247737646103,
0.06184908375144005,
-0.059269651770591736,
-0.08718816190958023,
-0.05930265039205551,
-0.07730169594287872,
0.0080960588529706,
-0.04978354275226593,
0.06966052949428558,
-0.01240470726042986,
0.13221296668052673,
-0.08040022104978561,
-0.023575492203235626,
-0.0887303352355957,
0.04240598529577255,
-0.012022001668810844,
-0.13256709277629852,
-0.21663416922092438,
-0.05064918100833893,
0.0026295320130884647,
-0.014342094771564007,
0.21714670956134796,
-0.02622874826192856,
-0.08874227851629257,
0.04999305680394173,
0.06934843957424164,
0.16823916137218475,
-0.0008650150266475976,
0.25804299116134644,
0.035310011357069016,
0.03383481875061989,
-0.06693612784147263,
0.023270899429917336,
0.045881520956754684,
-0.10601496696472168,
0.05687053129076958,
0.09904928505420685,
-0.02789405547082424,
0.09040209650993347,
-0.024938447400927544,
-0.04645003005862236,
0.05246574431657791,
-0.09910298138856888,
-0.05317483842372894,
0.023233721032738686,
0.009847880341112614,
0.036207474768161774,
0.23055145144462585,
-0.06486792117357254,
-0.03515565022826195,
-0.02273271046578884,
-0.021145790815353394,
-0.19307371973991394,
-0.14153456687927246,
-0.14047746360301971,
-0.10968784242868423,
0.06616965681314468,
-0.07175037264823914,
-0.01185464859008789,
0.09433726966381073,
0.03147280588746071,
-0.05254008248448372,
0.11112251877784729,
-0.07656412571668625,
-0.0848955363035202,
0.024617794901132584,
-0.02266838401556015,
0.0006497151916846633,
-0.020755551755428314,
-0.017286498099565506,
0.03757702186703682,
0.00966620072722435,
0.013766100630164146,
0.021621860563755035,
0.06936486810445786,
0.0357549786567688,
-0.10845433920621872,
-0.05757184326648712,
-0.06448255479335785,
0.07902612537145615,
0.04891064390540123,
0.13602788746356964,
0.0036216711159795523,
0.003788225119933486,
0.05403226241469383,
0.17218798398971558,
-0.03813575953245163,
-0.052631642669439316,
-0.07094701379537582,
0.13891363143920898,
-0.019248971715569496,
0.11031370609998703,
-0.013745217584073544,
-0.04466104134917259,
0.03597381338477135,
0.23662206530570984,
0.18739384412765503,
-0.03022376075387001,
0.01773729734122753,
-0.09219540655612946,
0.03275680169463158,
0.04832259193062782,
-0.022437097504734993,
0.04993830993771553,
0.1407332569360733,
-0.06442586332559586,
-0.03680576756596565,
0.014846207574009895,
0.00917789712548256,
-0.09620551764965057,
0.07313837110996246,
-0.03960747644305229,
-0.07642187923192978,
-0.08454439043998718,
0.07770311087369919,
-0.10824597626924515,
0.09002173691987991,
-0.07597295939922333,
-0.13074138760566711,
-0.0626433864235878,
0.012174737639725208,
0.056368015706539154,
0.0513683557510376,
0.03065972402691841,
-0.07269394397735596,
-0.0027795112691819668,
0.007419312372803688,
0.005608659703284502,
-0.1359582543373108,
-0.03462883085012436,
0.05317911133170128,
0.04610811918973923,
0.12341704219579697,
0.025619041174650192,
0.053931187838315964,
0.11613762378692627,
-0.05143020674586296,
-0.07409734278917313,
0.14772026240825653,
0.01969115622341633,
-0.10262318700551987,
0.040930528193712234,
-0.03211837634444237,
0.03614179790019989,
0.07360602170228958,
0.054300498217344284,
-0.08762802183628082,
0.04163828119635582,
-0.007832719944417477,
-0.09112028777599335,
-0.06365055590867996,
0.048358459025621414,
-0.09754876792430878,
0.08241371810436249,
0.03510749340057373,
-0.10332797467708588,
0.0019209374440833926,
-0.019443746656179428,
0.08583100140094757,
0.030733279883861542,
-0.1817665547132492,
0.04598311707377434,
-0.17440858483314514,
0.041772615164518356,
0.014958573505282402,
-0.014534308575093746,
-0.32099953293800354,
-0.012852340936660767,
-0.09132730960845947,
0.034026045352220535,
-0.1058378517627716,
0.04856368154287338,
0.1373571902513504,
0.0041321394965052605,
-0.019345324486494064,
-0.21664179861545563,
0.0576627179980278,
0.0628793016076088,
-0.050371285527944565,
-0.14817924797534943
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | null | ddemilla/Mixtral-8x7B-Instruct-v0.1-allium-core-base | [
"transformers",
"safetensors",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-15T05:17:30+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
31,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06646376848220825,
0.2168014943599701,
-0.00225935154594481,
0.023818302899599075,
0.1271018385887146,
-0.001635765191167593,
0.04218708351254463,
0.13324736058712006,
-0.020175931975245476,
0.11144465953111649,
0.046588581055402756,
0.09377603232860565,
0.09928803145885468,
0.18404334783554077,
0.04859916493296623,
-0.2059975117444992,
0.007056170143187046,
-0.09090408682823181,
0.014076028019189835,
0.1116579994559288,
0.13719257712364197,
-0.10291384905576706,
0.08272874355316162,
-0.04045208916068077,
-0.02019004337489605,
0.00012576708104461432,
-0.09259183704853058,
-0.07032395154237747,
0.06885425746440887,
0.06264153122901917,
0.051234472543001175,
0.001456156256608665,
0.09140396863222122,
-0.2864592671394348,
0.017265573143959045,
0.08406311273574829,
0.0027674848679453135,
0.06290827691555023,
0.07236549258232117,
-0.07389893382787704,
0.11328595131635666,
-0.08021481335163116,
0.13019037246704102,
0.08625296503305435,
-0.062064990401268005,
-0.23071379959583282,
-0.07525765895843506,
0.0963398814201355,
0.12251301854848862,
0.06215599179267883,
-0.022921854630112648,
0.15455181896686554,
-0.06248689442873001,
0.012971068732440472,
0.1294165402650833,
-0.11526761949062347,
-0.05572471022605896,
0.061741601675748825,
0.11775490641593933,
0.10740239918231964,
-0.14110268652439117,
-0.0017287094378843904,
0.04900608956813812,
0.029121357947587967,
0.08589313924312592,
0.022661056369543076,
0.12003941088914871,
0.04652795568108559,
-0.13695219159126282,
-0.04037507623434067,
0.12011898308992386,
0.038862764835357666,
-0.06446044892072678,
-0.2168138176202774,
-0.006778308190405369,
-0.0601806715130806,
-0.014732478186488152,
-0.07019448280334473,
0.039128515869379044,
-0.02470310963690281,
0.07317749410867691,
-0.04465159401297569,
-0.1063927412033081,
-0.0421026237308979,
0.0892222449183464,
0.07748593389987946,
0.011527054943144321,
-0.02519804798066616,
0.04627908393740654,
0.13455867767333984,
0.05402068421244621,
-0.10399353504180908,
-0.07017925381660461,
-0.06942764669656754,
-0.09420394152402878,
-0.04035796597599983,
0.056760527193546295,
0.031942449510097504,
0.02665667235851288,
0.22703726589679718,
0.016653569415211678,
0.04155244305729866,
0.0224777739495039,
0.01032855175435543,
0.043662428855895996,
0.0955500528216362,
-0.05303520709276199,
-0.15660029649734497,
-0.04072032496333122,
0.09077946096658707,
-0.0027527001220732927,
-0.036689214408397675,
-0.03966725245118141,
0.03849169611930847,
0.06843466311693192,
0.13122352957725525,
0.07552056759595871,
-0.017929591238498688,
-0.04813180863857269,
-0.030096933245658875,
0.23523783683776855,
-0.1493375599384308,
0.04426715523004532,
-0.02271856553852558,
-0.01804111897945404,
-0.03908449783921242,
0.03597262129187584,
0.022118929773569107,
-0.000004518366949923802,
0.09706240892410278,
-0.058981191366910934,
-0.05378659814596176,
-0.10168042778968811,
-0.03272576630115509,
0.04088849574327469,
-0.013975566253066063,
-0.010589460842311382,
-0.09025166928768158,
-0.09490354359149933,
-0.04766594246029854,
0.05537205561995506,
-0.05123869329690933,
-0.03770573064684868,
0.009465423412621021,
-0.08151785284280777,
-0.005444355774670839,
-0.005417742300778627,
0.10699385404586792,
-0.03222226724028587,
0.04445803165435791,
-0.027600755915045738,
0.05225523188710213,
0.09919606149196625,
0.031576547771692276,
-0.0773419588804245,
0.0561848059296608,
-0.22559374570846558,
0.07503069192171097,
-0.11481974273920059,
0.04335082694888115,
-0.1704932004213333,
-0.042439818382263184,
0.005444696638733149,
0.0139949731528759,
0.013206101022660732,
0.12720820307731628,
-0.19255615770816803,
-0.01654396951198578,
0.13260798156261444,
-0.09212633967399597,
-0.118110790848732,
0.07884611934423447,
-0.029701577499508858,
0.1624738723039627,
0.04682036489248276,
-0.027025915682315826,
0.09224298596382141,
-0.16434773802757263,
-0.07092688232660294,
-0.00949116237461567,
-0.01727987825870514,
0.12109188735485077,
0.07512219995260239,
-0.05991523340344429,
0.046571120619773865,
0.02832140028476715,
-0.038078423589468,
-0.04424772411584854,
-0.050857074558734894,
-0.10884185880422592,
-0.01070026308298111,
-0.08987759798765182,
0.04065500199794769,
-0.01250192429870367,
-0.07916021347045898,
-0.029885273426771164,
-0.18612512946128845,
-0.0030564051121473312,
0.10038342326879501,
0.0035033065360039473,
-0.005652366206049919,
-0.08666291832923889,
0.026358824223279953,
-0.03112892620265484,
-0.008404186926782131,
-0.16764774918556213,
-0.04399421438574791,
0.046902090311050415,
-0.16094985604286194,
0.020117372274398804,
-0.06413903087377548,
0.06334125250577927,
0.03641495108604431,
-0.05590536445379257,
-0.0248766727745533,
-0.01730942726135254,
0.011945613659918308,
-0.05083848536014557,
-0.18994836509227753,
-0.056277405470609665,
-0.037882111966609955,
0.149809330701828,
-0.25956398248672485,
0.032966937869787216,
0.051140617579221725,
0.14649195969104767,
0.00406361510977149,
-0.05115427449345589,
0.01429014839231968,
-0.05360214412212372,
-0.054652128368616104,
-0.06746816635131836,
-0.006135428790003061,
-0.027576493099331856,
-0.05147203803062439,
0.019243421033024788,
-0.1755700707435608,
-0.021410830318927765,
0.09424154460430145,
0.12876708805561066,
-0.1486445665359497,
-0.018640631809830666,
-0.048725154250860214,
-0.06339836865663528,
-0.0715010017156601,
-0.07038594037294388,
0.10712739825248718,
0.0513901449739933,
0.04796046018600464,
-0.07435787469148636,
-0.07092321664094925,
0.02726263552904129,
0.006906150374561548,
-0.03382374346256256,
0.08727246522903442,
0.05199531093239784,
-0.09209315478801727,
0.0756213590502739,
0.1092359870672226,
0.07177663594484329,
0.09363535046577454,
0.01574566215276718,
-0.11756632477045059,
-0.028492970392107964,
0.036266472190618515,
0.02740776725113392,
0.1465986967086792,
-0.05952361226081848,
0.04016614332795143,
0.04494241625070572,
-0.04170418903231621,
0.022319864481687546,
-0.08787637203931808,
0.024075502529740334,
0.025203049182891846,
-0.0034381982404738665,
0.06284574419260025,
-0.02525499276816845,
-0.0050758360885083675,
0.07016654312610626,
0.047779910266399384,
0.04621000960469246,
0.009655474685132504,
-0.01720241829752922,
-0.1047825813293457,
0.16950392723083496,
-0.0951867327094078,
-0.269941508769989,
-0.17632324993610382,
0.026197833940386772,
0.04035249724984169,
-0.022378476336598396,
0.031619444489479065,
-0.07056326419115067,
-0.10630585998296738,
-0.1060405746102333,
-0.002429972169920802,
0.01714223250746727,
-0.06364088505506516,
-0.0741225928068161,
0.07348573952913284,
0.04382912442088127,
-0.14902326464653015,
0.038552410900592804,
0.055694397538900375,
-0.057955220341682434,
-0.0233661737293005,
0.09118817001581192,
0.12397737801074982,
0.14583967626094818,
-0.021366750821471214,
-0.028626007959246635,
0.029004426673054695,
0.19620531797409058,
-0.13469526171684265,
0.10371150821447372,
0.13814030587673187,
-0.04545360431075096,
0.08360563963651657,
0.1560150384902954,
0.029186224564909935,
-0.08317049592733383,
0.05044832453131676,
0.04082648828625679,
-0.043159641325473785,
-0.2666129767894745,
-0.0534592866897583,
0.012832709588110447,
-0.06255637854337692,
0.09786593168973923,
0.10183793306350708,
0.11542957276105881,
0.034910861402750015,
-0.07166364789009094,
-0.043925940990448,
-0.0058974819257855415,
0.11737963557243347,
-0.05490213260054588,
-0.012639665976166725,
0.07686592638492584,
-0.05086168646812439,
0.005355054512619972,
0.10266812145709991,
0.02973790094256401,
0.17442677915096283,
0.020399179309606552,
0.11231429129838943,
0.06195578724145889,
0.08633565157651901,
0.0007386076031252742,
0.02951662428677082,
0.05147615820169449,
0.017203815281391144,
-0.002300140680745244,
-0.10421168059110641,
-0.006156572140753269,
0.1449710875749588,
0.028103826567530632,
0.029669636860489845,
-0.0018948549404740334,
-0.005003341939300299,
0.05121048167347908,
0.1746254414319992,
-0.011592294089496136,
-0.22072425484657288,
-0.0845772922039032,
0.06936841458082199,
-0.06218599155545235,
-0.12968985736370087,
-0.026130788028240204,
0.045467354357242584,
-0.17519839107990265,
0.026703642681241035,
-0.027433741837739944,
0.0919293761253357,
-0.09345759451389313,
-0.02221956104040146,
0.03687324374914169,
0.084866963326931,
-0.014529162086546421,
0.08703910559415817,
-0.14498743414878845,
0.11886418610811234,
0.02978132851421833,
0.09024628251791,
-0.11081171780824661,
0.07909037172794342,
-0.007550720125436783,
0.009180475026369095,
0.19379350543022156,
-0.011335089802742004,
-0.03514958545565605,
-0.08774717897176743,
-0.11210042238235474,
-0.013537433929741383,
0.12687496840953827,
-0.1243172138929367,
0.08773399889469147,
-0.015198243781924248,
-0.044079482555389404,
0.00937260314822197,
-0.12100647389888763,
-0.17273177206516266,
-0.19628387689590454,
0.05585884302854538,
-0.09575839340686798,
0.025643249973654747,
-0.11914430558681488,
-0.07089093327522278,
-0.02952558360993862,
0.241120383143425,
-0.1745356321334839,
-0.06510113179683685,
-0.1468164622783661,
-0.046294767409563065,
0.1662203073501587,
-0.04437198117375374,
0.0718095526099205,
-0.0208172257989645,
0.20345525443553925,
0.005988610442727804,
-0.004939318168908358,
0.06724198162555695,
-0.08892562240362167,
-0.16873881220817566,
-0.06771010160446167,
0.1510489284992218,
0.11680185794830322,
0.04907919466495514,
-0.002248800592496991,
0.0011772146681323647,
-0.016943959519267082,
-0.1137804463505745,
-0.0033210667315870523,
0.16037839651107788,
0.03878779336810112,
0.025986969470977783,
-0.05243593826889992,
-0.08797456324100494,
-0.06899320334196091,
-0.06853509694337845,
0.06221301481127739,
0.19590823352336884,
-0.10376439243555069,
0.1700313836336136,
0.147536963224411,
-0.07305635511875153,
-0.23175598680973053,
0.035342130810022354,
0.04983805492520332,
0.0014306638622656465,
0.04886869341135025,
-0.18252557516098022,
0.10521943867206573,
0.019543392583727837,
-0.05505957826972008,
0.13485197722911835,
-0.1557481735944748,
-0.1552847921848297,
0.0722852572798729,
0.03904085233807564,
-0.22423844039440155,
-0.1354004591703415,
-0.09622503817081451,
-0.05825018882751465,
-0.14065024256706238,
0.06054598465561867,
-0.002136280992999673,
0.015948504209518433,
0.03500790148973465,
-0.0015643214574083686,
0.027123261243104935,
-0.058935679495334625,
0.18609118461608887,
-0.004065449349582195,
0.020676052197813988,
-0.060264769941568375,
-0.0478842556476593,
0.09839435666799545,
-0.06130504235625267,
0.12208222597837448,
0.004057085141539574,
0.01594383642077446,
-0.10362856835126877,
-0.048314861953258514,
-0.04328322783112526,
0.05154227837920189,
-0.07548051327466965,
-0.10070807486772537,
-0.043625857681035995,
0.08841723203659058,
0.07005169242620468,
-0.03383097052574158,
0.00549331633374095,
-0.07189501076936722,
0.10019614547491074,
0.17795267701148987,
0.17573626339435577,
0.009926567785441875,
-0.07241068035364151,
0.01677953451871872,
-0.04142116755247116,
0.044231921434402466,
-0.2513144314289093,
0.03756171092391014,
0.06098250672221184,
0.029438555240631104,
0.09217222779989243,
-0.020435843616724014,
-0.1820858269929886,
-0.04050002992153168,
0.08094815909862518,
-0.05452597141265869,
-0.22617179155349731,
-0.019085140898823738,
0.0954197570681572,
-0.2020406424999237,
-0.007372708059847355,
0.03995226323604584,
-0.048725228756666183,
-0.023169852793216705,
0.00010950004070764408,
0.06317184865474701,
0.002471912419423461,
0.09773622453212738,
0.0735151618719101,
0.09715340286493301,
-0.08337292820215225,
0.10562895983457565,
0.10150538384914398,
-0.09572599828243256,
0.03605884686112404,
0.06754924356937408,
-0.05300498008728027,
-0.043293699622154236,
0.03665391728281975,
0.033023297786712646,
0.005234600510448217,
-0.060321882367134094,
0.013913018628954887,
-0.036497246474027634,
0.044923391193151474,
0.08326134830713272,
0.03754979372024536,
-0.013354414142668247,
0.06462216377258301,
0.03401726484298706,
-0.10898099094629288,
0.10366570204496384,
0.01731540448963642,
0.04105307161808014,
-0.08384523540735245,
-0.019968897104263306,
0.035425446927547455,
0.030576206743717194,
-0.01765924133360386,
-0.02306121215224266,
-0.02860277332365513,
-0.01614218018949032,
-0.14299540221691132,
-0.023106401786208153,
-0.07243485748767853,
0.006181265693157911,
0.014656842686235905,
-0.031884219497442245,
-0.011233693920075893,
0.02475680410861969,
-0.06979699432849884,
-0.07426341623067856,
-0.006949664559215307,
0.09833318740129471,
-0.15115703642368317,
0.008848577737808228,
0.06907843053340912,
-0.11088496446609497,
0.08190931379795074,
-0.008411259390413761,
0.016245156526565552,
0.022527478635311127,
-0.15448406338691711,
0.05601610988378525,
0.0008648968650959432,
0.01916889287531376,
0.025886621326208115,
-0.16471809148788452,
0.004104440100491047,
-0.04661374166607857,
-0.02149827405810356,
-0.00004464812809601426,
-0.02647159807384014,
-0.12325995415449142,
0.06858719140291214,
-0.015622655861079693,
-0.035931166261434555,
-0.02701525390148163,
0.0539589487016201,
0.07888586074113846,
-0.027474910020828247,
0.10445091128349304,
-0.008690856397151947,
0.04941811040043831,
-0.16801609098911285,
-0.02470702864229679,
-0.04982255399227142,
0.019377702847123146,
0.009884213097393513,
-0.007693959400057793,
0.04183054715394974,
-0.00976533442735672,
0.21883612871170044,
-0.05075952783226967,
0.1607085019350052,
0.05847611650824547,
-0.017352959141135216,
-0.0007513365126214921,
0.06180921941995621,
0.05997028574347496,
0.04658793285489082,
0.009480604901909828,
0.023740366101264954,
-0.022450892254710197,
-0.006695089396089315,
-0.15932634472846985,
0.01890849508345127,
0.14999441802501678,
0.06301083415746689,
0.024745315313339233,
0.05866100639104843,
-0.12775006890296936,
-0.12135478109121323,
0.09311001747846603,
-0.026755332946777344,
0.00928465835750103,
-0.08245618641376495,
0.1358020007610321,
0.14980104565620422,
-0.14000412821769714,
0.05256148427724838,
-0.06134212389588356,
-0.05217423290014267,
-0.10388828068971634,
-0.12032219022512436,
-0.05887215584516525,
-0.053666237741708755,
0.002330566756427288,
-0.03760887682437897,
0.054546963423490524,
0.03344334661960602,
-0.009351172484457493,
-0.00022941511997487396,
0.13597318530082703,
-0.019751882180571556,
-0.0028988157864660025,
0.048313532024621964,
0.03693558648228645,
0.02373051457107067,
-0.05275435373187065,
0.02940409444272518,
0.02539868652820587,
0.032232340425252914,
0.06546790152788162,
0.033412106335163116,
-0.047448933124542236,
0.03804153576493263,
-0.0025254099164158106,
-0.11207924783229828,
0.019641218706965446,
-0.00460948096588254,
-0.0742158442735672,
0.1268945336341858,
0.0407399944961071,
0.010224059224128723,
-0.03741471841931343,
0.24361543357372284,
-0.06653323769569397,
-0.06378097087144852,
-0.13251738250255585,
0.10491154342889786,
-0.0027236645109951496,
0.06476365029811859,
0.023412218317389488,
-0.1284150779247284,
0.005243356805294752,
0.13858191668987274,
0.12181595712900162,
0.0045748427510261536,
0.009228081442415714,
0.0518609918653965,
0.0025186820421367884,
-0.06998204439878464,
0.054019294679164886,
0.06992026418447495,
0.12919506430625916,
-0.07847554981708527,
0.07680778950452805,
0.0006860480643808842,
-0.08370215445756912,
-0.02947772853076458,
0.11312682181596756,
-0.0409729965031147,
0.03491825982928276,
-0.047444481402635574,
0.10916327685117722,
-0.05787910893559456,
-0.29412412643432617,
0.02350960113108158,
-0.09588567912578583,
-0.15202060341835022,
-0.018367812037467957,
0.05944539234042168,
-0.02624768204987049,
0.018029648810625076,
0.06971040368080139,
-0.06011629104614258,
0.20098382234573364,
0.0335683599114418,
-0.07864278554916382,
-0.0664360448718071,
0.04837050288915634,
-0.06564252078533173,
0.2949807047843933,
0.008418165147304535,
0.02863333560526371,
0.10770907253026962,
-0.03253700211644173,
-0.18271861970424652,
0.010723991319537163,
0.1133992001414299,
-0.08056149631738663,
0.08200647681951523,
0.19000613689422607,
-0.012578671798110008,
0.1209007054567337,
0.05294662341475487,
-0.047376248985528946,
0.04217283055186272,
-0.03389401361346245,
-0.051268599927425385,
-0.10752558708190918,
0.058453381061553955,
-0.05909625440835953,
0.15447644889354706,
0.10152646154165268,
-0.05671518296003342,
-0.004550917539745569,
-0.05555408447980881,
0.04875178262591362,
0.01804669201374054,
0.12263146042823792,
0.02951994352042675,
-0.1865430772304535,
0.032826557755470276,
-0.01144319772720337,
0.10186848044395447,
-0.25588861107826233,
-0.08421015739440918,
0.08833149075508118,
-0.011924264021217823,
-0.05105875805020332,
0.10560628771781921,
0.057650718837976456,
0.04243382066488266,
-0.043439045548439026,
-0.10480839014053345,
-0.02186836116015911,
0.14663739502429962,
-0.1469624787569046,
-0.025013303384184837
] |
null | null | transformers | Quants thanks to @konz00: https://huggingface.co/konz00/EvilxEchidna-7b-GGUF
The following models were included in the merge:
* [Test157t/Echidna-7b-128k](https://huggingface.co/Test157t/Echidna-7b-128k)
* [maywell/PiVoT-0.1-Evil-a](https://huggingface.co/maywell/PiVoT-0.1-Evil-a)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
slices:
- sources:
- model: maywell/PiVoT-0.1-Evil-a
normalize: true
layer_range: [0, 32]
- model: Test157t/Echidna-7b-128k
layer_range: [0, 32]
merge_method: slerp
base_model: Test157t/Echidna-7b-128k
parameters:
t:
- filter: self_attn
value: [0, 0.5, 0.3, 0.7, 1]
- filter: mlp
value: [1, 0.5, 0.7, 0.3, 0]
- value: 0.5
dtype: bfloat16
```
| {"library_name": "transformers", "tags": ["mergekit", "merge"], "base_model": ["Test157t/Echidna-7b-128k", "maywell/PiVoT-0.1-Evil-a"]} | text-generation | Test157t/EvilxEchidna-7b | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"mergekit",
"merge",
"base_model:Test157t/Echidna-7b-128k",
"base_model:maywell/PiVoT-0.1-Evil-a",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-15T05:23:04+00:00 | [] | [] | TAGS
#transformers #safetensors #mistral #text-generation #mergekit #merge #base_model-Test157t/Echidna-7b-128k #base_model-maywell/PiVoT-0.1-Evil-a #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
| Quants thanks to @konz00: URL
The following models were included in the merge:
* Test157t/Echidna-7b-128k
* maywell/PiVoT-0.1-Evil-a
### Configuration
The following YAML configuration was used to produce this model:
| [
"### Configuration\n\nThe following YAML configuration was used to produce this model:"
] | [
"TAGS\n#transformers #safetensors #mistral #text-generation #mergekit #merge #base_model-Test157t/Echidna-7b-128k #base_model-maywell/PiVoT-0.1-Evil-a #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"### Configuration\n\nThe following YAML configuration was used to produce this model:"
] | [
89,
17
] | [
"passage: TAGS\n#transformers #safetensors #mistral #text-generation #mergekit #merge #base_model-Test157t/Echidna-7b-128k #base_model-maywell/PiVoT-0.1-Evil-a #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### Configuration\n\nThe following YAML configuration was used to produce this model:"
] | [
-0.06968680024147034,
-0.10638370364904404,
-0.0013910392299294472,
-0.006426576990634203,
0.10228051245212555,
0.03859635815024376,
0.18152950704097748,
0.04734649136662483,
-0.024282781407237053,
0.0464051254093647,
0.17198914289474487,
0.07378556579351425,
0.01967877522110939,
0.13705874979496002,
-0.05275262892246246,
-0.1577739268541336,
0.09511573612689972,
0.029763463884592056,
-0.10387644916772842,
0.11603325605392456,
0.08302424103021622,
-0.05185728147625923,
0.11892610788345337,
0.005765047390013933,
-0.1440761536359787,
0.06871002912521362,
-0.029978357255458832,
-0.031962182372808456,
0.08915740251541138,
0.09166626632213593,
0.10604996234178543,
0.030361097306013107,
-0.024047542363405228,
-0.06730655580759048,
0.04889954626560211,
-0.00406520115211606,
-0.018287893384695053,
0.0570986270904541,
0.04089880734682083,
-0.02474709413945675,
0.05963125079870224,
-0.00435355631634593,
-0.0019682336132973433,
-0.002494943095371127,
-0.10835663229227066,
0.024202698841691017,
-0.01903468184173107,
0.1010059267282486,
0.15550051629543304,
0.10315421968698502,
-0.02517777308821678,
0.08406884968280792,
0.01920168660581112,
0.08157329261302948,
0.09169011563062668,
-0.2457040548324585,
-0.02528471127152443,
0.05401977151632309,
0.0801086276769638,
-0.008658833801746368,
0.06074335053563118,
0.02257649227976799,
0.12130475789308548,
-0.003274340881034732,
-0.0390501543879509,
-0.020417707040905952,
0.11961329728364944,
0.0006339203100651503,
-0.1221163347363472,
-0.054365772753953934,
0.21341454982757568,
-0.04014526680111885,
0.03234706446528435,
0.002007615752518177,
-0.12288692593574524,
0.015568061731755733,
0.002060233149677515,
-0.028538357466459274,
-0.0574464350938797,
0.034046709537506104,
0.01645655371248722,
-0.023389499634504318,
-0.10023616999387741,
-0.04768342152237892,
-0.13579092919826508,
0.24609491229057312,
0.0419490821659565,
0.033622823655605316,
-0.09424634277820587,
0.08705529570579529,
-0.026966417208313942,
-0.11863818019628525,
0.07380709797143936,
-0.04961875453591347,
0.03817364200949669,
-0.022200170904397964,
-0.08151450008153915,
-0.092034250497818,
0.14602836966514587,
0.09670267254114151,
-0.01857946626842022,
0.031888820230960846,
0.038251616060733795,
0.057339876890182495,
0.04837289825081825,
-0.0007704431191086769,
-0.13581496477127075,
-0.02755349688231945,
0.0303924772888422,
0.07865475118160248,
0.0814012810587883,
-0.0035895821638405323,
-0.1116955578327179,
-0.028377532958984375,
0.11060225963592529,
0.025328323245048523,
0.04563184827566147,
0.11993614584207535,
-0.07689289003610611,
-0.005405154079198837,
0.12701381742954254,
-0.054138097912073135,
0.004414976574480534,
0.012922599911689758,
0.0013030702248215675,
-0.041763532906770706,
0.06556864827871323,
0.039764728397130966,
-0.03798731788992882,
0.09377820044755936,
-0.10196908563375473,
-0.009251780807971954,
-0.056721821427345276,
-0.0869264230132103,
0.01889491081237793,
0.01921897381544113,
0.02503526397049427,
-0.09175115078687668,
-0.2664487063884735,
-0.017912857234477997,
0.029999272897839546,
-0.02515307255089283,
-0.06233154609799385,
-0.06497561931610107,
-0.021219005808234215,
0.02131843939423561,
-0.04491151124238968,
0.04335084930062294,
-0.05213620141148567,
0.02945651300251484,
0.022766677662730217,
0.03419617563486099,
-0.09707248210906982,
0.03774666041135788,
-0.10211264342069626,
0.08577977120876312,
-0.11932257562875748,
0.061368126422166824,
-0.01970558799803257,
0.1145431399345398,
-0.03414246067404747,
0.02845398150384426,
-0.09048858284950256,
0.07334250211715698,
0.08420149981975555,
0.2243841290473938,
-0.08970950543880463,
-0.09830468147993088,
0.010874244384467602,
-0.1688365638256073,
-0.1786130964756012,
0.08044105768203735,
-0.01370999775826931,
0.036221276968717575,
0.07739130407571793,
0.21152080595493317,
-0.014049483463168144,
-0.05781499296426773,
0.03131580725312233,
0.019315989688038826,
-0.10864373296499252,
-0.042417995631694794,
0.018211839720606804,
0.0041585443541407585,
-0.2703537940979004,
0.04108813777565956,
0.023840133100748062,
0.10721215605735779,
-0.07635143399238586,
-0.027468986809253693,
-0.11437396705150604,
-0.046957891434431076,
0.06669364869594574,
-0.0018619562033563852,
0.06376586109399796,
-0.06920825690031052,
-0.01434042677283287,
0.0842268317937851,
0.05575205013155937,
-0.049871496856212616,
-0.009675429202616215,
0.003946361597627401,
0.13624608516693115,
-0.1686854213476181,
0.051220834255218506,
-0.10617513954639435,
-0.030558321624994278,
-0.04310576245188713,
0.06067390739917755,
0.03583820164203644,
0.046556584537029266,
0.08050227910280228,
0.01277906820178032,
-0.031835511326789856,
-0.0640527606010437,
0.16461500525474548,
0.05772939324378967,
-0.13521096110343933,
-0.18427957594394684,
0.000515591527801007,
-0.05300175026059151,
0.10344604402780533,
-0.1340176910161972,
0.0403694324195385,
-0.04373641312122345,
0.16398362815380096,
-0.05766547843813896,
0.10058628767728806,
0.06471224874258041,
0.029988646507263184,
-0.06129405274987221,
0.005961070768535137,
0.0650121197104454,
0.00933363288640976,
-0.10209853202104568,
0.14735829830169678,
-0.17975088953971863,
0.16948549449443817,
0.15487413108348846,
-0.122565858066082,
-0.0036258569452911615,
-0.04345311224460602,
-0.0320008248090744,
-0.061936672776937485,
0.057061124593019485,
-0.08010847121477127,
0.02674882858991623,
-0.013456021435558796,
0.15987801551818848,
-0.056499846279621124,
0.0076779332011938095,
-0.02658763714134693,
-0.10044314712285995,
-0.08205678313970566,
0.10543453693389893,
-0.07526255398988724,
-0.1994154453277588,
0.16593414545059204,
0.17392419278621674,
-0.05727256089448929,
0.12775039672851562,
0.0074867685325443745,
-0.028915369883179665,
-0.010214479640126228,
-0.021535780280828476,
-0.03822091221809387,
-0.022171003744006157,
-0.09048005193471909,
0.027454476803541183,
0.051125746220350266,
-0.024480342864990234,
0.10870502889156342,
-0.11678484082221985,
0.010992329567670822,
0.03266414627432823,
0.008957039564847946,
0.06840694695711136,
0.05620237812399864,
-0.0038834786973893642,
0.08512188494205475,
-0.014297425746917725,
-0.04349455237388611,
0.06946570426225662,
0.040801480412483215,
-0.11833162605762482,
0.16961288452148438,
-0.1356085240840912,
-0.22937150299549103,
-0.18730266392230988,
-0.016720734536647797,
-0.12666887044906616,
0.03355645015835762,
0.08180295675992966,
-0.0692884773015976,
-0.03821304813027382,
-0.09886344522237778,
0.0821772888302803,
-0.009893027134239674,
0.008645003661513329,
-0.051084764301776886,
-0.008749866858124733,
0.026328682899475098,
-0.07812283933162689,
-0.020866919308900833,
-0.01895419880747795,
0.029151679947972298,
0.07740634679794312,
-0.14261937141418457,
0.06752435863018036,
0.1229807585477829,
0.019668234512209892,
0.014776177704334259,
0.002649827627465129,
0.21783745288848877,
-0.0024285651743412018,
0.02279255911707878,
0.1354086548089981,
-0.08973397314548492,
0.07012073695659637,
0.20612576603889465,
-0.03310204669833183,
-0.029007354751229286,
0.02288181707262993,
-0.05328142270445824,
-0.04913875088095665,
-0.13953150808811188,
-0.1310776174068451,
-0.06735289096832275,
0.005248267203569412,
0.10378576070070267,
0.06602563709020615,
0.1026466116309166,
0.12707628309726715,
-0.055620383471250534,
0.01338747888803482,
-0.001437737955711782,
0.07184591144323349,
0.2039671093225479,
-0.005730145610868931,
0.0857837125658989,
-0.0529717393219471,
-0.10509539395570755,
0.04694988578557968,
0.05219684913754463,
0.1289561539888382,
0.05026644840836525,
0.006495431996881962,
0.0717959925532341,
0.05626489594578743,
0.11267504096031189,
0.15097397565841675,
-0.019856804981827736,
-0.03229551017284393,
-0.022380484268069267,
-0.0654500275850296,
-0.029746215790510178,
0.07439908385276794,
-0.0611627995967865,
0.03685804829001427,
-0.05280524864792824,
-0.04866722598671913,
0.0651945099234581,
0.109388068318367,
0.08419396728277206,
-0.3111100494861603,
-0.04729898273944855,
0.06078208237886429,
-0.009409409947693348,
-0.05906176194548607,
-0.008504618890583515,
-0.03939316049218178,
0.01697010174393654,
0.11673219501972198,
0.01059913169592619,
0.09747092425823212,
0.026372790336608887,
0.06525497138500214,
-0.11568544059991837,
0.057122133672237396,
-0.029447175562381744,
0.0862346664071083,
-0.2461707890033722,
0.25887006521224976,
0.018946034833788872,
0.008838681504130363,
-0.027761515229940414,
-0.014192993752658367,
0.04068874940276146,
0.200252503156662,
-0.018358556553721428,
-0.004344411194324493,
0.01012356672435999,
-0.0009374207002110779,
-0.07791998237371445,
0.03337617218494415,
-0.03373314440250397,
-0.0062441276386380196,
0.06804028898477554,
-0.005048520863056183,
-0.011748841032385826,
0.026954082772135735,
0.04992891475558281,
-0.12813328206539154,
-0.11297981441020966,
0.036681681871414185,
0.06287141144275665,
0.06856857985258102,
-0.051836103200912476,
-0.06449444591999054,
-0.052090514451265335,
0.1665075123310089,
0.07743396610021591,
-0.10766776651144028,
-0.11181595176458359,
0.02017979882657528,
0.14252883195877075,
-0.029268888756632805,
0.011904105544090271,
-0.04390757158398628,
0.0436837412416935,
-0.00931227020919323,
-0.16455787420272827,
0.08890711516141891,
-0.10573213547468185,
-0.06713558733463287,
-0.024351300671696663,
0.0730966329574585,
-0.06070619076490402,
-0.009754820726811886,
0.005216771271079779,
0.048472966998815536,
-0.16487175226211548,
-0.04238685965538025,
-0.010462431237101555,
0.04217401146888733,
0.051627255976200104,
0.06390287727117538,
-0.02189781703054905,
-0.132115438580513,
-0.022438328713178635,
-0.004659069702029228,
0.10124167799949646,
0.17229804396629333,
0.009166968055069447,
0.007422308437526226,
0.2310933917760849,
-0.08445040136575699,
-0.23290207982063293,
-0.11525404453277588,
-0.028475208207964897,
0.058991093188524246,
-0.08644221723079681,
-0.0319305956363678,
0.0946672111749649,
0.07018912583589554,
-0.02538149431347847,
0.00859724823385477,
-0.2055978924036026,
-0.16056127846240997,
0.14427390694618225,
0.06962636858224869,
0.3713166415691376,
-0.13684915006160736,
-0.08544369041919708,
-0.12183745950460434,
-0.03781963512301445,
0.019522057846188545,
-0.17645789682865143,
0.0735638216137886,
-0.023128487169742584,
0.04892697185277939,
0.01528053730726242,
-0.05340758338570595,
0.14539393782615662,
-0.05207551643252373,
0.06235339492559433,
-0.09542090445756912,
-0.033276136964559555,
0.0541955940425396,
-0.04682501405477524,
0.10052859783172607,
-0.13932622969150543,
0.0562131367623806,
-0.08268583565950394,
-0.038320768624544144,
-0.0167813990265131,
0.07850925624370575,
0.002447338541969657,
-0.012937678955495358,
-0.03402784466743469,
-0.06076347082853317,
-0.03436656296253204,
-0.02347608283162117,
0.1083545833826065,
-0.03525987267494202,
0.10803849250078201,
0.1994573175907135,
0.09966851770877838,
-0.11215894669294357,
0.06904932856559753,
0.05799665302038193,
-0.08492816239595413,
0.09668581187725067,
-0.10563357174396515,
-0.003683395916596055,
0.11774266511201859,
-0.03782476484775543,
0.07595477253198624,
0.05220598354935646,
0.010986861772835255,
-0.013401400297880173,
0.11892583221197128,
-0.22519749402999878,
-0.11425266414880753,
-0.037790875881910324,
-0.002741707256063819,
-0.03140305355191231,
0.07122208923101425,
0.16374920308589935,
-0.03214016929268837,
-0.020378246903419495,
0.0029220080468803644,
-0.004609079100191593,
-0.08457069098949432,
0.15459993481636047,
0.032263919711112976,
0.033028215169906616,
-0.12532269954681396,
0.05527342110872269,
-0.018456658348441124,
-0.08385095000267029,
0.04386860504746437,
0.04226594418287277,
-0.1313742697238922,
-0.1183789074420929,
-0.04090134799480438,
0.28456780314445496,
-0.06552961468696594,
-0.09803616255521774,
-0.10719403624534607,
-0.15368252992630005,
0.03919537365436554,
0.11889065057039261,
0.06601790338754654,
0.04778077080845833,
-0.004616215359419584,
-0.06565918773412704,
-0.04364531859755516,
0.07004299759864807,
0.10914666205644608,
0.06299713253974915,
-0.11792246252298355,
0.047779109328985214,
-0.03306197375059128,
0.055321719497442245,
-0.08592545241117477,
0.020845789462327957,
-0.149104043841362,
-0.011481175199151039,
-0.20839683711528778,
-0.025995755568146706,
-0.12591946125030518,
-0.046894267201423645,
0.04427328705787659,
-0.08721683919429779,
-0.040046900510787964,
0.0036276583559811115,
-0.07239268720149994,
0.029608067125082016,
-0.017357874661684036,
0.03332974761724472,
-0.03557778149843216,
-0.017829744145274162,
0.06006861850619316,
-0.04932722449302673,
0.030151790007948875,
0.12504474818706512,
-0.1098673939704895,
-0.0051405462436378,
-0.16762715578079224,
-0.04403456673026085,
0.059206701815128326,
-0.019846517592668533,
0.03900966793298721,
-0.059595733880996704,
-0.0051438999362289906,
0.11324027925729752,
0.025968728587031364,
-0.007467953953891993,
0.08269108831882477,
-0.042491886764764786,
-0.014465119689702988,
-0.015073930844664574,
-0.09084383398294449,
0.001525539206340909,
-0.04377005249261856,
0.10484293848276138,
0.04412061348557472,
0.17050527036190033,
-0.09762878715991974,
-0.01933436468243599,
-0.10969565808773041,
0.02578023076057434,
-0.005444525741040707,
-0.15379881858825684,
-0.058913569897413254,
-0.12421819567680359,
0.011115902103483677,
0.021327752619981766,
0.19945141673088074,
-0.017686109989881516,
-0.05976992845535278,
0.011462042108178139,
0.06971297413110733,
0.089778371155262,
0.06981568038463593,
0.29376280307769775,
-0.022609882056713104,
0.017236528918147087,
-0.13442644476890564,
0.05351096764206886,
0.042839165776968,
-0.028159551322460175,
-0.03835447132587433,
0.03104478493332863,
-0.04963652044534683,
0.1190580353140831,
0.040696654468774796,
0.039976440370082855,
0.01413526851683855,
-0.16284538805484772,
-0.09807649254798889,
0.05122619494795799,
0.051115501672029495,
0.15928448736667633,
0.21024364233016968,
-0.011639924719929695,
-0.019896628335118294,
-0.02929726056754589,
-0.040248479694128036,
-0.07066308706998825,
-0.07791195064783096,
-0.13174228370189667,
-0.19176261126995087,
-0.014634243212640285,
-0.08553792536258698,
-0.04088364914059639,
0.08743185549974442,
0.03547704964876175,
-0.018912184983491898,
0.14546597003936768,
0.20019856095314026,
-0.035176221281290054,
0.0280920322984457,
-0.0415463000535965,
0.0009618796175345778,
-0.02038242109119892,
-0.042716607451438904,
-0.005448105279356241,
-0.04190775007009506,
-0.05006856843829155,
0.03318651020526886,
0.01039590872824192,
0.061801981180906296,
-0.05387937277555466,
-0.08746596425771713,
-0.029970554634928703,
0.04647878557443619,
0.019916005432605743,
0.0668656975030899,
0.005700564943253994,
-0.04224191978573799,
0.004327933769673109,
0.07777766138315201,
-0.049484748393297195,
-0.1302744746208191,
-0.07750862836837769,
0.17956076562404633,
-0.036608435213565826,
0.05618643760681152,
-0.012739064171910286,
-0.02661936543881893,
0.03366239368915558,
0.20837034285068512,
0.28103503584861755,
-0.05211134999990463,
-0.0007719608838669956,
-0.06104151904582977,
0.019671667367219925,
0.028943244367837906,
0.04347727447748184,
0.045428741723299026,
0.08262871950864792,
-0.09260687977075577,
0.03389466926455498,
-0.06574969738721848,
-0.04277461767196655,
-0.12931780517101288,
-0.04899999499320984,
0.02271847054362297,
-0.09541913866996765,
-0.01564745046198368,
0.10155462473630905,
-0.07487832009792328,
0.07063882797956467,
-0.0587603785097599,
-0.07936415076255798,
-0.05593094974756241,
-0.03248804435133934,
0.15604029595851898,
-0.004899418447166681,
0.07478433847427368,
-0.07602453231811523,
0.008398988284170628,
0.023547865450382233,
-0.048191264271736145,
-0.11136455833911896,
-0.02846018597483635,
0.04712031036615372,
-0.04608874022960663,
-0.01936551369726658,
0.009372121654450893,
0.06646809726953506,
0.10703183710575104,
-0.005145181901752949,
-0.12197166681289673,
0.06132306903600693,
-0.018384337425231934,
-0.011659631505608559,
0.06039181351661682,
-0.05235476419329643,
0.04404130578041077,
-0.15526142716407776,
0.03059437870979309,
-0.14949281513690948,
0.027828894555568695,
-0.007080362178385258,
-0.008779010735452175,
-0.03382730484008789,
0.03309336677193642,
-0.025829261168837547,
0.11123377829790115,
0.08430503308773041,
-0.053977854549884796,
0.012751301750540733,
-0.017019236460328102,
0.036332301795482635,
0.002762230345979333,
0.04515645653009415,
0.01505332626402378,
-0.23192115128040314,
-0.026380017399787903,
0.13988962769508362,
-0.011273225769400597,
-0.25892215967178345,
-0.04692826420068741,
-0.15822885930538177,
-0.025918753817677498,
-0.060728441923856735,
0.09611138701438904,
0.1785295456647873,
0.029862798750400543,
-0.006869829725474119,
-0.13668721914291382,
0.01641283743083477,
0.11771785467863083,
-0.05580104514956474,
-0.12261778116226196
] |
null | null | transformers |
# Zero凉宫春日
基于Qwen_7B_base 热启,在15w高质量的NPC抽取样本上进行2k训练
epoch=2,batch_size=64,lr=2e-5
| {} | text-generation | silk-road/Haruhi-dialogue-action-extract-7B | [
"transformers",
"pytorch",
"qwen",
"text-generation",
"custom_code",
"autotrain_compatible",
"region:us"
] | 2024-02-15T05:23:07+00:00 | [] | [] | TAGS
#transformers #pytorch #qwen #text-generation #custom_code #autotrain_compatible #region-us
|
# Zero凉宫春日
基于Qwen_7B_base 热启,在15w高质量的NPC抽取样本上进行2k训练
epoch=2,batch_size=64,lr=2e-5
| [
"# Zero凉宫春日\n\n基于Qwen_7B_base 热启,在15w高质量的NPC抽取样本上进行2k训练\n\n\nepoch=2,batch_size=64,lr=2e-5"
] | [
"TAGS\n#transformers #pytorch #qwen #text-generation #custom_code #autotrain_compatible #region-us \n",
"# Zero凉宫春日\n\n基于Qwen_7B_base 热启,在15w高质量的NPC抽取样本上进行2k训练\n\n\nepoch=2,batch_size=64,lr=2e-5"
] | [
34,
53
] | [
"passage: TAGS\n#transformers #pytorch #qwen #text-generation #custom_code #autotrain_compatible #region-us \n# Zero凉宫春日\n\n基于Qwen_7B_base 热启,在15w高质量的NPC抽取样本上进行2k训练\n\n\nepoch=2,batch_size=64,lr=2e-5"
] | [
-0.07570357620716095,
0.06770378351211548,
-0.006306175142526627,
0.05619881674647331,
0.07064640522003174,
-0.021577684208750725,
0.09454905241727829,
0.13661615550518036,
0.025025121867656708,
-0.025743093341588974,
0.13482841849327087,
0.10630793869495392,
0.019156618043780327,
0.121566541492939,
-0.02600725367665291,
-0.18550895154476166,
0.05380392447113991,
0.10048771649599075,
-0.05264177545905113,
0.12058663368225098,
0.02184700407087803,
-0.06362084299325943,
0.07607664912939072,
-0.03484070301055908,
-0.1512148231267929,
-0.01771715097129345,
-0.015420333482325077,
-0.07776495069265366,
0.1078241690993309,
0.009309913963079453,
0.04190477356314659,
0.08469061553478241,
0.02109920047223568,
-0.09916452318429947,
0.028527773916721344,
-0.03841805458068848,
-0.06618434935808182,
0.053053632378578186,
0.010906240902841091,
0.09900232404470444,
0.04176972061395645,
0.05996585264801979,
-0.057598695158958435,
0.03758466988801956,
-0.06603369116783142,
-0.01537676714360714,
0.04168863967061043,
0.08655340224504471,
0.09982292354106903,
0.08059624582529068,
-0.013139137998223305,
0.18854382634162903,
-0.16804035007953644,
0.09017431735992432,
0.051171962171792984,
-0.3750106990337372,
-0.015552552416920662,
0.09605487436056137,
0.08271834999322891,
0.1039951741695404,
-0.06814473867416382,
-0.011240698397159576,
0.06747447699308395,
-0.0014522382989525795,
-0.06206758692860603,
-0.06385651975870132,
-0.06594852358102798,
0.030810551717877388,
-0.0900210291147232,
0.042999450117349625,
0.13524708151817322,
-0.0058580925688147545,
0.006331669632345438,
0.05059065669775009,
-0.08423767238855362,
-0.21070480346679688,
-0.001866351580247283,
-0.030621835961937904,
-0.0427887886762619,
0.04303713142871857,
-0.10130418092012405,
-0.0296240895986557,
-0.05479198321700096,
-0.11135799437761307,
-0.1430026739835739,
0.15031112730503082,
0.04458184912800789,
0.013853459618985653,
-0.129562646150589,
0.10063611716032028,
0.01674405112862587,
-0.0866893008351326,
-0.04963473975658417,
-0.05720176175236702,
-0.009272752329707146,
0.03796197101473808,
-0.028123797848820686,
0.05255746841430664,
0.10873822122812271,
0.13381168246269226,
0.03456227853894234,
0.07453952729701996,
0.023974932730197906,
0.04955502972006798,
-0.06078648939728737,
0.09759856015443802,
0.021521495655179024,
-0.08226636052131653,
0.07355950027704239,
0.055431388318538666,
-0.007781671825796366,
-0.06448644399642944,
-0.14938847720623016,
-0.10511596500873566,
0.07984800636768341,
0.037407852709293365,
0.013887853361666203,
0.0648936927318573,
0.016288040205836296,
-0.032683756202459335,
0.11001380532979965,
-0.027244308963418007,
-0.03545401245355606,
0.03340062126517296,
-0.04367391765117645,
0.021933527663350105,
-0.03538699820637703,
0.03074127994477749,
-0.044950731098651886,
-0.10118350386619568,
-0.06773937493562698,
-0.03675416111946106,
0.006820071954280138,
-0.02790032885968685,
0.005428674630820751,
0.026382096111774445,
0.028211582452058792,
-0.15617603063583374,
-0.13965043425559998,
0.03511402755975723,
-0.04015059024095535,
-0.009148462675511837,
-0.0721767246723175,
-0.0883803740143776,
-0.08964360505342484,
-0.007619316689670086,
-0.008744346909224987,
0.027604367583990097,
-0.011201865039765835,
0.10514238476753235,
0.1218353733420372,
0.1150272786617279,
-0.1264684945344925,
0.02051714062690735,
-0.1293361932039261,
0.022386325523257256,
0.005910452920943499,
0.07199384272098541,
-0.006942972540855408,
0.023581435903906822,
0.006427735555917025,
-0.06159094348549843,
-0.049875013530254364,
-0.020943276584148407,
0.01460698526352644,
0.11763220280408859,
-0.1457929015159607,
-0.09417568147182465,
0.1825108379125595,
-0.07975510507822037,
-0.11909178644418716,
0.13024143874645233,
0.01625964231789112,
-0.04896878823637962,
0.027664486318826675,
0.18521256744861603,
0.10910598188638687,
0.022916052490472794,
-0.011577090248465538,
0.0830836370587349,
-0.05734378099441528,
-0.14364634454250336,
0.08328188210725784,
0.10703359544277191,
-0.09294296056032181,
0.06485796719789505,
-0.06382111459970474,
0.07320646196603775,
-0.08155309408903122,
-0.09403415024280548,
-0.0470401793718338,
-0.09007451683282852,
0.11569082736968994,
0.0008520655683241785,
0.13598991930484772,
-0.04256940260529518,
-0.029924748465418816,
-0.057966992259025574,
0.10840047895908356,
-0.01866009458899498,
0.010663056746125221,
-0.12741874158382416,
0.12987090647220612,
0.010030707344412804,
0.04802843928337097,
-0.08840782940387726,
0.05165332928299904,
0.049179427325725555,
0.01524398848414421,
0.02055499516427517,
0.08514264971017838,
0.05277194455265999,
0.07520323246717453,
-0.02471756748855114,
-0.0759892463684082,
-0.01757109723985195,
-0.025171494111418724,
-0.07868269830942154,
-0.05564410984516144,
-0.01213738415390253,
-0.037174042314291,
0.09099771827459335,
-0.10492954403162003,
0.031641941517591476,
-0.07968438416719437,
0.04300154745578766,
-0.025103535503149033,
0.03361045941710472,
-0.01887936145067215,
0.08413965255022049,
-0.06872377544641495,
0.03830092400312424,
0.07482584565877914,
-0.009127042256295681,
-0.14839451014995575,
0.08384787291288376,
-0.12691152095794678,
0.20315510034561157,
0.14692220091819763,
-0.1751519739627838,
0.0545240193605423,
-0.008688884787261486,
-0.031132718548178673,
-0.03528924286365509,
0.05453783646225929,
0.01527803298085928,
0.13864879310131073,
-0.010978573933243752,
0.1266528218984604,
-0.05706808716058731,
-0.007727476302534342,
-0.0024292394518852234,
-0.03799604997038841,
0.02256467379629612,
0.09621619433164597,
0.09868943691253662,
-0.1118212342262268,
0.07976885139942169,
0.13785837590694427,
-0.06469918042421341,
0.1912287473678589,
0.007991977035999298,
-0.018134431913495064,
-0.021283142268657684,
0.018668830394744873,
-0.03733330965042114,
0.0031543446239084005,
-0.16055108606815338,
-0.03771167993545532,
0.03193110600113869,
-0.01980232633650303,
0.05259130150079727,
-0.1218937486410141,
-0.08090432733297348,
-0.046951714903116226,
-0.051157910376787186,
-0.10949550569057465,
0.04315293952822685,
-0.012545483186841011,
0.08485859632492065,
-0.003613740438595414,
-0.05192935839295387,
0.05963573232293129,
0.007476184982806444,
-0.07454366236925125,
0.13958418369293213,
-0.08710766583681107,
-0.16184133291244507,
-0.11090213805437088,
-0.006909549236297607,
-0.07498985528945923,
-0.0003483023610897362,
0.05229075998067856,
-0.16650788486003876,
0.059455785900354385,
-0.002122540958225727,
-0.0033224294893443584,
-0.03149932995438576,
0.0455966517329216,
0.05248098075389862,
0.04243476688861847,
0.014651666395366192,
-0.0711359977722168,
0.001663953997194767,
-0.05867644026875496,
-0.12655898928642273,
0.14585264027118683,
-0.14490079879760742,
0.06829190254211426,
0.17622236907482147,
-0.011956814676523209,
0.02996572107076645,
0.0022614700719714165,
0.22765624523162842,
-0.028803672641515732,
0.0030678596813231707,
0.1603403240442276,
0.004426007159054279,
0.022804541513323784,
0.048336438834667206,
-0.007157941348850727,
-0.04405127093195915,
0.02443532645702362,
-0.02331295609474182,
-0.08006630092859268,
-0.15206484496593475,
-0.039519377052783966,
-0.0387633852660656,
0.10772761702537537,
-0.008334153331816196,
0.0722714439034462,
0.10663104802370071,
0.05584069713950157,
0.03215174376964569,
0.054198432713747025,
-0.059015221893787384,
0.022653695195913315,
0.07253237813711166,
0.05306725576519966,
0.12084932625293732,
-0.027975527569651604,
-0.05839760601520538,
0.03421799838542938,
0.07116939127445221,
0.11149243265390396,
0.029307417571544647,
0.005151977762579918,
-0.0030240111518651247,
0.2605188488960266,
0.16498112678527832,
0.03696995973587036,
0.026063038036227226,
-0.05496903881430626,
0.0059786103665828705,
-0.007973061874508858,
-0.05801773443818092,
0.044894974678754807,
0.060049813240766525,
-0.03673520311713219,
-0.05307791382074356,
0.10310389846563339,
0.03354170173406601,
0.08765684813261032,
0.07936442643404007,
-0.15659500658512115,
-0.05032866448163986,
0.0011227393988519907,
-0.026846570894122124,
0.000571345561183989,
0.10672159492969513,
0.10591839998960495,
-0.10215527564287186,
-0.011201188899576664,
-0.008995506912469864,
0.10906574875116348,
-0.016869105398654938,
0.09777624160051346,
-0.054915379732847214,
-0.05156693235039711,
0.0309890229254961,
0.07545118778944016,
-0.27571654319763184,
0.17406557500362396,
-0.01979091763496399,
0.04781493544578552,
-0.03261109068989754,
-0.07746780663728714,
0.054099272936582565,
0.04052911326289177,
0.032009925693273544,
0.018998105078935623,
0.00044844241347163916,
-0.2100859433412552,
-0.12757621705532074,
0.07768701761960983,
0.06467839330434799,
0.027903007343411446,
0.04762062057852745,
-0.0072594815865159035,
-0.007441462948918343,
-0.010334713384509087,
0.08973304182291031,
-0.06302355229854584,
-0.0009829322807490826,
0.02204115316271782,
0.07629828155040741,
0.05662906914949417,
-0.03017439693212509,
-0.06289906054735184,
-0.07574985921382904,
0.06875954568386078,
-0.11776775121688843,
-0.05389517918229103,
-0.03218516334891319,
0.014804847538471222,
0.06945240497589111,
-0.1082281619310379,
0.0109410360455513,
-0.03132361173629761,
-0.007559907156974077,
-0.0002901557309087366,
-0.13638408482074738,
0.05671463534235954,
-0.06317664682865143,
-0.17073559761047363,
-0.018146706745028496,
0.08778060972690582,
0.005000291392207146,
0.09881570935249329,
0.019425567239522934,
-0.06338256597518921,
-0.06391816586256027,
-0.11440867185592651,
0.03749899938702583,
-0.12219975888729095,
-0.08839548379182816,
-0.06763885915279388,
-0.04317500442266464,
0.04660319164395332,
-0.1035245954990387,
-0.019691962748765945,
0.20504769682884216,
0.25869980454444885,
-0.05834461748600006,
0.012483332306146622,
0.16393537819385529,
0.015120350755751133,
-0.20693296194076538,
-0.11998897790908813,
-0.0014877996873110533,
0.024423697963356972,
-0.07015643268823624,
-0.16880619525909424,
0.05452177673578262,
0.08679696917533875,
0.009618234820663929,
0.060556910932064056,
-0.24069733917713165,
-0.07562875747680664,
0.12745265662670135,
-0.004154921974986792,
0.30804476141929626,
-0.0867915228009224,
-0.05936665087938309,
0.056191328912973404,
-0.11986339837312698,
0.07985445857048035,
-0.08749429136514664,
0.10609634965658188,
-0.0916268453001976,
0.04089602455496788,
0.017625946551561356,
-0.03504040464758873,
0.09949278831481934,
0.03694632649421692,
0.019238995388150215,
-0.03052210621535778,
-0.12811794877052307,
-0.03608885779976845,
-0.00413482217118144,
-0.01664160192012787,
-0.0735982358455658,
0.05610155686736107,
-0.10624981671571732,
-0.0007176660583354533,
-0.09487461298704147,
-0.012450749054551125,
0.011710688471794128,
-0.05732938274741173,
0.0067345695570111275,
-0.005743863992393017,
-0.026957668364048004,
0.03446078673005104,
0.1194416955113411,
-0.00038703883183188736,
0.09193605929613113,
0.20551784336566925,
0.020516954362392426,
-0.023970196023583412,
0.1183953732252121,
-0.04393523558974266,
-0.04309613257646561,
0.11543389409780502,
-0.08428940176963806,
0.03138245642185211,
0.1390702724456787,
0.012878695502877235,
0.021353473886847496,
0.09375650435686111,
0.010568618774414062,
0.09657393395900726,
0.0831955149769783,
-0.1365324854850769,
0.011323360726237297,
-0.07866793870925903,
-0.08816602826118469,
0.06631410866975784,
0.07160457223653793,
0.08609973639249802,
-0.05085908621549606,
-0.03557780012488365,
-0.005957701243460178,
-0.023748358711600304,
-0.07713111490011215,
0.14298269152641296,
0.0796174705028534,
0.05724303424358368,
-0.13102464377880096,
0.05206528306007385,
0.0019519261550158262,
0.020216336473822594,
0.037141744047403336,
0.12869594991207123,
-0.13054786622524261,
-0.08759450167417526,
-0.026833798736333847,
0.12202183157205582,
-0.09704385697841644,
-0.04866033047437668,
-0.10059338063001633,
-0.0778282955288887,
0.0611262209713459,
0.11913487315177917,
0.0630221962928772,
0.03944152966141701,
-0.02632017247378826,
-0.003167493734508753,
-0.06181471794843674,
0.026477400213479996,
-0.002888618502765894,
0.03880655765533447,
-0.10864128917455673,
0.13583378493785858,
0.00831614900380373,
0.1846754252910614,
-0.07635776698589325,
-0.07223185151815414,
-0.12972256541252136,
0.0426415279507637,
-0.19308187067508698,
-0.06051016226410866,
-0.06335420906543732,
-0.038308918476104736,
-0.02200331725180149,
-0.0885826125741005,
-0.08240040391683578,
0.005724815651774406,
-0.09220821410417557,
-0.00012851612700615078,
-0.004888975527137518,
-0.013247264549136162,
-0.02418716624379158,
0.011517569422721863,
0.06833826005458832,
-0.03218638524413109,
0.0979495719075203,
0.11878620088100433,
-0.043323975056409836,
0.10068630427122116,
0.00683688186109066,
-0.013433740474283695,
0.038600947707891464,
0.07446115463972092,
0.03653711453080177,
0.12098890542984009,
0.006383325904607773,
0.030111314728856087,
0.02550588920712471,
0.04949483275413513,
0.15812472999095917,
-0.07075759768486023,
-0.02525181882083416,
-0.1137198954820633,
-0.09787821769714355,
-0.07133008539676666,
-0.0036870050244033337,
0.08109183609485626,
0.030160922557115555,
0.09168300777673721,
-0.06985782086849213,
0.058380864560604095,
-0.11081646382808685,
-0.003603411139920354,
0.019589725881814957,
-0.08396658301353455,
-0.07451490312814713,
-0.05646110326051712,
0.03404569998383522,
-0.027288571000099182,
0.12802450358867645,
-0.11996381729841232,
-0.04234609752893448,
-0.029347660019993782,
0.004474950488656759,
-0.06219739094376564,
-0.025860991328954697,
0.27436563372612,
0.14969667792320251,
-0.01976468227803707,
0.015256688930094242,
0.0600857175886631,
0.02140086330473423,
0.024115588515996933,
0.04037630558013916,
0.0043855407275259495,
0.021430937573313713,
0.1112983375787735,
0.035968564450740814,
0.004055775236338377,
-0.13871829211711884,
-0.040552422404289246,
-0.11976252496242523,
-0.0016708715120330453,
-0.010312715545296669,
0.07905987650156021,
0.16333499550819397,
-0.017125587910413742,
-0.021900031715631485,
0.023844100534915924,
-0.05751676484942436,
-0.11717768013477325,
-0.07335100322961807,
-0.10836151987314224,
-0.1572847068309784,
0.018791403621435165,
-0.06424775719642639,
-0.03151325508952141,
0.014404048211872578,
0.03881068155169487,
-0.05845623090863228,
0.09713108092546463,
0.13750629127025604,
-0.04356660321354866,
0.03940020129084587,
-0.0412728525698185,
-0.017737656831741333,
-0.013181707821786404,
-0.012881881557404995,
0.0420701690018177,
-0.003875536611303687,
0.014427868649363518,
0.029653718695044518,
-0.1049254983663559,
0.04680843651294708,
-0.07248469442129135,
-0.10755082219839096,
-0.021901024505496025,
-0.002000650390982628,
0.008896212093532085,
0.09165773540735245,
0.009773756377398968,
-0.035386186093091965,
-0.001180290593765676,
0.15040843188762665,
-0.038382403552532196,
-0.05546073615550995,
-0.05826660990715027,
0.1894126981496811,
0.02058832161128521,
0.04860733449459076,
-0.01356594916433096,
-0.009890349581837654,
-0.07011756300926208,
0.32781070470809937,
0.18088342249393463,
-0.12203750014305115,
0.0016079199267551303,
0.05216573178768158,
0.007597350049763918,
0.0030746760312467813,
0.09920114278793335,
0.11570319533348083,
0.15512537956237793,
-0.07981527596712112,
-0.09425801783800125,
-0.09825088083744049,
-0.020015275105834007,
-0.020556297153234482,
0.0613044835627079,
0.06670614331960678,
-0.0455416701734066,
-0.1130615696310997,
0.005428573116660118,
-0.20025768876075745,
0.17478902637958527,
-0.04491671547293663,
-0.1093512549996376,
-0.08637256175279617,
0.008816810324788094,
0.06428889185190201,
0.06199146807193756,
0.03091256693005562,
-0.03892334923148155,
0.04522429406642914,
-0.02224177122116089,
-0.026939939707517624,
-0.10661308467388153,
0.0008300398476421833,
0.08905668556690216,
0.023138118907809258,
0.06774399429559708,
-0.007317755371332169,
0.049870725721120834,
0.07544974237680435,
0.05119364708662033,
-0.04324299097061157,
0.16978819668293,
-0.01028356421738863,
-0.03297712653875351,
-0.04906710982322693,
0.07135773450136185,
0.05041571334004402,
0.0464843288064003,
0.07448945939540863,
-0.04542461782693863,
0.020749850198626518,
0.02418842725455761,
-0.017887935042381287,
-0.07126925885677338,
0.028856663033366203,
-0.002309981035068631,
0.12477488070726395,
0.09306945651769638,
-0.05229618772864342,
0.006280825007706881,
-0.035262580960989,
0.030341146513819695,
-0.002577711595222354,
-0.03361380100250244,
-0.08666740357875824,
-0.17557114362716675,
-0.024610910564661026,
0.012075847014784813,
-0.012862498871982098,
-0.14634883403778076,
-0.014107391238212585,
-0.12883757054805756,
-0.005459176376461983,
-0.0768054649233818,
0.07355497777462006,
0.09265565127134323,
0.018190208822488785,
0.0010947250993922353,
-0.06064988672733307,
-0.012615058571100235,
0.05807702988386154,
-0.16473595798015594,
-0.12627167999744415
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-finetuned-squad
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.1
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "base_model": "bert-base-cased", "model-index": [{"name": "bert-finetuned-squad", "results": []}]} | question-answering | ttamer/bert-finetuned-squad | [
"transformers",
"tensorboard",
"safetensors",
"bert",
"question-answering",
"generated_from_trainer",
"base_model:bert-base-cased",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2024-02-15T05:25:35+00:00 | [] | [] | TAGS
#transformers #tensorboard #safetensors #bert #question-answering #generated_from_trainer #base_model-bert-base-cased #license-apache-2.0 #endpoints_compatible #region-us
|
# bert-finetuned-squad
This model is a fine-tuned version of bert-base-cased on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.1
| [
"# bert-finetuned-squad\n\nThis model is a fine-tuned version of bert-base-cased on an unknown dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- Transformers 4.35.2\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1"
] | [
"TAGS\n#transformers #tensorboard #safetensors #bert #question-answering #generated_from_trainer #base_model-bert-base-cased #license-apache-2.0 #endpoints_compatible #region-us \n",
"# bert-finetuned-squad\n\nThis model is a fine-tuned version of bert-base-cased on an unknown dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- Transformers 4.35.2\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1"
] | [
60,
35,
6,
12,
8,
3,
103,
4,
33
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #bert #question-answering #generated_from_trainer #base_model-bert-base-cased #license-apache-2.0 #endpoints_compatible #region-us \n# bert-finetuned-squad\n\nThis model is a fine-tuned version of bert-base-cased on an unknown dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3\n- mixed_precision_training: Native AMP### Training results### Framework versions\n\n- Transformers 4.35.2\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1"
] | [
-0.061459798365831375,
0.061447951942682266,
-0.0023345649242401123,
0.05422242730855942,
0.14727948606014252,
0.020580867305397987,
0.133071169257164,
0.1109636053442955,
-0.099789097905159,
0.0471339225769043,
0.04277966171503067,
0.027452314272522926,
0.04439014568924904,
0.10173879563808441,
-0.04047640040516853,
-0.2326717972755432,
0.0243214163929224,
0.015390816144645214,
-0.09670136868953705,
0.07859394699335098,
0.10860180854797363,
-0.12125761061906815,
0.04846896603703499,
0.009541328996419907,
-0.14627864956855774,
0.044559869915246964,
-0.017151489853858948,
-0.0479029081761837,
0.10530135035514832,
0.043245647102594376,
0.1464504599571228,
0.009714404121041298,
0.11924252659082413,
-0.23196730017662048,
0.013637912459671497,
0.08863382786512375,
0.0286929402500391,
0.06396042555570602,
0.057905957102775574,
0.024468891322612762,
0.06762804836034775,
-0.10307248681783676,
0.12413535267114639,
0.01943388395011425,
-0.07848954945802689,
-0.20726418495178223,
-0.08991456776857376,
0.021897481754422188,
0.10657299309968948,
0.06853597611188889,
-0.0014294445281848311,
0.13080278038978577,
-0.12465435266494751,
0.05454849824309349,
0.18297156691551208,
-0.2775663435459137,
-0.0742194801568985,
0.044039271771907806,
0.066988006234169,
0.05576687306165695,
-0.10548405349254608,
-0.02604071982204914,
0.04436066374182701,
0.045092348009347916,
0.09539152681827545,
-0.01563568040728569,
-0.1185605600476265,
0.012210484594106674,
-0.14206890761852264,
0.012168851681053638,
0.09484157711267471,
0.04601401463150978,
-0.04385574162006378,
-0.09959746152162552,
-0.032543595880270004,
-0.01876622810959816,
-0.04760074242949486,
-0.0338420532643795,
0.03985750675201416,
-0.032872460782527924,
-0.06725551187992096,
-0.05386665090918541,
-0.08165749907493591,
-0.08186745643615723,
-0.021592821925878525,
0.12031173706054688,
0.049023035913705826,
0.0020565874874591827,
-0.05340089648962021,
0.08596295863389969,
-0.04782279580831528,
-0.0972023606300354,
0.009081766940653324,
-0.0008378953789360821,
-0.09486114233732224,
-0.07832273095846176,
-0.0621207058429718,
-0.06785807758569717,
0.0348195806145668,
0.14466795325279236,
-0.03551895171403885,
0.08212119340896606,
-0.04245670139789581,
0.011188340373337269,
-0.023814590647816658,
0.11640627682209015,
-0.035820312798023224,
-0.01497437059879303,
-0.0031569628044962883,
0.09074471145868301,
0.003783012507483363,
0.0028517041355371475,
-0.0839226022362709,
0.00034435841371305287,
0.06287102401256561,
0.04945514351129532,
-0.05412912368774414,
0.03270968049764633,
-0.05242898687720299,
-0.014940381981432438,
-0.013021047227084637,
-0.11373132467269897,
0.050237782299518585,
0.005551122594624758,
-0.05133688449859619,
-0.03309698402881622,
0.013901998288929462,
0.02568003721535206,
0.0024044683668762445,
0.10671638697385788,
-0.06286582350730896,
0.001527723390609026,
-0.10509428381919861,
-0.10192393511533737,
0.026301341131329536,
-0.04589732736349106,
-0.0001480861392337829,
-0.07383490353822708,
-0.16255640983581543,
-0.028034863993525505,
0.0522027425467968,
-0.04300333186984062,
-0.009100627154111862,
-0.036343641579151154,
-0.03701746091246605,
0.012845677323639393,
-0.021096475422382355,
0.17049647867679596,
-0.05358185991644859,
0.06594842672348022,
-0.019995305687189102,
0.026813561096787453,
-0.013226971961557865,
0.041318267583847046,
-0.07363813370466232,
0.03031962923705578,
-0.16215889155864716,
0.05207372456789017,
-0.11184148490428925,
0.013519112020730972,
-0.12260060012340546,
-0.08264897763729095,
0.013123290613293648,
0.009542727842926979,
0.08599310368299484,
0.0929957926273346,
-0.16837601363658905,
-0.03492205590009689,
0.132857546210289,
-0.09205308556556702,
-0.06740245223045349,
0.10494925081729889,
-0.06968790292739868,
0.03958335891366005,
0.06577906012535095,
0.14707745611667633,
0.07223697751760483,
-0.14001241326332092,
0.028791893273591995,
-0.0025663531851023436,
0.09249944984912872,
0.0470452681183815,
0.04789413884282112,
-0.02686789073050022,
-0.035463228821754456,
0.000380219571525231,
-0.056678012013435364,
0.024655528366565704,
-0.08302492648363113,
-0.07098092138767242,
-0.03393018990755081,
-0.08446280658245087,
0.057038187980651855,
0.023242488503456116,
0.03703390434384346,
-0.08535196632146835,
-0.09979818761348724,
0.18490739166736603,
0.10960789024829865,
-0.06901982426643372,
0.015217185020446777,
-0.09181997179985046,
0.02399349957704544,
-0.029291432350873947,
-0.0160883329808712,
-0.1937323659658432,
-0.124361552298069,
0.029358062893152237,
-0.0499911904335022,
0.042770374566316605,
0.047171566635370255,
0.07468511909246445,
0.06069914251565933,
-0.07250207662582397,
-0.009682667441666126,
-0.08157332986593246,
0.01827055588364601,
-0.10930684208869934,
-0.21364574134349823,
-0.0561688095331192,
-0.04888840764760971,
0.14983327686786652,
-0.2371571958065033,
0.01744062267243862,
-0.0158857349306345,
0.14663065969944,
0.04917171224951744,
-0.028531232848763466,
-0.025619521737098694,
0.08447664231061935,
0.006524949800223112,
-0.07085540145635605,
0.0456293523311615,
0.0035106351133435965,
-0.10825884342193604,
-0.05158180370926857,
-0.1254018247127533,
0.05313687399029732,
0.08509916067123413,
0.010861638933420181,
-0.09549152106046677,
-0.06123816594481468,
-0.04289982467889786,
-0.03770140931010246,
-0.09153592586517334,
-0.006009100005030632,
0.19756795465946198,
0.008866924792528152,
0.13024085760116577,
-0.06384221464395523,
-0.04316077008843422,
-0.011104362085461617,
-0.003277657087892294,
0.0065968395210802555,
0.08201264590024948,
0.06290043890476227,
-0.10100670158863068,
0.08966144919395447,
0.1427372545003891,
-0.08809559792280197,
0.12802675366401672,
-0.06669536232948303,
-0.08382763713598251,
-0.0035146523732692003,
0.018052376806735992,
-0.0009520510211586952,
0.12291523069143295,
-0.11392845958471298,
0.01214613951742649,
0.020349163562059402,
0.024035392329096794,
0.033255793154239655,
-0.1866011917591095,
-0.0001348506921203807,
0.01857338659465313,
-0.031072042882442474,
-0.005707079544663429,
-0.034159861505031586,
0.04566868394613266,
0.0961286723613739,
0.026690978556871414,
-0.026047369465231895,
0.030917497351765633,
-0.017370129004120827,
-0.07888409495353699,
0.19535638391971588,
-0.134768545627594,
-0.10342710465192795,
-0.0994999036192894,
-0.01285966020077467,
-0.029272371903061867,
-0.025446409359574318,
0.0300839152187109,
-0.10529284924268723,
-0.06118984892964363,
-0.08911913633346558,
0.026114005595445633,
-0.015716243535280228,
-0.01027036365121603,
0.016980791464447975,
0.007922949269413948,
0.09388486295938492,
-0.13049335777759552,
0.014737911522388458,
-0.02808591164648533,
-0.10750939697027206,
0.006074283272027969,
0.05640389025211334,
0.07634573429822922,
0.12668631970882416,
-0.02135971561074257,
-0.012784563936293125,
-0.05513405054807663,
0.20797152817249298,
-0.05621897801756859,
-0.021660933271050453,
0.11368010938167572,
-0.02144048735499382,
0.040778279304504395,
0.14336541295051575,
0.056096289306879044,
-0.09655733406543732,
0.04084358364343643,
0.09027957916259766,
-0.007372742053121328,
-0.2477952539920807,
-0.035485442727804184,
-0.03706059232354164,
-0.09619531035423279,
0.09314313530921936,
0.04598979651927948,
-0.03920724615454674,
0.046859025955200195,
-0.010759986005723476,
0.04798964783549309,
0.004607341252267361,
0.08061698079109192,
0.13040615618228912,
0.026539716869592667,
0.10464926809072495,
-0.031242653727531433,
-0.03924533352255821,
0.06145051121711731,
0.01601589098572731,
0.2622612416744232,
0.0032494233455508947,
0.06713822484016418,
0.07083655148744583,
0.12690454721450806,
-0.006245773751288652,
0.015555835328996181,
-0.000842420500703156,
-0.019448664039373398,
-0.00855666771531105,
-0.06403692066669464,
-0.00960704404860735,
0.024987639859318733,
-0.02751791477203369,
0.03134535253047943,
-0.09673149138689041,
-0.04363711178302765,
0.01712709292769432,
0.259813129901886,
0.033559225499629974,
-0.24476152658462524,
-0.06446326524019241,
0.03345059975981712,
-0.054381150752305984,
-0.06443323194980621,
0.01147520449012518,
0.13037928938865662,
-0.1164151281118393,
0.05030885711312294,
-0.062406305223703384,
0.10151725262403488,
-0.004453735426068306,
0.0011111715575680137,
0.03996695950627327,
0.12747614085674286,
-0.012449676170945168,
0.07362329959869385,
-0.24941353499889374,
0.22383655607700348,
0.02319640852510929,
0.11503933370113373,
-0.046525727957487106,
0.03555026277899742,
0.026707086712121964,
0.06300769001245499,
0.0560712032020092,
-0.022611534222960472,
-0.07392343878746033,
-0.19212935864925385,
-0.03508710861206055,
0.059186238795518875,
0.10614626854658127,
-0.003535062773153186,
0.08273082971572876,
-0.03742257505655289,
0.018663646653294563,
0.0643555074930191,
-0.05351274833083153,
-0.19107525050640106,
-0.12351815402507782,
-0.016547195613384247,
0.020241545513272285,
-0.011910421773791313,
-0.11267317086458206,
-0.10366198420524597,
-0.051850952208042145,
0.16490136086940765,
0.01943778060376644,
-0.029776055365800858,
-0.12596632540225983,
0.0854724869132042,
0.1040196567773819,
-0.0347907580435276,
0.03230150043964386,
0.017272591590881348,
0.12235206365585327,
0.02306991070508957,
-0.08323754370212555,
0.08243215829133987,
-0.0786900743842125,
-0.15695762634277344,
-0.08197148889303207,
0.10315465182065964,
0.07422619313001633,
0.05436359718441963,
0.01108220312744379,
0.016497192904353142,
0.02324766479432583,
-0.07448835670948029,
-0.012275688350200653,
0.09931504726409912,
0.07955443859100342,
0.0918746367096901,
-0.10934382677078247,
-0.014584965072572231,
-0.018572811037302017,
-0.014820524491369724,
0.1233733668923378,
0.22609290480613708,
-0.07465904206037521,
0.0728081688284874,
0.10633830726146698,
-0.08155998587608337,
-0.1912233829498291,
0.10248123109340668,
0.1091802567243576,
0.014924203976988792,
0.0620192289352417,
-0.20946091413497925,
0.1343722641468048,
0.12071874737739563,
-0.023595836013555527,
0.02381988614797592,
-0.29926279187202454,
-0.12816034257411957,
0.1080908253788948,
0.1308019906282425,
0.029860712587833405,
-0.14037245512008667,
-0.029754674062132835,
-0.03576713800430298,
-0.1251954287290573,
0.12356105446815491,
-0.15127962827682495,
0.09242217242717743,
0.014962718822062016,
0.0752267837524414,
0.02469545230269432,
-0.03275773674249649,
0.14851243793964386,
-0.022257039323449135,
0.1015954315662384,
-0.0469214990735054,
0.052410323172807693,
0.052272725850343704,
-0.051470305770635605,
0.0027104844339191914,
-0.051835622638463974,
0.0386844128370285,
-0.1199895590543747,
-0.025339633226394653,
-0.06346026062965393,
0.058189574629068375,
-0.04288136959075928,
-0.06441446393728256,
-0.05519048497080803,
0.05625461786985397,
0.039457056671381,
-0.031044045463204384,
0.07836628705263138,
-0.018602069467306137,
0.15986160933971405,
0.05192641168832779,
0.14238160848617554,
-0.028629746288061142,
-0.08330649882555008,
0.016823383048176765,
-0.037374380975961685,
0.09144406765699387,
-0.11978122591972351,
0.040791310369968414,
0.11425328254699707,
0.02885865420103073,
0.15491695702075958,
0.04755830019712448,
-0.059327129274606705,
0.01985377073287964,
0.04295020550489426,
-0.08762049674987793,
-0.15524885058403015,
0.019345000386238098,
0.05994473397731781,
-0.14515921473503113,
0.01810130663216114,
0.11084803938865662,
-0.04176182299852371,
-0.018473336473107338,
-0.000029236023692646995,
0.011293904855847359,
-0.03950151801109314,
0.17660193145275116,
0.03044256381690502,
0.050335001200437546,
-0.08342214673757553,
0.10434763133525848,
0.08080239593982697,
-0.11423652619123459,
0.050117090344429016,
0.04423784092068672,
-0.06290752440690994,
-0.0224289707839489,
0.09095968306064606,
0.2181142121553421,
0.0026791319251060486,
-0.05532265082001686,
-0.07284814119338989,
-0.12529242038726807,
0.033083245158195496,
0.1457516849040985,
0.04752615466713905,
-0.021746931597590446,
-0.02143864706158638,
0.048404790461063385,
-0.10574515163898468,
0.07167717069387436,
0.02479260042309761,
0.07577353715896606,
-0.10183551907539368,
0.10504424571990967,
0.00980623159557581,
0.00805351510643959,
-0.017236854881048203,
0.02171151712536812,
-0.1222962886095047,
-0.00987266469746828,
-0.18757635354995728,
0.0020522947888821363,
-0.02818240597844124,
0.0168794896453619,
0.010995532386004925,
-0.04504646360874176,
-0.036076620221138,
0.030973490327596664,
-0.08395108580589294,
-0.031977470964193344,
0.022327955812215805,
0.07035724818706512,
-0.1359185129404068,
-0.015433784574270248,
0.02833874523639679,
-0.08806543052196503,
0.059240829199552536,
0.02517775446176529,
0.028964368626475334,
0.052969977259635925,
-0.17566116154193878,
-0.03748887777328491,
0.026944991201162338,
0.027665691450238228,
0.05950703099370003,
-0.08567380160093307,
-0.022085702046751976,
-0.016151318326592445,
0.05988408997654915,
0.015331972390413284,
0.05757523700594902,
-0.1131846234202385,
-0.040379103273153305,
-0.046796757727861404,
-0.0734470933675766,
-0.05157812312245369,
0.032347459346055984,
0.10132639855146408,
0.05063449591398239,
0.16778309643268585,
-0.1006341353058815,
0.033364713191986084,
-0.18055792152881622,
-0.02917565405368805,
-0.019195297732949257,
-0.022025134414434433,
-0.08018834888935089,
-0.043590616434812546,
0.06981530040502548,
-0.04999225586652756,
0.1256609410047531,
-0.020875349640846252,
0.09235033392906189,
0.045605119317770004,
-0.07377336919307709,
0.023352887481451035,
0.030738437548279762,
0.22314849495887756,
0.062051914632320404,
-0.01610133796930313,
0.044526077806949615,
-0.0056547923013567924,
0.047879233956336975,
0.08856824040412903,
0.17098933458328247,
0.18817758560180664,
-0.01588284969329834,
0.05176379159092903,
0.07569438964128494,
-0.09040294587612152,
-0.09791208058595657,
0.12116839736700058,
-0.005164319649338722,
0.10924898087978363,
-0.06450361758470535,
0.22765009105205536,
0.07122702151536942,
-0.17626233398914337,
0.04897518455982208,
-0.079558365046978,
-0.08922912180423737,
-0.11066397279500961,
-0.018690351396799088,
-0.08227329701185226,
-0.1427392214536667,
0.00919851753860712,
-0.12561637163162231,
0.02118326909840107,
0.09895186126232147,
0.014988322742283344,
0.02805721014738083,
0.14173580706119537,
-0.039292510598897934,
0.023229781538248062,
0.04869456589221954,
0.00500459223985672,
0.008690242655575275,
-0.07347477972507477,
-0.08363408595323563,
0.05529560148715973,
0.0012694550678133965,
0.04475992172956467,
-0.03477095440030098,
-0.0022524481173604727,
0.027730969712138176,
-0.006817549001425505,
-0.06992533802986145,
0.034952547401189804,
0.014332410879433155,
0.04505559802055359,
0.08219046145677567,
0.047078315168619156,
0.0041319685988128185,
-0.03671559691429138,
0.2745203673839569,
-0.07755859196186066,
-0.09643987566232681,
-0.14884765446186066,
0.22370687127113342,
-0.003403999609872699,
0.008373335003852844,
0.04428301006555557,
-0.10526137799024582,
-0.006543160416185856,
0.16100084781646729,
0.1284177154302597,
-0.07510718703269958,
-0.005658114794641733,
-0.005321638658642769,
-0.027412617579102516,
-0.08328115940093994,
0.13014888763427734,
0.0994640663266182,
0.01851866953074932,
-0.05985711142420769,
-0.03247898444533348,
0.002075428608804941,
-0.01568879559636116,
-0.0859164297580719,
0.040165092796087265,
0.026456035673618317,
-0.0037903201300650835,
-0.03225818648934364,
0.08521488308906555,
0.030575698241591454,
-0.20965440571308136,
0.03453199565410614,
-0.1313685029745102,
-0.16948415338993073,
-0.04227505996823311,
0.09327809512615204,
-0.013944268226623535,
0.04107850417494774,
-0.03446254879236221,
0.012208864092826843,
0.10743625462055206,
-0.02745700627565384,
-0.004946270026266575,
-0.14141987264156342,
0.12466928362846375,
-0.052271340042352676,
0.22071899473667145,
0.006029118318110704,
0.0742056742310524,
0.11567369103431702,
0.0317792184650898,
-0.09987157583236694,
0.050585098564624786,
0.0658205896615982,
-0.11023902893066406,
0.017630815505981445,
0.13049155473709106,
-0.055503349751234055,
0.12244829535484314,
0.04489539563655853,
-0.1413429230451584,
0.02666780725121498,
-0.08042153716087341,
-0.06361427158117294,
-0.07365655899047852,
0.010948935523629189,
-0.08452470600605011,
0.14192570745944977,
0.19942158460617065,
-0.025426695123314857,
0.0029907685238868,
-0.06473544239997864,
0.026517853140830994,
0.04611179232597351,
0.09800279140472412,
-0.04463860020041466,
-0.20993530750274658,
0.025476839393377304,
0.05283667892217636,
0.018994735553860664,
-0.2569529712200165,
-0.0952228531241417,
0.04606932774186134,
-0.020690133795142174,
-0.0612851046025753,
0.10012989491224289,
0.09256619960069656,
0.035076919943094254,
-0.04678044468164444,
-0.2243194729089737,
-0.0418829619884491,
0.14177238941192627,
-0.11487370729446411,
-0.04110102728009224
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# scenario-KD-PR-MSV-EN-CL-D2_data-en-massive_all_1_1
This model is a fine-tuned version of [FacebookAI/xlm-roberta-base](https://huggingface.co/FacebookAI/xlm-roberta-base) on the massive dataset.
It achieves the following results on the evaluation set:
- Loss: 6.0402
- Accuracy: 0.4900
- F1: 0.4424
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 666
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|:------:|
| No log | 0.28 | 100 | 7.5378 | 0.2054 | 0.0683 |
| No log | 0.56 | 200 | 6.7584 | 0.3029 | 0.1679 |
| No log | 0.83 | 300 | 6.1838 | 0.3605 | 0.2430 |
| No log | 1.11 | 400 | 6.0327 | 0.3699 | 0.2692 |
| 4.8934 | 1.39 | 500 | 5.6667 | 0.4111 | 0.3002 |
| 4.8934 | 1.67 | 600 | 5.7790 | 0.4050 | 0.3124 |
| 4.8934 | 1.94 | 700 | 5.4854 | 0.4381 | 0.3466 |
| 4.8934 | 2.22 | 800 | 5.5038 | 0.4274 | 0.3551 |
| 4.8934 | 2.5 | 900 | 5.7379 | 0.4214 | 0.3570 |
| 2.2762 | 2.78 | 1000 | 5.6617 | 0.4315 | 0.3586 |
| 2.2762 | 3.06 | 1100 | 5.3737 | 0.4670 | 0.3688 |
| 2.2762 | 3.33 | 1200 | 5.8262 | 0.4357 | 0.3586 |
| 2.2762 | 3.61 | 1300 | 5.5787 | 0.4451 | 0.3870 |
| 2.2762 | 3.89 | 1400 | 5.8382 | 0.4307 | 0.3650 |
| 1.5016 | 4.17 | 1500 | 5.9036 | 0.4421 | 0.3738 |
| 1.5016 | 4.44 | 1600 | 5.9435 | 0.4366 | 0.3749 |
| 1.5016 | 4.72 | 1700 | 6.2289 | 0.4303 | 0.3617 |
| 1.5016 | 5.0 | 1800 | 6.2064 | 0.4294 | 0.3723 |
| 1.5016 | 5.28 | 1900 | 6.1007 | 0.4375 | 0.3655 |
| 1.0866 | 5.56 | 2000 | 5.9348 | 0.4489 | 0.3800 |
| 1.0866 | 5.83 | 2100 | 6.0184 | 0.4482 | 0.3840 |
| 1.0866 | 6.11 | 2200 | 6.4128 | 0.4329 | 0.3823 |
| 1.0866 | 6.39 | 2300 | 6.3238 | 0.4431 | 0.3917 |
| 1.0866 | 6.67 | 2400 | 6.2291 | 0.4482 | 0.3990 |
| 0.8677 | 6.94 | 2500 | 6.0659 | 0.4588 | 0.3970 |
| 0.8677 | 7.22 | 2600 | 6.4519 | 0.4397 | 0.3919 |
| 0.8677 | 7.5 | 2700 | 5.9209 | 0.4735 | 0.4074 |
| 0.8677 | 7.78 | 2800 | 6.8011 | 0.4311 | 0.3764 |
| 0.8677 | 8.06 | 2900 | 6.6258 | 0.4286 | 0.3745 |
| 0.6959 | 8.33 | 3000 | 6.0205 | 0.4598 | 0.3965 |
| 0.6959 | 8.61 | 3100 | 6.4682 | 0.4452 | 0.3991 |
| 0.6959 | 8.89 | 3200 | 6.2001 | 0.4563 | 0.3975 |
| 0.6959 | 9.17 | 3300 | 6.3492 | 0.4551 | 0.3971 |
| 0.6959 | 9.44 | 3400 | 6.2074 | 0.4576 | 0.4026 |
| 0.6083 | 9.72 | 3500 | 7.1146 | 0.4255 | 0.3813 |
| 0.6083 | 10.0 | 3600 | 6.5633 | 0.4487 | 0.4069 |
| 0.6083 | 10.28 | 3700 | 6.1010 | 0.4707 | 0.4088 |
| 0.6083 | 10.56 | 3800 | 6.2278 | 0.4727 | 0.4166 |
| 0.6083 | 10.83 | 3900 | 7.1772 | 0.4332 | 0.3886 |
| 0.5292 | 11.11 | 4000 | 7.0066 | 0.4387 | 0.3958 |
| 0.5292 | 11.39 | 4100 | 7.4419 | 0.4220 | 0.3945 |
| 0.5292 | 11.67 | 4200 | 6.5553 | 0.4653 | 0.4125 |
| 0.5292 | 11.94 | 4300 | 6.6278 | 0.4586 | 0.4135 |
| 0.5292 | 12.22 | 4400 | 6.2774 | 0.4692 | 0.4059 |
| 0.4804 | 12.5 | 4500 | 6.0512 | 0.4801 | 0.4246 |
| 0.4804 | 12.78 | 4600 | 6.3185 | 0.4701 | 0.4186 |
| 0.4804 | 13.06 | 4700 | 6.4401 | 0.4631 | 0.4108 |
| 0.4804 | 13.33 | 4800 | 6.2653 | 0.4663 | 0.4213 |
| 0.4804 | 13.61 | 4900 | 6.5306 | 0.4583 | 0.4106 |
| 0.4325 | 13.89 | 5000 | 6.7602 | 0.4476 | 0.3999 |
| 0.4325 | 14.17 | 5100 | 6.2450 | 0.4744 | 0.4098 |
| 0.4325 | 14.44 | 5200 | 6.4444 | 0.4675 | 0.4249 |
| 0.4325 | 14.72 | 5300 | 6.8074 | 0.4482 | 0.4130 |
| 0.4325 | 15.0 | 5400 | 6.7639 | 0.4562 | 0.4111 |
| 0.3985 | 15.28 | 5500 | 6.1950 | 0.4749 | 0.4198 |
| 0.3985 | 15.56 | 5600 | 5.9213 | 0.4885 | 0.4311 |
| 0.3985 | 15.83 | 5700 | 6.2779 | 0.4640 | 0.4228 |
| 0.3985 | 16.11 | 5800 | 6.2863 | 0.4764 | 0.4259 |
| 0.3985 | 16.39 | 5900 | 6.3712 | 0.4719 | 0.4206 |
| 0.3744 | 16.67 | 6000 | 6.1818 | 0.4763 | 0.4311 |
| 0.3744 | 16.94 | 6100 | 5.9495 | 0.4932 | 0.4295 |
| 0.3744 | 17.22 | 6200 | 6.2344 | 0.4758 | 0.4278 |
| 0.3744 | 17.5 | 6300 | 6.2192 | 0.4773 | 0.4254 |
| 0.3744 | 17.78 | 6400 | 6.0373 | 0.4825 | 0.4292 |
| 0.3523 | 18.06 | 6500 | 6.0578 | 0.4844 | 0.4343 |
| 0.3523 | 18.33 | 6600 | 6.4294 | 0.4701 | 0.4219 |
| 0.3523 | 18.61 | 6700 | 6.3761 | 0.4754 | 0.4259 |
| 0.3523 | 18.89 | 6800 | 6.1985 | 0.4768 | 0.4264 |
| 0.3523 | 19.17 | 6900 | 6.4615 | 0.4751 | 0.4220 |
| 0.3322 | 19.44 | 7000 | 6.1346 | 0.4851 | 0.4302 |
| 0.3322 | 19.72 | 7100 | 6.0352 | 0.4871 | 0.4282 |
| 0.3322 | 20.0 | 7200 | 6.1533 | 0.4844 | 0.4350 |
| 0.3322 | 20.28 | 7300 | 6.2422 | 0.4767 | 0.4290 |
| 0.3322 | 20.56 | 7400 | 6.6000 | 0.4664 | 0.4275 |
| 0.3219 | 20.83 | 7500 | 6.2801 | 0.4773 | 0.4322 |
| 0.3219 | 21.11 | 7600 | 6.2024 | 0.4792 | 0.4283 |
| 0.3219 | 21.39 | 7700 | 6.2759 | 0.4764 | 0.4309 |
| 0.3219 | 21.67 | 7800 | 6.5024 | 0.4656 | 0.4226 |
| 0.3219 | 21.94 | 7900 | 6.1907 | 0.4809 | 0.4306 |
| 0.3074 | 22.22 | 8000 | 6.0753 | 0.4850 | 0.4332 |
| 0.3074 | 22.5 | 8100 | 5.9962 | 0.4906 | 0.4332 |
| 0.3074 | 22.78 | 8200 | 6.0495 | 0.4910 | 0.4322 |
| 0.3074 | 23.06 | 8300 | 6.2469 | 0.4791 | 0.4329 |
| 0.3074 | 23.33 | 8400 | 6.0686 | 0.4897 | 0.4364 |
| 0.296 | 23.61 | 8500 | 5.9709 | 0.4914 | 0.4409 |
| 0.296 | 23.89 | 8600 | 6.0164 | 0.4923 | 0.4414 |
| 0.296 | 24.17 | 8700 | 6.0804 | 0.4870 | 0.4384 |
| 0.296 | 24.44 | 8800 | 6.0943 | 0.4869 | 0.4351 |
| 0.296 | 24.72 | 8900 | 6.1010 | 0.4857 | 0.4326 |
| 0.2918 | 25.0 | 9000 | 6.0140 | 0.4877 | 0.4396 |
| 0.2918 | 25.28 | 9100 | 6.1158 | 0.4879 | 0.4319 |
| 0.2918 | 25.56 | 9200 | 6.0549 | 0.4907 | 0.4386 |
| 0.2918 | 25.83 | 9300 | 6.0634 | 0.4862 | 0.4349 |
| 0.2918 | 26.11 | 9400 | 6.0695 | 0.4900 | 0.4356 |
| 0.2822 | 26.39 | 9500 | 6.0804 | 0.4919 | 0.4392 |
| 0.2822 | 26.67 | 9600 | 6.1051 | 0.4848 | 0.4371 |
| 0.2822 | 26.94 | 9700 | 5.9125 | 0.4951 | 0.4443 |
| 0.2822 | 27.22 | 9800 | 6.1057 | 0.4868 | 0.4372 |
| 0.2822 | 27.5 | 9900 | 6.0316 | 0.4906 | 0.4366 |
| 0.2786 | 27.78 | 10000 | 6.0800 | 0.4889 | 0.4395 |
| 0.2786 | 28.06 | 10100 | 6.0678 | 0.4894 | 0.4397 |
| 0.2786 | 28.33 | 10200 | 6.1626 | 0.4830 | 0.4345 |
| 0.2786 | 28.61 | 10300 | 6.0285 | 0.4892 | 0.4411 |
| 0.2786 | 28.89 | 10400 | 6.0053 | 0.4897 | 0.4396 |
| 0.2731 | 29.17 | 10500 | 6.0126 | 0.4910 | 0.4421 |
| 0.2731 | 29.44 | 10600 | 6.0908 | 0.4867 | 0.4371 |
| 0.2731 | 29.72 | 10700 | 6.0650 | 0.4903 | 0.4422 |
| 0.2731 | 30.0 | 10800 | 6.0402 | 0.4900 | 0.4424 |
### Framework versions
- Transformers 4.33.3
- Pytorch 2.1.1+cu121
- Datasets 2.14.5
- Tokenizers 0.13.3
| {"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["massive"], "metrics": ["accuracy", "f1"], "base_model": "FacebookAI/xlm-roberta-base", "model-index": [{"name": "scenario-KD-PR-MSV-EN-CL-D2_data-en-massive_all_1_1", "results": []}]} | null | haryoaw/scenario-KD-PR-MSV-EN-CL-D2_data-en-massive_all_1_1 | [
"transformers",
"pytorch",
"xlm-roberta",
"generated_from_trainer",
"dataset:massive",
"base_model:FacebookAI/xlm-roberta-base",
"license:mit",
"endpoints_compatible",
"region:us"
] | 2024-02-15T05:25:57+00:00 | [] | [] | TAGS
#transformers #pytorch #xlm-roberta #generated_from_trainer #dataset-massive #base_model-FacebookAI/xlm-roberta-base #license-mit #endpoints_compatible #region-us
| scenario-KD-PR-MSV-EN-CL-D2\_data-en-massive\_all\_1\_1
=======================================================
This model is a fine-tuned version of FacebookAI/xlm-roberta-base on the massive dataset.
It achieves the following results on the evaluation set:
* Loss: 6.0402
* Accuracy: 0.4900
* F1: 0.4424
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 5e-05
* train\_batch\_size: 32
* eval\_batch\_size: 32
* seed: 666
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 30
### Training results
### Framework versions
* Transformers 4.33.3
* Pytorch 2.1.1+cu121
* Datasets 2.14.5
* Tokenizers 0.13.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 666\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 30",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.33.3\n* Pytorch 2.1.1+cu121\n* Datasets 2.14.5\n* Tokenizers 0.13.3"
] | [
"TAGS\n#transformers #pytorch #xlm-roberta #generated_from_trainer #dataset-massive #base_model-FacebookAI/xlm-roberta-base #license-mit #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 666\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 30",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.33.3\n* Pytorch 2.1.1+cu121\n* Datasets 2.14.5\n* Tokenizers 0.13.3"
] | [
60,
99,
4,
33
] | [
"passage: TAGS\n#transformers #pytorch #xlm-roberta #generated_from_trainer #dataset-massive #base_model-FacebookAI/xlm-roberta-base #license-mit #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 666\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 30### Training results### Framework versions\n\n\n* Transformers 4.33.3\n* Pytorch 2.1.1+cu121\n* Datasets 2.14.5\n* Tokenizers 0.13.3"
] | [
-0.1330670863389969,
0.0655493214726448,
-0.0012373015051707625,
0.11280655115842819,
0.17408329248428345,
0.028312986716628075,
0.09240809082984924,
0.1102430522441864,
-0.06465045362710953,
0.01907394267618656,
0.13175037503242493,
0.13442014157772064,
-0.0016737250844016671,
0.1298762857913971,
-0.03608129546046257,
-0.2258286327123642,
-0.02108626253902912,
0.05513632297515869,
-0.12558355927467346,
0.14012660086154938,
0.09341404587030411,
-0.14459916949272156,
0.10658343136310577,
-0.0052469889633357525,
-0.24841636419296265,
0.03391527757048607,
0.014954534359276295,
-0.03545151278376579,
0.14422549307346344,
0.028828943148255348,
0.11390675604343414,
0.005278408993035555,
0.07602538913488388,
-0.15433984994888306,
0.017219634726643562,
0.04467126727104187,
-0.013578587211668491,
0.0807473361492157,
0.03321497514843941,
-0.008510121144354343,
0.139021635055542,
-0.08935678750276566,
0.026255054399371147,
0.030980661511421204,
-0.13857991993427277,
-0.2578940987586975,
-0.08671098947525024,
0.06095224618911743,
0.05935320258140564,
0.10907561331987381,
0.0021662365179508924,
0.12390600889921188,
-0.09559598565101624,
0.08687329292297363,
0.2592088282108307,
-0.26958101987838745,
-0.08117066323757172,
0.03640292212367058,
0.017017513513565063,
0.07825407385826111,
-0.08905907720327377,
-0.013253073208034039,
0.0718451738357544,
0.038824014365673065,
0.11786729842424393,
-0.04307128116488457,
-0.019504275172948837,
0.037977173924446106,
-0.14125162363052368,
-0.02972658909857273,
0.15267711877822876,
0.06428548693656921,
-0.036164551973342896,
-0.005262387450784445,
-0.04547421634197235,
-0.1116059422492981,
-0.03607344254851341,
-0.0004998587537556887,
0.04126007482409477,
-0.04158004745841026,
-0.1212504431605339,
-0.009240245446562767,
-0.08742199838161469,
-0.087135449051857,
-0.05764997750520706,
0.1531733274459839,
0.023939315229654312,
0.02463807910680771,
-0.03553431108593941,
0.11438362300395966,
-0.016097914427518845,
-0.13064691424369812,
0.016861358657479286,
0.010130714625120163,
-0.005410587415099144,
-0.03698154538869858,
-0.060967884957790375,
0.00774499261751771,
0.021686973050236702,
0.09526891261339188,
-0.05372472107410431,
0.028681213036179543,
0.05930493399500847,
0.04251595214009285,
-0.09144819527864456,
0.16741298139095306,
-0.08483598381280899,
-0.055960867553949356,
0.007107889279723167,
0.04573967307806015,
-0.023420734331011772,
0.02184131368994713,
-0.09807248413562775,
-0.016621636226773262,
0.06580251455307007,
0.001141075394116342,
-0.07774918526411057,
0.05454251915216446,
-0.05104255676269531,
-0.026965610682964325,
-0.008633832447230816,
-0.06601157784461975,
0.02161628007888794,
-0.0008877916261553764,
-0.09519684314727783,
-0.010690105147659779,
0.019217563793063164,
-0.0025748375337570906,
0.0229126438498497,
0.04804505407810211,
-0.100639209151268,
0.03269432485103607,
-0.0980878695845604,
-0.09788519889116287,
-0.0010313265956938267,
-0.08581482619047165,
0.049220312386751175,
-0.09199085831642151,
-0.1850062608718872,
-0.012464051134884357,
0.036776311695575714,
-0.028429118916392326,
-0.05566892772912979,
-0.03401137515902519,
-0.06718549132347107,
-0.011239467188715935,
-0.01315669622272253,
0.1382739394903183,
-0.0633920356631279,
0.11876924335956573,
0.06213345751166344,
0.06950061768293381,
-0.04784597083926201,
0.05382154509425163,
-0.09358520060777664,
0.019329404458403587,
-0.20972980558872223,
0.0645296722650528,
-0.053493842482566833,
0.08426771312952042,
-0.07638143748044968,
-0.1147809624671936,
0.01527340617030859,
0.006875565741211176,
0.08048547804355621,
0.10208547860383987,
-0.17492328584194183,
-0.09839248657226562,
0.1613481640815735,
-0.06496699154376984,
-0.14795413613319397,
0.10394053906202316,
-0.07933265715837479,
0.07321249693632126,
0.07549454271793365,
0.1467667520046234,
0.07057546079158783,
-0.0703599601984024,
0.017712829634547234,
0.004387320019304752,
0.023982180282473564,
-0.09835291653871536,
0.06515555083751678,
0.012668026611208916,
-0.03299596533179283,
0.02408967912197113,
-0.07713941484689713,
0.08161409944295883,
-0.12930920720100403,
-0.07911732792854309,
-0.0417085625231266,
-0.12124387919902802,
0.03406819328665733,
0.06583300232887268,
0.06571229547262192,
-0.11330289393663406,
-0.05249740555882454,
0.09767499566078186,
0.09981899708509445,
-0.05249321460723877,
0.003840748453512788,
-0.05305395647883415,
0.06451227515935898,
-0.07357422262430191,
-0.044085800647735596,
-0.17624033987522125,
-0.008957887999713421,
-0.010302596725523472,
0.045603230595588684,
0.022216055542230606,
0.047409117221832275,
0.07735923677682877,
0.051013343036174774,
-0.05071808025240898,
-0.02381294220685959,
-0.06745515763759613,
-0.006108622532337904,
-0.11821025609970093,
-0.18566620349884033,
-0.0483989417552948,
-0.01978641003370285,
0.10571807622909546,
-0.20149630308151245,
0.03349201753735542,
-0.012255567125976086,
0.06782747805118561,
0.00475912494584918,
-0.015678949654102325,
-0.04298820346593857,
0.0829518660902977,
-0.006067316513508558,
-0.0492093488574028,
0.06307130306959152,
0.011044839397072792,
-0.10692797601222992,
-0.054580047726631165,
-0.035772230476140976,
0.2444654107093811,
0.1288740634918213,
-0.1115284264087677,
-0.07200749218463898,
-0.003153627971187234,
-0.06481299549341202,
-0.02805113047361374,
-0.02565307542681694,
0.035194989293813705,
0.1871495246887207,
-0.02323215641081333,
0.13550446927547455,
-0.08810639381408691,
-0.03309871628880501,
0.02109670452773571,
-0.033046890050172806,
0.03704997897148132,
0.13460424542427063,
0.1235489472746849,
-0.11153337359428406,
0.13569122552871704,
0.14480669796466827,
-0.06223141402006149,
0.13646118342876434,
-0.04846205189824104,
-0.07501446455717087,
-0.04100213944911957,
-0.031249336898326874,
-0.022257745265960693,
0.13804949820041656,
-0.151831716299057,
0.0050390963442623615,
0.012281256727874279,
0.007160841953009367,
0.03455473855137825,
-0.20188160240650177,
-0.07314807921648026,
0.04122688248753548,
-0.04822276160120964,
-0.057813871651887894,
-0.007129728328436613,
0.01895453967154026,
0.10097049921751022,
-0.0011388424318283796,
-0.08381732553243637,
0.03668266907334328,
0.009696681052446365,
-0.07224465161561966,
0.2103305459022522,
-0.06057191640138626,
-0.11387144774198532,
-0.0976247563958168,
-0.06455551832914352,
-0.04240228608250618,
-0.008179357275366783,
0.05153847113251686,
-0.09697749465703964,
-0.0027323290705680847,
-0.04020358994603157,
0.04048943147063255,
-0.0010758412536233664,
0.03621921315789223,
0.034100208431482315,
0.0008156982949003577,
0.08080364018678665,
-0.10795795917510986,
-0.007098485715687275,
-0.07219968736171722,
-0.07720712572336197,
0.026226066052913666,
0.01937231235206127,
0.13122330605983734,
0.10632888972759247,
-0.012834262102842331,
0.02338704839348793,
-0.02640892006456852,
0.2651960849761963,
-0.06859419494867325,
-0.047225311398506165,
0.15199610590934753,
0.037715476006269455,
0.04263259842991829,
0.07547842711210251,
0.07582005113363266,
-0.10035370290279388,
-0.006005915347486734,
0.034776099026203156,
-0.04378467798233032,
-0.1988353133201599,
-0.040261946618556976,
-0.05761894956231117,
-0.039381243288517,
0.07620971649885178,
0.01330427173525095,
0.005641170311719179,
0.07566501945257187,
0.052885752171278,
0.04419727623462677,
-0.07830870151519775,
0.03831104561686516,
0.055999793112277985,
0.04454077035188675,
0.11568257957696915,
-0.05152131989598274,
-0.04659159108996391,
0.05088600516319275,
-0.023869933560490608,
0.2454889416694641,
-0.012670870870351791,
0.09651640802621841,
0.08106142282485962,
0.2070920169353485,
-0.009835885837674141,
0.08076895773410797,
-0.019935721531510353,
-0.06512328237295151,
-0.016826462000608444,
-0.028441661968827248,
-0.019368484616279602,
0.0051236641593277454,
-0.06329291313886642,
0.08592351526021957,
-0.15156826376914978,
-0.006323166657239199,
0.05807922035455704,
0.2629905641078949,
0.019288435578346252,
-0.32263198494911194,
-0.1026882603764534,
-0.014736535958945751,
-0.01603933982551098,
-0.0061035798862576485,
0.018896380439400673,
0.11800308525562286,
-0.09103252738714218,
0.024266818538308144,
-0.05002853646874428,
0.09406516700983047,
-0.0021323191467672586,
0.0469839833676815,
0.07492635399103165,
0.10227532684803009,
-0.009802749380469322,
0.06528176367282867,
-0.28003352880477905,
0.29554006457328796,
0.0015762998955324292,
0.07033231854438782,
-0.050972480326890945,
-0.026589708402752876,
0.0305726770311594,
0.0709192156791687,
0.06845416128635406,
-0.005715351086109877,
0.0022235519718378782,
-0.19680161774158478,
-0.03275055065751076,
0.05242737755179405,
0.06510931998491287,
-0.05515364557504654,
0.10362964123487473,
0.0006165073136799037,
0.01818562112748623,
0.0745161771774292,
0.015480081550776958,
-0.05486442148685455,
-0.0709998682141304,
-0.0526922307908535,
-0.008632483892142773,
-0.04614676535129547,
-0.07870882004499435,
-0.10224229842424393,
-0.09460225701332092,
0.13491782546043396,
0.012626693584024906,
-0.03040234185755253,
-0.11097066849470139,
0.09405110776424408,
0.09276412427425385,
-0.09132672101259232,
0.037583403289318085,
0.03394998237490654,
0.03705977275967598,
0.04098430648446083,
-0.06661371886730194,
0.10999839752912521,
-0.06923292577266693,
-0.14061430096626282,
-0.029901282861828804,
0.11684845387935638,
0.06522466987371445,
0.058622077107429504,
-0.0024402078706771135,
-0.013821843080222607,
-0.04195606708526611,
-0.1008770689368248,
0.03845090791583061,
-0.047267064452171326,
0.06294414401054382,
0.04500659182667732,
-0.018988672643899918,
0.05827553942799568,
-0.050683774054050446,
-0.009830041788518429,
0.1637435108423233,
0.24585318565368652,
-0.09821378439664841,
-0.0028345377650111914,
0.022934913635253906,
-0.05876215547323227,
-0.15656810998916626,
0.08222685009241104,
0.05286957696080208,
0.01750781759619713,
0.03750944510102272,
-0.18806694447994232,
0.13432838022708893,
0.12563171982765198,
0.009947532787919044,
0.07716301828622818,
-0.349504292011261,
-0.10707573592662811,
0.093353271484375,
0.14380644261837006,
0.1847435086965561,
-0.12610892951488495,
-0.007951829582452774,
-0.011323806829750538,
-0.14019227027893066,
0.09572131186723709,
-0.13186439871788025,
0.11181320995092392,
-0.045587699860334396,
0.082867331802845,
0.0023358557373285294,
-0.051446981728076935,
0.12203429639339447,
0.035433329641819,
0.14315500855445862,
-0.03730442374944687,
-0.03408501297235489,
0.05805359408259392,
-0.02362111769616604,
0.00685246242210269,
-0.07808180898427963,
0.03889842331409454,
-0.12778989970684052,
-0.017037229612469673,
-0.11250491440296173,
0.04423312097787857,
-0.02975432761013508,
-0.05298991501331329,
-0.04283084347844124,
0.034843046218156815,
0.02240324579179287,
-0.010136271826922894,
0.07712480425834656,
0.009335169568657875,
0.14531445503234863,
0.06847080588340759,
0.06885194778442383,
-0.08380047976970673,
-0.08537911623716354,
-0.007624573074281216,
-0.008906981907784939,
0.055563583970069885,
-0.14696234464645386,
0.018572352826595306,
0.13285543024539948,
0.04286251962184906,
0.12174110859632492,
0.08197005093097687,
-0.023706896230578423,
0.03594198450446129,
0.06122016906738281,
-0.1440112441778183,
-0.11135464161634445,
-0.025470612570643425,
-0.11330534517765045,
-0.10270009189844131,
0.06605040282011032,
0.08766794949769974,
-0.06202448159456253,
-0.01711369678378105,
-0.037674661725759506,
-0.01620132476091385,
-0.0910649523139,
0.19532258808612823,
0.07913419604301453,
0.04766923561692238,
-0.11792562156915665,
0.06650710105895996,
-0.002412372035905719,
-0.027035078033804893,
-0.005223927553743124,
0.06967071443796158,
-0.07132024317979813,
-0.029616596177220345,
0.07253623008728027,
0.2234647572040558,
-0.1014941930770874,
-0.02744438499212265,
-0.15787950158119202,
-0.12854604423046112,
0.08364717662334442,
0.2039162665605545,
0.1189558357000351,
0.0031631523743271828,
-0.042749229818582535,
-0.0031802046578377485,
-0.12289827316999435,
0.06005626171827316,
0.05771248787641525,
0.04921385273337364,
-0.13829828798770905,
0.17226023972034454,
-0.0018882084405049682,
0.05459098145365715,
-0.024897802621126175,
0.028857707977294922,
-0.11384342610836029,
0.018796704709529877,
-0.16804780066013336,
-0.05660194158554077,
-0.020270202308893204,
0.007633215747773647,
-0.022535329684615135,
-0.07425607740879059,
-0.07071869820356369,
0.023321891203522682,
-0.12312495708465576,
-0.0054783509112894535,
0.06391561031341553,
0.05229000747203827,
-0.11571390181779861,
-0.02861974388360977,
0.028371542692184448,
-0.0481766015291214,
0.07167473435401917,
0.07507415115833282,
0.03772949427366257,
0.03965813294053078,
-0.13540516793727875,
-0.003556106938049197,
0.054423317313194275,
-0.0028709210455417633,
0.08119473606348038,
-0.08838575333356857,
0.017924973741173744,
-0.004542126320302486,
0.057640623301267624,
0.012564721517264843,
0.03905895724892616,
-0.12293010205030441,
0.0008144614985212684,
-0.009809868410229683,
-0.047779638320207596,
-0.06456948071718216,
0.02611224539577961,
0.13244573771953583,
0.0534234344959259,
0.1647307276725769,
-0.07264027744531631,
0.03402845188975334,
-0.24163013696670532,
-0.018250562250614166,
-0.015964508056640625,
-0.09721887111663818,
-0.10474381595849991,
-0.06134363263845444,
0.07321524620056152,
-0.03908602520823479,
0.15089911222457886,
-0.00030349180451594293,
0.024770131334662437,
0.012565121054649353,
-0.04601035639643669,
0.05882679671049118,
0.012972760945558548,
0.23174737393856049,
0.01459025964140892,
-0.02357357367873192,
0.09400291740894318,
0.07744797319173813,
0.10123053193092346,
0.1301320642232895,
0.19174824655056,
0.17895731329917908,
0.014930269680917263,
0.0659550130367279,
0.06758768111467361,
-0.05001916363835335,
-0.11469022929668427,
0.014273060485720634,
-0.018676748499274254,
0.05932821333408356,
-0.032700084149837494,
0.18790535628795624,
0.07986962050199509,
-0.1715816706418991,
0.045587360858917236,
-0.02769434079527855,
-0.07556068152189255,
-0.07942119985818863,
-0.005270777270197868,
-0.07418077439069748,
-0.17042624950408936,
0.01241841446608305,
-0.12812510132789612,
-0.009218218736350536,
0.11007381230592728,
-0.009367824532091618,
-0.029396651312708855,
0.15450339019298553,
0.06570978462696075,
0.025700509548187256,
0.07231318950653076,
-0.008794652298092842,
-0.04066424444317818,
-0.091834157705307,
-0.06005309894680977,
-0.0027727168053388596,
-0.045278873294591904,
0.01717858575284481,
-0.06348573416471481,
-0.08618859201669693,
0.030975976958870888,
-0.010733695700764656,
-0.0948307067155838,
0.0034219338558614254,
0.02124924398958683,
0.07409905642271042,
0.04145868867635727,
0.01153910718858242,
0.006984091829508543,
-0.024841826409101486,
0.21634404361248016,
-0.07048234343528748,
-0.040566012263298035,
-0.09832336753606796,
0.22759008407592773,
0.039271749556064606,
-0.012347935698926449,
0.026925910264253616,
-0.05036177858710289,
0.024959728121757507,
0.21582835912704468,
0.1941530704498291,
-0.11821743845939636,
0.0027248498518019915,
-0.020955227315425873,
-0.012017324566841125,
-0.02322113700211048,
0.10616062581539154,
0.14017558097839355,
0.0322115421295166,
-0.10149423778057098,
-0.03795783594250679,
-0.06792178004980087,
-0.003949529491364956,
-0.009116577915847301,
0.057087186723947525,
0.06351859867572784,
0.006146910134702921,
-0.04499514400959015,
0.06464245170354843,
-0.06545783579349518,
-0.098065584897995,
0.08250896632671356,
-0.19380028545856476,
-0.1575201451778412,
-0.03275774046778679,
0.03434097766876221,
0.027178794145584106,
0.0834408774971962,
-0.04267555847764015,
0.004687108099460602,
0.07463282346725464,
-0.014276224188506603,
-0.114166259765625,
-0.13397790491580963,
0.11977392435073853,
-0.0825439989566803,
0.17878828942775726,
-0.056288979947566986,
0.04342855513095856,
0.12711311876773834,
0.04932258278131485,
-0.08615948259830475,
0.05501668527722359,
0.035408396273851395,
-0.06207876652479172,
-0.006348856259137392,
0.05382842198014259,
-0.02040688507258892,
0.07928092032670975,
0.03271723538637161,
-0.12273318320512772,
0.011885667219758034,
-0.07312555611133575,
-0.04058809578418732,
-0.06336808949708939,
-0.0551820769906044,
-0.05323707312345505,
0.12324342131614685,
0.2190917283296585,
-0.04575323686003685,
0.014570722356438637,
-0.06744256615638733,
0.04664952680468559,
0.08079225569963455,
0.05846783518791199,
-0.07071883976459503,
-0.21244153380393982,
0.007497979328036308,
0.08987580984830856,
-0.04884840175509453,
-0.2285352498292923,
-0.09780117869377136,
-0.004731953609734774,
-0.08135044574737549,
-0.049364473670721054,
0.07724910974502563,
0.0642005205154419,
0.07437372207641602,
-0.03848041966557503,
-0.048128947615623474,
-0.09351063519716263,
0.16643022000789642,
-0.13840016722679138,
-0.08491929620504379
] |
null | null | null |
# Lora of trieste/トリエステ/的里雅斯特 (Azur Lane)
## What Is This?
This is the LoRA model of waifu trieste/トリエステ/的里雅斯特 (Azur Lane).
## How Is It Trained?
* This model is trained with [HCP-Diffusion](https://github.com/7eu7d7/HCP-Diffusion).
* The [auto-training framework](https://github.com/deepghs/cyberharem) is maintained by [DeepGHS Team](https://huggingface.co/deepghs).
* The base model used for training is [deepghs/animefull-latest](https://huggingface.co/deepghs/animefull-latest).
* Dataset used for training is the `stage3-p480-800` in [CyberHarem/trieste_azurlane](https://huggingface.co/datasets/CyberHarem/trieste_azurlane), which contains 222 images.
* Batch size is 4, resolution is 720x720, clustering into 5 buckets.
* Batch size for regularization dataset is 16, resolution is 720x720, clustering into 20 buckets.
* Trained for 2240 steps, 40 checkpoints were saved and evaluated.
* **Trigger word is `trieste_azurlane`.**
* Pruned core tags for this waifu are `breasts, hair_over_one_eye, large_breasts, long_hair, green_eyes, pink_hair, ponytail, hair_ornament, mole, hairclip, bangs, mole_under_eye, earrings, purple_hair`. You can add them to the prompt when some features of waifu (e.g. hair color) are not stable.
## How to Use It?
### If You Are Using A1111 WebUI v1.7+
**Just use it like the classic LoRA**. The LoRA we provided are bundled with the embedding file.
### If You Are Using A1111 WebUI v1.6 or Lower
After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.
For example, if you want to use the model from step 1288, you need to download [`1288/trieste_azurlane.pt`](https://huggingface.co/CyberHarem/trieste_azurlane/resolve/main/1288/trieste_azurlane.pt) as the embedding and [`1288/trieste_azurlane.safetensors`](https://huggingface.co/CyberHarem/trieste_azurlane/resolve/main/1288/trieste_azurlane.safetensors) for loading Lora. By using both files together, you can generate images for the desired characters.
## Which Step Should I Use?
We selected 5 good steps for you to choose. The best one is step 1288.
1720 images (1.79 GiB) were generated for auto-testing.

The base model used for generating preview images is [Meina/MeinaMix_V11](https://huggingface.co/Meina/MeinaMix_V11).
Here are the preview of the recommended steps:
| Step | Epoch | CCIP | AI Corrupt | Bikini Plus | Score | Download | pattern_0_0 | pattern_0_1 | pattern_1_0 | pattern_1_1 | pattern_1_2 | pattern_2_0 | pattern_2_1 | pattern_2_2 | portrait_0 | portrait_1 | portrait_2 | full_body_0 | full_body_1 | profile_0 | profile_1 | free_0 | free_1 | shorts | maid_0 | maid_1 | miko | yukata | suit | china | bikini_0 | bikini_1 | bikini_2 | sit | squat | kneel | jump | crossed_arms | angry | smile | cry | grin | n_lie_0 | n_lie_1 | n_stand_0 | n_stand_1 | n_stand_2 | n_sex_0 | n_sex_1 |
|-------:|--------:|:----------|:-------------|:--------------|:----------|:------------------------------------------------------------------------------------------------------|:----------------------------------------------|:----------------------------------------------|:----------------------------------------------|:----------------------------------------------|:----------------------------------------------|:----------------------------------------------|:----------------------------------------------|:----------------------------------------------|:--------------------------------------------|:--------------------------------------------|:--------------------------------------------|:----------------------------------------------|:----------------------------------------------|:------------------------------------------|:------------------------------------------|:------------------------------------|:------------------------------------|:------------------------------------|:------------------------------------|:------------------------------------|:--------------------------------|:------------------------------------|:--------------------------------|:----------------------------------|:----------------------------------------|:----------------------------------------|:----------------------------------------|:------------------------------|:----------------------------------|:----------------------------------|:--------------------------------|:------------------------------------------------|:----------------------------------|:----------------------------------|:------------------------------|:--------------------------------|:--------------------------------------|:--------------------------------------|:------------------------------------------|:------------------------------------------|:------------------------------------------|:--------------------------------------|:--------------------------------------|
| 1288 | 24 | **0.874** | 0.932 | 0.837 | **0.775** | [Download](https://huggingface.co/CyberHarem/trieste_azurlane/resolve/main/1288/trieste_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
| 1064 | 20 | 0.847 | 0.936 | 0.830 | 0.737 | [Download](https://huggingface.co/CyberHarem/trieste_azurlane/resolve/main/1064/trieste_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
| 2016 | 37 | 0.820 | **0.967** | **0.842** | 0.730 | [Download](https://huggingface.co/CyberHarem/trieste_azurlane/resolve/main/2016/trieste_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
| 1680 | 31 | 0.825 | 0.921 | 0.838 | 0.728 | [Download](https://huggingface.co/CyberHarem/trieste_azurlane/resolve/main/1680/trieste_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
| 952 | 18 | 0.812 | 0.949 | 0.841 | 0.720 | [Download](https://huggingface.co/CyberHarem/trieste_azurlane/resolve/main/952/trieste_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
## Anything Else?
Because the automation of LoRA training always annoys some people. So for the following groups, it is not recommended to use this model and we express regret:
1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.
2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.
3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.
4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.
5. Individuals who finds the generated image content offensive to their values.
## All Steps
We uploaded the files in all steps. you can check the images, metrics and download them in the following links:
* [Steps From 1736 to 2240](all/0.md)
* [Steps From 1176 to 1680](all/1.md)
* [Steps From 616 to 1120](all/2.md)
* [Steps From 56 to 560](all/3.md)
| {"license": "mit", "tags": ["art", "not-for-all-audiences"], "datasets": ["CyberHarem/trieste_azurlane"], "pipeline_tag": "text-to-image"} | text-to-image | CyberHarem/trieste_azurlane | [
"art",
"not-for-all-audiences",
"text-to-image",
"dataset:CyberHarem/trieste_azurlane",
"license:mit",
"region:us"
] | 2024-02-15T05:31:53+00:00 | [] | [] | TAGS
#art #not-for-all-audiences #text-to-image #dataset-CyberHarem/trieste_azurlane #license-mit #region-us
| Lora of trieste/トリエステ/的里雅斯特 (Azur Lane)
=======================================
What Is This?
-------------
This is the LoRA model of waifu trieste/トリエステ/的里雅斯特 (Azur Lane).
How Is It Trained?
------------------
* This model is trained with HCP-Diffusion.
* The auto-training framework is maintained by DeepGHS Team.
* The base model used for training is deepghs/animefull-latest.
* Dataset used for training is the 'stage3-p480-800' in CyberHarem/trieste\_azurlane, which contains 222 images.
* Batch size is 4, resolution is 720x720, clustering into 5 buckets.
* Batch size for regularization dataset is 16, resolution is 720x720, clustering into 20 buckets.
* Trained for 2240 steps, 40 checkpoints were saved and evaluated.
* Trigger word is 'trieste\_azurlane'.
* Pruned core tags for this waifu are 'breasts, hair\_over\_one\_eye, large\_breasts, long\_hair, green\_eyes, pink\_hair, ponytail, hair\_ornament, mole, hairclip, bangs, mole\_under\_eye, earrings, purple\_hair'. You can add them to the prompt when some features of waifu (e.g. hair color) are not stable.
How to Use It?
--------------
### If You Are Using A1111 WebUI v1.7+
Just use it like the classic LoRA. The LoRA we provided are bundled with the embedding file.
### If You Are Using A1111 WebUI v1.6 or Lower
After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.
For example, if you want to use the model from step 1288, you need to download '1288/trieste\_azurlane.pt' as the embedding and '1288/trieste\_azurlane.safetensors' for loading Lora. By using both files together, you can generate images for the desired characters.
Which Step Should I Use?
------------------------
We selected 5 good steps for you to choose. The best one is step 1288.
1720 images (1.79 GiB) were generated for auto-testing.
!Metrics Plot
The base model used for generating preview images is Meina/MeinaMix\_V11.
Here are the preview of the recommended steps:
Anything Else?
--------------
Because the automation of LoRA training always annoys some people. So for the following groups, it is not recommended to use this model and we express regret:
1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.
2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.
3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.
4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.
5. Individuals who finds the generated image content offensive to their values.
All Steps
---------
We uploaded the files in all steps. you can check the images, metrics and download them in the following links:
* Steps From 1736 to 2240
* Steps From 1176 to 1680
* Steps From 616 to 1120
* Steps From 56 to 560
| [
"### If You Are Using A1111 WebUI v1.7+\n\n\nJust use it like the classic LoRA. The LoRA we provided are bundled with the embedding file.",
"### If You Are Using A1111 WebUI v1.6 or Lower\n\n\nAfter downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.\n\n\nFor example, if you want to use the model from step 1288, you need to download '1288/trieste\\_azurlane.pt' as the embedding and '1288/trieste\\_azurlane.safetensors' for loading Lora. By using both files together, you can generate images for the desired characters.\n\n\nWhich Step Should I Use?\n------------------------\n\n\nWe selected 5 good steps for you to choose. The best one is step 1288.\n\n\n1720 images (1.79 GiB) were generated for auto-testing.\n\n\n!Metrics Plot\n\n\nThe base model used for generating preview images is Meina/MeinaMix\\_V11.\n\n\nHere are the preview of the recommended steps:\n\n\n\nAnything Else?\n--------------\n\n\nBecause the automation of LoRA training always annoys some people. So for the following groups, it is not recommended to use this model and we express regret:\n\n\n1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.\n2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.\n3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.\n4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.\n5. Individuals who finds the generated image content offensive to their values.\n\n\nAll Steps\n---------\n\n\nWe uploaded the files in all steps. you can check the images, metrics and download them in the following links:\n\n\n* Steps From 1736 to 2240\n* Steps From 1176 to 1680\n* Steps From 616 to 1120\n* Steps From 56 to 560"
] | [
"TAGS\n#art #not-for-all-audiences #text-to-image #dataset-CyberHarem/trieste_azurlane #license-mit #region-us \n",
"### If You Are Using A1111 WebUI v1.7+\n\n\nJust use it like the classic LoRA. The LoRA we provided are bundled with the embedding file.",
"### If You Are Using A1111 WebUI v1.6 or Lower\n\n\nAfter downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.\n\n\nFor example, if you want to use the model from step 1288, you need to download '1288/trieste\\_azurlane.pt' as the embedding and '1288/trieste\\_azurlane.safetensors' for loading Lora. By using both files together, you can generate images for the desired characters.\n\n\nWhich Step Should I Use?\n------------------------\n\n\nWe selected 5 good steps for you to choose. The best one is step 1288.\n\n\n1720 images (1.79 GiB) were generated for auto-testing.\n\n\n!Metrics Plot\n\n\nThe base model used for generating preview images is Meina/MeinaMix\\_V11.\n\n\nHere are the preview of the recommended steps:\n\n\n\nAnything Else?\n--------------\n\n\nBecause the automation of LoRA training always annoys some people. So for the following groups, it is not recommended to use this model and we express regret:\n\n\n1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.\n2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.\n3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.\n4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.\n5. Individuals who finds the generated image content offensive to their values.\n\n\nAll Steps\n---------\n\n\nWe uploaded the files in all steps. you can check the images, metrics and download them in the following links:\n\n\n* Steps From 1736 to 2240\n* Steps From 1176 to 1680\n* Steps From 616 to 1120\n* Steps From 56 to 560"
] | [
44,
38,
472
] | [
"passage: TAGS\n#art #not-for-all-audiences #text-to-image #dataset-CyberHarem/trieste_azurlane #license-mit #region-us \n### If You Are Using A1111 WebUI v1.7+\n\n\nJust use it like the classic LoRA. The LoRA we provided are bundled with the embedding file."
] | [
-0.0007935950998216867,
-0.0028011335525661707,
-0.004275194369256496,
0.08900074660778046,
0.07778718322515488,
0.07575833797454834,
0.23083150386810303,
0.07516220211982727,
0.13553911447525024,
-0.06138211488723755,
0.08212766796350479,
0.06540497392416,
-0.010550104081630707,
0.03219239413738251,
-0.034442633390426636,
-0.14789152145385742,
-0.06019710376858711,
-0.040670525282621384,
-0.0029346568044275045,
0.011435266584157944,
0.08319089561700821,
0.0028219048399478197,
0.10931042581796646,
-0.05494394525885582,
-0.03304550424218178,
0.06838271021842957,
-0.032886866480112076,
-0.03767966106534004,
0.03719813749194145,
0.08615638315677643,
0.12032853811979294,
0.01669151894748211,
0.07222403585910797,
-0.14828233420848846,
0.06327583640813828,
-0.0034621995873749256,
-0.11322078108787537,
-0.009037167765200138,
0.030645741149783134,
-0.044831834733486176,
0.1363353133201599,
0.03646703064441681,
-0.10896508395671844,
0.03399058058857918,
-0.12587131559848785,
-0.007938364520668983,
-0.05252603068947792,
0.03418223559856415,
0.12628202140331268,
0.05504482239484787,
0.02805505506694317,
0.06320057064294815,
-0.056258756667375565,
0.08690555393695831,
0.11849872767925262,
-0.13475055992603302,
-0.06998587399721146,
0.08373335003852844,
0.011880366131663322,
0.1523391306400299,
-0.08450855314731598,
0.10349786281585693,
0.06944093853235245,
-0.04682314768433571,
-0.15192115306854248,
-0.08887696266174316,
-0.207678884267807,
-0.013918007723987103,
0.008872389793395996,
0.015353783965110779,
0.41383418440818787,
0.05087793618440628,
0.037579718977212906,
0.06930413842201233,
-0.06564269214868546,
0.033693574368953705,
-0.0996018648147583,
0.1316099762916565,
0.04109565541148186,
0.09339246153831482,
-0.03750242665410042,
-0.1042906641960144,
-0.11419316381216049,
-0.06723350286483765,
-0.07098893821239471,
-0.004939934704452753,
0.026243681088089943,
0.11644813418388367,
-0.1757468581199646,
0.009627175517380238,
-0.05819759890437126,
-0.12334231287240982,
0.026155712082982063,
-0.10010746866464615,
0.17308497428894043,
0.06431663036346436,
-0.010045614093542099,
0.010590924881398678,
0.24693311750888824,
0.11701790988445282,
0.2081090807914734,
0.03967498615384102,
-0.10911405086517334,
0.13974213600158691,
0.029688414186239243,
-0.0865120142698288,
-0.016579218208789825,
-0.11797238141298294,
0.12870745360851288,
-0.05396934598684311,
0.1038365438580513,
-0.057936009019613266,
-0.12755130231380463,
0.011226953938603401,
-0.1135735809803009,
0.06877731531858444,
0.040013693273067474,
0.0023731342516839504,
-0.0492926761507988,
0.04121771454811096,
0.02826060727238655,
-0.03355425223708153,
0.002365589141845703,
-0.008934208191931248,
-0.06286270916461945,
0.05670320242643356,
0.10244493931531906,
0.026177095249295235,
0.049078281968832016,
0.010893799364566803,
-0.021048998460173607,
-0.00371909118257463,
-0.04817671701312065,
-0.0072062695398926735,
0.04170337691903114,
0.02767878584563732,
0.07971268147230148,
-0.1539967954158783,
-0.07238496840000153,
-0.009810811839997768,
0.061642590910196304,
-0.00226387451402843,
0.11143646389245987,
-0.0142965167760849,
0.059413548558950424,
0.0013020720798522234,
-0.020761530846357346,
0.017385859042406082,
-0.10376079380512238,
0.08084281533956528,
-0.031044138595461845,
0.09137273579835892,
-0.2155594527721405,
-0.005813505966216326,
-0.051129940897226334,
0.007365083321928978,
0.04771701246500015,
-0.0027329581789672375,
-0.11468133330345154,
0.11420997232198715,
-0.013919463381171227,
0.06291613727807999,
-0.0997021496295929,
0.04378337785601616,
0.03850441053509712,
0.08672038465738297,
-0.10740788280963898,
0.00671619875356555,
0.10959259420633316,
-0.14547400176525116,
-0.15622460842132568,
0.10000615566968918,
-0.02190709300339222,
0.03325177729129791,
0.04853034391999245,
0.1566423922777176,
0.16886724531650543,
-0.20379780232906342,
-0.012170683592557907,
0.060407854616642,
-0.0214365404099226,
-0.0839964970946312,
-0.011983644217252731,
0.10407958924770355,
0.02740609459578991,
0.037213824689388275,
-0.051597438752651215,
0.1135958731174469,
-0.03304148092865944,
-0.08402415364980698,
-0.032256245613098145,
-0.07786040753126144,
-0.05453150346875191,
0.055674031376838684,
-0.007337970659136772,
-0.057408299297094345,
0.007054424379020929,
-0.1550789773464203,
0.16211086511611938,
0.010306877084076405,
0.014819706790149212,
-0.06598243117332458,
0.14221040904521942,
0.017621606588363647,
-0.005590524524450302,
0.013621870428323746,
-0.06659235805273056,
-0.10123507678508759,
0.2491295337677002,
0.07842810451984406,
0.10499706119298935,
0.06988070160150528,
-0.03526851534843445,
-0.07190833240747452,
0.027024002745747566,
0.006767737213522196,
-0.04064226895570755,
0.024957794696092606,
-0.11844464391469955,
0.04768776521086693,
-0.015097532421350479,
0.02744598127901554,
-0.014111658558249474,
-0.029059916734695435,
0.07658611983060837,
0.02127828449010849,
-0.021657751873135567,
0.08628877997398376,
0.026418054476380348,
-0.022904973477125168,
-0.07803013920783997,
0.0021696179173886776,
0.07805684953927994,
-0.012267506681382656,
-0.059267736971378326,
0.01866859197616577,
0.006119720637798309,
0.041137829422950745,
0.19624051451683044,
-0.22342710196971893,
0.032812219113111496,
-0.0031285323202610016,
0.04390281066298485,
0.0439116470515728,
-0.001195163233205676,
-0.036233656108379364,
0.03181653469800949,
-0.02212146483361721,
0.06936722248792648,
-0.021556878462433815,
0.07378900051116943,
-0.020916394889354706,
-0.12983131408691406,
-0.020882131531834602,
-0.023187341168522835,
0.17229965329170227,
-0.16149865090847015,
0.06940753757953644,
0.18548202514648438,
-0.12451786547899246,
0.13784748315811157,
0.009350569918751717,
-0.013140973635017872,
0.018198076635599136,
0.05411124601960182,
0.005268484354019165,
0.11187553405761719,
-0.09653479605913162,
-0.023452917113900185,
0.02196870930492878,
-0.09060400724411011,
0.03172002732753754,
-0.12306056916713715,
-0.10322652757167816,
-0.07402976602315903,
-0.03868401050567627,
-0.03252031281590462,
0.014831756241619587,
-0.05534553527832031,
0.08171429485082626,
-0.08728346228599548,
-0.10359372943639755,
-0.02868550829589367,
-0.081316739320755,
0.025617796927690506,
0.01076306588947773,
-0.05376677215099335,
-0.1260339766740799,
-0.12843278050422668,
-0.08560195565223694,
-0.1535673290491104,
0.004199977032840252,
0.06268236041069031,
-0.10424485057592392,
-0.04240875318646431,
0.02028808929026127,
-0.045522309839725494,
0.0995936170220375,
-0.08484531193971634,
0.003462699009105563,
0.05914941057562828,
-0.04857531189918518,
-0.16991643607616425,
-0.002858746098354459,
-0.07750903815031052,
-0.056824058294296265,
0.12871378660202026,
-0.1645035594701767,
0.18855394423007965,
-0.0261238943785429,
0.039698902517557144,
0.07005682587623596,
0.03381958231329918,
0.12286397069692612,
-0.1156320869922638,
0.07796340435743332,
0.18477073311805725,
0.041598983108997345,
0.08158747106790543,
0.1266680657863617,
0.08360850811004639,
-0.10407697409391403,
0.030667508020997047,
0.07269121706485748,
-0.10180888324975967,
-0.07981987297534943,
-0.07042741030454636,
-0.11641611158847809,
-0.05960950255393982,
0.06355361640453339,
0.0598401241004467,
0.04469072073698044,
0.13187216222286224,
-0.05681838467717171,
-0.023562472313642502,
0.1055450364947319,
0.04682439938187599,
0.08277516812086105,
0.015590704046189785,
0.06201586127281189,
-0.14444969594478607,
-0.062342576682567596,
0.16973185539245605,
0.22676877677440643,
0.2329258769750595,
0.014592999592423439,
0.07159813493490219,
0.11681365221738815,
0.07229283452033997,
0.08873751759529114,
0.05519508942961693,
0.010667364113032818,
0.015084338374435902,
-0.07178770750761032,
-0.061553798615932465,
0.02461443468928337,
-0.00003258381548221223,
-0.043439555913209915,
-0.14766348898410797,
0.09363400936126709,
-0.004668982699513435,
0.09272593259811401,
0.1442737877368927,
0.0445081889629364,
-0.1248580664396286,
0.1597360521554947,
0.09169624000787735,
0.10298287868499756,
-0.0710795447230339,
0.11908885091543198,
0.06568089872598648,
-0.013245346955955029,
0.15859919786453247,
0.030192891135811806,
0.14999385178089142,
-0.03962213173508644,
-0.07453560829162598,
-0.06660392880439758,
-0.047910403460264206,
0.006192426197230816,
0.03166859224438667,
-0.1972767859697342,
0.10964415967464447,
0.056516773998737335,
0.01464055571705103,
-0.015721648931503296,
-0.05262647196650505,
0.17831546068191528,
0.17928119003772736,
0.09835944324731827,
0.031390849500894547,
-0.03479716181755066,
-0.008596029132604599,
-0.08589467406272888,
0.049393732100725174,
0.01895993947982788,
0.06749691069126129,
-0.042566847056150436,
-0.09891799092292786,
-0.028155077248811722,
-0.0028849379159510136,
0.01842581294476986,
-0.06971007585525513,
-0.11912912875413895,
-0.041904017329216,
0.25553691387176514,
-0.08469299226999283,
0.060057006776332855,
0.0467466376721859,
0.021523255854845047,
-0.003961868118494749,
0.039734795689582825,
-0.02526126056909561,
-0.02167174406349659,
-0.044464293867349625,
-0.00003114308128715493,
0.011620542034506798,
-0.04610094055533409,
-0.05719324201345444,
-0.02315904200077057,
-0.10323980450630188,
-0.0935259535908699,
0.01236552931368351,
-0.05432620644569397,
0.029635373502969742,
-0.028573593124747276,
0.021609527990221977,
-0.0980057641863823,
-0.03727656602859497,
0.033408813178539276,
0.03411223739385605,
-0.0759579986333847,
-0.1287563294172287,
-0.008650499396026134,
-0.027612991631031036,
-0.054928552359342575,
0.03818025067448616,
-0.13159289956092834,
-0.10054569691419601,
-0.049900494515895844,
-0.01295156218111515,
0.13220776617527008,
0.2202393114566803,
-0.034125953912734985,
-0.0018600382609292865,
0.14828051626682281,
-0.10561885684728622,
-0.33010438084602356,
-0.13744564354419708,
-0.15784195065498352,
-0.10045003145933151,
0.029116583988070488,
-0.06562521308660507,
0.021786345168948174,
0.08692441135644913,
-0.04267719388008118,
0.2105213850736618,
-0.21427728235721588,
-0.09434593468904495,
0.08150775730609894,
0.09839684516191483,
0.3184796869754791,
-0.2236349880695343,
0.01640482246875763,
-0.11870303004980087,
-0.02869178168475628,
-0.011273540556430817,
-0.07344120740890503,
0.11460814625024796,
0.04141850769519806,
0.06296388059854507,
-0.0019184901611879468,
-0.009023046121001244,
0.14372916519641876,
-0.06227494403719902,
0.13204866647720337,
-0.11360009014606476,
-0.09318234026432037,
0.21341083943843842,
-0.02729560248553753,
0.004218512214720249,
-0.19947846233844757,
-0.034682586789131165,
-0.026023875921964645,
0.03275096043944359,
-0.006746178492903709,
0.06871836632490158,
-0.0058388542383909225,
-0.013378078117966652,
-0.12095646560192108,
-0.01929454691708088,
-0.03249781206250191,
0.06908851116895676,
0.20450128614902496,
-0.07382097095251083,
-0.06224594637751579,
0.055429019033908844,
-0.005878719966858625,
0.11357888579368591,
0.002424529753625393,
-0.04285653308033943,
-0.037405192852020264,
0.09106646478176117,
-0.16965681314468384,
0.06420912593603134,
0.017120910808444023,
-0.007564136292785406,
0.003303517121821642,
0.016801437363028526,
0.02498992718756199,
0.10629518330097198,
0.18099424242973328,
-0.005659106653183699,
-0.026340825483202934,
-0.02223270945250988,
0.018293635919690132,
0.1311124712228775,
-0.006399140227586031,
0.10983893275260925,
0.022990716621279716,
0.04507127031683922,
0.001866473932750523,
0.04643150418996811,
-0.08140993863344193,
-0.08026499301195145,
0.09430886059999466,
-0.05370323732495308,
-0.08481691032648087,
0.08710420876741409,
0.044366348534822464,
0.06825502216815948,
-0.006935692857950926,
0.045715197920799255,
0.012387451715767384,
-0.12574659287929535,
0.03223485127091408,
0.20925258100032806,
-0.0780639573931694,
-0.06634440273046494,
-0.08218611776828766,
0.012806040234863758,
-0.11859144270420074,
0.06869518011808395,
0.04221333563327789,
-0.03837699815630913,
0.10511782020330429,
-0.04949042573571205,
-0.03980141133069992,
0.014688560739159584,
-0.05697420984506607,
0.04073970019817352,
-0.13034848868846893,
-0.1980532556772232,
0.040899790823459625,
-0.018035370856523514,
-0.06781241297721863,
-0.09080140292644501,
-0.0804632157087326,
0.0652754157781601,
-0.15161161124706268,
0.1302192360162735,
-0.062023114413022995,
0.05858364701271057,
-0.043441418558359146,
-0.05295374244451523,
-0.11244344711303711,
-0.015123795717954636,
-0.06141547113656998,
-0.01989070698618889,
0.05941883474588394,
0.015344980172812939,
-0.12488780915737152,
-0.11013944447040558,
0.05979402735829353,
-0.013441435061395168,
-0.0021953675895929337,
0.015825539827346802,
-0.06576002389192581,
0.010867184959352016,
-0.22923628985881805,
-0.06913426518440247,
0.078861765563488,
0.04180165380239487,
-0.08243048191070557,
0.13620080053806305,
0.0584590882062912,
-0.01390441320836544,
0.03773060068488121,
0.0023111284244805574,
0.175437331199646,
-0.07086282223463058,
0.04443388432264328,
-0.1124318391084671,
-0.18159538507461548,
-0.017725959420204163,
0.04881654679775238,
0.22672389447689056,
0.08238939940929413,
0.10851142555475235,
-0.05160336196422577,
0.020675402134656906,
-0.0018356038490310311,
0.071876659989357,
0.019665831699967384,
-0.11126456409692764,
-0.05194927752017975,
-0.16880643367767334,
-0.06192762404680252,
-0.057760342955589294,
0.1591442972421646,
0.05825021490454674,
-0.12564456462860107,
-0.0009010356152430177,
0.11641620099544525,
-0.17280641198158264,
-0.010151066817343235,
0.1538575142621994,
-0.053017038851976395,
0.02751380391418934,
-0.16331321001052856,
0.02802293188869953,
0.07399635761976242,
-0.016884388402104378,
0.010323292575776577,
0.13389697670936584,
-0.004632923286408186,
-0.002074782270938158,
0.05113968253135681,
-0.03284880146384239,
0.06361433118581772,
-0.07184033840894699,
0.07660970091819763,
0.007803742308169603,
-0.05755811929702759,
-0.1062721461057663,
0.17810121178627014,
-0.023560313507914543,
0.010145277716219425,
-0.049627676606178284,
-0.005238097161054611,
-0.10928678512573242,
-0.10152336955070496,
-0.06510954350233078,
-0.13756848871707916,
0.07054178416728973,
-0.057030316442251205,
0.024926191195845604,
-0.009773215278983116,
0.020479189231991768,
-0.08012764900922775,
0.014803093858063221,
-0.20202699303627014,
-0.04967259243130684,
0.022126303985714912,
-0.013718241825699806,
-0.001111341523937881,
-0.060045380145311356,
-0.04126109182834625,
0.010310283862054348,
-0.05949021875858307,
-0.06675147265195847,
0.06042638421058655,
0.0988243892788887,
0.04375205934047699,
-0.15628165006637573,
-0.09596515446901321,
-0.0708332359790802,
0.030691685155034065,
0.07670382410287857,
0.18258848786354065,
0.0324554480612278,
-0.012559998780488968,
0.04418647661805153,
0.13229793310165405,
0.011304935440421104,
-0.0740649551153183,
-0.0629793331027031,
-0.12686774134635925,
-0.1293102651834488,
0.006237010471522808,
-0.06827180087566376,
-0.021703409031033516,
0.020670199766755104,
0.2291315644979477,
0.20025195181369781,
-0.14590635895729065,
0.031844448298215866,
-0.07103422284126282,
0.03966165706515312,
-0.02152101881802082,
0.15866929292678833,
0.034830231219530106,
0.15219014883041382,
-0.03398715332150459,
-0.030931755900382996,
-0.06109955534338951,
0.028964083641767502,
-0.10500732809305191,
0.05017223581671715,
-0.011116445064544678,
-0.06805328279733658,
-0.07135727256536484,
0.11491400748491287,
-0.09860064089298248,
0.07016041874885559,
0.19064371287822723,
-0.1567673683166504,
-0.02474255859851837,
-0.036003902554512024,
0.04863864928483963,
0.1049773320555687,
0.02665131911635399,
-0.0799311026930809,
-0.03679608926177025,
0.0006801635026931763,
0.028339186683297157,
-0.17667295038700104,
-0.13406196236610413,
-0.004573036450892687,
-0.12845279276371002,
0.14791950583457947,
-0.01305434200912714,
0.0006847942713648081,
0.04008728265762329,
-0.06630373746156693,
0.0043871100060641766,
0.14930224418640137,
0.021049456670880318,
-0.040777526795864105,
-0.022815903648734093,
-0.07644408941268921,
-0.11506567150354385,
0.07531974464654922,
0.08081968128681183,
0.04485555738210678,
0.004569314420223236,
0.14999113976955414,
-0.02655159682035446,
-0.03763274848461151,
0.13576580584049225,
-0.17489421367645264,
0.089394710958004,
0.0016632821643725038,
-0.011638551019132137,
-0.06803406029939651,
-0.04802046716213226,
0.03920365124940872,
0.09252705425024033,
-0.15924212336540222,
-0.039533957839012146,
0.04675113409757614,
-0.09989336133003235,
0.06875581294298172,
0.0459829680621624,
-0.10019927471876144,
0.014360903762280941,
-0.11796178668737411,
0.007476360071450472,
-0.1021372526884079,
0.038964223116636276,
0.20167088508605957,
-0.03451177477836609,
0.013259368017315865,
-0.14313644170761108,
0.054516393691301346,
-0.03467240557074547,
-0.025299470871686935,
-0.06989193707704544
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | null | rohdimp24/LORA-bloom-3b | [
"transformers",
"safetensors",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-15T05:38:49+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
31,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06646376848220825,
0.2168014943599701,
-0.00225935154594481,
0.023818302899599075,
0.1271018385887146,
-0.001635765191167593,
0.04218708351254463,
0.13324736058712006,
-0.020175931975245476,
0.11144465953111649,
0.046588581055402756,
0.09377603232860565,
0.09928803145885468,
0.18404334783554077,
0.04859916493296623,
-0.2059975117444992,
0.007056170143187046,
-0.09090408682823181,
0.014076028019189835,
0.1116579994559288,
0.13719257712364197,
-0.10291384905576706,
0.08272874355316162,
-0.04045208916068077,
-0.02019004337489605,
0.00012576708104461432,
-0.09259183704853058,
-0.07032395154237747,
0.06885425746440887,
0.06264153122901917,
0.051234472543001175,
0.001456156256608665,
0.09140396863222122,
-0.2864592671394348,
0.017265573143959045,
0.08406311273574829,
0.0027674848679453135,
0.06290827691555023,
0.07236549258232117,
-0.07389893382787704,
0.11328595131635666,
-0.08021481335163116,
0.13019037246704102,
0.08625296503305435,
-0.062064990401268005,
-0.23071379959583282,
-0.07525765895843506,
0.0963398814201355,
0.12251301854848862,
0.06215599179267883,
-0.022921854630112648,
0.15455181896686554,
-0.06248689442873001,
0.012971068732440472,
0.1294165402650833,
-0.11526761949062347,
-0.05572471022605896,
0.061741601675748825,
0.11775490641593933,
0.10740239918231964,
-0.14110268652439117,
-0.0017287094378843904,
0.04900608956813812,
0.029121357947587967,
0.08589313924312592,
0.022661056369543076,
0.12003941088914871,
0.04652795568108559,
-0.13695219159126282,
-0.04037507623434067,
0.12011898308992386,
0.038862764835357666,
-0.06446044892072678,
-0.2168138176202774,
-0.006778308190405369,
-0.0601806715130806,
-0.014732478186488152,
-0.07019448280334473,
0.039128515869379044,
-0.02470310963690281,
0.07317749410867691,
-0.04465159401297569,
-0.1063927412033081,
-0.0421026237308979,
0.0892222449183464,
0.07748593389987946,
0.011527054943144321,
-0.02519804798066616,
0.04627908393740654,
0.13455867767333984,
0.05402068421244621,
-0.10399353504180908,
-0.07017925381660461,
-0.06942764669656754,
-0.09420394152402878,
-0.04035796597599983,
0.056760527193546295,
0.031942449510097504,
0.02665667235851288,
0.22703726589679718,
0.016653569415211678,
0.04155244305729866,
0.0224777739495039,
0.01032855175435543,
0.043662428855895996,
0.0955500528216362,
-0.05303520709276199,
-0.15660029649734497,
-0.04072032496333122,
0.09077946096658707,
-0.0027527001220732927,
-0.036689214408397675,
-0.03966725245118141,
0.03849169611930847,
0.06843466311693192,
0.13122352957725525,
0.07552056759595871,
-0.017929591238498688,
-0.04813180863857269,
-0.030096933245658875,
0.23523783683776855,
-0.1493375599384308,
0.04426715523004532,
-0.02271856553852558,
-0.01804111897945404,
-0.03908449783921242,
0.03597262129187584,
0.022118929773569107,
-0.000004518366949923802,
0.09706240892410278,
-0.058981191366910934,
-0.05378659814596176,
-0.10168042778968811,
-0.03272576630115509,
0.04088849574327469,
-0.013975566253066063,
-0.010589460842311382,
-0.09025166928768158,
-0.09490354359149933,
-0.04766594246029854,
0.05537205561995506,
-0.05123869329690933,
-0.03770573064684868,
0.009465423412621021,
-0.08151785284280777,
-0.005444355774670839,
-0.005417742300778627,
0.10699385404586792,
-0.03222226724028587,
0.04445803165435791,
-0.027600755915045738,
0.05225523188710213,
0.09919606149196625,
0.031576547771692276,
-0.0773419588804245,
0.0561848059296608,
-0.22559374570846558,
0.07503069192171097,
-0.11481974273920059,
0.04335082694888115,
-0.1704932004213333,
-0.042439818382263184,
0.005444696638733149,
0.0139949731528759,
0.013206101022660732,
0.12720820307731628,
-0.19255615770816803,
-0.01654396951198578,
0.13260798156261444,
-0.09212633967399597,
-0.118110790848732,
0.07884611934423447,
-0.029701577499508858,
0.1624738723039627,
0.04682036489248276,
-0.027025915682315826,
0.09224298596382141,
-0.16434773802757263,
-0.07092688232660294,
-0.00949116237461567,
-0.01727987825870514,
0.12109188735485077,
0.07512219995260239,
-0.05991523340344429,
0.046571120619773865,
0.02832140028476715,
-0.038078423589468,
-0.04424772411584854,
-0.050857074558734894,
-0.10884185880422592,
-0.01070026308298111,
-0.08987759798765182,
0.04065500199794769,
-0.01250192429870367,
-0.07916021347045898,
-0.029885273426771164,
-0.18612512946128845,
-0.0030564051121473312,
0.10038342326879501,
0.0035033065360039473,
-0.005652366206049919,
-0.08666291832923889,
0.026358824223279953,
-0.03112892620265484,
-0.008404186926782131,
-0.16764774918556213,
-0.04399421438574791,
0.046902090311050415,
-0.16094985604286194,
0.020117372274398804,
-0.06413903087377548,
0.06334125250577927,
0.03641495108604431,
-0.05590536445379257,
-0.0248766727745533,
-0.01730942726135254,
0.011945613659918308,
-0.05083848536014557,
-0.18994836509227753,
-0.056277405470609665,
-0.037882111966609955,
0.149809330701828,
-0.25956398248672485,
0.032966937869787216,
0.051140617579221725,
0.14649195969104767,
0.00406361510977149,
-0.05115427449345589,
0.01429014839231968,
-0.05360214412212372,
-0.054652128368616104,
-0.06746816635131836,
-0.006135428790003061,
-0.027576493099331856,
-0.05147203803062439,
0.019243421033024788,
-0.1755700707435608,
-0.021410830318927765,
0.09424154460430145,
0.12876708805561066,
-0.1486445665359497,
-0.018640631809830666,
-0.048725154250860214,
-0.06339836865663528,
-0.0715010017156601,
-0.07038594037294388,
0.10712739825248718,
0.0513901449739933,
0.04796046018600464,
-0.07435787469148636,
-0.07092321664094925,
0.02726263552904129,
0.006906150374561548,
-0.03382374346256256,
0.08727246522903442,
0.05199531093239784,
-0.09209315478801727,
0.0756213590502739,
0.1092359870672226,
0.07177663594484329,
0.09363535046577454,
0.01574566215276718,
-0.11756632477045059,
-0.028492970392107964,
0.036266472190618515,
0.02740776725113392,
0.1465986967086792,
-0.05952361226081848,
0.04016614332795143,
0.04494241625070572,
-0.04170418903231621,
0.022319864481687546,
-0.08787637203931808,
0.024075502529740334,
0.025203049182891846,
-0.0034381982404738665,
0.06284574419260025,
-0.02525499276816845,
-0.0050758360885083675,
0.07016654312610626,
0.047779910266399384,
0.04621000960469246,
0.009655474685132504,
-0.01720241829752922,
-0.1047825813293457,
0.16950392723083496,
-0.0951867327094078,
-0.269941508769989,
-0.17632324993610382,
0.026197833940386772,
0.04035249724984169,
-0.022378476336598396,
0.031619444489479065,
-0.07056326419115067,
-0.10630585998296738,
-0.1060405746102333,
-0.002429972169920802,
0.01714223250746727,
-0.06364088505506516,
-0.0741225928068161,
0.07348573952913284,
0.04382912442088127,
-0.14902326464653015,
0.038552410900592804,
0.055694397538900375,
-0.057955220341682434,
-0.0233661737293005,
0.09118817001581192,
0.12397737801074982,
0.14583967626094818,
-0.021366750821471214,
-0.028626007959246635,
0.029004426673054695,
0.19620531797409058,
-0.13469526171684265,
0.10371150821447372,
0.13814030587673187,
-0.04545360431075096,
0.08360563963651657,
0.1560150384902954,
0.029186224564909935,
-0.08317049592733383,
0.05044832453131676,
0.04082648828625679,
-0.043159641325473785,
-0.2666129767894745,
-0.0534592866897583,
0.012832709588110447,
-0.06255637854337692,
0.09786593168973923,
0.10183793306350708,
0.11542957276105881,
0.034910861402750015,
-0.07166364789009094,
-0.043925940990448,
-0.0058974819257855415,
0.11737963557243347,
-0.05490213260054588,
-0.012639665976166725,
0.07686592638492584,
-0.05086168646812439,
0.005355054512619972,
0.10266812145709991,
0.02973790094256401,
0.17442677915096283,
0.020399179309606552,
0.11231429129838943,
0.06195578724145889,
0.08633565157651901,
0.0007386076031252742,
0.02951662428677082,
0.05147615820169449,
0.017203815281391144,
-0.002300140680745244,
-0.10421168059110641,
-0.006156572140753269,
0.1449710875749588,
0.028103826567530632,
0.029669636860489845,
-0.0018948549404740334,
-0.005003341939300299,
0.05121048167347908,
0.1746254414319992,
-0.011592294089496136,
-0.22072425484657288,
-0.0845772922039032,
0.06936841458082199,
-0.06218599155545235,
-0.12968985736370087,
-0.026130788028240204,
0.045467354357242584,
-0.17519839107990265,
0.026703642681241035,
-0.027433741837739944,
0.0919293761253357,
-0.09345759451389313,
-0.02221956104040146,
0.03687324374914169,
0.084866963326931,
-0.014529162086546421,
0.08703910559415817,
-0.14498743414878845,
0.11886418610811234,
0.02978132851421833,
0.09024628251791,
-0.11081171780824661,
0.07909037172794342,
-0.007550720125436783,
0.009180475026369095,
0.19379350543022156,
-0.011335089802742004,
-0.03514958545565605,
-0.08774717897176743,
-0.11210042238235474,
-0.013537433929741383,
0.12687496840953827,
-0.1243172138929367,
0.08773399889469147,
-0.015198243781924248,
-0.044079482555389404,
0.00937260314822197,
-0.12100647389888763,
-0.17273177206516266,
-0.19628387689590454,
0.05585884302854538,
-0.09575839340686798,
0.025643249973654747,
-0.11914430558681488,
-0.07089093327522278,
-0.02952558360993862,
0.241120383143425,
-0.1745356321334839,
-0.06510113179683685,
-0.1468164622783661,
-0.046294767409563065,
0.1662203073501587,
-0.04437198117375374,
0.0718095526099205,
-0.0208172257989645,
0.20345525443553925,
0.005988610442727804,
-0.004939318168908358,
0.06724198162555695,
-0.08892562240362167,
-0.16873881220817566,
-0.06771010160446167,
0.1510489284992218,
0.11680185794830322,
0.04907919466495514,
-0.002248800592496991,
0.0011772146681323647,
-0.016943959519267082,
-0.1137804463505745,
-0.0033210667315870523,
0.16037839651107788,
0.03878779336810112,
0.025986969470977783,
-0.05243593826889992,
-0.08797456324100494,
-0.06899320334196091,
-0.06853509694337845,
0.06221301481127739,
0.19590823352336884,
-0.10376439243555069,
0.1700313836336136,
0.147536963224411,
-0.07305635511875153,
-0.23175598680973053,
0.035342130810022354,
0.04983805492520332,
0.0014306638622656465,
0.04886869341135025,
-0.18252557516098022,
0.10521943867206573,
0.019543392583727837,
-0.05505957826972008,
0.13485197722911835,
-0.1557481735944748,
-0.1552847921848297,
0.0722852572798729,
0.03904085233807564,
-0.22423844039440155,
-0.1354004591703415,
-0.09622503817081451,
-0.05825018882751465,
-0.14065024256706238,
0.06054598465561867,
-0.002136280992999673,
0.015948504209518433,
0.03500790148973465,
-0.0015643214574083686,
0.027123261243104935,
-0.058935679495334625,
0.18609118461608887,
-0.004065449349582195,
0.020676052197813988,
-0.060264769941568375,
-0.0478842556476593,
0.09839435666799545,
-0.06130504235625267,
0.12208222597837448,
0.004057085141539574,
0.01594383642077446,
-0.10362856835126877,
-0.048314861953258514,
-0.04328322783112526,
0.05154227837920189,
-0.07548051327466965,
-0.10070807486772537,
-0.043625857681035995,
0.08841723203659058,
0.07005169242620468,
-0.03383097052574158,
0.00549331633374095,
-0.07189501076936722,
0.10019614547491074,
0.17795267701148987,
0.17573626339435577,
0.009926567785441875,
-0.07241068035364151,
0.01677953451871872,
-0.04142116755247116,
0.044231921434402466,
-0.2513144314289093,
0.03756171092391014,
0.06098250672221184,
0.029438555240631104,
0.09217222779989243,
-0.020435843616724014,
-0.1820858269929886,
-0.04050002992153168,
0.08094815909862518,
-0.05452597141265869,
-0.22617179155349731,
-0.019085140898823738,
0.0954197570681572,
-0.2020406424999237,
-0.007372708059847355,
0.03995226323604584,
-0.048725228756666183,
-0.023169852793216705,
0.00010950004070764408,
0.06317184865474701,
0.002471912419423461,
0.09773622453212738,
0.0735151618719101,
0.09715340286493301,
-0.08337292820215225,
0.10562895983457565,
0.10150538384914398,
-0.09572599828243256,
0.03605884686112404,
0.06754924356937408,
-0.05300498008728027,
-0.043293699622154236,
0.03665391728281975,
0.033023297786712646,
0.005234600510448217,
-0.060321882367134094,
0.013913018628954887,
-0.036497246474027634,
0.044923391193151474,
0.08326134830713272,
0.03754979372024536,
-0.013354414142668247,
0.06462216377258301,
0.03401726484298706,
-0.10898099094629288,
0.10366570204496384,
0.01731540448963642,
0.04105307161808014,
-0.08384523540735245,
-0.019968897104263306,
0.035425446927547455,
0.030576206743717194,
-0.01765924133360386,
-0.02306121215224266,
-0.02860277332365513,
-0.01614218018949032,
-0.14299540221691132,
-0.023106401786208153,
-0.07243485748767853,
0.006181265693157911,
0.014656842686235905,
-0.031884219497442245,
-0.011233693920075893,
0.02475680410861969,
-0.06979699432849884,
-0.07426341623067856,
-0.006949664559215307,
0.09833318740129471,
-0.15115703642368317,
0.008848577737808228,
0.06907843053340912,
-0.11088496446609497,
0.08190931379795074,
-0.008411259390413761,
0.016245156526565552,
0.022527478635311127,
-0.15448406338691711,
0.05601610988378525,
0.0008648968650959432,
0.01916889287531376,
0.025886621326208115,
-0.16471809148788452,
0.004104440100491047,
-0.04661374166607857,
-0.02149827405810356,
-0.00004464812809601426,
-0.02647159807384014,
-0.12325995415449142,
0.06858719140291214,
-0.015622655861079693,
-0.035931166261434555,
-0.02701525390148163,
0.0539589487016201,
0.07888586074113846,
-0.027474910020828247,
0.10445091128349304,
-0.008690856397151947,
0.04941811040043831,
-0.16801609098911285,
-0.02470702864229679,
-0.04982255399227142,
0.019377702847123146,
0.009884213097393513,
-0.007693959400057793,
0.04183054715394974,
-0.00976533442735672,
0.21883612871170044,
-0.05075952783226967,
0.1607085019350052,
0.05847611650824547,
-0.017352959141135216,
-0.0007513365126214921,
0.06180921941995621,
0.05997028574347496,
0.04658793285489082,
0.009480604901909828,
0.023740366101264954,
-0.022450892254710197,
-0.006695089396089315,
-0.15932634472846985,
0.01890849508345127,
0.14999441802501678,
0.06301083415746689,
0.024745315313339233,
0.05866100639104843,
-0.12775006890296936,
-0.12135478109121323,
0.09311001747846603,
-0.026755332946777344,
0.00928465835750103,
-0.08245618641376495,
0.1358020007610321,
0.14980104565620422,
-0.14000412821769714,
0.05256148427724838,
-0.06134212389588356,
-0.05217423290014267,
-0.10388828068971634,
-0.12032219022512436,
-0.05887215584516525,
-0.053666237741708755,
0.002330566756427288,
-0.03760887682437897,
0.054546963423490524,
0.03344334661960602,
-0.009351172484457493,
-0.00022941511997487396,
0.13597318530082703,
-0.019751882180571556,
-0.0028988157864660025,
0.048313532024621964,
0.03693558648228645,
0.02373051457107067,
-0.05275435373187065,
0.02940409444272518,
0.02539868652820587,
0.032232340425252914,
0.06546790152788162,
0.033412106335163116,
-0.047448933124542236,
0.03804153576493263,
-0.0025254099164158106,
-0.11207924783229828,
0.019641218706965446,
-0.00460948096588254,
-0.0742158442735672,
0.1268945336341858,
0.0407399944961071,
0.010224059224128723,
-0.03741471841931343,
0.24361543357372284,
-0.06653323769569397,
-0.06378097087144852,
-0.13251738250255585,
0.10491154342889786,
-0.0027236645109951496,
0.06476365029811859,
0.023412218317389488,
-0.1284150779247284,
0.005243356805294752,
0.13858191668987274,
0.12181595712900162,
0.0045748427510261536,
0.009228081442415714,
0.0518609918653965,
0.0025186820421367884,
-0.06998204439878464,
0.054019294679164886,
0.06992026418447495,
0.12919506430625916,
-0.07847554981708527,
0.07680778950452805,
0.0006860480643808842,
-0.08370215445756912,
-0.02947772853076458,
0.11312682181596756,
-0.0409729965031147,
0.03491825982928276,
-0.047444481402635574,
0.10916327685117722,
-0.05787910893559456,
-0.29412412643432617,
0.02350960113108158,
-0.09588567912578583,
-0.15202060341835022,
-0.018367812037467957,
0.05944539234042168,
-0.02624768204987049,
0.018029648810625076,
0.06971040368080139,
-0.06011629104614258,
0.20098382234573364,
0.0335683599114418,
-0.07864278554916382,
-0.0664360448718071,
0.04837050288915634,
-0.06564252078533173,
0.2949807047843933,
0.008418165147304535,
0.02863333560526371,
0.10770907253026962,
-0.03253700211644173,
-0.18271861970424652,
0.010723991319537163,
0.1133992001414299,
-0.08056149631738663,
0.08200647681951523,
0.19000613689422607,
-0.012578671798110008,
0.1209007054567337,
0.05294662341475487,
-0.047376248985528946,
0.04217283055186272,
-0.03389401361346245,
-0.051268599927425385,
-0.10752558708190918,
0.058453381061553955,
-0.05909625440835953,
0.15447644889354706,
0.10152646154165268,
-0.05671518296003342,
-0.004550917539745569,
-0.05555408447980881,
0.04875178262591362,
0.01804669201374054,
0.12263146042823792,
0.02951994352042675,
-0.1865430772304535,
0.032826557755470276,
-0.01144319772720337,
0.10186848044395447,
-0.25588861107826233,
-0.08421015739440918,
0.08833149075508118,
-0.011924264021217823,
-0.05105875805020332,
0.10560628771781921,
0.057650718837976456,
0.04243382066488266,
-0.043439045548439026,
-0.10480839014053345,
-0.02186836116015911,
0.14663739502429962,
-0.1469624787569046,
-0.025013303384184837
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# scenario-KD-PR-MSV-EN-EN-D2_data-en-massive_all_1_1
This model is a fine-tuned version of [FacebookAI/xlm-roberta-base](https://huggingface.co/FacebookAI/xlm-roberta-base) on the massive dataset.
It achieves the following results on the evaluation set:
- Loss: 7.4557
- Accuracy: 0.4542
- F1: 0.3997
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 47
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|:------:|
| No log | 0.28 | 100 | 7.5858 | 0.2343 | 0.0893 |
| No log | 0.56 | 200 | 6.6990 | 0.3112 | 0.1823 |
| No log | 0.83 | 300 | 6.0789 | 0.3867 | 0.2572 |
| No log | 1.11 | 400 | 5.9605 | 0.4037 | 0.2779 |
| 5.219 | 1.39 | 500 | 5.9969 | 0.3801 | 0.2897 |
| 5.219 | 1.67 | 600 | 5.5708 | 0.4340 | 0.3063 |
| 5.219 | 1.94 | 700 | 5.5486 | 0.4323 | 0.3408 |
| 5.219 | 2.22 | 800 | 5.7112 | 0.4175 | 0.3328 |
| 5.219 | 2.5 | 900 | 5.6918 | 0.4241 | 0.3309 |
| 2.5566 | 2.78 | 1000 | 5.8073 | 0.4071 | 0.3317 |
| 2.5566 | 3.06 | 1100 | 6.4247 | 0.3718 | 0.3127 |
| 2.5566 | 3.33 | 1200 | 5.5318 | 0.4400 | 0.3548 |
| 2.5566 | 3.61 | 1300 | 6.1294 | 0.4157 | 0.3393 |
| 2.5566 | 3.89 | 1400 | 6.6057 | 0.3980 | 0.3468 |
| 1.7451 | 4.17 | 1500 | 5.7690 | 0.4434 | 0.3574 |
| 1.7451 | 4.44 | 1600 | 6.8472 | 0.3823 | 0.3434 |
| 1.7451 | 4.72 | 1700 | 6.0105 | 0.4302 | 0.3547 |
| 1.7451 | 5.0 | 1800 | 6.2133 | 0.4303 | 0.3651 |
| 1.7451 | 5.28 | 1900 | 6.1706 | 0.4450 | 0.3682 |
| 1.2832 | 5.56 | 2000 | 6.2779 | 0.4215 | 0.3666 |
| 1.2832 | 5.83 | 2100 | 6.2887 | 0.4386 | 0.3657 |
| 1.2832 | 6.11 | 2200 | 7.3324 | 0.3916 | 0.3450 |
| 1.2832 | 6.39 | 2300 | 6.5663 | 0.4371 | 0.3675 |
| 1.2832 | 6.67 | 2400 | 7.9241 | 0.3881 | 0.3289 |
| 0.9829 | 6.94 | 2500 | 7.1264 | 0.4101 | 0.3597 |
| 0.9829 | 7.22 | 2600 | 6.9262 | 0.4299 | 0.3637 |
| 0.9829 | 7.5 | 2700 | 7.3824 | 0.4249 | 0.3538 |
| 0.9829 | 7.78 | 2800 | 7.1468 | 0.4360 | 0.3630 |
| 0.9829 | 8.06 | 2900 | 8.1472 | 0.3922 | 0.3473 |
| 0.7736 | 8.33 | 3000 | 7.5301 | 0.4274 | 0.3677 |
| 0.7736 | 8.61 | 3100 | 8.5605 | 0.3840 | 0.3447 |
| 0.7736 | 8.89 | 3200 | 7.5599 | 0.4208 | 0.3663 |
| 0.7736 | 9.17 | 3300 | 8.0662 | 0.4022 | 0.3531 |
| 0.7736 | 9.44 | 3400 | 7.7026 | 0.4183 | 0.3572 |
| 0.6388 | 9.72 | 3500 | 7.3751 | 0.4351 | 0.3678 |
| 0.6388 | 10.0 | 3600 | 7.7079 | 0.4304 | 0.3679 |
| 0.6388 | 10.28 | 3700 | 8.1274 | 0.4168 | 0.3675 |
| 0.6388 | 10.56 | 3800 | 7.4026 | 0.4489 | 0.3834 |
| 0.6388 | 10.83 | 3900 | 7.8569 | 0.4347 | 0.3739 |
| 0.5491 | 11.11 | 4000 | 8.4369 | 0.4104 | 0.3712 |
| 0.5491 | 11.39 | 4100 | 7.6399 | 0.4294 | 0.3671 |
| 0.5491 | 11.67 | 4200 | 7.5023 | 0.4434 | 0.3744 |
| 0.5491 | 11.94 | 4300 | 7.7085 | 0.4315 | 0.3693 |
| 0.5491 | 12.22 | 4400 | 8.1275 | 0.4250 | 0.3695 |
| 0.4683 | 12.5 | 4500 | 7.7533 | 0.4468 | 0.3709 |
| 0.4683 | 12.78 | 4600 | 7.9580 | 0.4373 | 0.3787 |
| 0.4683 | 13.06 | 4700 | 7.7855 | 0.4377 | 0.3811 |
| 0.4683 | 13.33 | 4800 | 7.9553 | 0.4385 | 0.3789 |
| 0.4683 | 13.61 | 4900 | 7.5295 | 0.4490 | 0.3850 |
| 0.3993 | 13.89 | 5000 | 7.1115 | 0.4572 | 0.3868 |
| 0.3993 | 14.17 | 5100 | 8.5595 | 0.4124 | 0.3663 |
| 0.3993 | 14.44 | 5200 | 7.6619 | 0.4321 | 0.3723 |
| 0.3993 | 14.72 | 5300 | 7.8443 | 0.4255 | 0.3666 |
| 0.3993 | 15.0 | 5400 | 8.0150 | 0.4318 | 0.3822 |
| 0.3647 | 15.28 | 5500 | 8.4854 | 0.4144 | 0.3693 |
| 0.3647 | 15.56 | 5600 | 7.8110 | 0.4329 | 0.3757 |
| 0.3647 | 15.83 | 5700 | 7.7835 | 0.4451 | 0.3810 |
| 0.3647 | 16.11 | 5800 | 8.2197 | 0.4232 | 0.3743 |
| 0.3647 | 16.39 | 5900 | 8.0185 | 0.4334 | 0.3789 |
| 0.3214 | 16.67 | 6000 | 8.3538 | 0.4319 | 0.3835 |
| 0.3214 | 16.94 | 6100 | 7.8909 | 0.4354 | 0.3807 |
| 0.3214 | 17.22 | 6200 | 7.7767 | 0.4294 | 0.3834 |
| 0.3214 | 17.5 | 6300 | 8.1544 | 0.4283 | 0.3782 |
| 0.3214 | 17.78 | 6400 | 9.1679 | 0.4002 | 0.3678 |
| 0.3041 | 18.06 | 6500 | 8.5720 | 0.4133 | 0.3709 |
| 0.3041 | 18.33 | 6600 | 7.6984 | 0.4477 | 0.3926 |
| 0.3041 | 18.61 | 6700 | 7.5970 | 0.4499 | 0.3904 |
| 0.3041 | 18.89 | 6800 | 7.8427 | 0.4377 | 0.3773 |
| 0.3041 | 19.17 | 6900 | 7.4411 | 0.4497 | 0.3912 |
| 0.2783 | 19.44 | 7000 | 7.8410 | 0.4370 | 0.3820 |
| 0.2783 | 19.72 | 7100 | 7.8680 | 0.4463 | 0.3874 |
| 0.2783 | 20.0 | 7200 | 7.7858 | 0.4410 | 0.3729 |
| 0.2783 | 20.28 | 7300 | 7.0824 | 0.4589 | 0.3946 |
| 0.2783 | 20.56 | 7400 | 7.6646 | 0.4467 | 0.3884 |
| 0.261 | 20.83 | 7500 | 7.3004 | 0.4580 | 0.3888 |
| 0.261 | 21.11 | 7600 | 7.7126 | 0.4489 | 0.3863 |
| 0.261 | 21.39 | 7700 | 8.0802 | 0.4291 | 0.3828 |
| 0.261 | 21.67 | 7800 | 7.4386 | 0.4536 | 0.3931 |
| 0.261 | 21.94 | 7900 | 8.1070 | 0.4337 | 0.3846 |
| 0.2524 | 22.22 | 8000 | 7.4291 | 0.4540 | 0.3950 |
| 0.2524 | 22.5 | 8100 | 7.7144 | 0.4469 | 0.3891 |
| 0.2524 | 22.78 | 8200 | 8.2414 | 0.4295 | 0.3793 |
| 0.2524 | 23.06 | 8300 | 7.9833 | 0.4393 | 0.3864 |
| 0.2524 | 23.33 | 8400 | 7.3628 | 0.4541 | 0.3922 |
| 0.2414 | 23.61 | 8500 | 7.3701 | 0.4546 | 0.4025 |
| 0.2414 | 23.89 | 8600 | 7.7196 | 0.4451 | 0.3954 |
| 0.2414 | 24.17 | 8700 | 7.6399 | 0.4500 | 0.3948 |
| 0.2414 | 24.44 | 8800 | 7.4280 | 0.4543 | 0.3993 |
| 0.2414 | 24.72 | 8900 | 7.8374 | 0.4397 | 0.3915 |
| 0.2315 | 25.0 | 9000 | 7.4209 | 0.4561 | 0.3965 |
| 0.2315 | 25.28 | 9100 | 7.4840 | 0.4493 | 0.3969 |
| 0.2315 | 25.56 | 9200 | 7.5720 | 0.4492 | 0.3932 |
| 0.2315 | 25.83 | 9300 | 7.6239 | 0.4470 | 0.3952 |
| 0.2315 | 26.11 | 9400 | 7.9898 | 0.4362 | 0.3856 |
| 0.2238 | 26.39 | 9500 | 7.6913 | 0.4463 | 0.3955 |
| 0.2238 | 26.67 | 9600 | 7.5922 | 0.4494 | 0.3975 |
| 0.2238 | 26.94 | 9700 | 7.4909 | 0.4500 | 0.3979 |
| 0.2238 | 27.22 | 9800 | 7.5671 | 0.4492 | 0.3957 |
| 0.2238 | 27.5 | 9900 | 7.4782 | 0.4538 | 0.3964 |
| 0.2241 | 27.78 | 10000 | 7.4428 | 0.4550 | 0.4017 |
| 0.2241 | 28.06 | 10100 | 7.5369 | 0.4506 | 0.3962 |
| 0.2241 | 28.33 | 10200 | 7.5258 | 0.4479 | 0.3950 |
| 0.2241 | 28.61 | 10300 | 7.5294 | 0.4512 | 0.3934 |
| 0.2241 | 28.89 | 10400 | 7.2467 | 0.4606 | 0.4002 |
| 0.2167 | 29.17 | 10500 | 7.4440 | 0.4543 | 0.4012 |
| 0.2167 | 29.44 | 10600 | 7.6319 | 0.4470 | 0.3965 |
| 0.2167 | 29.72 | 10700 | 7.4810 | 0.4537 | 0.3996 |
| 0.2167 | 30.0 | 10800 | 7.4557 | 0.4542 | 0.3997 |
### Framework versions
- Transformers 4.33.3
- Pytorch 2.1.1+cu121
- Datasets 2.14.5
- Tokenizers 0.13.3
| {"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["massive"], "metrics": ["accuracy", "f1"], "base_model": "FacebookAI/xlm-roberta-base", "model-index": [{"name": "scenario-KD-PR-MSV-EN-EN-D2_data-en-massive_all_1_1", "results": []}]} | null | haryoaw/scenario-KD-PR-MSV-EN-EN-D2_data-en-massive_all_1_1 | [
"transformers",
"pytorch",
"xlm-roberta",
"generated_from_trainer",
"dataset:massive",
"base_model:FacebookAI/xlm-roberta-base",
"license:mit",
"endpoints_compatible",
"region:us"
] | 2024-02-15T05:40:03+00:00 | [] | [] | TAGS
#transformers #pytorch #xlm-roberta #generated_from_trainer #dataset-massive #base_model-FacebookAI/xlm-roberta-base #license-mit #endpoints_compatible #region-us
| scenario-KD-PR-MSV-EN-EN-D2\_data-en-massive\_all\_1\_1
=======================================================
This model is a fine-tuned version of FacebookAI/xlm-roberta-base on the massive dataset.
It achieves the following results on the evaluation set:
* Loss: 7.4557
* Accuracy: 0.4542
* F1: 0.3997
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 5e-05
* train\_batch\_size: 32
* eval\_batch\_size: 32
* seed: 47
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 30
### Training results
### Framework versions
* Transformers 4.33.3
* Pytorch 2.1.1+cu121
* Datasets 2.14.5
* Tokenizers 0.13.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 47\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 30",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.33.3\n* Pytorch 2.1.1+cu121\n* Datasets 2.14.5\n* Tokenizers 0.13.3"
] | [
"TAGS\n#transformers #pytorch #xlm-roberta #generated_from_trainer #dataset-massive #base_model-FacebookAI/xlm-roberta-base #license-mit #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 47\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 30",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.33.3\n* Pytorch 2.1.1+cu121\n* Datasets 2.14.5\n* Tokenizers 0.13.3"
] | [
60,
98,
4,
33
] | [
"passage: TAGS\n#transformers #pytorch #xlm-roberta #generated_from_trainer #dataset-massive #base_model-FacebookAI/xlm-roberta-base #license-mit #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 47\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 30### Training results### Framework versions\n\n\n* Transformers 4.33.3\n* Pytorch 2.1.1+cu121\n* Datasets 2.14.5\n* Tokenizers 0.13.3"
] | [
-0.13031280040740967,
0.06937964260578156,
-0.0014504295540973544,
0.11406318843364716,
0.17141394317150116,
0.031947679817676544,
0.09491652250289917,
0.11005323380231857,
-0.06468161940574646,
0.022727128118276596,
0.12611760199069977,
0.12994113564491272,
0.0020992408972233534,
0.14872165024280548,
-0.03757438436150551,
-0.23125143349170685,
-0.017463011667132378,
0.05183197185397148,
-0.11298424750566483,
0.13618086278438568,
0.09605979174375534,
-0.14715032279491425,
0.10172931849956512,
-0.009086145088076591,
-0.24042925238609314,
0.027230415493249893,
0.013979928568005562,
-0.03861868008971214,
0.14375248551368713,
0.01845855824649334,
0.10932449251413345,
0.004541611764580011,
0.08161770552396774,
-0.15520694851875305,
0.015959059819579124,
0.04568933695554733,
-0.011349128559231758,
0.08190394937992096,
0.035194866359233856,
-0.0024399638641625643,
0.1229412853717804,
-0.08161235600709915,
0.02480870857834816,
0.026889782398939133,
-0.13782812654972076,
-0.2620670795440674,
-0.08921921253204346,
0.06503807008266449,
0.05674009770154953,
0.1026732474565506,
0.0006081442697905004,
0.12821272015571594,
-0.08960866928100586,
0.08662353456020355,
0.2658783495426178,
-0.27499279379844666,
-0.079537533223629,
0.04342438280582428,
0.020515277981758118,
0.07646788656711578,
-0.09254278987646103,
-0.01423199288547039,
0.06981520354747772,
0.042318541556596756,
0.12134852260351181,
-0.044550374150276184,
-0.015397747978568077,
0.03292936086654663,
-0.14423136413097382,
-0.03146921098232269,
0.14313486218452454,
0.058063626289367676,
-0.03704330697655678,
-0.010249702259898186,
-0.04908653721213341,
-0.12696783244609833,
-0.04226986691355705,
-0.0035296580754220486,
0.042194463312625885,
-0.04086008295416832,
-0.11421383917331696,
-0.0020188374910503626,
-0.08451595157384872,
-0.08406414836645126,
-0.05176399275660515,
0.14840580523014069,
0.029619207605719566,
0.02358168363571167,
-0.03235645219683647,
0.11535210907459259,
-0.029335565865039825,
-0.13358192145824432,
0.023800892755389214,
0.0071549611166119576,
-0.0023839739151299,
-0.03750597685575485,
-0.06241697072982788,
0.008286085911095142,
0.020253261551260948,
0.10113842785358429,
-0.06500774621963501,
0.021940262988209724,
0.05798808112740517,
0.04558056965470314,
-0.08237461745738983,
0.158425971865654,
-0.08482743054628372,
-0.05981569364666939,
0.006810741499066353,
0.05019528791308403,
-0.02707589976489544,
0.02103285863995552,
-0.09525163471698761,
-0.01441587507724762,
0.07647717744112015,
0.005340021103620529,
-0.08011606335639954,
0.05585438758134842,
-0.05058930814266205,
-0.025141827762126923,
-0.004122624639421701,
-0.0672256126999855,
0.02509509213268757,
-0.0019173934124410152,
-0.09883511066436768,
-0.007417774293571711,
0.01502587553113699,
-0.003204338252544403,
0.016363196074962616,
0.05705143138766289,
-0.10058628767728806,
0.03322136402130127,
-0.10117052495479584,
-0.09751148521900177,
-0.0004369558300822973,
-0.08587035536766052,
0.03901517391204834,
-0.09243114292621613,
-0.18481799960136414,
-0.015687504783272743,
0.04095486178994179,
-0.02846863493323326,
-0.048355527222156525,
-0.029554205015301704,
-0.07078181207180023,
-0.013711504638195038,
-0.01679627038538456,
0.13218720257282257,
-0.06148694083094597,
0.12237299978733063,
0.05415552854537964,
0.0752304419875145,
-0.054038356989622116,
0.055762384086847305,
-0.09564545750617981,
0.017864491790533066,
-0.21836085617542267,
0.05430203676223755,
-0.05396661534905434,
0.07849852740764618,
-0.07763992249965668,
-0.12006359547376633,
0.018661020323634148,
0.0022135747130960226,
0.08468397706747055,
0.09969757497310638,
-0.1843019723892212,
-0.09166782349348068,
0.16514535248279572,
-0.06827251613140106,
-0.1404317319393158,
0.10369808971881866,
-0.07634158432483673,
0.07705377042293549,
0.07746028900146484,
0.15227657556533813,
0.06786082684993744,
-0.07192473858594894,
0.016052115708589554,
-0.00705972034484148,
0.027958080172538757,
-0.0939130187034607,
0.0612955316901207,
0.01985807716846466,
-0.022417422384023666,
0.02989760972559452,
-0.06793089956045151,
0.07825100421905518,
-0.1320420205593109,
-0.08063968271017075,
-0.03959861025214195,
-0.12337025254964828,
0.037677135318517685,
0.06636935472488403,
0.06631580740213394,
-0.10949254035949707,
-0.04737323895096779,
0.0883527621626854,
0.09508557617664337,
-0.05281220003962517,
0.005702847149223089,
-0.05286421626806259,
0.0720200389623642,
-0.06017330661416054,
-0.040238916873931885,
-0.17896832525730133,
-0.003916604910045862,
-0.00266240700148046,
0.04041941836476326,
0.022702546790242195,
0.04645002260804176,
0.07271720468997955,
0.05102334916591644,
-0.05233559384942055,
-0.025181999430060387,
-0.06329894810914993,
-0.0067722187377512455,
-0.11827091127634048,
-0.18262654542922974,
-0.04850516840815544,
-0.02055334858596325,
0.09343468397855759,
-0.19629451632499695,
0.032295119017362595,
-0.01886354386806488,
0.07501021027565002,
0.0032878692727535963,
-0.016735151410102844,
-0.04759928584098816,
0.08896750211715698,
-0.0018650059355422854,
-0.04882722347974777,
0.06626927107572556,
0.00802820548415184,
-0.09438479691743851,
-0.05639002472162247,
-0.03752066567540169,
0.24392718076705933,
0.12806616723537445,
-0.10878831148147583,
-0.08008889853954315,
-0.005070923827588558,
-0.06521018594503403,
-0.026824168860912323,
-0.027503304183483124,
0.03453146666288376,
0.180476114153862,
-0.021901298314332962,
0.13474731147289276,
-0.08642084151506424,
-0.03330833837389946,
0.02389402873814106,
-0.033788446336984634,
0.041235633194446564,
0.12512175738811493,
0.12441986799240112,
-0.11514230072498322,
0.13442440330982208,
0.14176250994205475,
-0.06483909487724304,
0.13842830061912537,
-0.048835210502147675,
-0.0720546692609787,
-0.043457597494125366,
-0.03381456062197685,
-0.023650208488106728,
0.13811837136745453,
-0.13388386368751526,
0.007125255651772022,
0.01271645538508892,
0.00825434923171997,
0.03687981888651848,
-0.20367121696472168,
-0.06838269531726837,
0.03267791122198105,
-0.048255953937768936,
-0.05737832933664322,
-0.0028433676343411207,
0.021294288337230682,
0.10493043065071106,
-0.005483072251081467,
-0.09096711874008179,
0.03457675874233246,
0.009457939304411411,
-0.07015258073806763,
0.2060426026582718,
-0.06264128535985947,
-0.11743286997079849,
-0.09632200747728348,
-0.058046855032444,
-0.04277905076742172,
-0.01057354174554348,
0.05541180446743965,
-0.08826985210180283,
-0.004408363252878189,
-0.04043864831328392,
0.03346889838576317,
0.007962160743772984,
0.0407419353723526,
0.027887778356671333,
0.0002961773134302348,
0.07473426312208176,
-0.10734377801418304,
-0.008517991751432419,
-0.06894674897193909,
-0.07305099815130234,
0.028259506449103355,
0.023245923221111298,
0.1267927587032318,
0.1070036068558693,
-0.01883893460035324,
0.021906616166234016,
-0.026766624301671982,
0.2733730971813202,
-0.06838652491569519,
-0.03827540948987007,
0.15717363357543945,
0.035445548593997955,
0.04839326813817024,
0.07901041209697723,
0.07473569363355637,
-0.10004360228776932,
-0.005981551017612219,
0.02843857742846012,
-0.04315922036767006,
-0.20023761689662933,
-0.04475655034184456,
-0.05409897491335869,
-0.03774743154644966,
0.07900593429803848,
0.015291198156774044,
0.01324166264384985,
0.07388807088136673,
0.05551082640886307,
0.04798445105552673,
-0.07643407583236694,
0.039361633360385895,
0.055300358682870865,
0.043627675622701645,
0.11238761246204376,
-0.04827447980642319,
-0.052454542368650436,
0.04933447763323784,
-0.020394163206219673,
0.24765212833881378,
-0.01710248552262783,
0.08691399544477463,
0.07827980816364288,
0.20384427905082703,
-0.005426658783107996,
0.08815453946590424,
-0.023039685562253,
-0.06139019876718521,
-0.018939152359962463,
-0.027696549892425537,
-0.021930065006017685,
0.007027535233646631,
-0.04930361732840538,
0.07383797317743301,
-0.14628317952156067,
-0.013797187246382236,
0.06268243491649628,
0.26387524604797363,
0.013175460509955883,
-0.32268601655960083,
-0.095619335770607,
-0.0164212416857481,
-0.018793871626257896,
-0.009200275875627995,
0.021544212475419044,
0.12110793590545654,
-0.08985503762960434,
0.02383352816104889,
-0.05228393152356148,
0.09361404925584793,
-0.0055168974213302135,
0.04895811155438423,
0.07962381094694138,
0.10030799359083176,
-0.00307678640820086,
0.0611310712993145,
-0.27487459778785706,
0.290510892868042,
0.0008479738025926054,
0.06945855170488358,
-0.054529543966054916,
-0.025094831362366676,
0.02842610329389572,
0.06689278036355972,
0.06655050814151764,
-0.0045248717069625854,
-0.0030675986781716347,
-0.19935737550258636,
-0.0312824472784996,
0.047215353697538376,
0.07221955806016922,
-0.05022153630852699,
0.10178954899311066,
-0.00009481183224124834,
0.01569078303873539,
0.06909257173538208,
0.006806843914091587,
-0.06367576122283936,
-0.06319639831781387,
-0.04268498718738556,
-0.0006946761277504265,
-0.03478251025080681,
-0.07876425981521606,
-0.10415629297494888,
-0.11024022102355957,
0.1299837827682495,
0.02561027929186821,
-0.02605278417468071,
-0.11194358021020889,
0.09327154606580734,
0.10312390327453613,
-0.08688870817422867,
0.04189062491059303,
0.034838125109672546,
0.038152892142534256,
0.03450563922524452,
-0.06240028887987137,
0.11130882054567337,
-0.06528906524181366,
-0.14577163755893707,
-0.03382984548807144,
0.1151994913816452,
0.06402396410703659,
0.05855466052889824,
0.0011132672661915421,
-0.012235553003847599,
-0.03743724152445793,
-0.10191923379898071,
0.039669379591941833,
-0.052357565611600876,
0.06398916244506836,
0.03983603045344353,
-0.02389279194176197,
0.04371979087591171,
-0.04963056743144989,
-0.009761028923094273,
0.1624693125486374,
0.24629254639148712,
-0.09951475262641907,
-0.002364460611715913,
0.0337316058576107,
-0.05876404047012329,
-0.16262869536876678,
0.0756259560585022,
0.05496915429830551,
0.011543246917426586,
0.034409865736961365,
-0.18494214117527008,
0.12713554501533508,
0.12257175892591476,
0.00572827085852623,
0.0873328149318695,
-0.3513120412826538,
-0.1084279790520668,
0.08994073420763016,
0.1417018622159958,
0.17843835055828094,
-0.12816035747528076,
-0.006879426073282957,
-0.013284516520798206,
-0.14002986252307892,
0.09378067404031754,
-0.1249067634344101,
0.11496291309595108,
-0.04233478754758835,
0.08574512600898743,
-0.0010877930326387286,
-0.055220670998096466,
0.12067895382642746,
0.041557203978300095,
0.1430204063653946,
-0.033917106688022614,
-0.03664431720972061,
0.06343092769384384,
-0.02280902862548828,
0.014339515008032322,
-0.061759479343891144,
0.04030860960483551,
-0.11689268052577972,
-0.013975527137517929,
-0.11085643619298935,
0.04133883863687515,
-0.028783809393644333,
-0.055444035679101944,
-0.04325680434703827,
0.0456397645175457,
0.02595704048871994,
-0.011318185366690159,
0.0658012256026268,
0.011239588260650635,
0.14441505074501038,
0.07818378508090973,
0.06251700967550278,
-0.07852913439273834,
-0.08497191965579987,
-0.0075917853973805904,
-0.012288229539990425,
0.05782012268900871,
-0.13377398252487183,
0.019706416875123978,
0.13767440617084503,
0.03640361130237579,
0.12094122916460037,
0.0818648487329483,
-0.026696374639868736,
0.032417409121990204,
0.0664927288889885,
-0.15169057250022888,
-0.10295505076646805,
-0.0220351442694664,
-0.1044682040810585,
-0.09250271320343018,
0.07071582227945328,
0.09271473437547684,
-0.06522510945796967,
-0.017430216073989868,
-0.035544440150260925,
-0.012682697735726833,
-0.08929329365491867,
0.19736480712890625,
0.07714524120092392,
0.0492706298828125,
-0.11761882901191711,
0.06209509074687958,
0.0005535733071155846,
-0.027850789949297905,
-0.00664710346609354,
0.07627064734697342,
-0.07413231581449509,
-0.03107595257461071,
0.07399273663759232,
0.2139652669429779,
-0.09783110022544861,
-0.03104233369231224,
-0.1580706685781479,
-0.12345458567142487,
0.08296646177768707,
0.20794305205345154,
0.1159629374742508,
0.004556561354547739,
-0.045222729444503784,
-0.0017409248976036906,
-0.11868469417095184,
0.05746699497103691,
0.056088510900735855,
0.05268735811114311,
-0.13408918678760529,
0.1788681149482727,
-0.0012691880110651255,
0.04897841438651085,
-0.029410017654299736,
0.030937906354665756,
-0.11798574030399323,
0.021711669862270355,
-0.15795989334583282,
-0.060462940484285355,
-0.021371496841311455,
0.004602424800395966,
-0.022745458409190178,
-0.07017282396554947,
-0.06918808817863464,
0.02056143619120121,
-0.12579013407230377,
-0.0057230242528021336,
0.05894479528069496,
0.052856992930173874,
-0.12009967118501663,
-0.02558867447078228,
0.024114318192005157,
-0.04609232023358345,
0.07331952452659607,
0.0666009858250618,
0.029199395328760147,
0.04090892896056175,
-0.13805627822875977,
-0.007344060577452183,
0.05287393927574158,
-0.0030411877669394016,
0.08183197677135468,
-0.08873441070318222,
0.018483178690075874,
-0.0023105484433472157,
0.0577271543443203,
0.016181837767362595,
0.03534667193889618,
-0.12601745128631592,
0.001490049879066646,
-0.009729603305459023,
-0.04839001223444939,
-0.06254326552152634,
0.016935205087065697,
0.1301526576280594,
0.05110819637775421,
0.16166479885578156,
-0.07141312956809998,
0.038999829441308975,
-0.23709028959274292,
-0.019261173903942108,
-0.013767803087830544,
-0.0997454822063446,
-0.09656460583209991,
-0.06053442507982254,
0.07294560223817825,
-0.03806235268712044,
0.16247183084487915,
0.011801173910498619,
0.01795758493244648,
0.011401413939893246,
-0.050320953130722046,
0.041027363389730453,
0.016097819432616234,
0.22497963905334473,
0.017813213169574738,
-0.023712176829576492,
0.08356475085020065,
0.07792354375123978,
0.09204218536615372,
0.1252494603395462,
0.19548222422599792,
0.17354178428649902,
0.035207562148571014,
0.06332981586456299,
0.06569454073905945,
-0.049992240965366364,
-0.11721239984035492,
0.016379281878471375,
-0.018887748941779137,
0.06205487996339798,
-0.03664763271808624,
0.18415696918964386,
0.07247012108564377,
-0.17337676882743835,
0.04883132502436638,
-0.03168124333024025,
-0.07490456849336624,
-0.07617262750864029,
-0.0069414172321558,
-0.07311050593852997,
-0.1680804193019867,
0.012481629848480225,
-0.12579034268856049,
-0.005602534860372543,
0.11231290549039841,
-0.009390563704073429,
-0.027223261073231697,
0.149333193898201,
0.05190418288111687,
0.028739260509610176,
0.07583795487880707,
-0.0033588469959795475,
-0.038448184728622437,
-0.09026708453893661,
-0.06183324754238129,
-0.01138093788176775,
-0.03891877457499504,
0.01979830488562584,
-0.05995412543416023,
-0.0883500799536705,
0.029736841097474098,
-0.008499816060066223,
-0.09253942221403122,
0.005463805980980396,
0.017841516062617302,
0.07193883508443832,
0.0476662702858448,
0.009964259341359138,
0.0070400964468717575,
-0.024828674271702766,
0.20450863242149353,
-0.07148462533950806,
-0.05170057341456413,
-0.10329848527908325,
0.2276635318994522,
0.03853292018175125,
-0.010432587936520576,
0.027501095086336136,
-0.0492110438644886,
0.019296344369649887,
0.21158553659915924,
0.20613498985767365,
-0.12028413265943527,
0.001912390347570181,
-0.013247611932456493,
-0.012712995521724224,
-0.027146514505147934,
0.11381888389587402,
0.14236481487751007,
0.04464622959494591,
-0.10279840230941772,
-0.0291543398052454,
-0.07448938488960266,
-0.008305089548230171,
-0.013923817314207554,
0.05840189382433891,
0.06084766611456871,
0.010426688008010387,
-0.05059775710105896,
0.07031905651092529,
-0.0730811133980751,
-0.103003591299057,
0.07647652924060822,
-0.19009478390216827,
-0.15997271239757538,
-0.0276548583060503,
0.042036037892103195,
0.023785024881362915,
0.07856542617082596,
-0.04229287430644035,
0.00902967806905508,
0.0755595862865448,
-0.01777803525328636,
-0.09824003279209137,
-0.13276764750480652,
0.11805438250303268,
-0.08423139154911041,
0.1789989024400711,
-0.05572013556957245,
0.05333072319626808,
0.12546654045581818,
0.04492175579071045,
-0.08270072937011719,
0.056050416082143784,
0.036876413971185684,
-0.06678672134876251,
-0.01167071983218193,
0.06290806084871292,
-0.021136078983545303,
0.08170244097709656,
0.030341913923621178,
-0.12089730054140091,
0.012295414693653584,
-0.06790122389793396,
-0.03714360296726227,
-0.06402076035737991,
-0.05546052008867264,
-0.050931449979543686,
0.1254909634590149,
0.21629592776298523,
-0.04427899047732353,
0.014664768241345882,
-0.06668917834758759,
0.0485767237842083,
0.07709062099456787,
0.04767797887325287,
-0.07306317985057831,
-0.21629616618156433,
0.0056095910258591175,
0.09618499875068665,
-0.04486307501792908,
-0.21728277206420898,
-0.09498947858810425,
-0.006122061982750893,
-0.07853620499372482,
-0.05906984955072403,
0.0782386064529419,
0.05753979831933975,
0.07108708471059799,
-0.03646618127822876,
-0.06104789301753044,
-0.0952862873673439,
0.1584334373474121,
-0.1405889093875885,
-0.08777708560228348
] |
null | null | null |
# Lora of joffre/ジョッフル/霞飞 (Azur Lane)
## What Is This?
This is the LoRA model of waifu joffre/ジョッフル/霞飞 (Azur Lane).
## How Is It Trained?
* This model is trained with [HCP-Diffusion](https://github.com/7eu7d7/HCP-Diffusion).
* The [auto-training framework](https://github.com/deepghs/cyberharem) is maintained by [DeepGHS Team](https://huggingface.co/deepghs).
* The base model used for training is [deepghs/animefull-latest](https://huggingface.co/deepghs/animefull-latest).
* Dataset used for training is the `stage3-p480-800` in [CyberHarem/joffre_azurlane](https://huggingface.co/datasets/CyberHarem/joffre_azurlane), which contains 190 images.
* Batch size is 4, resolution is 720x720, clustering into 5 buckets.
* Batch size for regularization dataset is 16, resolution is 720x720, clustering into 20 buckets.
* Trained for 1920 steps, 40 checkpoints were saved and evaluated.
* **Trigger word is `joffre_azurlane`.**
* Pruned core tags for this waifu are `breasts, twintails, large_breasts, hair_ornament, bangs, red_eyes, grey_hair, long_hair, white_hair`. You can add them to the prompt when some features of waifu (e.g. hair color) are not stable.
## How to Use It?
### If You Are Using A1111 WebUI v1.7+
**Just use it like the classic LoRA**. The LoRA we provided are bundled with the embedding file.
### If You Are Using A1111 WebUI v1.6 or Lower
After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.
For example, if you want to use the model from step 720, you need to download [`720/joffre_azurlane.pt`](https://huggingface.co/CyberHarem/joffre_azurlane/resolve/main/720/joffre_azurlane.pt) as the embedding and [`720/joffre_azurlane.safetensors`](https://huggingface.co/CyberHarem/joffre_azurlane/resolve/main/720/joffre_azurlane.safetensors) for loading Lora. By using both files together, you can generate images for the desired characters.
## Which Step Should I Use?
We selected 5 good steps for you to choose. The best one is step 720.
1600 images (1.70 GiB) were generated for auto-testing.

The base model used for generating preview images is [Meina/MeinaMix_V11](https://huggingface.co/Meina/MeinaMix_V11).
Here are the preview of the recommended steps:
| Step | Epoch | CCIP | AI Corrupt | Bikini Plus | Score | Download | pattern_0_0 | pattern_0_1 | pattern_1 | pattern_2 | pattern_3 | portrait_0 | portrait_1 | portrait_2 | full_body_0 | full_body_1 | profile_0 | profile_1 | free_0 | free_1 | shorts | maid_0 | maid_1 | miko | yukata | suit | china | bikini_0 | bikini_1 | bikini_2 | sit | squat | kneel | jump | crossed_arms | angry | smile | cry | grin | n_lie_0 | n_lie_1 | n_stand_0 | n_stand_1 | n_stand_2 | n_sex_0 | n_sex_1 |
|-------:|--------:|:----------|:-------------|:--------------|:----------|:----------------------------------------------------------------------------------------------------|:----------------------------------------------|:----------------------------------------------|:------------------------------------------|:------------------------------------------|:------------------------------------------|:--------------------------------------------|:--------------------------------------------|:--------------------------------------------|:----------------------------------------------|:----------------------------------------------|:------------------------------------------|:------------------------------------------|:------------------------------------|:------------------------------------|:------------------------------------|:------------------------------------|:------------------------------------|:--------------------------------|:------------------------------------|:--------------------------------|:----------------------------------|:----------------------------------------|:----------------------------------------|:----------------------------------------|:------------------------------|:----------------------------------|:----------------------------------|:--------------------------------|:------------------------------------------------|:----------------------------------|:----------------------------------|:------------------------------|:--------------------------------|:--------------------------------------|:--------------------------------------|:------------------------------------------|:------------------------------------------|:------------------------------------------|:--------------------------------------|:--------------------------------------|
| 720 | 16 | **0.989** | 0.849 | 0.845 | **0.723** | [Download](https://huggingface.co/CyberHarem/joffre_azurlane/resolve/main/720/joffre_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
| 768 | 17 | 0.984 | **0.866** | **0.848** | 0.720 | [Download](https://huggingface.co/CyberHarem/joffre_azurlane/resolve/main/768/joffre_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
| 1248 | 27 | 0.983 | 0.806 | 0.842 | 0.712 | [Download](https://huggingface.co/CyberHarem/joffre_azurlane/resolve/main/1248/joffre_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
| 1344 | 29 | 0.980 | 0.813 | 0.841 | 0.706 | [Download](https://huggingface.co/CyberHarem/joffre_azurlane/resolve/main/1344/joffre_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
| 672 | 15 | 0.955 | 0.845 | 0.843 | 0.680 | [Download](https://huggingface.co/CyberHarem/joffre_azurlane/resolve/main/672/joffre_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
## Anything Else?
Because the automation of LoRA training always annoys some people. So for the following groups, it is not recommended to use this model and we express regret:
1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.
2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.
3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.
4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.
5. Individuals who finds the generated image content offensive to their values.
## All Steps
We uploaded the files in all steps. you can check the images, metrics and download them in the following links:
* [Steps From 1488 to 1920](all/0.md)
* [Steps From 1008 to 1440](all/1.md)
* [Steps From 528 to 960](all/2.md)
* [Steps From 48 to 480](all/3.md)
| {"license": "mit", "tags": ["art", "not-for-all-audiences"], "datasets": ["CyberHarem/joffre_azurlane"], "pipeline_tag": "text-to-image"} | text-to-image | CyberHarem/joffre_azurlane | [
"art",
"not-for-all-audiences",
"text-to-image",
"dataset:CyberHarem/joffre_azurlane",
"license:mit",
"region:us"
] | 2024-02-15T05:45:29+00:00 | [] | [] | TAGS
#art #not-for-all-audiences #text-to-image #dataset-CyberHarem/joffre_azurlane #license-mit #region-us
| Lora of joffre/ジョッフル/霞飞 (Azur Lane)
===================================
What Is This?
-------------
This is the LoRA model of waifu joffre/ジョッフル/霞飞 (Azur Lane).
How Is It Trained?
------------------
* This model is trained with HCP-Diffusion.
* The auto-training framework is maintained by DeepGHS Team.
* The base model used for training is deepghs/animefull-latest.
* Dataset used for training is the 'stage3-p480-800' in CyberHarem/joffre\_azurlane, which contains 190 images.
* Batch size is 4, resolution is 720x720, clustering into 5 buckets.
* Batch size for regularization dataset is 16, resolution is 720x720, clustering into 20 buckets.
* Trained for 1920 steps, 40 checkpoints were saved and evaluated.
* Trigger word is 'joffre\_azurlane'.
* Pruned core tags for this waifu are 'breasts, twintails, large\_breasts, hair\_ornament, bangs, red\_eyes, grey\_hair, long\_hair, white\_hair'. You can add them to the prompt when some features of waifu (e.g. hair color) are not stable.
How to Use It?
--------------
### If You Are Using A1111 WebUI v1.7+
Just use it like the classic LoRA. The LoRA we provided are bundled with the embedding file.
### If You Are Using A1111 WebUI v1.6 or Lower
After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.
For example, if you want to use the model from step 720, you need to download '720/joffre\_azurlane.pt' as the embedding and '720/joffre\_azurlane.safetensors' for loading Lora. By using both files together, you can generate images for the desired characters.
Which Step Should I Use?
------------------------
We selected 5 good steps for you to choose. The best one is step 720.
1600 images (1.70 GiB) were generated for auto-testing.
!Metrics Plot
The base model used for generating preview images is Meina/MeinaMix\_V11.
Here are the preview of the recommended steps:
Anything Else?
--------------
Because the automation of LoRA training always annoys some people. So for the following groups, it is not recommended to use this model and we express regret:
1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.
2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.
3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.
4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.
5. Individuals who finds the generated image content offensive to their values.
All Steps
---------
We uploaded the files in all steps. you can check the images, metrics and download them in the following links:
* Steps From 1488 to 1920
* Steps From 1008 to 1440
* Steps From 528 to 960
* Steps From 48 to 480
| [
"### If You Are Using A1111 WebUI v1.7+\n\n\nJust use it like the classic LoRA. The LoRA we provided are bundled with the embedding file.",
"### If You Are Using A1111 WebUI v1.6 or Lower\n\n\nAfter downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.\n\n\nFor example, if you want to use the model from step 720, you need to download '720/joffre\\_azurlane.pt' as the embedding and '720/joffre\\_azurlane.safetensors' for loading Lora. By using both files together, you can generate images for the desired characters.\n\n\nWhich Step Should I Use?\n------------------------\n\n\nWe selected 5 good steps for you to choose. The best one is step 720.\n\n\n1600 images (1.70 GiB) were generated for auto-testing.\n\n\n!Metrics Plot\n\n\nThe base model used for generating preview images is Meina/MeinaMix\\_V11.\n\n\nHere are the preview of the recommended steps:\n\n\n\nAnything Else?\n--------------\n\n\nBecause the automation of LoRA training always annoys some people. So for the following groups, it is not recommended to use this model and we express regret:\n\n\n1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.\n2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.\n3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.\n4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.\n5. Individuals who finds the generated image content offensive to their values.\n\n\nAll Steps\n---------\n\n\nWe uploaded the files in all steps. you can check the images, metrics and download them in the following links:\n\n\n* Steps From 1488 to 1920\n* Steps From 1008 to 1440\n* Steps From 528 to 960\n* Steps From 48 to 480"
] | [
"TAGS\n#art #not-for-all-audiences #text-to-image #dataset-CyberHarem/joffre_azurlane #license-mit #region-us \n",
"### If You Are Using A1111 WebUI v1.7+\n\n\nJust use it like the classic LoRA. The LoRA we provided are bundled with the embedding file.",
"### If You Are Using A1111 WebUI v1.6 or Lower\n\n\nAfter downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.\n\n\nFor example, if you want to use the model from step 720, you need to download '720/joffre\\_azurlane.pt' as the embedding and '720/joffre\\_azurlane.safetensors' for loading Lora. By using both files together, you can generate images for the desired characters.\n\n\nWhich Step Should I Use?\n------------------------\n\n\nWe selected 5 good steps for you to choose. The best one is step 720.\n\n\n1600 images (1.70 GiB) were generated for auto-testing.\n\n\n!Metrics Plot\n\n\nThe base model used for generating preview images is Meina/MeinaMix\\_V11.\n\n\nHere are the preview of the recommended steps:\n\n\n\nAnything Else?\n--------------\n\n\nBecause the automation of LoRA training always annoys some people. So for the following groups, it is not recommended to use this model and we express regret:\n\n\n1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.\n2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.\n3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.\n4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.\n5. Individuals who finds the generated image content offensive to their values.\n\n\nAll Steps\n---------\n\n\nWe uploaded the files in all steps. you can check the images, metrics and download them in the following links:\n\n\n* Steps From 1488 to 1920\n* Steps From 1008 to 1440\n* Steps From 528 to 960\n* Steps From 48 to 480"
] | [
44,
38,
468
] | [
"passage: TAGS\n#art #not-for-all-audiences #text-to-image #dataset-CyberHarem/joffre_azurlane #license-mit #region-us \n### If You Are Using A1111 WebUI v1.7+\n\n\nJust use it like the classic LoRA. The LoRA we provided are bundled with the embedding file."
] | [
0.016675828024744987,
0.0002515364612918347,
-0.004219320137053728,
0.0794907882809639,
0.07235827296972275,
0.07582960277795792,
0.22533951699733734,
0.08279091864824295,
0.11932580173015594,
-0.07131464779376984,
0.09931936115026474,
0.06655319780111313,
-0.003932388033717871,
0.05096694082021713,
-0.03610950708389282,
-0.15142259001731873,
-0.06309021264314651,
-0.02544151060283184,
0.010521728545427322,
0.013723926618695259,
0.08487533777952194,
0.0063403295353055,
0.10486774891614914,
-0.05303420498967171,
-0.0423070564866066,
0.06225832551717758,
-0.030046509578824043,
-0.035967666655778885,
0.030326416715979576,
0.08912193775177002,
0.11830079555511475,
0.018837235867977142,
0.06333427876234055,
-0.16860340535640717,
0.07078772038221359,
-0.006052935030311346,
-0.10951469093561172,
0.0017891349270939827,
0.021377108991146088,
-0.03774739429354668,
0.12274400889873505,
0.028619516640901566,
-0.11481725424528122,
0.043377164751291275,
-0.13181859254837036,
-0.031034043058753014,
-0.05410497263073921,
0.044524893164634705,
0.1344154179096222,
0.044843826442956924,
0.022932877764105797,
0.0476008802652359,
-0.04260929673910141,
0.08587183803319931,
0.13365349173545837,
-0.1300186961889267,
-0.07058311253786087,
0.10542339831590652,
0.026022030040621758,
0.1346416175365448,
-0.09076932072639465,
0.09729432314634323,
0.07888398319482803,
-0.04936404153704643,
-0.1422712802886963,
-0.09798376262187958,
-0.22883360087871552,
-0.004010580480098724,
0.010535708628594875,
0.020512821152806282,
0.41978684067726135,
0.057000089436769485,
0.04127877950668335,
0.06166888773441315,
-0.0756044089794159,
0.02076956070959568,
-0.09497890621423721,
0.1328088343143463,
0.04461570829153061,
0.09403080493211746,
-0.030115384608507156,
-0.10458256304264069,
-0.11408407986164093,
-0.07295083999633789,
-0.0821446031332016,
-0.04193853214383125,
0.025069445371627808,
0.11470469832420349,
-0.19988131523132324,
0.011461775749921799,
-0.05520367622375488,
-0.12583647668361664,
0.02361041121184826,
-0.10735684633255005,
0.17051802575588226,
0.06496831774711609,
-0.012838533148169518,
-0.009228970855474472,
0.2514413893222809,
0.12330441921949387,
0.18674935400485992,
0.045180171728134155,
-0.11112669855356216,
0.13438183069229126,
0.04100155085325241,
-0.08438096195459366,
-0.0150649044662714,
-0.11372391879558563,
0.14690852165222168,
-0.043741870671510696,
0.10316058993339539,
-0.054746102541685104,
-0.10822418332099915,
0.013751969672739506,
-0.1088419258594513,
0.062723807990551,
0.03973834216594696,
0.00867907702922821,
-0.04931408911943436,
0.051744770258665085,
0.03917165845632553,
-0.0340619720518589,
-0.0046688951551914215,
-0.0024359873495996,
-0.05499712750315666,
0.05156056210398674,
0.12313859164714813,
0.033765245229005814,
0.05215655267238617,
-0.016903245821595192,
-0.015260578133165836,
-0.001194530981592834,
-0.04722721502184868,
-0.008776835165917873,
0.047013331204652786,
0.05318956449627876,
0.0879560112953186,
-0.15766946971416473,
-0.08361678570508957,
-0.015999620780348778,
0.05919269472360611,
0.0066881561651825905,
0.09042336791753769,
-0.008507197722792625,
0.05033477395772934,
0.003491085721179843,
-0.025659097358584404,
0.0417119599878788,
-0.10105320811271667,
0.08560874313116074,
-0.019063223153352737,
0.09360596537590027,
-0.20218521356582642,
-0.004890440497547388,
-0.040756579488515854,
0.009950410574674606,
0.05541248247027397,
-0.005848518572747707,
-0.11210348457098007,
0.11387981474399567,
-0.018300464376807213,
0.07335661351680756,
-0.0944635346531868,
0.045367248356342316,
0.02154107391834259,
0.08368707448244095,
-0.10344171524047852,
0.00935821607708931,
0.11880743503570557,
-0.14088423550128937,
-0.15447670221328735,
0.09551344066858292,
-0.025017766281962395,
0.039940666407346725,
0.047454580664634705,
0.1513519436120987,
0.1749468743801117,
-0.1926065981388092,
-0.013633753173053265,
0.06373018771409988,
-0.01595180667936802,
-0.0708608329296112,
-0.014153463765978813,
0.10749610513448715,
0.017335614189505577,
0.028504882007837296,
-0.026384716853499413,
0.122421495616436,
-0.02919466607272625,
-0.08047657459974289,
-0.027725985273718834,
-0.07803694903850555,
-0.06666155904531479,
0.05540855973958969,
-0.010159151628613472,
-0.048975490033626556,
0.022887641564011574,
-0.15726260840892792,
0.16097040474414825,
0.01386114303022623,
0.018715741112828255,
-0.0664605051279068,
0.09913837164640427,
0.0029109453316777945,
0.008932428434491158,
0.014596011489629745,
-0.053884562104940414,
-0.10351943224668503,
0.22109641134738922,
0.07531709223985672,
0.10873390734195709,
0.06326470524072647,
-0.0478212833404541,
-0.06673427671194077,
0.015613619238138199,
0.009332452900707722,
-0.03819689899682999,
0.016683997586369514,
-0.10631343722343445,
0.04639529064297676,
-0.019677327945828438,
0.029051775112748146,
-0.017771486192941666,
-0.023050911724567413,
0.056456200778484344,
0.01959140971302986,
-0.012195352464914322,
0.09143992513418198,
0.055869728326797485,
-0.02240734174847603,
-0.06572604924440384,
0.006808024365454912,
0.07046249508857727,
-0.01577298529446125,
-0.06489841639995575,
0.031933121383190155,
0.0043914662674069405,
0.04812527820467949,
0.19666947424411774,
-0.22265008091926575,
0.04557468369603157,
0.01113012433052063,
0.04702577367424965,
0.034883350133895874,
-0.002421076176688075,
-0.0341680571436882,
0.02543322741985321,
-0.023260468617081642,
0.07360335439443588,
-0.020109860226511955,
0.059885136783123016,
-0.027193784713745117,
-0.14248453080654144,
-0.005937585141509771,
-0.02053513005375862,
0.16908639669418335,
-0.1756177544593811,
0.06332890689373016,
0.17629247903823853,
-0.13058990240097046,
0.14336241781711578,
0.0005667455843649805,
-0.012309879995882511,
0.014988726936280727,
0.03757864981889725,
0.0000680361918057315,
0.10205317288637161,
-0.07153607159852982,
-0.030116017907857895,
0.029548611491918564,
-0.09141270816326141,
0.032938841730356216,
-0.11626903712749481,
-0.10668066143989563,
-0.07486158609390259,
-0.03941367566585541,
-0.033219724893569946,
0.0322958379983902,
-0.045267969369888306,
0.07939428836107254,
-0.09398087859153748,
-0.07919123023748398,
-0.028300058096647263,
-0.08782251179218292,
0.020989304408431053,
0.008585957810282707,
-0.05788342282176018,
-0.12812551856040955,
-0.11206013709306717,
-0.08900777250528336,
-0.14645329117774963,
-0.004809534642845392,
0.0690048485994339,
-0.10517635941505432,
-0.04292584955692291,
0.012428266927599907,
-0.05047379434108734,
0.09916239976882935,
-0.07921309769153595,
0.015226142480969429,
0.04389716684818268,
-0.030085153877735138,
-0.16971947252750397,
0.0016769430367276073,
-0.06349045783281326,
-0.06162651628255844,
0.1481463462114334,
-0.15786045789718628,
0.17977787554264069,
-0.04079175367951393,
0.04917370527982712,
0.06493596732616425,
0.03099939413368702,
0.12762384116649628,
-0.10506117343902588,
0.07750902324914932,
0.19668273627758026,
0.044469837099313736,
0.07471253722906113,
0.12110555917024612,
0.07797332853078842,
-0.10831458121538162,
0.036620840430259705,
0.07506819069385529,
-0.09257695078849792,
-0.08472295105457306,
-0.04694916307926178,
-0.1124834269285202,
-0.04317628964781761,
0.06432275474071503,
0.06286535412073135,
0.036021701991558075,
0.11827480792999268,
-0.06562858074903488,
-0.000341494771419093,
0.11200691759586334,
0.041545331478118896,
0.07586418837308884,
0.011346748098731041,
0.05305252969264984,
-0.15031369030475616,
-0.04603392630815506,
0.16005276143550873,
0.22471243143081665,
0.20975162088871002,
0.023195559158921242,
0.07193408906459808,
0.12316485494375229,
0.08250821381807327,
0.09845445305109024,
0.055609237402677536,
0.004646818153560162,
0.017849063500761986,
-0.07213002443313599,
-0.05102556198835373,
0.010675211437046528,
0.002853028941899538,
-0.04487110674381256,
-0.13795587420463562,
0.10342346131801605,
-0.0032973941415548325,
0.08531484007835388,
0.13235008716583252,
0.037418533116579056,
-0.11115219444036484,
0.16177228093147278,
0.09827128052711487,
0.07854795455932617,
-0.06865716725587845,
0.12802082300186157,
0.06308085471391678,
-0.007103291340172291,
0.16054624319076538,
0.02407822012901306,
0.1500089019536972,
-0.045599836856126785,
-0.07575497776269913,
-0.07346826046705246,
-0.055368222296237946,
0.010814204812049866,
0.03129415959119797,
-0.22927933931350708,
0.08577020466327667,
0.05588822439312935,
0.018464427441358566,
-0.001992112025618553,
-0.049632832407951355,
0.17841969430446625,
0.1668962836265564,
0.0757354348897934,
0.02115035243332386,
-0.017551904544234276,
-0.002354640979319811,
-0.07882322371006012,
0.0557224415242672,
0.0035227034240961075,
0.062027692794799805,
-0.03861807659268379,
-0.10177138447761536,
-0.01852443255484104,
-0.006795223336666822,
0.03531383350491524,
-0.08227430284023285,
-0.11522122472524643,
-0.04597804322838783,
0.2510966956615448,
-0.045226022601127625,
0.04582784324884415,
0.05948047712445259,
0.023449527099728584,
-0.04192844033241272,
0.020102156326174736,
-0.030633457005023956,
-0.018264295533299446,
-0.032825399190187454,
-0.002948982873931527,
0.010331547819077969,
-0.055167779326438904,
-0.06124085187911987,
-0.026068976148962975,
-0.11091040819883347,
-0.10275261104106903,
0.0011034238850697875,
-0.052584871649742126,
0.007967532612383366,
-0.02567140758037567,
0.008747372776269913,
-0.09506992995738983,
-0.035253580659627914,
0.028686044737696648,
0.02957380935549736,
-0.08642052859067917,
-0.12754657864570618,
-0.003921732772141695,
-0.0027937153354287148,
-0.05592303350567818,
0.024554654955863953,
-0.10454157739877701,
-0.09453723579645157,
-0.04521174728870392,
-0.02248460426926613,
0.12644995748996735,
0.22247472405433655,
-0.024230819195508957,
0.0012905814219266176,
0.1515691727399826,
-0.09862084686756134,
-0.32647421956062317,
-0.1705133318901062,
-0.15384139120578766,
-0.10297603160142899,
0.02850336767733097,
-0.06737053394317627,
0.022622376680374146,
0.07435891032218933,
-0.0394318550825119,
0.19252759218215942,
-0.20331096649169922,
-0.09139968454837799,
0.08218846470117569,
0.0901290699839592,
0.31845420598983765,
-0.24504202604293823,
0.014078265056014061,
-0.11578098684549332,
-0.05355362594127655,
0.01017453707754612,
-0.10146576911211014,
0.1176520511507988,
0.03840513899922371,
0.06890743970870972,
-0.011946776881814003,
-0.008554625324904919,
0.14488323032855988,
-0.07632241398096085,
0.1412411779165268,
-0.11123453080654144,
-0.09186150878667831,
0.17982439696788788,
-0.03491615876555443,
0.008071627467870712,
-0.2139826864004135,
-0.036076292395591736,
-0.04473008215427399,
0.03970077261328697,
-0.013383859768509865,
0.06786651909351349,
-0.007267436478286982,
-0.008174329064786434,
-0.12566612660884857,
-0.028355492278933525,
-0.027499789372086525,
0.06688731908798218,
0.23180067539215088,
-0.07368717342615128,
-0.05310007557272911,
0.06306596100330353,
-0.0058793011121451855,
0.11148348450660706,
0.01811129041016102,
-0.0538393072783947,
-0.04271119832992554,
0.10171310603618622,
-0.20568324625492096,
0.06219348683953285,
0.0023258153814822435,
-0.009683436714112759,
0.020743336528539658,
0.00676429970189929,
0.019475923851132393,
0.11765190958976746,
0.17249108850955963,
-0.018588529899716377,
-0.01965269073843956,
-0.01872144639492035,
0.021506523713469505,
0.1258414387702942,
-0.015190765261650085,
0.1196366548538208,
0.017283830791711807,
0.03479313105344772,
0.011852219700813293,
0.06143397092819214,
-0.08670253306627274,
-0.08176515996456146,
0.09566409140825272,
-0.04533713310956955,
-0.09044661372900009,
0.08704487979412079,
0.05096805468201637,
0.07793693244457245,
0.0009532225085422397,
0.0516820102930069,
0.019238648936152458,
-0.12620291113853455,
0.015125388279557228,
0.23487214744091034,
-0.06715021282434464,
-0.06316187977790833,
-0.06852563470602036,
0.011975896544754505,
-0.11843827366828918,
0.08716725558042526,
0.03455021232366562,
-0.029199950397014618,
0.12160010635852814,
-0.046545494347810745,
-0.03503862023353577,
0.01829107291996479,
-0.05715205892920494,
0.039549779146909714,
-0.1350024789571762,
-0.1996173858642578,
0.04162351042032242,
-0.0033482874277979136,
-0.0641186386346817,
-0.08192875981330872,
-0.08577296137809753,
0.0598333440721035,
-0.16365401446819305,
0.13882748782634735,
-0.05684339255094528,
0.06006575748324394,
-0.035136524587869644,
-0.054825201630592346,
-0.1077236458659172,
-0.016925526782870293,
-0.053503964096307755,
-0.0205727256834507,
0.05829973146319389,
0.025071173906326294,
-0.12642736732959747,
-0.11107621341943741,
0.06023110821843147,
-0.001495683565735817,
-0.0010917402105405927,
0.009474864229559898,
-0.07327783852815628,
0.017665645107626915,
-0.22465218603610992,
-0.07465062290430069,
0.08700603246688843,
0.03808441385626793,
-0.08749718964099884,
0.12842367589473724,
0.04377561807632446,
-0.02654159814119339,
0.031697165220975876,
0.001570350956171751,
0.17651785910129547,
-0.07378862798213959,
0.027391480281949043,
-0.12550243735313416,
-0.15237902104854584,
-0.03085837885737419,
0.034142304211854935,
0.22318391501903534,
0.09510725736618042,
0.11789204180240631,
-0.056884925812482834,
0.01867016591131687,
-0.014630005694925785,
0.07210476696491241,
0.01930936984717846,
-0.11180795729160309,
-0.05821927636861801,
-0.1671394407749176,
-0.06358269602060318,
-0.06605665385723114,
0.15941719710826874,
0.04204154014587402,
-0.1511780023574829,
-0.0023216065019369125,
0.09751222282648087,
-0.19061361253261566,
-0.01414076704531908,
0.15596292912960052,
-0.052218008786439896,
0.030456410720944405,
-0.15688134729862213,
0.033208124339580536,
0.07990100979804993,
-0.03329591080546379,
-0.005385906435549259,
0.11850643157958984,
0.0016962189693003893,
0.0011168140918016434,
0.031191395595669746,
-0.02976255677640438,
0.1002201959490776,
-0.049998700618743896,
0.0664602592587471,
0.006498505361378193,
-0.050898678600788116,
-0.12176649272441864,
0.20299068093299866,
-0.011168142780661583,
0.005839398596435785,
-0.05387692525982857,
0.002361849183216691,
-0.10547170042991638,
-0.10643130540847778,
-0.07194021344184875,
-0.1348770409822464,
0.06636373698711395,
-0.06335773319005966,
0.017212338745594025,
-0.003962966613471508,
0.01886921375989914,
-0.0748608261346817,
0.0052225408144295216,
-0.16599804162979126,
-0.05285998806357384,
0.018147235736250877,
-0.01873687095940113,
-0.018014153465628624,
-0.05340718850493431,
-0.035566046833992004,
0.03400992974638939,
-0.05662786215543747,
-0.06380753964185715,
0.06251925975084305,
0.09012024104595184,
0.055808428674936295,
-0.16311798989772797,
-0.1121508777141571,
-0.07195433974266052,
0.038470566272735596,
0.06766404211521149,
0.18600554764270782,
0.04163861647248268,
-0.010326561518013477,
0.04968218877911568,
0.13249251246452332,
0.0161376241594553,
-0.08048070967197418,
-0.07386568933725357,
-0.13419215381145477,
-0.1304938793182373,
-0.0037943145725876093,
-0.06502494215965271,
-0.018554680049419403,
0.01689046248793602,
0.2338385432958603,
0.18024498224258423,
-0.14771035313606262,
0.040904346853494644,
-0.07149875164031982,
0.04080015793442726,
-0.030460240319371223,
0.14684076607227325,
0.051152147352695465,
0.1399570107460022,
-0.0325285904109478,
-0.0241378266364336,
-0.055345695465803146,
0.019192367792129517,
-0.10207807272672653,
0.03659261018037796,
-0.01218082569539547,
-0.06781280040740967,
-0.058132946491241455,
0.10045289993286133,
-0.10573149472475052,
0.04748612269759178,
0.1780495047569275,
-0.13778576254844666,
-0.01026421133428812,
-0.04288391023874283,
0.06413495540618896,
0.1097964346408844,
0.010660555213689804,
-0.07815781980752945,
-0.020525185391306877,
0.019957978278398514,
0.01772293448448181,
-0.1625925600528717,
-0.10427451878786087,
-0.00644707540050149,
-0.1450713574886322,
0.12423975765705109,
-0.004777793772518635,
-0.008532118052244186,
0.03184903413057327,
-0.07655055820941925,
-0.0040219249203801155,
0.1661052405834198,
0.01893143355846405,
-0.03157643601298332,
-0.03459710627794266,
-0.05704154074192047,
-0.10171983391046524,
0.08164718747138977,
0.08629855513572693,
0.04967915639281273,
-0.005818655248731375,
0.1455758661031723,
-0.03510192036628723,
-0.03601725399494171,
0.13561081886291504,
-0.16302399337291718,
0.08834612369537354,
-0.0032246140763163567,
-0.017506517469882965,
-0.062240179628133774,
-0.0503913089632988,
0.03932808339595795,
0.08441409468650818,
-0.16593123972415924,
-0.04182010516524315,
0.06984322518110275,
-0.09550084918737411,
0.06576232612133026,
0.049675945192575455,
-0.07460902631282806,
0.027238035574555397,
-0.12874756753444672,
-0.003966334741562605,
-0.09476994723081589,
0.04982598498463631,
0.20168791711330414,
-0.03469250723719597,
0.013888583518564701,
-0.14858689904212952,
0.05819014459848404,
-0.024830467998981476,
-0.037965886294841766,
-0.07828888297080994
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | null | raucha/peft-test | [
"transformers",
"safetensors",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-15T05:46:39+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
31,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06646376848220825,
0.2168014943599701,
-0.00225935154594481,
0.023818302899599075,
0.1271018385887146,
-0.001635765191167593,
0.04218708351254463,
0.13324736058712006,
-0.020175931975245476,
0.11144465953111649,
0.046588581055402756,
0.09377603232860565,
0.09928803145885468,
0.18404334783554077,
0.04859916493296623,
-0.2059975117444992,
0.007056170143187046,
-0.09090408682823181,
0.014076028019189835,
0.1116579994559288,
0.13719257712364197,
-0.10291384905576706,
0.08272874355316162,
-0.04045208916068077,
-0.02019004337489605,
0.00012576708104461432,
-0.09259183704853058,
-0.07032395154237747,
0.06885425746440887,
0.06264153122901917,
0.051234472543001175,
0.001456156256608665,
0.09140396863222122,
-0.2864592671394348,
0.017265573143959045,
0.08406311273574829,
0.0027674848679453135,
0.06290827691555023,
0.07236549258232117,
-0.07389893382787704,
0.11328595131635666,
-0.08021481335163116,
0.13019037246704102,
0.08625296503305435,
-0.062064990401268005,
-0.23071379959583282,
-0.07525765895843506,
0.0963398814201355,
0.12251301854848862,
0.06215599179267883,
-0.022921854630112648,
0.15455181896686554,
-0.06248689442873001,
0.012971068732440472,
0.1294165402650833,
-0.11526761949062347,
-0.05572471022605896,
0.061741601675748825,
0.11775490641593933,
0.10740239918231964,
-0.14110268652439117,
-0.0017287094378843904,
0.04900608956813812,
0.029121357947587967,
0.08589313924312592,
0.022661056369543076,
0.12003941088914871,
0.04652795568108559,
-0.13695219159126282,
-0.04037507623434067,
0.12011898308992386,
0.038862764835357666,
-0.06446044892072678,
-0.2168138176202774,
-0.006778308190405369,
-0.0601806715130806,
-0.014732478186488152,
-0.07019448280334473,
0.039128515869379044,
-0.02470310963690281,
0.07317749410867691,
-0.04465159401297569,
-0.1063927412033081,
-0.0421026237308979,
0.0892222449183464,
0.07748593389987946,
0.011527054943144321,
-0.02519804798066616,
0.04627908393740654,
0.13455867767333984,
0.05402068421244621,
-0.10399353504180908,
-0.07017925381660461,
-0.06942764669656754,
-0.09420394152402878,
-0.04035796597599983,
0.056760527193546295,
0.031942449510097504,
0.02665667235851288,
0.22703726589679718,
0.016653569415211678,
0.04155244305729866,
0.0224777739495039,
0.01032855175435543,
0.043662428855895996,
0.0955500528216362,
-0.05303520709276199,
-0.15660029649734497,
-0.04072032496333122,
0.09077946096658707,
-0.0027527001220732927,
-0.036689214408397675,
-0.03966725245118141,
0.03849169611930847,
0.06843466311693192,
0.13122352957725525,
0.07552056759595871,
-0.017929591238498688,
-0.04813180863857269,
-0.030096933245658875,
0.23523783683776855,
-0.1493375599384308,
0.04426715523004532,
-0.02271856553852558,
-0.01804111897945404,
-0.03908449783921242,
0.03597262129187584,
0.022118929773569107,
-0.000004518366949923802,
0.09706240892410278,
-0.058981191366910934,
-0.05378659814596176,
-0.10168042778968811,
-0.03272576630115509,
0.04088849574327469,
-0.013975566253066063,
-0.010589460842311382,
-0.09025166928768158,
-0.09490354359149933,
-0.04766594246029854,
0.05537205561995506,
-0.05123869329690933,
-0.03770573064684868,
0.009465423412621021,
-0.08151785284280777,
-0.005444355774670839,
-0.005417742300778627,
0.10699385404586792,
-0.03222226724028587,
0.04445803165435791,
-0.027600755915045738,
0.05225523188710213,
0.09919606149196625,
0.031576547771692276,
-0.0773419588804245,
0.0561848059296608,
-0.22559374570846558,
0.07503069192171097,
-0.11481974273920059,
0.04335082694888115,
-0.1704932004213333,
-0.042439818382263184,
0.005444696638733149,
0.0139949731528759,
0.013206101022660732,
0.12720820307731628,
-0.19255615770816803,
-0.01654396951198578,
0.13260798156261444,
-0.09212633967399597,
-0.118110790848732,
0.07884611934423447,
-0.029701577499508858,
0.1624738723039627,
0.04682036489248276,
-0.027025915682315826,
0.09224298596382141,
-0.16434773802757263,
-0.07092688232660294,
-0.00949116237461567,
-0.01727987825870514,
0.12109188735485077,
0.07512219995260239,
-0.05991523340344429,
0.046571120619773865,
0.02832140028476715,
-0.038078423589468,
-0.04424772411584854,
-0.050857074558734894,
-0.10884185880422592,
-0.01070026308298111,
-0.08987759798765182,
0.04065500199794769,
-0.01250192429870367,
-0.07916021347045898,
-0.029885273426771164,
-0.18612512946128845,
-0.0030564051121473312,
0.10038342326879501,
0.0035033065360039473,
-0.005652366206049919,
-0.08666291832923889,
0.026358824223279953,
-0.03112892620265484,
-0.008404186926782131,
-0.16764774918556213,
-0.04399421438574791,
0.046902090311050415,
-0.16094985604286194,
0.020117372274398804,
-0.06413903087377548,
0.06334125250577927,
0.03641495108604431,
-0.05590536445379257,
-0.0248766727745533,
-0.01730942726135254,
0.011945613659918308,
-0.05083848536014557,
-0.18994836509227753,
-0.056277405470609665,
-0.037882111966609955,
0.149809330701828,
-0.25956398248672485,
0.032966937869787216,
0.051140617579221725,
0.14649195969104767,
0.00406361510977149,
-0.05115427449345589,
0.01429014839231968,
-0.05360214412212372,
-0.054652128368616104,
-0.06746816635131836,
-0.006135428790003061,
-0.027576493099331856,
-0.05147203803062439,
0.019243421033024788,
-0.1755700707435608,
-0.021410830318927765,
0.09424154460430145,
0.12876708805561066,
-0.1486445665359497,
-0.018640631809830666,
-0.048725154250860214,
-0.06339836865663528,
-0.0715010017156601,
-0.07038594037294388,
0.10712739825248718,
0.0513901449739933,
0.04796046018600464,
-0.07435787469148636,
-0.07092321664094925,
0.02726263552904129,
0.006906150374561548,
-0.03382374346256256,
0.08727246522903442,
0.05199531093239784,
-0.09209315478801727,
0.0756213590502739,
0.1092359870672226,
0.07177663594484329,
0.09363535046577454,
0.01574566215276718,
-0.11756632477045059,
-0.028492970392107964,
0.036266472190618515,
0.02740776725113392,
0.1465986967086792,
-0.05952361226081848,
0.04016614332795143,
0.04494241625070572,
-0.04170418903231621,
0.022319864481687546,
-0.08787637203931808,
0.024075502529740334,
0.025203049182891846,
-0.0034381982404738665,
0.06284574419260025,
-0.02525499276816845,
-0.0050758360885083675,
0.07016654312610626,
0.047779910266399384,
0.04621000960469246,
0.009655474685132504,
-0.01720241829752922,
-0.1047825813293457,
0.16950392723083496,
-0.0951867327094078,
-0.269941508769989,
-0.17632324993610382,
0.026197833940386772,
0.04035249724984169,
-0.022378476336598396,
0.031619444489479065,
-0.07056326419115067,
-0.10630585998296738,
-0.1060405746102333,
-0.002429972169920802,
0.01714223250746727,
-0.06364088505506516,
-0.0741225928068161,
0.07348573952913284,
0.04382912442088127,
-0.14902326464653015,
0.038552410900592804,
0.055694397538900375,
-0.057955220341682434,
-0.0233661737293005,
0.09118817001581192,
0.12397737801074982,
0.14583967626094818,
-0.021366750821471214,
-0.028626007959246635,
0.029004426673054695,
0.19620531797409058,
-0.13469526171684265,
0.10371150821447372,
0.13814030587673187,
-0.04545360431075096,
0.08360563963651657,
0.1560150384902954,
0.029186224564909935,
-0.08317049592733383,
0.05044832453131676,
0.04082648828625679,
-0.043159641325473785,
-0.2666129767894745,
-0.0534592866897583,
0.012832709588110447,
-0.06255637854337692,
0.09786593168973923,
0.10183793306350708,
0.11542957276105881,
0.034910861402750015,
-0.07166364789009094,
-0.043925940990448,
-0.0058974819257855415,
0.11737963557243347,
-0.05490213260054588,
-0.012639665976166725,
0.07686592638492584,
-0.05086168646812439,
0.005355054512619972,
0.10266812145709991,
0.02973790094256401,
0.17442677915096283,
0.020399179309606552,
0.11231429129838943,
0.06195578724145889,
0.08633565157651901,
0.0007386076031252742,
0.02951662428677082,
0.05147615820169449,
0.017203815281391144,
-0.002300140680745244,
-0.10421168059110641,
-0.006156572140753269,
0.1449710875749588,
0.028103826567530632,
0.029669636860489845,
-0.0018948549404740334,
-0.005003341939300299,
0.05121048167347908,
0.1746254414319992,
-0.011592294089496136,
-0.22072425484657288,
-0.0845772922039032,
0.06936841458082199,
-0.06218599155545235,
-0.12968985736370087,
-0.026130788028240204,
0.045467354357242584,
-0.17519839107990265,
0.026703642681241035,
-0.027433741837739944,
0.0919293761253357,
-0.09345759451389313,
-0.02221956104040146,
0.03687324374914169,
0.084866963326931,
-0.014529162086546421,
0.08703910559415817,
-0.14498743414878845,
0.11886418610811234,
0.02978132851421833,
0.09024628251791,
-0.11081171780824661,
0.07909037172794342,
-0.007550720125436783,
0.009180475026369095,
0.19379350543022156,
-0.011335089802742004,
-0.03514958545565605,
-0.08774717897176743,
-0.11210042238235474,
-0.013537433929741383,
0.12687496840953827,
-0.1243172138929367,
0.08773399889469147,
-0.015198243781924248,
-0.044079482555389404,
0.00937260314822197,
-0.12100647389888763,
-0.17273177206516266,
-0.19628387689590454,
0.05585884302854538,
-0.09575839340686798,
0.025643249973654747,
-0.11914430558681488,
-0.07089093327522278,
-0.02952558360993862,
0.241120383143425,
-0.1745356321334839,
-0.06510113179683685,
-0.1468164622783661,
-0.046294767409563065,
0.1662203073501587,
-0.04437198117375374,
0.0718095526099205,
-0.0208172257989645,
0.20345525443553925,
0.005988610442727804,
-0.004939318168908358,
0.06724198162555695,
-0.08892562240362167,
-0.16873881220817566,
-0.06771010160446167,
0.1510489284992218,
0.11680185794830322,
0.04907919466495514,
-0.002248800592496991,
0.0011772146681323647,
-0.016943959519267082,
-0.1137804463505745,
-0.0033210667315870523,
0.16037839651107788,
0.03878779336810112,
0.025986969470977783,
-0.05243593826889992,
-0.08797456324100494,
-0.06899320334196091,
-0.06853509694337845,
0.06221301481127739,
0.19590823352336884,
-0.10376439243555069,
0.1700313836336136,
0.147536963224411,
-0.07305635511875153,
-0.23175598680973053,
0.035342130810022354,
0.04983805492520332,
0.0014306638622656465,
0.04886869341135025,
-0.18252557516098022,
0.10521943867206573,
0.019543392583727837,
-0.05505957826972008,
0.13485197722911835,
-0.1557481735944748,
-0.1552847921848297,
0.0722852572798729,
0.03904085233807564,
-0.22423844039440155,
-0.1354004591703415,
-0.09622503817081451,
-0.05825018882751465,
-0.14065024256706238,
0.06054598465561867,
-0.002136280992999673,
0.015948504209518433,
0.03500790148973465,
-0.0015643214574083686,
0.027123261243104935,
-0.058935679495334625,
0.18609118461608887,
-0.004065449349582195,
0.020676052197813988,
-0.060264769941568375,
-0.0478842556476593,
0.09839435666799545,
-0.06130504235625267,
0.12208222597837448,
0.004057085141539574,
0.01594383642077446,
-0.10362856835126877,
-0.048314861953258514,
-0.04328322783112526,
0.05154227837920189,
-0.07548051327466965,
-0.10070807486772537,
-0.043625857681035995,
0.08841723203659058,
0.07005169242620468,
-0.03383097052574158,
0.00549331633374095,
-0.07189501076936722,
0.10019614547491074,
0.17795267701148987,
0.17573626339435577,
0.009926567785441875,
-0.07241068035364151,
0.01677953451871872,
-0.04142116755247116,
0.044231921434402466,
-0.2513144314289093,
0.03756171092391014,
0.06098250672221184,
0.029438555240631104,
0.09217222779989243,
-0.020435843616724014,
-0.1820858269929886,
-0.04050002992153168,
0.08094815909862518,
-0.05452597141265869,
-0.22617179155349731,
-0.019085140898823738,
0.0954197570681572,
-0.2020406424999237,
-0.007372708059847355,
0.03995226323604584,
-0.048725228756666183,
-0.023169852793216705,
0.00010950004070764408,
0.06317184865474701,
0.002471912419423461,
0.09773622453212738,
0.0735151618719101,
0.09715340286493301,
-0.08337292820215225,
0.10562895983457565,
0.10150538384914398,
-0.09572599828243256,
0.03605884686112404,
0.06754924356937408,
-0.05300498008728027,
-0.043293699622154236,
0.03665391728281975,
0.033023297786712646,
0.005234600510448217,
-0.060321882367134094,
0.013913018628954887,
-0.036497246474027634,
0.044923391193151474,
0.08326134830713272,
0.03754979372024536,
-0.013354414142668247,
0.06462216377258301,
0.03401726484298706,
-0.10898099094629288,
0.10366570204496384,
0.01731540448963642,
0.04105307161808014,
-0.08384523540735245,
-0.019968897104263306,
0.035425446927547455,
0.030576206743717194,
-0.01765924133360386,
-0.02306121215224266,
-0.02860277332365513,
-0.01614218018949032,
-0.14299540221691132,
-0.023106401786208153,
-0.07243485748767853,
0.006181265693157911,
0.014656842686235905,
-0.031884219497442245,
-0.011233693920075893,
0.02475680410861969,
-0.06979699432849884,
-0.07426341623067856,
-0.006949664559215307,
0.09833318740129471,
-0.15115703642368317,
0.008848577737808228,
0.06907843053340912,
-0.11088496446609497,
0.08190931379795074,
-0.008411259390413761,
0.016245156526565552,
0.022527478635311127,
-0.15448406338691711,
0.05601610988378525,
0.0008648968650959432,
0.01916889287531376,
0.025886621326208115,
-0.16471809148788452,
0.004104440100491047,
-0.04661374166607857,
-0.02149827405810356,
-0.00004464812809601426,
-0.02647159807384014,
-0.12325995415449142,
0.06858719140291214,
-0.015622655861079693,
-0.035931166261434555,
-0.02701525390148163,
0.0539589487016201,
0.07888586074113846,
-0.027474910020828247,
0.10445091128349304,
-0.008690856397151947,
0.04941811040043831,
-0.16801609098911285,
-0.02470702864229679,
-0.04982255399227142,
0.019377702847123146,
0.009884213097393513,
-0.007693959400057793,
0.04183054715394974,
-0.00976533442735672,
0.21883612871170044,
-0.05075952783226967,
0.1607085019350052,
0.05847611650824547,
-0.017352959141135216,
-0.0007513365126214921,
0.06180921941995621,
0.05997028574347496,
0.04658793285489082,
0.009480604901909828,
0.023740366101264954,
-0.022450892254710197,
-0.006695089396089315,
-0.15932634472846985,
0.01890849508345127,
0.14999441802501678,
0.06301083415746689,
0.024745315313339233,
0.05866100639104843,
-0.12775006890296936,
-0.12135478109121323,
0.09311001747846603,
-0.026755332946777344,
0.00928465835750103,
-0.08245618641376495,
0.1358020007610321,
0.14980104565620422,
-0.14000412821769714,
0.05256148427724838,
-0.06134212389588356,
-0.05217423290014267,
-0.10388828068971634,
-0.12032219022512436,
-0.05887215584516525,
-0.053666237741708755,
0.002330566756427288,
-0.03760887682437897,
0.054546963423490524,
0.03344334661960602,
-0.009351172484457493,
-0.00022941511997487396,
0.13597318530082703,
-0.019751882180571556,
-0.0028988157864660025,
0.048313532024621964,
0.03693558648228645,
0.02373051457107067,
-0.05275435373187065,
0.02940409444272518,
0.02539868652820587,
0.032232340425252914,
0.06546790152788162,
0.033412106335163116,
-0.047448933124542236,
0.03804153576493263,
-0.0025254099164158106,
-0.11207924783229828,
0.019641218706965446,
-0.00460948096588254,
-0.0742158442735672,
0.1268945336341858,
0.0407399944961071,
0.010224059224128723,
-0.03741471841931343,
0.24361543357372284,
-0.06653323769569397,
-0.06378097087144852,
-0.13251738250255585,
0.10491154342889786,
-0.0027236645109951496,
0.06476365029811859,
0.023412218317389488,
-0.1284150779247284,
0.005243356805294752,
0.13858191668987274,
0.12181595712900162,
0.0045748427510261536,
0.009228081442415714,
0.0518609918653965,
0.0025186820421367884,
-0.06998204439878464,
0.054019294679164886,
0.06992026418447495,
0.12919506430625916,
-0.07847554981708527,
0.07680778950452805,
0.0006860480643808842,
-0.08370215445756912,
-0.02947772853076458,
0.11312682181596756,
-0.0409729965031147,
0.03491825982928276,
-0.047444481402635574,
0.10916327685117722,
-0.05787910893559456,
-0.29412412643432617,
0.02350960113108158,
-0.09588567912578583,
-0.15202060341835022,
-0.018367812037467957,
0.05944539234042168,
-0.02624768204987049,
0.018029648810625076,
0.06971040368080139,
-0.06011629104614258,
0.20098382234573364,
0.0335683599114418,
-0.07864278554916382,
-0.0664360448718071,
0.04837050288915634,
-0.06564252078533173,
0.2949807047843933,
0.008418165147304535,
0.02863333560526371,
0.10770907253026962,
-0.03253700211644173,
-0.18271861970424652,
0.010723991319537163,
0.1133992001414299,
-0.08056149631738663,
0.08200647681951523,
0.19000613689422607,
-0.012578671798110008,
0.1209007054567337,
0.05294662341475487,
-0.047376248985528946,
0.04217283055186272,
-0.03389401361346245,
-0.051268599927425385,
-0.10752558708190918,
0.058453381061553955,
-0.05909625440835953,
0.15447644889354706,
0.10152646154165268,
-0.05671518296003342,
-0.004550917539745569,
-0.05555408447980881,
0.04875178262591362,
0.01804669201374054,
0.12263146042823792,
0.02951994352042675,
-0.1865430772304535,
0.032826557755470276,
-0.01144319772720337,
0.10186848044395447,
-0.25588861107826233,
-0.08421015739440918,
0.08833149075508118,
-0.011924264021217823,
-0.05105875805020332,
0.10560628771781921,
0.057650718837976456,
0.04243382066488266,
-0.043439045548439026,
-0.10480839014053345,
-0.02186836116015911,
0.14663739502429962,
-0.1469624787569046,
-0.025013303384184837
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | null | himanshugrad/Whisper-Hindi | [
"transformers",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-15T05:46:48+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
26,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.08389580249786377,
0.19830818474292755,
-0.0013316317927092314,
0.02313883788883686,
0.11396584659814835,
0.01961737498641014,
0.053626976907253265,
0.14538456499576569,
0.0060051376931369305,
0.10656800121068954,
0.066679947078228,
0.09131570905447006,
0.09678101539611816,
0.20042605698108673,
0.04371999576687813,
-0.17659740149974823,
0.010636410675942898,
-0.06930278241634369,
-0.010073255747556686,
0.11651819199323654,
0.141214057803154,
-0.10151198506355286,
0.07627976685762405,
-0.03319970890879631,
-0.02870541252195835,
-0.0070160143077373505,
-0.07769215852022171,
-0.05755697935819626,
0.07573003321886063,
0.054863471537828445,
0.04207949340343475,
-0.0008347301045432687,
0.08447454124689102,
-0.2674994468688965,
0.013753628358244896,
0.07452993094921112,
0.010659529827535152,
0.05990942195057869,
0.07833302766084671,
-0.04036625102162361,
0.12881849706172943,
-0.06320446729660034,
0.13035163283348083,
0.0906217098236084,
-0.0681561604142189,
-0.24378153681755066,
-0.08239314705133438,
0.06505522131919861,
0.12533815205097198,
0.07694927603006363,
-0.02823091857135296,
0.16422191262245178,
-0.07247646898031235,
0.019290022552013397,
0.09481704235076904,
-0.1151006743311882,
-0.060644298791885376,
0.08318385481834412,
0.14101974666118622,
0.10340547561645508,
-0.1255619376897812,
-0.012289565056562424,
0.04275871813297272,
0.045979104936122894,
0.07389909774065018,
0.011339850723743439,
0.1143413558602333,
0.05629947781562805,
-0.13526225090026855,
-0.05700986459851265,
0.14547574520111084,
0.023872992023825645,
-0.057064127177000046,
-0.2138909548521042,
-0.002902575535699725,
-0.07730814069509506,
-0.011685127392411232,
-0.06846728920936584,
0.0291305985301733,
-0.01194276288151741,
0.060226380825042725,
-0.0496203787624836,
-0.09797755628824234,
-0.046314824372529984,
0.1015089675784111,
0.054820988327264786,
0.011354796588420868,
-0.01489334274083376,
0.03576440364122391,
0.13432876765727997,
0.04213530570268631,
-0.10012737661600113,
-0.07065672427415848,
-0.0701170489192009,
-0.09620913118124008,
-0.03947552293539047,
0.04272124543786049,
0.020167991518974304,
0.042202774435281754,
0.2283228635787964,
0.024096308276057243,
0.05459817871451378,
0.029667891561985016,
0.0026177873369306326,
0.03211980313062668,
0.1073630079627037,
-0.041210614144802094,
-0.188126802444458,
-0.03292805701494217,
0.0931866466999054,
-0.009821015410125256,
-0.028658604249358177,
-0.033444397151470184,
0.035014089196920395,
0.08379437029361725,
0.11821532249450684,
0.08875755965709686,
-0.012828069739043713,
-0.037612639367580414,
-0.03493109717965126,
0.2115669697523117,
-0.14141373336315155,
0.045799970626831055,
-0.022097334265708923,
-0.018195297569036484,
-0.06905751675367355,
0.030103791505098343,
0.01831657998263836,
-0.003142025787383318,
0.06966056674718857,
-0.061253178864717484,
-0.05794486775994301,
-0.11518853157758713,
-0.045523155480623245,
0.04711875319480896,
-0.024105608463287354,
-0.024469668045639992,
-0.07765042781829834,
-0.11219723522663116,
-0.06417357176542282,
0.06612563133239746,
-0.04156653955578804,
-0.03974827378988266,
0.005308232270181179,
-0.07131324708461761,
0.008387917652726173,
0.008993842639029026,
0.12122467905282974,
-0.030063031241297722,
0.05833350867033005,
-0.002476902212947607,
0.05916252359747887,
0.10643328726291656,
0.03227818012237549,
-0.08492200076580048,
0.057466037571430206,
-0.20633617043495178,
0.08371785283088684,
-0.11420095711946487,
0.034276340156793594,
-0.17048145830631256,
-0.024183684960007668,
0.008447963744401932,
0.023597201332449913,
0.023726604878902435,
0.1338067352771759,
-0.2097422182559967,
-0.016196569427847862,
0.14133213460445404,
-0.09649793803691864,
-0.12422871589660645,
0.07990546524524689,
-0.03459475561976433,
0.1747698187828064,
0.038475677371025085,
-0.019652999937534332,
0.09909367561340332,
-0.15559963881969452,
-0.05852397903800011,
-0.026064254343509674,
-0.008927824907004833,
0.08823978155851364,
0.07542291283607483,
-0.05844951793551445,
0.02285866066813469,
0.02562655322253704,
-0.04727208614349365,
-0.0268824752420187,
-0.05256075784564018,
-0.10127434879541397,
-0.023140445351600647,
-0.09642518311738968,
0.026515161618590355,
0.000058677000197349116,
-0.07310442626476288,
-0.028560271486639977,
-0.17347893118858337,
-0.02563360333442688,
0.10103316605091095,
0.004820956848561764,
-0.007559072691947222,
-0.08540112525224686,
0.022149885073304176,
-0.05362366884946823,
-0.006164622958749533,
-0.16996455192565918,
-0.03558015450835228,
0.051895126700401306,
-0.14917676150798798,
0.015460150316357613,
-0.07327745854854584,
0.07047311216592789,
0.02098717913031578,
-0.05859505757689476,
-0.03108096309006214,
0.0007694467785768211,
0.004292082041501999,
-0.06229274719953537,
-0.1903683841228485,
-0.058886781334877014,
-0.041500482708215714,
0.15720732510089874,
-0.24841000139713287,
0.0300158578902483,
0.03247617185115814,
0.13185922801494598,
0.007058668415993452,
-0.06344027817249298,
0.02096918225288391,
-0.04676475748419762,
-0.050621338188648224,
-0.06898977607488632,
-0.009901339188218117,
-0.014539826661348343,
-0.031393732875585556,
0.012980648316442966,
-0.14970256388187408,
-0.060514215379953384,
0.09452559798955917,
0.11224991828203201,
-0.14555825293064117,
0.00204002158716321,
-0.0460561066865921,
-0.07002599537372589,
-0.07487804442644119,
-0.0761631652712822,
0.07739497721195221,
0.044650159776210785,
0.049250341951847076,
-0.06317461282014847,
-0.06234706938266754,
0.023210179060697556,
0.005524294450879097,
-0.019023682922124863,
0.0948529988527298,
0.074309803545475,
-0.09122881293296814,
0.07973480224609375,
0.08461450785398483,
0.04414684325456619,
0.086973637342453,
0.005991141777485609,
-0.11396963149309158,
-0.03062884695827961,
0.037754856050014496,
0.024159027263522148,
0.15351562201976776,
-0.08692087233066559,
0.030462130904197693,
0.052177220582962036,
-0.03854219615459442,
0.03157065063714981,
-0.0923321321606636,
0.025362705811858177,
0.021495236083865166,
-0.006555700208991766,
0.05864228308200836,
-0.018769768998026848,
-0.01403577346354723,
0.06336429715156555,
0.05677810311317444,
0.044270504266023636,
0.02595379762351513,
-0.02093072421848774,
-0.1278371512889862,
0.16537296772003174,
-0.09028079360723495,
-0.2540280222892761,
-0.17074446380138397,
0.015454737469553947,
0.03706491366028786,
-0.021728800609707832,
0.039588842540979385,
-0.06286025792360306,
-0.10237989574670792,
-0.09417891502380371,
0.0029635571409016848,
0.023925531655550003,
-0.058347854763269424,
-0.0817074254155159,
0.060779985040426254,
0.04047083482146263,
-0.13689260184764862,
0.0349188968539238,
0.06170675903558731,
-0.03042641654610634,
0.0018567070364952087,
0.07321398705244064,
0.12743599712848663,
0.14838241040706635,
-0.006730219814926386,
-0.012446845881640911,
0.035035960376262665,
0.229813352227211,
-0.1490442156791687,
0.10630457103252411,
0.14053207635879517,
-0.021705523133277893,
0.06635113060474396,
0.1461038440465927,
0.023231739178299904,
-0.07546708732843399,
0.04147516191005707,
0.04027445614337921,
-0.04228919371962547,
-0.2589097023010254,
-0.05694316700100899,
-0.00946022942662239,
-0.07043391466140747,
0.09718906134366989,
0.09238530695438385,
0.11972260475158691,
0.0337289460003376,
-0.05568677559494972,
-0.025771914049983025,
-0.003401360474526882,
0.114128477871418,
-0.027640055865049362,
-0.004564122296869755,
0.07965842634439468,
-0.05878787487745285,
0.011684526689350605,
0.09941446036100388,
0.019347423687577248,
0.17601320147514343,
0.02533329278230667,
0.10681075602769852,
0.06725578010082245,
0.09347675740718842,
-0.0015635732561349869,
0.034774236381053925,
0.05337131395936012,
0.022044572979211807,
0.010453542694449425,
-0.09408048540353775,
-0.012431944720447063,
0.13713060319423676,
0.019816776737570763,
0.009031654335558414,
0.008926562033593655,
-0.01010479498654604,
0.03131420537829399,
0.20501568913459778,
0.0009575071162544191,
-0.22537250816822052,
-0.09500737488269806,
0.059459153562784195,
-0.06931101530790329,
-0.143676295876503,
-0.02094252221286297,
0.030270220711827278,
-0.17292405664920807,
0.016790566965937614,
-0.0316389761865139,
0.09112390875816345,
-0.07145322859287262,
-0.028050832450389862,
0.06891903281211853,
0.07569212466478348,
-0.012108199298381805,
0.07973295450210571,
-0.19069278240203857,
0.12254468351602554,
0.03037673607468605,
0.08605273067951202,
-0.11708726733922958,
0.07849059253931046,
-0.0019813794642686844,
-0.014807495288550854,
0.17999744415283203,
-0.014062200672924519,
-0.0586031936109066,
-0.08878950774669647,
-0.08704045414924622,
-0.011727320961654186,
0.10361312329769135,
-0.09322915226221085,
0.09586969763040543,
-0.02775636687874794,
-0.03705112263560295,
0.012418309226632118,
-0.10469507426023483,
-0.1636953055858612,
-0.18679304420948029,
0.06244563311338425,
-0.07802703976631165,
0.012347841635346413,
-0.11227322369813919,
-0.06334327906370163,
-0.01575082167983055,
0.23160123825073242,
-0.16648635268211365,
-0.07049825042486191,
-0.1498587429523468,
-0.03997112438082695,
0.17463743686676025,
-0.042160745710134506,
0.06849376112222672,
-0.021383514627814293,
0.1873992383480072,
-0.008081548847258091,
-0.013158116489648819,
0.06569221615791321,
-0.09637628495693207,
-0.16879262030124664,
-0.05748843029141426,
0.14160962402820587,
0.10863390564918518,
0.05731578543782234,
-0.0038195757661014795,
0.013171887956559658,
-0.03383830562233925,
-0.09896382689476013,
0.013824623078107834,
0.13817466795444489,
0.0034514935687184334,
0.00682973163202405,
-0.03995988517999649,
-0.07027145475149155,
-0.05825701728463173,
-0.07912654429674149,
0.057147104293107986,
0.187900573015213,
-0.09512355923652649,
0.1602867990732193,
0.12431421875953674,
-0.06468851119279861,
-0.2306901067495346,
0.03996593505144119,
0.04701630026102066,
0.007666614837944508,
0.022401191294193268,
-0.19138796627521515,
0.09788824617862701,
0.0009011493530124426,
-0.06807263940572739,
0.14616990089416504,
-0.16564498841762543,
-0.1461436152458191,
0.08002161979675293,
0.025075770914554596,
-0.22560662031173706,
-0.14821304380893707,
-0.1037549376487732,
-0.03735695406794548,
-0.13707835972309113,
0.048581719398498535,
0.02614329755306244,
0.019834673032164574,
0.025222565978765488,
0.005338077899068594,
0.029657263308763504,
-0.07272187620401382,
0.1870686560869217,
-0.020297454670071602,
0.0072362530045211315,
-0.050640691071748734,
-0.04617878794670105,
0.09227550774812698,
-0.06150037795305252,
0.11741586774587631,
0.018679620698094368,
0.018796883523464203,
-0.1431548148393631,
-0.049209367483854294,
-0.060803934931755066,
0.04456847906112671,
-0.07284719496965408,
-0.09393193572759628,
-0.04137463867664337,
0.08888561278581619,
0.07211937010288239,
-0.032792408019304276,
-0.0027768779546022415,
-0.07569456845521927,
0.09405932575464249,
0.184477761387825,
0.17357055842876434,
0.009977072477340698,
-0.07020942866802216,
0.024555526673793793,
-0.042279548943042755,
0.03349342197179794,
-0.24652716517448425,
0.03456863760948181,
0.066053606569767,
0.03803660348057747,
0.08509242534637451,
-0.016836483031511307,
-0.1781480610370636,
-0.04086102172732353,
0.08498652279376984,
-0.06206206604838371,
-0.19876568019390106,
-0.02703288197517395,
0.08424776047468185,
-0.20383712649345398,
-0.032998621463775635,
0.041543323546648026,
-0.03834589570760727,
-0.02396267279982567,
-0.002415500348433852,
0.06396626681089401,
-0.008327016606926918,
0.12156640738248825,
0.06747189164161682,
0.10266115516424179,
-0.09284433722496033,
0.08920657634735107,
0.10416955500841141,
-0.09140542894601822,
0.03545991703867912,
0.10264154523611069,
-0.05670900270342827,
-0.04460543021559715,
0.033935222774744034,
0.05925208330154419,
-0.028357384726405144,
-0.06409841030836105,
-0.000502707262057811,
-0.0359574519097805,
0.04993389546871185,
0.08058220148086548,
0.036113787442445755,
-0.01202210783958435,
0.06544706225395203,
0.028145326301455498,
-0.11693570017814636,
0.10949387401342392,
0.04405685141682625,
0.04509059712290764,
-0.07182393968105316,
-0.012280966155230999,
0.015999672934412956,
0.032540347427129745,
-0.019734015688300133,
-0.014576527290046215,
-0.03146412968635559,
-0.007561005651950836,
-0.1553635597229004,
-0.02064543403685093,
-0.06516171246767044,
0.006067827809602022,
0.022207623347640038,
-0.03830232471227646,
-0.012014663778245449,
0.01381110493093729,
-0.07979435473680496,
-0.07571027427911758,
-0.01700955256819725,
0.08539021760225296,
-0.1381402313709259,
0.006627439055591822,
0.07182712107896805,
-0.10980239510536194,
0.07347989827394485,
-0.0048679932951927185,
0.017079560086131096,
0.010923396795988083,
-0.11654401570558548,
0.04386281594634056,
-0.005810429807752371,
0.01551580335944891,
0.022556742653250694,
-0.171111062169075,
0.011553828604519367,
-0.038553636521101,
-0.03114982508122921,
0.011926400475203991,
-0.025060230866074562,
-0.11875922232866287,
0.08676479011774063,
-0.028097305446863174,
-0.037512701004743576,
-0.03292486071586609,
0.06296087801456451,
0.08736220002174377,
-0.011740099638700485,
0.09667140990495682,
-0.025766119360923767,
0.04818311333656311,
-0.1756584197282791,
-0.01910574547946453,
-0.050167568027973175,
0.02537350542843342,
-0.01759655587375164,
-0.0070639788173139095,
0.055272240191698074,
-0.004191063344478607,
0.20991376042366028,
-0.03921036794781685,
0.1548677533864975,
0.05199402943253517,
-0.009925156831741333,
0.010884369723498821,
0.05032730847597122,
0.06423956155776978,
0.031145188957452774,
0.00853167474269867,
0.04660189896821976,
-0.004552975296974182,
-0.020357951521873474,
-0.13699717819690704,
0.02791593410074711,
0.16117429733276367,
0.061918217688798904,
0.0392887257039547,
0.03704594820737839,
-0.1422400325536728,
-0.09538721293210983,
0.10306388139724731,
-0.0331864058971405,
0.014331420883536339,
-0.08317886292934418,
0.17621558904647827,
0.12328410148620605,
-0.1574767529964447,
0.0577850341796875,
-0.07234696298837662,
-0.05066767707467079,
-0.1024852767586708,
-0.11832084506750107,
-0.06293155997991562,
-0.06027044355869293,
-0.004747506696730852,
-0.042489297688007355,
0.05734556168317795,
0.026751231402158737,
-0.003270963439717889,
-0.006759525276720524,
0.12665949761867523,
-0.0249644722789526,
-0.004145825747400522,
0.04152364656329155,
0.0326087586581707,
0.019319625571370125,
-0.05872373282909393,
0.017997145652770996,
0.018602589145302773,
0.022180357947945595,
0.06835069507360458,
0.0260987039655447,
-0.059317342936992645,
0.044286735355854034,
0.00319746439345181,
-0.11313364654779434,
0.018146557733416557,
-0.00002245741598017048,
-0.05020225793123245,
0.13557326793670654,
0.04076748713850975,
0.01548024732619524,
-0.029270920902490616,
0.24342355132102966,
-0.07199113070964813,
-0.08681939542293549,
-0.13965600728988647,
0.11511493474245071,
-0.023563209921121597,
0.03755274787545204,
0.016542524099349976,
-0.12659503519535065,
0.011511262506246567,
0.18531471490859985,
0.12824349105358124,
0.012459068559110165,
-0.007656481582671404,
0.05736639350652695,
-0.0007639875984750688,
-0.05985576659440994,
0.05051197111606598,
0.0664999932050705,
0.16097788512706757,
-0.09069112688302994,
0.0652846097946167,
-0.008405503816902637,
-0.0831485390663147,
-0.027498632669448853,
0.11705785244703293,
-0.022675158455967903,
0.02148384228348732,
-0.03778035193681717,
0.11204422265291214,
-0.052532415837049484,
-0.2719486355781555,
0.02952493168413639,
-0.09503202140331268,
-0.13993041217327118,
-0.02591860294342041,
0.041448429226875305,
-0.03349510580301285,
0.01577647216618061,
0.06254769116640091,
-0.045389387756586075,
0.18837277591228485,
0.025987716391682625,
-0.08679025620222092,
-0.07755549252033234,
0.05874146893620491,
-0.08695939928293228,
0.2789687216281891,
0.003863075515255332,
0.04782010242342949,
0.12108923494815826,
-0.03053574077785015,
-0.18664880096912384,
0.014769754372537136,
0.11989909410476685,
-0.09114406257867813,
0.07780203968286514,
0.18139931559562683,
-0.005561648402363062,
0.12649618089199066,
0.04705416411161423,
-0.03877115994691849,
0.03976387158036232,
-0.02721380814909935,
-0.03821522742509842,
-0.12209630757570267,
0.05661242455244064,
-0.0612691193819046,
0.15957388281822205,
0.1158948540687561,
-0.05964287370443344,
0.001120698289014399,
-0.06126941740512848,
0.06300627440214157,
0.014774397015571594,
0.12115653604269028,
0.018452486023306847,
-0.2023056596517563,
0.05087360367178917,
-0.03283824771642685,
0.08166342973709106,
-0.254973828792572,
-0.08186668157577515,
0.07622263580560684,
-0.019022729247808456,
-0.04275642707943916,
0.12311509251594543,
0.06101066991686821,
0.03676839917898178,
-0.03853875398635864,
-0.08537755906581879,
-0.01412904355674982,
0.15376435220241547,
-0.14123432338237762,
-0.029574336484074593
] |
null | null | transformers |
# Zero凉宫春日
基于Qwen_14B_base 热启,在15w高质量的NPC抽取样本上进行2k训练
epoch=2,batch_size=64,lr=2e-5
| {} | text-generation | silk-road/Haruhi-dialogue-action-extract-14B | [
"transformers",
"pytorch",
"qwen",
"text-generation",
"custom_code",
"autotrain_compatible",
"region:us"
] | 2024-02-15T05:49:29+00:00 | [] | [] | TAGS
#transformers #pytorch #qwen #text-generation #custom_code #autotrain_compatible #region-us
|
# Zero凉宫春日
基于Qwen_14B_base 热启,在15w高质量的NPC抽取样本上进行2k训练
epoch=2,batch_size=64,lr=2e-5
| [
"# Zero凉宫春日\n\n基于Qwen_14B_base 热启,在15w高质量的NPC抽取样本上进行2k训练\n\n\nepoch=2,batch_size=64,lr=2e-5"
] | [
"TAGS\n#transformers #pytorch #qwen #text-generation #custom_code #autotrain_compatible #region-us \n",
"# Zero凉宫春日\n\n基于Qwen_14B_base 热启,在15w高质量的NPC抽取样本上进行2k训练\n\n\nepoch=2,batch_size=64,lr=2e-5"
] | [
34,
53
] | [
"passage: TAGS\n#transformers #pytorch #qwen #text-generation #custom_code #autotrain_compatible #region-us \n# Zero凉宫春日\n\n基于Qwen_14B_base 热启,在15w高质量的NPC抽取样本上进行2k训练\n\n\nepoch=2,batch_size=64,lr=2e-5"
] | [
-0.07865263521671295,
0.0754353329539299,
-0.006386524997651577,
0.059194620698690414,
0.07627595961093903,
-0.019037269055843353,
0.09536977857351303,
0.13381047546863556,
0.016178835183382034,
-0.022418223321437836,
0.13544045388698578,
0.10237189382314682,
0.016581572592258453,
0.12415535002946854,
-0.025321708992123604,
-0.18331749737262726,
0.051696497946977615,
0.0985197052359581,
-0.05751413479447365,
0.12092718482017517,
0.021558746695518494,
-0.06068424507975578,
0.07801792770624161,
-0.03376254066824913,
-0.1532871127128601,
-0.017275609076023102,
-0.012704181484878063,
-0.07669225335121155,
0.11028700321912766,
0.006765439175069332,
0.045054759830236435,
0.07789625227451324,
0.021435189992189407,
-0.09576629847288132,
0.02849048748612404,
-0.03795742616057396,
-0.06553257256746292,
0.0542481355369091,
0.008597695268690586,
0.09792535752058029,
0.045465633273124695,
0.05734817311167717,
-0.05791425332427025,
0.03756287693977356,
-0.067691370844841,
-0.02528052218258381,
0.04235510528087616,
0.08524872362613678,
0.09594137221574783,
0.08320780098438263,
-0.013657208532094955,
0.18540062010288239,
-0.16201113164424896,
0.08635792136192322,
0.04872740060091019,
-0.3729146718978882,
-0.013950088061392307,
0.08937361091375351,
0.07767686992883682,
0.10055200010538101,
-0.06604152917861938,
-0.01134408824145794,
0.06961124390363693,
0.00373943243175745,
-0.058835625648498535,
-0.06338144838809967,
-0.07468153536319733,
0.031140971928834915,
-0.08790379017591476,
0.04275515675544739,
0.1376611292362213,
-0.00704119773581624,
0.0055443705059587955,
0.05677826702594757,
-0.08620796352624893,
-0.2052992582321167,
-0.004610031843185425,
-0.030478617176413536,
-0.04217934235930443,
0.04265452176332474,
-0.09825843572616577,
-0.02984866499900818,
-0.05464388430118561,
-0.10953360050916672,
-0.1474536508321762,
0.15735846757888794,
0.04504737630486488,
0.012362509034574032,
-0.13013404607772827,
0.09820262342691422,
0.01925642229616642,
-0.08477698266506195,
-0.04144351929426193,
-0.05533479154109955,
-0.009387019090354443,
0.04011380299925804,
-0.028894413262605667,
0.06119967997074127,
0.10642850399017334,
0.13720186054706573,
0.024147342890501022,
0.06856163591146469,
0.031123528257012367,
0.04752900451421738,
-0.0624702163040638,
0.08757713437080383,
0.015583930537104607,
-0.07835346460342407,
0.07331984490156174,
0.04821273311972618,
-0.004880251828581095,
-0.06542100757360458,
-0.14943340420722961,
-0.10401590913534164,
0.0747944563627243,
0.03560056909918785,
0.014437916688621044,
0.06762023270130157,
0.015431001782417297,
-0.02965167537331581,
0.10150175541639328,
-0.03199911490082741,
-0.03545459359884262,
0.03353620320558548,
-0.042705390602350235,
0.02445109933614731,
-0.029646223410964012,
0.027383727952837944,
-0.046581510454416275,
-0.09724252671003342,
-0.07036930322647095,
-0.038144633173942566,
0.00847963709384203,
-0.03136745095252991,
0.007448804099112749,
0.029363621026277542,
0.028330037370324135,
-0.15765778720378876,
-0.1392323076725006,
0.03405055031180382,
-0.0374029204249382,
-0.006245780270546675,
-0.07485613226890564,
-0.08818363398313522,
-0.09218249469995499,
-0.006348081864416599,
-0.008681236766278744,
0.02403153106570244,
-0.011215610429644585,
0.1047767773270607,
0.12294420599937439,
0.11660944670438766,
-0.12104743719100952,
0.02024366520345211,
-0.12792977690696716,
0.023008335381746292,
0.010678027756512165,
0.06903281807899475,
-0.00629358971491456,
0.027373112738132477,
0.007224889937788248,
-0.061919085681438446,
-0.05068928748369217,
-0.015066130086779594,
0.015876639634370804,
0.1227608248591423,
-0.14506042003631592,
-0.09775587171316147,
0.1846092939376831,
-0.08183762431144714,
-0.1195620447397232,
0.1291925013065338,
0.016332384198904037,
-0.045655857771635056,
0.028781846165657043,
0.18177317082881927,
0.1112956553697586,
0.015057074837386608,
-0.01126465667039156,
0.08581171184778214,
-0.06045057997107506,
-0.14784295856952667,
0.084804967045784,
0.10613206028938293,
-0.09064482897520065,
0.06418436020612717,
-0.06346573680639267,
0.07740268111228943,
-0.08210214972496033,
-0.09456972777843475,
-0.047787703573703766,
-0.08943163603544235,
0.11666705459356308,
0.0014804191887378693,
0.13582414388656616,
-0.04199764132499695,
-0.03207695484161377,
-0.049614183604717255,
0.11025088280439377,
-0.01884697750210762,
0.010609733872115612,
-0.12631309032440186,
0.1307430863380432,
0.00980366114526987,
0.0483863390982151,
-0.08972395211458206,
0.04749986529350281,
0.041700493544340134,
0.004704855382442474,
0.02167731709778309,
0.0876958966255188,
0.05441895127296448,
0.07777601480484009,
-0.02373228408396244,
-0.07053579390048981,
-0.017061974853277206,
-0.025058457627892494,
-0.07564898580312729,
-0.05903558433055878,
-0.00849942583590746,
-0.034972839057445526,
0.09558571875095367,
-0.10669390857219696,
0.033523742109537125,
-0.08064132183790207,
0.04208970069885254,
-0.025611866265535355,
0.034640971571207047,
-0.01774514466524124,
0.08280013501644135,
-0.07089696824550629,
0.03761737793684006,
0.07557955384254456,
-0.0059934030286967754,
-0.14476272463798523,
0.09000448882579803,
-0.12877337634563446,
0.2033785879611969,
0.15051303803920746,
-0.18309356272220612,
0.05139710009098053,
-0.0028621640522032976,
-0.03162498027086258,
-0.034920141100883484,
0.05405200272798538,
0.014272564090788364,
0.13727083802223206,
-0.0130073893815279,
0.1296577900648117,
-0.05320334807038307,
-0.00804818980395794,
-0.0005803540698252618,
-0.03941025957465172,
0.022739332169294357,
0.09931069612503052,
0.1016218289732933,
-0.09533832967281342,
0.08459156006574631,
0.13239727914333344,
-0.06903988867998123,
0.184976264834404,
0.006030017044395208,
-0.021101517602801323,
-0.022486073896288872,
0.01388778816908598,
-0.03757074847817421,
0.0019032283453270793,
-0.1552557498216629,
-0.03847511112689972,
0.034579306840896606,
-0.01960092782974243,
0.05356967821717262,
-0.12132613360881805,
-0.08170285820960999,
-0.04259005934000015,
-0.050274331122636795,
-0.11350680142641068,
0.03987853229045868,
-0.013673067092895508,
0.0860058143734932,
-0.004765104968100786,
-0.047910355031490326,
0.05865103751420975,
0.00954545009881258,
-0.07665158808231354,
0.13860972225666046,
-0.0891200602054596,
-0.16076847910881042,
-0.11472266912460327,
-0.008557718247175217,
-0.06942358613014221,
0.0016857752343639731,
0.05302233248949051,
-0.16931034624576569,
0.05939287319779396,
0.0010929768905043602,
-0.0034903897903859615,
-0.029473060742020607,
0.04605189338326454,
0.05303562059998512,
0.04150395467877388,
0.0167994424700737,
-0.06907498836517334,
0.0025055529549717903,
-0.05764445289969444,
-0.11732514202594757,
0.14523541927337646,
-0.14203542470932007,
0.07378748804330826,
0.1774282306432724,
-0.014411018230021,
0.02752753347158432,
0.004708440508693457,
0.22762168943881989,
-0.027718408033251762,
0.0031702432315796614,
0.16381949186325073,
0.0035864124074578285,
0.02284831739962101,
0.04502655938267708,
-0.005449933465570211,
-0.044511716812849045,
0.02203596755862236,
-0.02226804569363594,
-0.079056017100811,
-0.15748780965805054,
-0.041894182562828064,
-0.03864574432373047,
0.10508081316947937,
-0.0074419863522052765,
0.07043340802192688,
0.10084410756826401,
0.055235423147678375,
0.03378210589289665,
0.06068260967731476,
-0.06465955823659897,
0.020370911806821823,
0.08059704303741455,
0.05320971459150314,
0.11734209209680557,
-0.029475567862391472,
-0.060331784188747406,
0.03940669074654579,
0.06753286719322205,
0.11307301372289658,
0.027636798098683357,
0.009887550957500935,
-0.00363547308370471,
0.2538372576236725,
0.16325834393501282,
0.0356534905731678,
0.027422890067100525,
-0.05254630371928215,
0.0063359555788338184,
-0.009651070460677147,
-0.05756670609116554,
0.047414686530828476,
0.04859135299921036,
-0.0399453230202198,
-0.055458687245845795,
0.10543075203895569,
0.03513028472661972,
0.08274541795253754,
0.08117391169071198,
-0.1630532145500183,
-0.05620487406849861,
-0.00038590223994106054,
-0.022570835426449776,
0.00046714043128304183,
0.10680986940860748,
0.11193970590829849,
-0.09718278050422668,
-0.011932363733649254,
-0.010085868649184704,
0.10870931297540665,
-0.01871848851442337,
0.09774568676948547,
-0.051949143409729004,
-0.04755062609910965,
0.03000771254301071,
0.073816679418087,
-0.2710966467857361,
0.17720156908035278,
-0.01912958361208439,
0.048455290496349335,
-0.0275126863270998,
-0.07611893862485886,
0.05847305431962013,
0.042099788784980774,
0.03492473438382149,
0.022723659873008728,
-0.010211985558271408,
-0.20473895967006683,
-0.12662506103515625,
0.07819351553916931,
0.061846256256103516,
0.02791847288608551,
0.04722363501787186,
-0.008351026102900505,
-0.007321825250983238,
-0.008896199986338615,
0.08072169870138168,
-0.05762365832924843,
-0.00015776137297507375,
0.02125091291964054,
0.07836738228797913,
0.05157580226659775,
-0.027754293754696846,
-0.06684747338294983,
-0.08095832169055939,
0.07098408788442612,
-0.11728387326002121,
-0.05675327405333519,
-0.034399740397930145,
0.013923866674304008,
0.0666825994849205,
-0.10926749557256699,
0.012910736724734306,
-0.03029511496424675,
-0.004452521447092295,
-0.0005023453268222511,
-0.13697008788585663,
0.05679307505488396,
-0.06365160644054413,
-0.16341370344161987,
-0.01890679821372032,
0.08466476947069168,
0.002308965427801013,
0.09822675585746765,
0.01538024377077818,
-0.061746105551719666,
-0.06659888476133347,
-0.11494731158018112,
0.038717031478881836,
-0.12444643676280975,
-0.08081857860088348,
-0.06978166848421097,
-0.04908091202378273,
0.04283853620290756,
-0.10186486691236496,
-0.018521461635828018,
0.20494616031646729,
0.2599491775035858,
-0.06075531989336014,
0.007596105802804232,
0.16900478303432465,
0.013939435593783855,
-0.21131855249404907,
-0.11876598000526428,
-0.005951092578470707,
0.0245259627699852,
-0.06990092992782593,
-0.1729758381843567,
0.05629121512174606,
0.08628419041633606,
0.009297724813222885,
0.06418445706367493,
-0.23684677481651306,
-0.0764760673046112,
0.12922295928001404,
-0.003073933068662882,
0.31239739060401917,
-0.08818074315786362,
-0.06047593057155609,
0.05310232937335968,
-0.12190509587526321,
0.07880254834890366,
-0.08927981555461884,
0.103643499314785,
-0.08908495306968689,
0.048388704657554626,
0.01781357079744339,
-0.03531131520867348,
0.10157663375139236,
0.03958515450358391,
0.01868375763297081,
-0.032545194029808044,
-0.12614156305789948,
-0.03422626480460167,
-0.002435080474242568,
-0.014691365882754326,
-0.07064316421747208,
0.05893373861908913,
-0.11112257093191147,
-0.0006549077806994319,
-0.09229554235935211,
-0.010399735532701015,
0.012531588785350323,
-0.05892707034945488,
0.006838163360953331,
-0.005466638598591089,
-0.027003316208720207,
0.033053141087293625,
0.10764811933040619,
-0.0001517115015303716,
0.09237617254257202,
0.19276170432567596,
0.01569451205432415,
-0.016001680865883827,
0.11423632502555847,
-0.040102992206811905,
-0.04414039105176926,
0.1147577092051506,
-0.0966464951634407,
0.02949601039290428,
0.1413683146238327,
0.015348941087722778,
0.020130008459091187,
0.09185352921485901,
0.012257096357643604,
0.0940336287021637,
0.08465544134378433,
-0.14255721867084503,
0.015448309481143951,
-0.07806839793920517,
-0.10121382027864456,
0.06662920117378235,
0.06881262362003326,
0.08795684576034546,
-0.051999591290950775,
-0.033530063927173615,
-0.005372729152441025,
-0.023582864552736282,
-0.07524514198303223,
0.14133943617343903,
0.08481107652187347,
0.055760305374860764,
-0.13071756064891815,
0.0505499467253685,
0.0023742225021123886,
0.016360217705368996,
0.03880375623703003,
0.12927021086215973,
-0.13226470351219177,
-0.08757776021957397,
-0.02491413801908493,
0.12384890764951706,
-0.095095194876194,
-0.05364774167537689,
-0.10087454319000244,
-0.07633703947067261,
0.06307476758956909,
0.111709363758564,
0.06296645849943161,
0.04082788527011871,
-0.02688458375632763,
-0.00318946223706007,
-0.06467097997665405,
0.02770191989839077,
-0.0012162340572103858,
0.03766748309135437,
-0.11027923226356506,
0.1459193378686905,
0.01094417367130518,
0.18594422936439514,
-0.07650048285722733,
-0.06827294826507568,
-0.12783412635326385,
0.041569069027900696,
-0.1948411911725998,
-0.060315635055303574,
-0.063222736120224,
-0.037823740392923355,
-0.02097996324300766,
-0.08896387368440628,
-0.08493152260780334,
0.005929838865995407,
-0.09204746037721634,
0.0008831597515381873,
-0.0018378263339400291,
-0.013668728061020374,
-0.02639724127948284,
0.012489337474107742,
0.06992132216691971,
-0.032787028700113297,
0.09736444801092148,
0.1236848533153534,
-0.04269237443804741,
0.10067407041788101,
0.004928484559059143,
-0.011427042074501514,
0.03954765573143959,
0.07350903004407883,
0.038253434002399445,
0.11849243193864822,
0.008761716075241566,
0.02954738400876522,
0.028127526864409447,
0.04843229427933693,
0.1561570018529892,
-0.06895586848258972,
-0.01941564865410328,
-0.10925770550966263,
-0.09841681271791458,
-0.07304501533508301,
-0.007670979481190443,
0.08320998400449753,
0.02818906493484974,
0.09316359460353851,
-0.06878906488418579,
0.059007756412029266,
-0.11146016418933868,
-0.004198893904685974,
0.01781517267227173,
-0.08508925884962082,
-0.07281196862459183,
-0.0564175508916378,
0.03367295488715172,
-0.028632137924432755,
0.1285102665424347,
-0.11848364025354385,
-0.04260699078440666,
-0.02919437550008297,
0.0034748862963169813,
-0.07000453770160675,
-0.02561821974813938,
0.2698229253292084,
0.1501278430223465,
-0.020209340378642082,
0.011038892902433872,
0.0596097968518734,
0.021842239424586296,
0.02291877195239067,
0.039918430149555206,
0.003479326143860817,
0.022571871057152748,
0.11241405457258224,
0.03654379025101662,
0.003437021980062127,
-0.14588885009288788,
-0.0365816093981266,
-0.12494803220033646,
0.0008579101413488388,
-0.007570334710180759,
0.08116375654935837,
0.1677280068397522,
-0.015079302713274956,
-0.019556965678930283,
0.020420178771018982,
-0.055768489837646484,
-0.11965715885162354,
-0.0672978013753891,
-0.10731018334627151,
-0.15464994311332703,
0.018048712983727455,
-0.0627615824341774,
-0.03498661145567894,
0.02114379219710827,
0.036583125591278076,
-0.0588981918990612,
0.09741032868623734,
0.13649682700634003,
-0.044687628746032715,
0.03480011224746704,
-0.041954733431339264,
-0.01827998459339142,
-0.013767216354608536,
-0.01761711575090885,
0.04334821179509163,
0.00022470623662229627,
0.014239705167710781,
0.028808074072003365,
-0.10156600177288055,
0.049977317452430725,
-0.0757874995470047,
-0.10588819533586502,
-0.024795543402433395,
-0.0005981458816677332,
0.007360096089541912,
0.08621398359537125,
0.00813211314380169,
-0.03485706448554993,
0.00042026580194942653,
0.14997492730617523,
-0.03870284929871559,
-0.056787002831697464,
-0.06260722130537033,
0.19191184639930725,
0.01877535879611969,
0.04757718741893768,
-0.013693033717572689,
-0.009304409846663475,
-0.0712151825428009,
0.3325658142566681,
0.18525008857250214,
-0.1213681772351265,
-0.0004543508111964911,
0.05380254238843918,
0.008141495287418365,
0.0038349193055182695,
0.09976162761449814,
0.11753061413764954,
0.1573190838098526,
-0.077023446559906,
-0.09972116351127625,
-0.10101718455553055,
-0.017292197793722153,
-0.01950651779770851,
0.06792205572128296,
0.06318635493516922,
-0.04521352797746658,
-0.10906990617513657,
0.006971453782171011,
-0.19583448767662048,
0.16362364590168,
-0.04256361722946167,
-0.1089794784784317,
-0.08651666343212128,
0.012318969704210758,
0.06281816214323044,
0.06522618234157562,
0.030753348022699356,
-0.03923677280545235,
0.042609307914972305,
-0.018255243077874184,
-0.026740478351712227,
-0.11082102358341217,
0.0030023730359971523,
0.08862365782260895,
0.021902108564972878,
0.07096923142671585,
-0.007232275791466236,
0.04944504797458649,
0.07467000931501389,
0.05102024972438812,
-0.04363697022199631,
0.16894793510437012,
-0.010323183611035347,
-0.031050462275743484,
-0.046803317964076996,
0.06565424054861069,
0.04627717286348343,
0.03610322251915932,
0.07441478967666626,
-0.04348938912153244,
0.017719173803925514,
0.02765372022986412,
-0.020844100043177605,
-0.06955701112747192,
0.023770391941070557,
-0.002571387216448784,
0.12274276465177536,
0.09295383095741272,
-0.049972113221883774,
0.00730906194075942,
-0.03650522977113724,
0.030697345733642578,
-0.003610233310610056,
-0.038893625140190125,
-0.08362787216901779,
-0.17413416504859924,
-0.02549648843705654,
0.012523682788014412,
-0.009090881794691086,
-0.15069030225276947,
-0.012324323877692223,
-0.1296248435974121,
-0.0030653264839202166,
-0.07671952247619629,
0.07212204486131668,
0.09239202737808228,
0.01939292997121811,
-0.00021652434952557087,
-0.05616987496614456,
-0.009435196407139301,
0.05819956213235855,
-0.16318973898887634,
-0.1278398483991623
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | null | MaggieZhang/test2 | [
"transformers",
"safetensors",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-15T05:51:12+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
31,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06646376848220825,
0.2168014943599701,
-0.00225935154594481,
0.023818302899599075,
0.1271018385887146,
-0.001635765191167593,
0.04218708351254463,
0.13324736058712006,
-0.020175931975245476,
0.11144465953111649,
0.046588581055402756,
0.09377603232860565,
0.09928803145885468,
0.18404334783554077,
0.04859916493296623,
-0.2059975117444992,
0.007056170143187046,
-0.09090408682823181,
0.014076028019189835,
0.1116579994559288,
0.13719257712364197,
-0.10291384905576706,
0.08272874355316162,
-0.04045208916068077,
-0.02019004337489605,
0.00012576708104461432,
-0.09259183704853058,
-0.07032395154237747,
0.06885425746440887,
0.06264153122901917,
0.051234472543001175,
0.001456156256608665,
0.09140396863222122,
-0.2864592671394348,
0.017265573143959045,
0.08406311273574829,
0.0027674848679453135,
0.06290827691555023,
0.07236549258232117,
-0.07389893382787704,
0.11328595131635666,
-0.08021481335163116,
0.13019037246704102,
0.08625296503305435,
-0.062064990401268005,
-0.23071379959583282,
-0.07525765895843506,
0.0963398814201355,
0.12251301854848862,
0.06215599179267883,
-0.022921854630112648,
0.15455181896686554,
-0.06248689442873001,
0.012971068732440472,
0.1294165402650833,
-0.11526761949062347,
-0.05572471022605896,
0.061741601675748825,
0.11775490641593933,
0.10740239918231964,
-0.14110268652439117,
-0.0017287094378843904,
0.04900608956813812,
0.029121357947587967,
0.08589313924312592,
0.022661056369543076,
0.12003941088914871,
0.04652795568108559,
-0.13695219159126282,
-0.04037507623434067,
0.12011898308992386,
0.038862764835357666,
-0.06446044892072678,
-0.2168138176202774,
-0.006778308190405369,
-0.0601806715130806,
-0.014732478186488152,
-0.07019448280334473,
0.039128515869379044,
-0.02470310963690281,
0.07317749410867691,
-0.04465159401297569,
-0.1063927412033081,
-0.0421026237308979,
0.0892222449183464,
0.07748593389987946,
0.011527054943144321,
-0.02519804798066616,
0.04627908393740654,
0.13455867767333984,
0.05402068421244621,
-0.10399353504180908,
-0.07017925381660461,
-0.06942764669656754,
-0.09420394152402878,
-0.04035796597599983,
0.056760527193546295,
0.031942449510097504,
0.02665667235851288,
0.22703726589679718,
0.016653569415211678,
0.04155244305729866,
0.0224777739495039,
0.01032855175435543,
0.043662428855895996,
0.0955500528216362,
-0.05303520709276199,
-0.15660029649734497,
-0.04072032496333122,
0.09077946096658707,
-0.0027527001220732927,
-0.036689214408397675,
-0.03966725245118141,
0.03849169611930847,
0.06843466311693192,
0.13122352957725525,
0.07552056759595871,
-0.017929591238498688,
-0.04813180863857269,
-0.030096933245658875,
0.23523783683776855,
-0.1493375599384308,
0.04426715523004532,
-0.02271856553852558,
-0.01804111897945404,
-0.03908449783921242,
0.03597262129187584,
0.022118929773569107,
-0.000004518366949923802,
0.09706240892410278,
-0.058981191366910934,
-0.05378659814596176,
-0.10168042778968811,
-0.03272576630115509,
0.04088849574327469,
-0.013975566253066063,
-0.010589460842311382,
-0.09025166928768158,
-0.09490354359149933,
-0.04766594246029854,
0.05537205561995506,
-0.05123869329690933,
-0.03770573064684868,
0.009465423412621021,
-0.08151785284280777,
-0.005444355774670839,
-0.005417742300778627,
0.10699385404586792,
-0.03222226724028587,
0.04445803165435791,
-0.027600755915045738,
0.05225523188710213,
0.09919606149196625,
0.031576547771692276,
-0.0773419588804245,
0.0561848059296608,
-0.22559374570846558,
0.07503069192171097,
-0.11481974273920059,
0.04335082694888115,
-0.1704932004213333,
-0.042439818382263184,
0.005444696638733149,
0.0139949731528759,
0.013206101022660732,
0.12720820307731628,
-0.19255615770816803,
-0.01654396951198578,
0.13260798156261444,
-0.09212633967399597,
-0.118110790848732,
0.07884611934423447,
-0.029701577499508858,
0.1624738723039627,
0.04682036489248276,
-0.027025915682315826,
0.09224298596382141,
-0.16434773802757263,
-0.07092688232660294,
-0.00949116237461567,
-0.01727987825870514,
0.12109188735485077,
0.07512219995260239,
-0.05991523340344429,
0.046571120619773865,
0.02832140028476715,
-0.038078423589468,
-0.04424772411584854,
-0.050857074558734894,
-0.10884185880422592,
-0.01070026308298111,
-0.08987759798765182,
0.04065500199794769,
-0.01250192429870367,
-0.07916021347045898,
-0.029885273426771164,
-0.18612512946128845,
-0.0030564051121473312,
0.10038342326879501,
0.0035033065360039473,
-0.005652366206049919,
-0.08666291832923889,
0.026358824223279953,
-0.03112892620265484,
-0.008404186926782131,
-0.16764774918556213,
-0.04399421438574791,
0.046902090311050415,
-0.16094985604286194,
0.020117372274398804,
-0.06413903087377548,
0.06334125250577927,
0.03641495108604431,
-0.05590536445379257,
-0.0248766727745533,
-0.01730942726135254,
0.011945613659918308,
-0.05083848536014557,
-0.18994836509227753,
-0.056277405470609665,
-0.037882111966609955,
0.149809330701828,
-0.25956398248672485,
0.032966937869787216,
0.051140617579221725,
0.14649195969104767,
0.00406361510977149,
-0.05115427449345589,
0.01429014839231968,
-0.05360214412212372,
-0.054652128368616104,
-0.06746816635131836,
-0.006135428790003061,
-0.027576493099331856,
-0.05147203803062439,
0.019243421033024788,
-0.1755700707435608,
-0.021410830318927765,
0.09424154460430145,
0.12876708805561066,
-0.1486445665359497,
-0.018640631809830666,
-0.048725154250860214,
-0.06339836865663528,
-0.0715010017156601,
-0.07038594037294388,
0.10712739825248718,
0.0513901449739933,
0.04796046018600464,
-0.07435787469148636,
-0.07092321664094925,
0.02726263552904129,
0.006906150374561548,
-0.03382374346256256,
0.08727246522903442,
0.05199531093239784,
-0.09209315478801727,
0.0756213590502739,
0.1092359870672226,
0.07177663594484329,
0.09363535046577454,
0.01574566215276718,
-0.11756632477045059,
-0.028492970392107964,
0.036266472190618515,
0.02740776725113392,
0.1465986967086792,
-0.05952361226081848,
0.04016614332795143,
0.04494241625070572,
-0.04170418903231621,
0.022319864481687546,
-0.08787637203931808,
0.024075502529740334,
0.025203049182891846,
-0.0034381982404738665,
0.06284574419260025,
-0.02525499276816845,
-0.0050758360885083675,
0.07016654312610626,
0.047779910266399384,
0.04621000960469246,
0.009655474685132504,
-0.01720241829752922,
-0.1047825813293457,
0.16950392723083496,
-0.0951867327094078,
-0.269941508769989,
-0.17632324993610382,
0.026197833940386772,
0.04035249724984169,
-0.022378476336598396,
0.031619444489479065,
-0.07056326419115067,
-0.10630585998296738,
-0.1060405746102333,
-0.002429972169920802,
0.01714223250746727,
-0.06364088505506516,
-0.0741225928068161,
0.07348573952913284,
0.04382912442088127,
-0.14902326464653015,
0.038552410900592804,
0.055694397538900375,
-0.057955220341682434,
-0.0233661737293005,
0.09118817001581192,
0.12397737801074982,
0.14583967626094818,
-0.021366750821471214,
-0.028626007959246635,
0.029004426673054695,
0.19620531797409058,
-0.13469526171684265,
0.10371150821447372,
0.13814030587673187,
-0.04545360431075096,
0.08360563963651657,
0.1560150384902954,
0.029186224564909935,
-0.08317049592733383,
0.05044832453131676,
0.04082648828625679,
-0.043159641325473785,
-0.2666129767894745,
-0.0534592866897583,
0.012832709588110447,
-0.06255637854337692,
0.09786593168973923,
0.10183793306350708,
0.11542957276105881,
0.034910861402750015,
-0.07166364789009094,
-0.043925940990448,
-0.0058974819257855415,
0.11737963557243347,
-0.05490213260054588,
-0.012639665976166725,
0.07686592638492584,
-0.05086168646812439,
0.005355054512619972,
0.10266812145709991,
0.02973790094256401,
0.17442677915096283,
0.020399179309606552,
0.11231429129838943,
0.06195578724145889,
0.08633565157651901,
0.0007386076031252742,
0.02951662428677082,
0.05147615820169449,
0.017203815281391144,
-0.002300140680745244,
-0.10421168059110641,
-0.006156572140753269,
0.1449710875749588,
0.028103826567530632,
0.029669636860489845,
-0.0018948549404740334,
-0.005003341939300299,
0.05121048167347908,
0.1746254414319992,
-0.011592294089496136,
-0.22072425484657288,
-0.0845772922039032,
0.06936841458082199,
-0.06218599155545235,
-0.12968985736370087,
-0.026130788028240204,
0.045467354357242584,
-0.17519839107990265,
0.026703642681241035,
-0.027433741837739944,
0.0919293761253357,
-0.09345759451389313,
-0.02221956104040146,
0.03687324374914169,
0.084866963326931,
-0.014529162086546421,
0.08703910559415817,
-0.14498743414878845,
0.11886418610811234,
0.02978132851421833,
0.09024628251791,
-0.11081171780824661,
0.07909037172794342,
-0.007550720125436783,
0.009180475026369095,
0.19379350543022156,
-0.011335089802742004,
-0.03514958545565605,
-0.08774717897176743,
-0.11210042238235474,
-0.013537433929741383,
0.12687496840953827,
-0.1243172138929367,
0.08773399889469147,
-0.015198243781924248,
-0.044079482555389404,
0.00937260314822197,
-0.12100647389888763,
-0.17273177206516266,
-0.19628387689590454,
0.05585884302854538,
-0.09575839340686798,
0.025643249973654747,
-0.11914430558681488,
-0.07089093327522278,
-0.02952558360993862,
0.241120383143425,
-0.1745356321334839,
-0.06510113179683685,
-0.1468164622783661,
-0.046294767409563065,
0.1662203073501587,
-0.04437198117375374,
0.0718095526099205,
-0.0208172257989645,
0.20345525443553925,
0.005988610442727804,
-0.004939318168908358,
0.06724198162555695,
-0.08892562240362167,
-0.16873881220817566,
-0.06771010160446167,
0.1510489284992218,
0.11680185794830322,
0.04907919466495514,
-0.002248800592496991,
0.0011772146681323647,
-0.016943959519267082,
-0.1137804463505745,
-0.0033210667315870523,
0.16037839651107788,
0.03878779336810112,
0.025986969470977783,
-0.05243593826889992,
-0.08797456324100494,
-0.06899320334196091,
-0.06853509694337845,
0.06221301481127739,
0.19590823352336884,
-0.10376439243555069,
0.1700313836336136,
0.147536963224411,
-0.07305635511875153,
-0.23175598680973053,
0.035342130810022354,
0.04983805492520332,
0.0014306638622656465,
0.04886869341135025,
-0.18252557516098022,
0.10521943867206573,
0.019543392583727837,
-0.05505957826972008,
0.13485197722911835,
-0.1557481735944748,
-0.1552847921848297,
0.0722852572798729,
0.03904085233807564,
-0.22423844039440155,
-0.1354004591703415,
-0.09622503817081451,
-0.05825018882751465,
-0.14065024256706238,
0.06054598465561867,
-0.002136280992999673,
0.015948504209518433,
0.03500790148973465,
-0.0015643214574083686,
0.027123261243104935,
-0.058935679495334625,
0.18609118461608887,
-0.004065449349582195,
0.020676052197813988,
-0.060264769941568375,
-0.0478842556476593,
0.09839435666799545,
-0.06130504235625267,
0.12208222597837448,
0.004057085141539574,
0.01594383642077446,
-0.10362856835126877,
-0.048314861953258514,
-0.04328322783112526,
0.05154227837920189,
-0.07548051327466965,
-0.10070807486772537,
-0.043625857681035995,
0.08841723203659058,
0.07005169242620468,
-0.03383097052574158,
0.00549331633374095,
-0.07189501076936722,
0.10019614547491074,
0.17795267701148987,
0.17573626339435577,
0.009926567785441875,
-0.07241068035364151,
0.01677953451871872,
-0.04142116755247116,
0.044231921434402466,
-0.2513144314289093,
0.03756171092391014,
0.06098250672221184,
0.029438555240631104,
0.09217222779989243,
-0.020435843616724014,
-0.1820858269929886,
-0.04050002992153168,
0.08094815909862518,
-0.05452597141265869,
-0.22617179155349731,
-0.019085140898823738,
0.0954197570681572,
-0.2020406424999237,
-0.007372708059847355,
0.03995226323604584,
-0.048725228756666183,
-0.023169852793216705,
0.00010950004070764408,
0.06317184865474701,
0.002471912419423461,
0.09773622453212738,
0.0735151618719101,
0.09715340286493301,
-0.08337292820215225,
0.10562895983457565,
0.10150538384914398,
-0.09572599828243256,
0.03605884686112404,
0.06754924356937408,
-0.05300498008728027,
-0.043293699622154236,
0.03665391728281975,
0.033023297786712646,
0.005234600510448217,
-0.060321882367134094,
0.013913018628954887,
-0.036497246474027634,
0.044923391193151474,
0.08326134830713272,
0.03754979372024536,
-0.013354414142668247,
0.06462216377258301,
0.03401726484298706,
-0.10898099094629288,
0.10366570204496384,
0.01731540448963642,
0.04105307161808014,
-0.08384523540735245,
-0.019968897104263306,
0.035425446927547455,
0.030576206743717194,
-0.01765924133360386,
-0.02306121215224266,
-0.02860277332365513,
-0.01614218018949032,
-0.14299540221691132,
-0.023106401786208153,
-0.07243485748767853,
0.006181265693157911,
0.014656842686235905,
-0.031884219497442245,
-0.011233693920075893,
0.02475680410861969,
-0.06979699432849884,
-0.07426341623067856,
-0.006949664559215307,
0.09833318740129471,
-0.15115703642368317,
0.008848577737808228,
0.06907843053340912,
-0.11088496446609497,
0.08190931379795074,
-0.008411259390413761,
0.016245156526565552,
0.022527478635311127,
-0.15448406338691711,
0.05601610988378525,
0.0008648968650959432,
0.01916889287531376,
0.025886621326208115,
-0.16471809148788452,
0.004104440100491047,
-0.04661374166607857,
-0.02149827405810356,
-0.00004464812809601426,
-0.02647159807384014,
-0.12325995415449142,
0.06858719140291214,
-0.015622655861079693,
-0.035931166261434555,
-0.02701525390148163,
0.0539589487016201,
0.07888586074113846,
-0.027474910020828247,
0.10445091128349304,
-0.008690856397151947,
0.04941811040043831,
-0.16801609098911285,
-0.02470702864229679,
-0.04982255399227142,
0.019377702847123146,
0.009884213097393513,
-0.007693959400057793,
0.04183054715394974,
-0.00976533442735672,
0.21883612871170044,
-0.05075952783226967,
0.1607085019350052,
0.05847611650824547,
-0.017352959141135216,
-0.0007513365126214921,
0.06180921941995621,
0.05997028574347496,
0.04658793285489082,
0.009480604901909828,
0.023740366101264954,
-0.022450892254710197,
-0.006695089396089315,
-0.15932634472846985,
0.01890849508345127,
0.14999441802501678,
0.06301083415746689,
0.024745315313339233,
0.05866100639104843,
-0.12775006890296936,
-0.12135478109121323,
0.09311001747846603,
-0.026755332946777344,
0.00928465835750103,
-0.08245618641376495,
0.1358020007610321,
0.14980104565620422,
-0.14000412821769714,
0.05256148427724838,
-0.06134212389588356,
-0.05217423290014267,
-0.10388828068971634,
-0.12032219022512436,
-0.05887215584516525,
-0.053666237741708755,
0.002330566756427288,
-0.03760887682437897,
0.054546963423490524,
0.03344334661960602,
-0.009351172484457493,
-0.00022941511997487396,
0.13597318530082703,
-0.019751882180571556,
-0.0028988157864660025,
0.048313532024621964,
0.03693558648228645,
0.02373051457107067,
-0.05275435373187065,
0.02940409444272518,
0.02539868652820587,
0.032232340425252914,
0.06546790152788162,
0.033412106335163116,
-0.047448933124542236,
0.03804153576493263,
-0.0025254099164158106,
-0.11207924783229828,
0.019641218706965446,
-0.00460948096588254,
-0.0742158442735672,
0.1268945336341858,
0.0407399944961071,
0.010224059224128723,
-0.03741471841931343,
0.24361543357372284,
-0.06653323769569397,
-0.06378097087144852,
-0.13251738250255585,
0.10491154342889786,
-0.0027236645109951496,
0.06476365029811859,
0.023412218317389488,
-0.1284150779247284,
0.005243356805294752,
0.13858191668987274,
0.12181595712900162,
0.0045748427510261536,
0.009228081442415714,
0.0518609918653965,
0.0025186820421367884,
-0.06998204439878464,
0.054019294679164886,
0.06992026418447495,
0.12919506430625916,
-0.07847554981708527,
0.07680778950452805,
0.0006860480643808842,
-0.08370215445756912,
-0.02947772853076458,
0.11312682181596756,
-0.0409729965031147,
0.03491825982928276,
-0.047444481402635574,
0.10916327685117722,
-0.05787910893559456,
-0.29412412643432617,
0.02350960113108158,
-0.09588567912578583,
-0.15202060341835022,
-0.018367812037467957,
0.05944539234042168,
-0.02624768204987049,
0.018029648810625076,
0.06971040368080139,
-0.06011629104614258,
0.20098382234573364,
0.0335683599114418,
-0.07864278554916382,
-0.0664360448718071,
0.04837050288915634,
-0.06564252078533173,
0.2949807047843933,
0.008418165147304535,
0.02863333560526371,
0.10770907253026962,
-0.03253700211644173,
-0.18271861970424652,
0.010723991319537163,
0.1133992001414299,
-0.08056149631738663,
0.08200647681951523,
0.19000613689422607,
-0.012578671798110008,
0.1209007054567337,
0.05294662341475487,
-0.047376248985528946,
0.04217283055186272,
-0.03389401361346245,
-0.051268599927425385,
-0.10752558708190918,
0.058453381061553955,
-0.05909625440835953,
0.15447644889354706,
0.10152646154165268,
-0.05671518296003342,
-0.004550917539745569,
-0.05555408447980881,
0.04875178262591362,
0.01804669201374054,
0.12263146042823792,
0.02951994352042675,
-0.1865430772304535,
0.032826557755470276,
-0.01144319772720337,
0.10186848044395447,
-0.25588861107826233,
-0.08421015739440918,
0.08833149075508118,
-0.011924264021217823,
-0.05105875805020332,
0.10560628771781921,
0.057650718837976456,
0.04243382066488266,
-0.043439045548439026,
-0.10480839014053345,
-0.02186836116015911,
0.14663739502429962,
-0.1469624787569046,
-0.025013303384184837
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | text-generation | FINNUMBER/Yi-Ko-6B-Finch-SA-full | [
"transformers",
"safetensors",
"llama",
"text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-15T05:55:32+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #llama #text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #llama #text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
56,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #llama #text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06061961501836777,
0.15481999516487122,
-0.004844071343541145,
0.02074851468205452,
0.0983177199959755,
0.007407687604427338,
0.07119518518447876,
0.11185134947299957,
-0.023851769044995308,
0.1167980208992958,
0.031993988901376724,
0.09781743586063385,
0.11217817664146423,
0.16186554729938507,
0.0015333457849919796,
-0.22897611558437347,
0.049678247421979904,
-0.125278040766716,
-0.0294334813952446,
0.11977242678403854,
0.1422213912010193,
-0.10954539477825165,
0.0752737894654274,
-0.038042325526475906,
-0.005828251596540213,
-0.0323176346719265,
-0.06205610930919647,
-0.05266609415411949,
0.05311284959316254,
0.06794639676809311,
0.07308239489793777,
0.01171939354389906,
0.09106900542974472,
-0.2724283039569855,
0.02348201349377632,
0.0805930644273758,
-0.0006441773730330169,
0.07586129754781723,
0.04993962123990059,
-0.08749990910291672,
0.07524524629116058,
-0.060156844556331635,
0.1498761922121048,
0.07955671846866608,
-0.09018243104219437,
-0.19217631220817566,
-0.07921334356069565,
0.09916994720697403,
0.1890910118818283,
0.05953684076666832,
-0.026427440345287323,
0.11642678081989288,
-0.08593545109033585,
0.013638701289892197,
0.06446459144353867,
-0.06054406240582466,
-0.055855002254247665,
0.06904532760381699,
0.08335285633802414,
0.08567540347576141,
-0.12976622581481934,
-0.010767064057290554,
0.015032444149255753,
0.008952446281909943,
0.08948688954114914,
0.017146794125437737,
0.1335189938545227,
0.040557652711868286,
-0.13501930236816406,
-0.043155476450920105,
0.09761431813240051,
0.03665134683251381,
-0.04888195917010307,
-0.2485782504081726,
-0.023432478308677673,
-0.04339504987001419,
-0.03198111802339554,
-0.03649339824914932,
0.043764639645814896,
-0.014506848528981209,
0.07738617807626724,
-0.004502781666815281,
-0.0837155357003212,
-0.04301247000694275,
0.07241875678300858,
0.06128999963402748,
0.02571401372551918,
-0.015821760520339012,
0.0059297760017216206,
0.12327717989683151,
0.11431120336055756,
-0.126715749502182,
-0.052547648549079895,
-0.06306339055299759,
-0.08449548482894897,
-0.044861067086458206,
0.030838407576084137,
0.037995077669620514,
0.045936476439237595,
0.23867325484752655,
0.007765117567032576,
0.053257301449775696,
0.04455438256263733,
0.014407169073820114,
0.06501194834709167,
0.11008983850479126,
-0.05894824117422104,
-0.09719445556402206,
-0.028582042083144188,
0.10156717151403427,
0.007986726239323616,
-0.04139331728219986,
-0.05712985619902611,
0.07059531658887863,
0.018587570637464523,
0.12360043078660965,
0.08000938594341278,
0.003056557849049568,
-0.0755772516131401,
-0.062465377151966095,
0.17764076590538025,
-0.15825673937797546,
0.04532013460993767,
0.03055616281926632,
-0.0341108962893486,
-0.009745313785970211,
0.012105142697691917,
0.025474950671195984,
-0.021481726318597794,
0.09522198140621185,
-0.05601342022418976,
-0.034448131918907166,
-0.11389608681201935,
-0.03694311901926994,
0.030394554138183594,
0.011153047904372215,
-0.02865210548043251,
-0.03502652049064636,
-0.08865131437778473,
-0.06405586749315262,
0.09101516753435135,
-0.07148737460374832,
-0.04784895107150078,
-0.016645915806293488,
-0.07833752781152725,
0.021804187446832657,
0.01691517047584057,
0.09064167737960815,
-0.0222476739436388,
0.03985358029603958,
-0.0550384595990181,
0.061440225690603256,
0.11723454296588898,
0.027987057343125343,
-0.05787884071469307,
0.061519939452409744,
-0.2424532175064087,
0.10252492874860764,
-0.07715212553739548,
0.04971238598227501,
-0.15203025937080383,
-0.02478341944515705,
0.03986154496669769,
0.01284773275256157,
-0.008251311257481575,
0.14196595549583435,
-0.21994100511074066,
-0.030957341194152832,
0.16964265704154968,
-0.10025953501462936,
-0.08109250664710999,
0.060782887041568756,
-0.05354252830147743,
0.11210215091705322,
0.04557164013385773,
-0.02375967986881733,
0.05775221437215805,
-0.14725260436534882,
-0.011030761525034904,
-0.041942402720451355,
-0.0180682260543108,
0.16207332909107208,
0.0703711211681366,
-0.06047816202044487,
0.07456906884908676,
0.01960151270031929,
-0.014246034435927868,
-0.04887177795171738,
-0.02822130173444748,
-0.1047162413597107,
0.01184528972953558,
-0.06102835759520531,
0.018109694123268127,
-0.021768750622868538,
-0.09445013850927353,
-0.029118487611413002,
-0.17402999103069305,
-0.0031633328180760145,
0.08821269869804382,
-0.011630427092313766,
-0.021509924903512,
-0.11245372891426086,
0.009332616813480854,
0.030967719852924347,
0.0002618339203763753,
-0.13677829504013062,
-0.06033218279480934,
0.026970699429512024,
-0.16097871959209442,
0.029791243374347687,
-0.05741601809859276,
0.04530094936490059,
0.04005871340632439,
-0.03433511033654213,
-0.03489551320672035,
0.010874404571950436,
0.010431389324367046,
-0.01894843392074108,
-0.25422003865242004,
-0.01882786676287651,
-0.0234990194439888,
0.1751047968864441,
-0.22956320643424988,
0.042598169296979904,
0.07489731162786484,
0.1460893303155899,
0.007349682506173849,
-0.03550100699067116,
0.015185600146651268,
-0.07262228429317474,
-0.03268764168024063,
-0.06316669285297394,
-0.01207790058106184,
-0.038400664925575256,
-0.05820201337337494,
0.04906858503818512,
-0.1686294972896576,
-0.030321966856718063,
0.10717973858118057,
0.06342670321464539,
-0.1473218947649002,
-0.02780107781291008,
-0.04056945815682411,
-0.04624456167221069,
-0.06676914542913437,
-0.05461418256163597,
0.11812574416399002,
0.056411582976579666,
0.04860803112387657,
-0.07140495628118515,
-0.07455260306596756,
0.008036690764129162,
-0.01956399530172348,
-0.014917809516191483,
0.09334591031074524,
0.07554110884666443,
-0.12264352291822433,
0.09177418053150177,
0.09668384492397308,
0.08576478064060211,
0.10314212739467621,
-0.014663571491837502,
-0.08914592862129211,
-0.040637146681547165,
0.02245822176337242,
0.016187267377972603,
0.15129362046718597,
-0.012961224652826786,
0.055492039769887924,
0.0358695350587368,
-0.014034898020327091,
0.011105312965810299,
-0.09736533463001251,
0.02655916102230549,
0.030835967510938644,
-0.016302183270454407,
0.03745110332965851,
-0.0447014644742012,
0.019208140671253204,
0.09039704501628876,
0.040895868092775345,
0.040978945791721344,
0.010155045427381992,
-0.04354988783597946,
-0.11037563532590866,
0.1787576973438263,
-0.12389461696147919,
-0.24818050861358643,
-0.13812170922756195,
0.010281167924404144,
0.04737642779946327,
-0.010411068797111511,
0.006690691225230694,
-0.06616118550300598,
-0.1175973042845726,
-0.09878289699554443,
0.018617089837789536,
0.045352302491664886,
-0.07590975612401962,
-0.06842505931854248,
0.06414616107940674,
0.03875524550676346,
-0.13939815759658813,
0.024007495492696762,
0.04662325978279114,
-0.08205481618642807,
-0.0029386086389422417,
0.0791812464594841,
0.06965780258178711,
0.17661017179489136,
0.013885351829230785,
-0.023669935762882233,
0.026634456589818,
0.20819635689258575,
-0.1436755359172821,
0.10975687950849533,
0.13545554876327515,
-0.08767466992139816,
0.08120133727788925,
0.1998777538537979,
0.03777998685836792,
-0.10680917650461197,
0.03608465939760208,
0.028374753892421722,
-0.028325283899903297,
-0.2502254545688629,
-0.06958996504545212,
0.0019060121849179268,
-0.05172049254179001,
0.07064855098724365,
0.08791537582874298,
0.09593888372182846,
0.016860228031873703,
-0.09976044297218323,
-0.07697858661413193,
0.046900223940610886,
0.10824491083621979,
-0.00015424020239152014,
-0.015208319760859013,
0.0904119610786438,
-0.03033481352031231,
0.01743943803012371,
0.09215071052312851,
0.0030607767403125763,
0.17535938322544098,
0.051709048449993134,
0.17189906537532806,
0.07866133749485016,
0.06444311141967773,
0.02004685252904892,
0.007725914940237999,
0.021817529574036598,
0.017227526754140854,
-0.0030957073904573917,
-0.08709781616926193,
-0.0034981227945536375,
0.1202581599354744,
0.049845851957798004,
0.029173865914344788,
0.012042860500514507,
-0.030704669654369354,
0.08337877690792084,
0.1770893782377243,
0.0029054484330117702,
-0.1893385946750641,
-0.07169844210147858,
0.07795937359333038,
-0.08648337423801422,
-0.10729733109474182,
-0.029470939189195633,
0.041069481521844864,
-0.1729043871164322,
0.016882894560694695,
-0.019335895776748657,
0.10788324475288391,
-0.13190391659736633,
-0.01772487722337246,
0.05657728388905525,
0.06932812184095383,
-0.009677323512732983,
0.06694949418306351,
-0.16090403497219086,
0.11770165711641312,
0.01751571334898472,
0.06636732816696167,
-0.09608277678489685,
0.09618937969207764,
-0.007830657996237278,
0.0041499207727611065,
0.1410749852657318,
0.010120149701833725,
-0.05952107161283493,
-0.09608154743909836,
-0.10546442121267319,
-0.009841260500252247,
0.1306990385055542,
-0.14852415025234222,
0.08813067525625229,
-0.02661319263279438,
-0.044553373008966446,
0.003614129964262247,
-0.12497276812791824,
-0.13103094696998596,
-0.18366187810897827,
0.05707118660211563,
-0.12947207689285278,
0.04045100137591362,
-0.10902881622314453,
-0.045833900570869446,
-0.02098964899778366,
0.20040063560009003,
-0.23137451708316803,
-0.06714103370904922,
-0.1551055610179901,
-0.08061286807060242,
0.14446212351322174,
-0.046455029398202896,
0.08550118654966354,
0.0008278203313238919,
0.19068008661270142,
0.021319707855582237,
-0.017237508669495583,
0.1072206199169159,
-0.10052918642759323,
-0.2010865956544876,
-0.09273224323987961,
0.15895552933216095,
0.13766798377037048,
0.03809428587555885,
-0.004381525795906782,
0.03171157464385033,
-0.02098114788532257,
-0.12076930701732635,
0.020226983353495598,
0.17317426204681396,
0.08982043713331223,
0.025265544652938843,
-0.02972041629254818,
-0.11267432570457458,
-0.07061342149972916,
-0.03774050623178482,
0.024755435064435005,
0.18072067201137543,
-0.07222156971693039,
0.18405316770076752,
0.13775517046451569,
-0.05534014105796814,
-0.19904261827468872,
0.021996473893523216,
0.04293542355298996,
0.0070380112156271935,
0.0323902890086174,
-0.20307663083076477,
0.09384101629257202,
0.0008334947633557022,
-0.05131231248378754,
0.1379684954881668,
-0.1823476254940033,
-0.151598259806633,
0.06042521819472313,
0.043563615530729294,
-0.19374065101146698,
-0.12374074012041092,
-0.08848230540752411,
-0.04693066328763962,
-0.15487661957740784,
0.10312657803297043,
0.0020827590487897396,
0.008401188999414444,
0.03778626397252083,
0.02252252586185932,
0.012139533646404743,
-0.04198719933629036,
0.1914343535900116,
-0.025891713798046112,
0.03347287327051163,
-0.0790715217590332,
-0.060851071029901505,
0.062408581376075745,
-0.058187782764434814,
0.0755455270409584,
-0.025226406753063202,
0.015947066247463226,
-0.10598332434892654,
-0.048235729336738586,
-0.02852320298552513,
0.019321219995617867,
-0.09431382268667221,
-0.09348297864198685,
-0.04829427972435951,
0.09367614984512329,
0.09042316675186157,
-0.03652578964829445,
-0.03649144619703293,
-0.078715980052948,
0.038977332413196564,
0.17627815902233124,
0.18159319460391998,
0.04659178853034973,
-0.07959239184856415,
-0.001915142871439457,
-0.014336181804537773,
0.04684065282344818,
-0.22077152132987976,
0.060553863644599915,
0.04557652771472931,
0.016117896884679794,
0.11537692695856094,
-0.0208132341504097,
-0.16198977828025818,
-0.06710557639598846,
0.061360616236925125,
-0.06944561004638672,
-0.17825035750865936,
0.0039279889315366745,
0.07344977557659149,
-0.16578389704227448,
-0.037031736224889755,
0.04200848564505577,
-0.01189455483108759,
-0.0403641052544117,
0.012352054007351398,
0.08063354343175888,
0.007078902795910835,
0.07699975371360779,
0.055281639099121094,
0.09124495089054108,
-0.10227900743484497,
0.07410510629415512,
0.08149529248476028,
-0.08644098788499832,
0.030720343813300133,
0.09573426842689514,
-0.06469762325286865,
-0.0346054881811142,
0.04237886518239975,
0.08354541659355164,
0.024281201884150505,
-0.04682289808988571,
0.0023111123591661453,
-0.09734189510345459,
0.05927345156669617,
0.11483542621135712,
0.03496333956718445,
0.011234734207391739,
0.03813567012548447,
0.04486291855573654,
-0.08093374222517014,
0.11926916986703873,
0.023795632645487785,
0.020354853942990303,
-0.04112942889332771,
-0.040553025901317596,
0.035851649940013885,
-0.026020776480436325,
-0.011440055444836617,
-0.035174157470464706,
-0.0722682997584343,
-0.014069457538425922,
-0.16000694036483765,
-0.0076758842915296555,
-0.03660871088504791,
0.005114538595080376,
0.022510098293423653,
-0.03652830421924591,
0.00792311318218708,
0.012217256240546703,
-0.06868947297334671,
-0.05553458258509636,
-0.023233558982610703,
0.09422210603952408,
-0.16494666039943695,
0.0220257006585598,
0.0823851153254509,
-0.12121747434139252,
0.09289738535881042,
0.016782134771347046,
0.00412249518558383,
0.026962365955114365,
-0.1545863002538681,
0.04763968288898468,
-0.020152103155851364,
0.013473534025251865,
0.04222847521305084,
-0.21637047827243805,
-0.004404853098094463,
-0.04015503451228142,
-0.05566934496164322,
-0.008993052877485752,
-0.0319182425737381,
-0.11338426172733307,
0.09645436704158783,
0.011025024577975273,
-0.08443772792816162,
-0.02965564839541912,
0.03353232145309448,
0.07690354436635971,
-0.027447547763586044,
0.1498211771249771,
-0.004663881380110979,
0.07559948414564133,
-0.17581342160701752,
-0.02282017655670643,
-0.011197620071470737,
0.022367527708411217,
-0.021871577948331833,
-0.01622559316456318,
0.04623444378376007,
-0.02704801969230175,
0.19120801985263824,
-0.024701936170458794,
0.049393873661756516,
0.06364397704601288,
0.009232889860868454,
-0.013832193799316883,
0.11151392012834549,
0.05708572641015053,
0.024334950372576714,
0.022262847051024437,
0.003451440716162324,
-0.04008655622601509,
-0.009981024079024792,
-0.18596695363521576,
0.06803664565086365,
0.14585918188095093,
0.09060460329055786,
-0.012669353745877743,
0.0707244873046875,
-0.10161512345075607,
-0.12005364894866943,
0.10127941519021988,
-0.06415384262800217,
-0.010188822634518147,
-0.06542414426803589,
0.14027701318264008,
0.14953285455703735,
-0.1886233240365982,
0.06583356112241745,
-0.06602055579423904,
-0.0566304549574852,
-0.11457879096269608,
-0.1930263340473175,
-0.057075321674346924,
-0.050602465867996216,
-0.018466074019670486,
-0.05384097993373871,
0.06939727067947388,
0.05750798434019089,
0.01126816775649786,
0.00868057832121849,
0.08568526059389114,
-0.009656033478677273,
0.00248199631460011,
0.030120067298412323,
0.06713981181383133,
0.016768986359238625,
-0.0321255661547184,
0.0179112758487463,
-0.00597198773175478,
0.034156378358602524,
0.059282708913087845,
0.03608176112174988,
-0.028436895459890366,
0.015559280291199684,
-0.034912437200546265,
-0.11309733241796494,
0.042801856994628906,
-0.029640642926096916,
-0.0749855786561966,
0.1347348988056183,
0.026981467381119728,
0.005015076603740454,
-0.023140020668506622,
0.2503887414932251,
-0.07436972856521606,
-0.09334370493888855,
-0.14373961091041565,
0.11701542884111404,
-0.04212593287229538,
0.0635172426700592,
0.03596310690045357,
-0.10810714215040207,
0.017985546961426735,
0.1320217251777649,
0.15442703664302826,
-0.04732590913772583,
0.019251897931098938,
0.028577854856848717,
0.00439635943621397,
-0.04075566306710243,
0.05177190154790878,
0.07100846618413925,
0.14500564336776733,
-0.05157303810119629,
0.08530787378549576,
0.002609728369861841,
-0.1021018698811531,
-0.041973695158958435,
0.11415864527225494,
-0.014296893030405045,
0.017620453611016273,
-0.057136841118335724,
0.124222531914711,
-0.05874236673116684,
-0.23697422444820404,
0.06316976249217987,
-0.0765061303973198,
-0.1432730257511139,
-0.024886758998036385,
0.071670763194561,
-0.016632623970508575,
0.02605951391160488,
0.07167234271764755,
-0.0754380151629448,
0.18880942463874817,
0.03957989811897278,
-0.05233397334814072,
-0.05954399332404137,
0.0744764655828476,
-0.11850855499505997,
0.27879106998443604,
0.010482731275260448,
0.051307905465364456,
0.1042102724313736,
-0.02021743729710579,
-0.13270841538906097,
0.023401619866490364,
0.09579801559448242,
-0.08917027711868286,
0.04087764397263527,
0.21448291838169098,
-0.00629545608535409,
0.11935057491064072,
0.07611140608787537,
-0.07468950748443604,
0.047562725841999054,
-0.11468592286109924,
-0.07639975845813751,
-0.08699081838130951,
0.09244474768638611,
-0.06785612553358078,
0.14258281886577606,
0.12599852681159973,
-0.05530165135860443,
0.011584274470806122,
-0.028389399871230125,
0.045467376708984375,
0.005578654818236828,
0.100032277405262,
0.011115525849163532,
-0.18496567010879517,
0.024811718612909317,
0.016259413212537766,
0.10884406417608261,
-0.18112654983997345,
-0.09105053544044495,
0.046958595514297485,
0.0005061255069449544,
-0.06443515419960022,
0.12483241409063339,
0.057313691824674606,
0.04654949903488159,
-0.0451689288020134,
-0.026830285787582397,
-0.006042256020009518,
0.14264579117298126,
-0.10707559436559677,
-0.005129707511514425
] |
null | null | null |
# Lora of independence/インディペンデンス/独立 (Azur Lane)
## What Is This?
This is the LoRA model of waifu independence/インディペンデンス/独立 (Azur Lane).
## How Is It Trained?
* This model is trained with [HCP-Diffusion](https://github.com/7eu7d7/HCP-Diffusion).
* The [auto-training framework](https://github.com/deepghs/cyberharem) is maintained by [DeepGHS Team](https://huggingface.co/deepghs).
* The base model used for training is [deepghs/animefull-latest](https://huggingface.co/deepghs/animefull-latest).
* Dataset used for training is the `stage3-p480-800` in [CyberHarem/independence_azurlane](https://huggingface.co/datasets/CyberHarem/independence_azurlane), which contains 146 images.
* Batch size is 4, resolution is 720x720, clustering into 5 buckets.
* Batch size for regularization dataset is 16, resolution is 720x720, clustering into 20 buckets.
* Trained for 1480 steps, 40 checkpoints were saved and evaluated.
* **Trigger word is `independence_azurlane`.**
* Pruned core tags for this waifu are `breasts, red_eyes, bangs, long_hair, brown_hair, ahoge, hairband, large_breasts, hair_ornament, very_long_hair`. You can add them to the prompt when some features of waifu (e.g. hair color) are not stable.
## How to Use It?
### If You Are Using A1111 WebUI v1.7+
**Just use it like the classic LoRA**. The LoRA we provided are bundled with the embedding file.
### If You Are Using A1111 WebUI v1.6 or Lower
After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.
For example, if you want to use the model from step 1221, you need to download [`1221/independence_azurlane.pt`](https://huggingface.co/CyberHarem/independence_azurlane/resolve/main/1221/independence_azurlane.pt) as the embedding and [`1221/independence_azurlane.safetensors`](https://huggingface.co/CyberHarem/independence_azurlane/resolve/main/1221/independence_azurlane.safetensors) for loading Lora. By using both files together, you can generate images for the desired characters.
## Which Step Should I Use?
We selected 5 good steps for you to choose. The best one is step 1221.
1560 images (1.65 GiB) were generated for auto-testing.

The base model used for generating preview images is [Meina/MeinaMix_V11](https://huggingface.co/Meina/MeinaMix_V11).
Here are the preview of the recommended steps:
| Step | Epoch | CCIP | AI Corrupt | Bikini Plus | Score | Download | pattern_0_0 | pattern_0_1 | pattern_1_0 | pattern_1_1 | portrait_0 | portrait_1 | portrait_2 | full_body_0 | full_body_1 | profile_0 | profile_1 | free_0 | free_1 | shorts | maid_0 | maid_1 | miko | yukata | suit | china | bikini_0 | bikini_1 | bikini_2 | sit | squat | kneel | jump | crossed_arms | angry | smile | cry | grin | n_lie_0 | n_lie_1 | n_stand_0 | n_stand_1 | n_stand_2 | n_sex_0 | n_sex_1 |
|-------:|--------:|:----------|:-------------|:--------------|:----------|:----------------------------------------------------------------------------------------------------------------|:----------------------------------------------|:----------------------------------------------|:----------------------------------------------|:----------------------------------------------|:--------------------------------------------|:--------------------------------------------|:--------------------------------------------|:----------------------------------------------|:----------------------------------------------|:------------------------------------------|:------------------------------------------|:------------------------------------|:------------------------------------|:------------------------------------|:------------------------------------|:------------------------------------|:--------------------------------|:------------------------------------|:--------------------------------|:----------------------------------|:----------------------------------------|:----------------------------------------|:----------------------------------------|:------------------------------|:----------------------------------|:----------------------------------|:--------------------------------|:------------------------------------------------|:----------------------------------|:----------------------------------|:------------------------------|:--------------------------------|:--------------------------------------|:--------------------------------------|:------------------------------------------|:------------------------------------------|:------------------------------------------|:--------------------------------------|:--------------------------------------|
| 1221 | 34 | 0.957 | **0.974** | **0.834** | **0.766** | [Download](https://huggingface.co/CyberHarem/independence_azurlane/resolve/main/1221/independence_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
| 1258 | 35 | 0.957 | 0.965 | 0.832 | 0.763 | [Download](https://huggingface.co/CyberHarem/independence_azurlane/resolve/main/1258/independence_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
| 999 | 28 | **0.962** | 0.971 | 0.826 | 0.756 | [Download](https://huggingface.co/CyberHarem/independence_azurlane/resolve/main/999/independence_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
| 1295 | 36 | 0.932 | 0.952 | 0.831 | 0.743 | [Download](https://huggingface.co/CyberHarem/independence_azurlane/resolve/main/1295/independence_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
| 1369 | 38 | 0.940 | 0.948 | 0.824 | 0.736 | [Download](https://huggingface.co/CyberHarem/independence_azurlane/resolve/main/1369/independence_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
## Anything Else?
Because the automation of LoRA training always annoys some people. So for the following groups, it is not recommended to use this model and we express regret:
1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.
2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.
3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.
4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.
5. Individuals who finds the generated image content offensive to their values.
## All Steps
We uploaded the files in all steps. you can check the images, metrics and download them in the following links:
* [Steps From 1147 to 1480](all/0.md)
* [Steps From 777 to 1110](all/1.md)
* [Steps From 407 to 740](all/2.md)
* [Steps From 37 to 370](all/3.md)
| {"license": "mit", "tags": ["art", "not-for-all-audiences"], "datasets": ["CyberHarem/independence_azurlane"], "pipeline_tag": "text-to-image"} | text-to-image | CyberHarem/independence_azurlane | [
"art",
"not-for-all-audiences",
"text-to-image",
"dataset:CyberHarem/independence_azurlane",
"license:mit",
"region:us"
] | 2024-02-15T06:00:42+00:00 | [] | [] | TAGS
#art #not-for-all-audiences #text-to-image #dataset-CyberHarem/independence_azurlane #license-mit #region-us
| Lora of independence/インディペンデンス/独立 (Azur Lane)
=============================================
What Is This?
-------------
This is the LoRA model of waifu independence/インディペンデンス/独立 (Azur Lane).
How Is It Trained?
------------------
* This model is trained with HCP-Diffusion.
* The auto-training framework is maintained by DeepGHS Team.
* The base model used for training is deepghs/animefull-latest.
* Dataset used for training is the 'stage3-p480-800' in CyberHarem/independence\_azurlane, which contains 146 images.
* Batch size is 4, resolution is 720x720, clustering into 5 buckets.
* Batch size for regularization dataset is 16, resolution is 720x720, clustering into 20 buckets.
* Trained for 1480 steps, 40 checkpoints were saved and evaluated.
* Trigger word is 'independence\_azurlane'.
* Pruned core tags for this waifu are 'breasts, red\_eyes, bangs, long\_hair, brown\_hair, ahoge, hairband, large\_breasts, hair\_ornament, very\_long\_hair'. You can add them to the prompt when some features of waifu (e.g. hair color) are not stable.
How to Use It?
--------------
### If You Are Using A1111 WebUI v1.7+
Just use it like the classic LoRA. The LoRA we provided are bundled with the embedding file.
### If You Are Using A1111 WebUI v1.6 or Lower
After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.
For example, if you want to use the model from step 1221, you need to download '1221/independence\_azurlane.pt' as the embedding and '1221/independence\_azurlane.safetensors' for loading Lora. By using both files together, you can generate images for the desired characters.
Which Step Should I Use?
------------------------
We selected 5 good steps for you to choose. The best one is step 1221.
1560 images (1.65 GiB) were generated for auto-testing.
!Metrics Plot
The base model used for generating preview images is Meina/MeinaMix\_V11.
Here are the preview of the recommended steps:
Anything Else?
--------------
Because the automation of LoRA training always annoys some people. So for the following groups, it is not recommended to use this model and we express regret:
1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.
2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.
3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.
4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.
5. Individuals who finds the generated image content offensive to their values.
All Steps
---------
We uploaded the files in all steps. you can check the images, metrics and download them in the following links:
* Steps From 1147 to 1480
* Steps From 777 to 1110
* Steps From 407 to 740
* Steps From 37 to 370
| [
"### If You Are Using A1111 WebUI v1.7+\n\n\nJust use it like the classic LoRA. The LoRA we provided are bundled with the embedding file.",
"### If You Are Using A1111 WebUI v1.6 or Lower\n\n\nAfter downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.\n\n\nFor example, if you want to use the model from step 1221, you need to download '1221/independence\\_azurlane.pt' as the embedding and '1221/independence\\_azurlane.safetensors' for loading Lora. By using both files together, you can generate images for the desired characters.\n\n\nWhich Step Should I Use?\n------------------------\n\n\nWe selected 5 good steps for you to choose. The best one is step 1221.\n\n\n1560 images (1.65 GiB) were generated for auto-testing.\n\n\n!Metrics Plot\n\n\nThe base model used for generating preview images is Meina/MeinaMix\\_V11.\n\n\nHere are the preview of the recommended steps:\n\n\n\nAnything Else?\n--------------\n\n\nBecause the automation of LoRA training always annoys some people. So for the following groups, it is not recommended to use this model and we express regret:\n\n\n1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.\n2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.\n3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.\n4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.\n5. Individuals who finds the generated image content offensive to their values.\n\n\nAll Steps\n---------\n\n\nWe uploaded the files in all steps. you can check the images, metrics and download them in the following links:\n\n\n* Steps From 1147 to 1480\n* Steps From 777 to 1110\n* Steps From 407 to 740\n* Steps From 37 to 370"
] | [
"TAGS\n#art #not-for-all-audiences #text-to-image #dataset-CyberHarem/independence_azurlane #license-mit #region-us \n",
"### If You Are Using A1111 WebUI v1.7+\n\n\nJust use it like the classic LoRA. The LoRA we provided are bundled with the embedding file.",
"### If You Are Using A1111 WebUI v1.6 or Lower\n\n\nAfter downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.\n\n\nFor example, if you want to use the model from step 1221, you need to download '1221/independence\\_azurlane.pt' as the embedding and '1221/independence\\_azurlane.safetensors' for loading Lora. By using both files together, you can generate images for the desired characters.\n\n\nWhich Step Should I Use?\n------------------------\n\n\nWe selected 5 good steps for you to choose. The best one is step 1221.\n\n\n1560 images (1.65 GiB) were generated for auto-testing.\n\n\n!Metrics Plot\n\n\nThe base model used for generating preview images is Meina/MeinaMix\\_V11.\n\n\nHere are the preview of the recommended steps:\n\n\n\nAnything Else?\n--------------\n\n\nBecause the automation of LoRA training always annoys some people. So for the following groups, it is not recommended to use this model and we express regret:\n\n\n1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.\n2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.\n3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.\n4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.\n5. Individuals who finds the generated image content offensive to their values.\n\n\nAll Steps\n---------\n\n\nWe uploaded the files in all steps. you can check the images, metrics and download them in the following links:\n\n\n* Steps From 1147 to 1480\n* Steps From 777 to 1110\n* Steps From 407 to 740\n* Steps From 37 to 370"
] | [
44,
38,
471
] | [
"passage: TAGS\n#art #not-for-all-audiences #text-to-image #dataset-CyberHarem/independence_azurlane #license-mit #region-us \n### If You Are Using A1111 WebUI v1.7+\n\n\nJust use it like the classic LoRA. The LoRA we provided are bundled with the embedding file."
] | [
0.010309855453670025,
-0.004867211449891329,
-0.00463488046079874,
0.06732463836669922,
0.06054766848683357,
0.08089499920606613,
0.23592372238636017,
0.0665263831615448,
0.11952917277812958,
-0.08963511139154434,
0.10227623581886292,
0.05288374051451683,
-0.0014143205480650067,
0.006738314870744944,
-0.037673529237508774,
-0.13959276676177979,
-0.06302941590547562,
-0.032539643347263336,
-0.008859186433255672,
0.014281431213021278,
0.08796405792236328,
0.0061920941807329655,
0.08715950697660446,
-0.05054660886526108,
-0.023110050708055496,
0.05213933065533638,
-0.03178386017680168,
-0.04727649688720703,
0.024802815169095993,
0.07697548717260361,
0.12830884754657745,
0.010390733368694782,
0.04796544462442398,
-0.18312878906726837,
0.06817257404327393,
-0.01554944645613432,
-0.09773167967796326,
0.008969546295702457,
0.022015463560819626,
-0.0364546999335289,
0.10229596495628357,
0.025652114301919937,
-0.10841035097837448,
0.050276029855012894,
-0.126700758934021,
-0.04334848374128342,
-0.06557504832744598,
0.030902398750185966,
0.1343560814857483,
0.055665720254182816,
0.01785612292587757,
0.04354020953178406,
-0.05972881615161896,
0.07752417773008347,
0.10078582912683487,
-0.11877021193504333,
-0.06749384105205536,
0.07021519541740417,
-0.0033435409422963858,
0.14369475841522217,
-0.06538604944944382,
0.10665988177061081,
0.08424030244350433,
-0.04513757675886154,
-0.16105103492736816,
-0.10055404156446457,
-0.2380441427230835,
-0.005359594710171223,
0.00817587599158287,
0.01152279507368803,
0.4277080297470093,
0.04735123738646507,
0.048221711069345474,
0.07229981571435928,
-0.061749231070280075,
0.035205431282520294,
-0.10109033435583115,
0.13402500748634338,
0.04586845263838768,
0.11318958550691605,
-0.00927612092345953,
-0.12747323513031006,
-0.12364383041858673,
-0.05395792797207832,
-0.10205363482236862,
-0.0027900387067347765,
0.032397352159023285,
0.11654038727283478,
-0.19654855132102966,
0.00572309410199523,
-0.01187850721180439,
-0.12029965221881866,
0.024926064535975456,
-0.09958535432815552,
0.1892734318971634,
0.0818849727511406,
-0.02665042318403721,
0.009997260756790638,
0.2532084882259369,
0.13414762914180756,
0.16979515552520752,
0.04074045270681381,
-0.10793013125658035,
0.14266110956668854,
0.030082298442721367,
-0.07893379032611847,
-0.017488528043031693,
-0.08487973362207413,
0.14129437506198883,
-0.07853250950574875,
0.10375043004751205,
-0.046233657747507095,
-0.11463560909032822,
0.019947979599237442,
-0.10844779759645462,
0.08786311745643616,
0.05703683942556381,
-0.0004985958803445101,
-0.05187046155333519,
0.0587894432246685,
0.009932461194694042,
-0.03785839304327965,
-0.01813291199505329,
-0.03295107185840607,
-0.04650450497865677,
0.04043898358941078,
0.10108926892280579,
0.03824162483215332,
0.046345047652721405,
-0.015011386945843697,
-0.013453126884996891,
0.00010905726958299056,
-0.05127829313278198,
-0.0018088928190991282,
0.036939214915037155,
0.05877256393432617,
0.0943201407790184,
-0.150572270154953,
-0.07295848429203033,
-0.0013890041736885905,
0.06955764442682266,
0.01027449406683445,
0.1043517217040062,
-0.004379634745419025,
0.06165126711130142,
-0.00009032460366142914,
-0.027423642575740814,
0.024239432066679,
-0.09726467728614807,
0.06698624789714813,
-0.0035787003580480814,
0.0774306133389473,
-0.1913834512233734,
-0.006830722559243441,
-0.05042998865246773,
0.02320619858801365,
0.06871464848518372,
-0.002645069733262062,
-0.1036389172077179,
0.1316181719303131,
0.00839588325470686,
0.08182740956544876,
-0.10097231715917587,
0.039646174758672714,
0.011229819618165493,
0.08104444295167923,
-0.0924743264913559,
-0.0007777536520734429,
0.10433971881866455,
-0.14605039358139038,
-0.16312143206596375,
0.09545303136110306,
-0.01457848772406578,
0.029318995773792267,
0.04444429278373718,
0.16385604441165924,
0.17582479119300842,
-0.18988442420959473,
-0.03272872045636177,
0.07879182696342468,
-0.03044997528195381,
-0.07077950984239578,
-0.0058958218432962894,
0.09662382304668427,
0.020745035260915756,
0.03864022344350815,
-0.03122648596763611,
0.14946432411670685,
-0.037461090832948685,
-0.08750970661640167,
-0.033605657517910004,
-0.08178774267435074,
-0.07930295914411545,
0.05513017624616623,
-0.00594280706718564,
-0.06196002662181854,
0.0338386595249176,
-0.17540603876113892,
0.1620938628911972,
0.026839319616556168,
0.024118972942233086,
-0.0917842835187912,
0.12205977737903595,
-0.010493597015738487,
0.004666800610721111,
-0.008272520266473293,
-0.07001803070306778,
-0.11128661781549454,
0.24627862870693207,
0.09761162847280502,
0.12236840277910233,
0.048635777086019516,
-0.0501563660800457,
-0.069674551486969,
0.01987731084227562,
0.024346789345145226,
-0.04617222398519516,
0.01580800674855709,
-0.10469414293766022,
0.05584317073225975,
-0.007727624382823706,
0.045208320021629333,
0.015563827939331532,
-0.0188540518283844,
0.0633375346660614,
0.01136899832636118,
-0.013855397701263428,
0.11500205844640732,
0.04215618222951889,
-0.012605521827936172,
-0.08338289707899094,
0.01249666791409254,
0.07215307652950287,
-0.02059468813240528,
-0.06723300367593765,
0.021028870716691017,
0.028207754716277122,
0.036438193172216415,
0.20327404141426086,
-0.22559396922588348,
0.060239460319280624,
0.03151513636112213,
0.040168434381484985,
0.04220213741064072,
0.030575502663850784,
-0.017277725040912628,
0.01854212023317814,
-0.01527777686715126,
0.06925148516893387,
-0.007146178279072046,
0.04654855653643608,
-0.03620052710175514,
-0.141008660197258,
-0.02927074022591114,
-0.03174475580453873,
0.1613898128271103,
-0.1450132131576538,
0.07893595844507217,
0.22676092386245728,
-0.10399412363767624,
0.1466587781906128,
-0.010440167970955372,
-0.015194648876786232,
0.010805738158524036,
0.023658512160182,
-0.0004528554854914546,
0.11361487954854965,
-0.06921477615833282,
-0.029374049976468086,
0.020236173644661903,
-0.07745089381933212,
0.03893907740712166,
-0.12152604013681412,
-0.11246542632579803,
-0.06901295483112335,
-0.040400322526693344,
-0.02770882658660412,
0.0224403478205204,
-0.05518712103366852,
0.06953402608633041,
-0.0915471762418747,
-0.07785830646753311,
-0.024767886847257614,
-0.08294683694839478,
0.03675532341003418,
-0.008069703355431557,
-0.08040669560432434,
-0.13213877379894257,
-0.10495755076408386,
-0.1154322698712349,
-0.13801923394203186,
-0.001300366478972137,
0.05961213633418083,
-0.11957208812236786,
-0.036232076585292816,
0.018567563965916634,
-0.04362742230296135,
0.0851159542798996,
-0.08230967074632645,
0.021677756682038307,
0.07245088368654251,
-0.04283391684293747,
-0.16938979923725128,
0.000042820392991416156,
-0.057867132127285004,
-0.05113013833761215,
0.14876185357570648,
-0.14451637864112854,
0.17932763695716858,
-0.029751824215054512,
0.057055551558732986,
0.07352622598409653,
0.023696063086390495,
0.11703261733055115,
-0.11842484027147293,
0.056592442095279694,
0.1835431009531021,
0.05034862458705902,
0.06612849235534668,
0.12157849222421646,
0.07021855562925339,
-0.10682471841573715,
0.034775178879499435,
0.07012064009904861,
-0.08236195147037506,
-0.09061568230390549,
-0.06698744744062424,
-0.12003553658723831,
-0.025037793442606926,
0.027950121089816093,
0.05392436683177948,
0.036543868482112885,
0.12510544061660767,
-0.05700314790010452,
-0.014143796637654305,
0.09550642222166061,
0.035363953560590744,
0.10190386325120926,
-0.01004654262214899,
0.05396262928843498,
-0.14940351247787476,
-0.04665376991033554,
0.16421279311180115,
0.2109343707561493,
0.25593385100364685,
0.032327696681022644,
0.0679435059428215,
0.1225278303027153,
0.12275603413581848,
0.08640450984239578,
0.06972607970237732,
0.01493153441697359,
0.017313433811068535,
-0.05916218459606171,
-0.03681086376309395,
0.02337142452597618,
-0.005889919586479664,
-0.06076890975236893,
-0.1541590839624405,
0.10323119908571243,
-0.0008094976074062288,
0.09007681906223297,
0.16549360752105713,
0.06274253129959106,
-0.07003369182348251,
0.14519429206848145,
0.1058485135436058,
0.07130381464958191,
-0.06546730548143387,
0.12027416378259659,
0.05779847502708435,
-0.004417541436851025,
0.16196900606155396,
0.04029306396842003,
0.14743787050247192,
-0.02702227607369423,
-0.07295279949903488,
-0.08104260265827179,
-0.06360475718975067,
0.013500510714948177,
0.0423826240003109,
-0.22233054041862488,
0.11746465414762497,
0.05624217167496681,
-0.0007141506066545844,
-0.024772832170128822,
-0.04843436926603317,
0.1864372044801712,
0.18220694363117218,
0.05815530940890312,
0.025588495656847954,
-0.041939739137887955,
-0.012969565577805042,
-0.07882611453533173,
0.06514527648687363,
0.02789844200015068,
0.04718859866261482,
-0.03729012608528137,
-0.11203141510486603,
-0.007484610192477703,
0.0016681087436154485,
0.011663944460451603,
-0.09839457273483276,
-0.13202904164791107,
-0.04834773764014244,
0.24860064685344696,
-0.05713670328259468,
0.03690136596560478,
0.06332569569349289,
0.03548665717244148,
-0.02323327586054802,
-0.005123257637023926,
-0.045041535049676895,
-0.00861162319779396,
-0.04875323921442032,
-0.0020766265224665403,
0.011051853187382221,
-0.0541519969701767,
-0.05599302425980568,
-0.011103596538305283,
-0.10068397969007492,
-0.11499268561601639,
-0.007767431437969208,
-0.06677260994911194,
0.009346804581582546,
-0.01915638893842697,
-0.00484889280050993,
-0.08927436918020248,
-0.02646601013839245,
0.023622002452611923,
0.028168831020593643,
-0.08840209245681763,
-0.1313963532447815,
-0.027655208483338356,
-0.04924817383289337,
-0.03058318793773651,
0.04823918640613556,
-0.11133898049592972,
-0.09322572499513626,
-0.06485970318317413,
-0.05620477348566055,
0.13397039473056793,
0.24168114364147186,
-0.019940542057156563,
0.0006622413638979197,
0.1695491373538971,
-0.09597169607877731,
-0.32576996088027954,
-0.18431025743484497,
-0.18019188940525055,
-0.10566561669111252,
0.04737217351794243,
-0.08891582489013672,
0.015713846310973167,
0.09209185093641281,
-0.044197820127010345,
0.2091185301542282,
-0.18427996337413788,
-0.08777382224798203,
0.10406984388828278,
0.0724588930606842,
0.31189775466918945,
-0.2399103194475174,
0.019787732511758804,
-0.12425952404737473,
-0.013454130850732327,
0.012191155925393105,
-0.0793837383389473,
0.10511589050292969,
0.029914015904068947,
0.07057783007621765,
-0.013438073918223381,
0.008968599140644073,
0.1570047289133072,
-0.09601680934429169,
0.1443408578634262,
-0.12596380710601807,
-0.09096944332122803,
0.2360624074935913,
-0.03406412526965141,
-0.008967354893684387,
-0.2097557932138443,
-0.03805672749876976,
-0.021908914670348167,
0.03309528902173042,
-0.008295884355902672,
0.04277600720524788,
-0.007656037341803312,
-0.028277358040213585,
-0.15650121867656708,
-0.02097899839282036,
-0.04268109053373337,
0.05502619966864586,
0.23974637687206268,
-0.046792566776275635,
-0.06664177775382996,
0.02894105389714241,
0.005805331747978926,
0.06377069652080536,
0.07451076060533524,
-0.057409219443798065,
-0.03797740861773491,
0.06991363316774368,
-0.16960982978343964,
0.04446317255496979,
0.010725562460720539,
-0.018498830497264862,
0.022512808442115784,
0.007118473295122385,
0.02087150700390339,
0.12162042409181595,
0.1791168749332428,
-0.0029336120933294296,
-0.08016221225261688,
-0.011412245221436024,
0.010079063475131989,
0.11188951879739761,
-0.03146315738558769,
0.11074656248092651,
0.033145565539598465,
0.03208760917186737,
0.005144109949469566,
0.0468527227640152,
-0.09494177997112274,
-0.08835940062999725,
0.09925933182239532,
-0.0476127453148365,
-0.08569087088108063,
0.09044244140386581,
0.04419645294547081,
0.08486740291118622,
-0.012071486562490463,
0.03763391450047493,
0.013703104108572006,
-0.130183145403862,
0.016565676778554916,
0.22085295617580414,
-0.06388359516859055,
-0.08144217729568481,
-0.06620568782091141,
-0.010704252868890762,
-0.11673705279827118,
0.07347559183835983,
0.038204848766326904,
-0.020439710468053818,
0.11623448133468628,
-0.03312956169247627,
-0.03153521940112114,
0.015655361115932465,
-0.07575657963752747,
0.008189796470105648,
-0.1423446238040924,
-0.2041996866464615,
0.04460231214761734,
0.0058106170035898685,
-0.0587419793009758,
-0.09445640444755554,
-0.05814700201153755,
0.07642415165901184,
-0.18093222379684448,
0.11918671429157257,
-0.08640909940004349,
0.04955366626381874,
-0.025746481493115425,
-0.06863664090633392,
-0.10844716429710388,
-0.029481222853064537,
-0.058384962379932404,
-0.0176432766020298,
0.06746694445610046,
0.01523500308394432,
-0.12292010337114334,
-0.11134878545999527,
0.07562430948019028,
0.007428924087435007,
0.0032876546029001474,
0.018274478614330292,
-0.05698447301983833,
0.03435058146715164,
-0.22999677062034607,
-0.06205635890364647,
0.09028048068284988,
0.04382716119289398,
-0.08766262978315353,
0.11277437955141068,
0.03196726366877556,
-0.022146394476294518,
0.03243431821465492,
0.01980138011276722,
0.18235157430171967,
-0.07474029809236526,
0.03350944072008133,
-0.0990419089794159,
-0.16034922003746033,
-0.02283136174082756,
0.03457455709576607,
0.22467081248760223,
0.09720852226018906,
0.12962251901626587,
-0.045429058372974396,
0.0149925472214818,
-0.009412783198058605,
0.0800444558262825,
0.018552683293819427,
-0.10507872700691223,
-0.03387312963604927,
-0.18032206594944,
-0.07266632467508316,
-0.04801836609840393,
0.19152730703353882,
0.03496171161532402,
-0.13318805396556854,
0.002251826226711273,
0.10720162838697433,
-0.18271851539611816,
-0.011550964787602425,
0.14301446080207825,
-0.0407261997461319,
0.021723495796322823,
-0.16868562996387482,
0.029398202896118164,
0.08022274821996689,
-0.08142116665840149,
-0.00857567973434925,
0.11426393687725067,
0.02149197645485401,
-0.008985181339085102,
0.015652013942599297,
-0.02728133089840412,
0.0852532908320427,
-0.09177252650260925,
0.05960238352417946,
-0.004345738794654608,
-0.04721719026565552,
-0.09004519879817963,
0.20478412508964539,
-0.015709640458226204,
0.009543138556182384,
-0.05223223194479942,
-0.0008248529047705233,
-0.10766355693340302,
-0.10608749836683273,
-0.04790140315890312,
-0.12224508076906204,
0.06330540031194687,
-0.06598436087369919,
0.010340440087020397,
0.007815607823431492,
0.030518298968672752,
-0.07300347089767456,
0.013633543625473976,
-0.16713790595531464,
-0.05686037614941597,
0.01128922589123249,
-0.018518049269914627,
-0.024682097136974335,
-0.07905709743499756,
-0.051076341420412064,
-0.0015215784078463912,
-0.0587972030043602,
-0.06735599040985107,
0.0627574697136879,
0.042650312185287476,
0.05918879806995392,
-0.17136050760746002,
-0.10143180191516876,
-0.07558035105466843,
0.048813559114933014,
0.07776971906423569,
0.1913682520389557,
0.02597002126276493,
0.00017931737238541245,
0.05611541122198105,
0.14465296268463135,
0.015493969433009624,
-0.10541214048862457,
-0.0628078281879425,
-0.1459033042192459,
-0.13701003789901733,
-0.013670256361365318,
-0.05502524971961975,
-0.01505877636373043,
0.008574525825679302,
0.2545265853404999,
0.19925063848495483,
-0.15299147367477417,
0.04055127128958702,
-0.07396496832370758,
0.037147462368011475,
-0.04695616662502289,
0.13244450092315674,
0.06039537489414215,
0.14613327383995056,
-0.03998390957713127,
-0.030718740075826645,
-0.05262384191155434,
0.03661753609776497,
-0.12910392880439758,
0.019059551879763603,
-0.02784401923418045,
-0.06290405243635178,
-0.04847980663180351,
0.10491511970758438,
-0.10464030504226685,
0.06772790104150772,
0.2018883228302002,
-0.13452115654945374,
-0.012194832786917686,
-0.02748854272067547,
0.06991471350193024,
0.09601791203022003,
0.017275100573897362,
-0.07507847994565964,
-0.04297550022602081,
0.014828360639512539,
0.024458806961774826,
-0.15415483713150024,
-0.12402453273534775,
0.015088657848536968,
-0.10998982191085815,
0.14177289605140686,
-0.009779800660908222,
-0.01952121965587139,
0.04498887434601784,
-0.0772254541516304,
-0.0223340205848217,
0.16943880915641785,
0.029841503128409386,
0.0033418445382267237,
-0.03922562301158905,
-0.0758543536067009,
-0.09436886012554169,
0.06701312959194183,
0.09742339700460434,
0.06555572152137756,
0.020817304030060768,
0.1357278823852539,
-0.022714557126164436,
-0.021085292100906372,
0.1371835619211197,
-0.17507945001125336,
0.07748351246118546,
-0.006200611591339111,
-0.019534070044755936,
-0.05491900444030762,
-0.04189268499612808,
0.01884620636701584,
0.07708090543746948,
-0.15756480395793915,
-0.044332582503557205,
0.09511139243841171,
-0.08165894448757172,
0.1100611612200737,
0.03429492563009262,
-0.08089714497327805,
0.019837647676467896,
-0.11548236012458801,
-0.002172773936763406,
-0.09851350635290146,
0.04518737271428108,
0.20879149436950684,
-0.030718157067894936,
0.02292683534324169,
-0.13365183770656586,
0.04792918637394905,
-0.040791235864162445,
-0.04008771479129791,
-0.06437408924102783
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | {"license": "apache-2.0", "library_name": "transformers"} | text-generation | sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-reversed_corrupted | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"conversational",
"arxiv:1910.09700",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-15T06:01:10+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #mistral #text-generation #conversational #arxiv-1910.09700 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #mistral #text-generation #conversational #arxiv-1910.09700 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
68,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #mistral #text-generation #conversational #arxiv-1910.09700 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.0535210520029068,
0.20994897186756134,
-0.005268031265586615,
0.014276806265115738,
0.0955512523651123,
0.008638777770102024,
0.06300276517868042,
0.1158338189125061,
-0.047883275896310806,
0.1243155300617218,
0.042200133204460144,
0.11444971710443497,
0.11701371520757675,
0.14681175351142883,
-0.010448015294969082,
-0.2127477526664734,
0.049479469656944275,
-0.09782613068819046,
-0.010965890251100063,
0.12522436678409576,
0.1528133600950241,
-0.1021554172039032,
0.06865602731704712,
-0.025421863421797752,
-0.027679279446601868,
-0.03849923610687256,
-0.0566946379840374,
-0.04705985262989998,
0.039500366896390915,
0.045059628784656525,
0.0707780197262764,
0.0020420614164322615,
0.08925426751375198,
-0.2961680293083191,
0.015841396525502205,
0.061627570539712906,
-0.0011538828257471323,
0.07117629796266556,
0.08686534315347672,
-0.06445532292127609,
0.11190240830183029,
-0.05954765900969505,
0.13944238424301147,
0.08088517934083939,
-0.08764896541833878,
-0.16849255561828613,
-0.08173782378435135,
0.1164049506187439,
0.17698147892951965,
0.05739709362387657,
-0.03419014438986778,
0.10745269805192947,
-0.07786449790000916,
0.018644774332642555,
0.04381900653243065,
-0.10070298612117767,
-0.06203228607773781,
0.08966916054487228,
0.10138842463493347,
0.047779373824596405,
-0.1254117786884308,
-0.02621159702539444,
0.01909841224551201,
0.020525658503174782,
0.08423653244972229,
0.01620851457118988,
0.14951850473880768,
0.03713475912809372,
-0.1349543035030365,
-0.06700814515352249,
0.10938923060894012,
0.035849303007125854,
-0.03309061378240585,
-0.23566174507141113,
-0.012542245909571648,
-0.018523117527365685,
-0.03702033683657646,
-0.04633718729019165,
0.04508320614695549,
0.00035771893453784287,
0.09011491388082504,
-0.008954458869993687,
-0.07910548150539398,
-0.040139373391866684,
0.07995790988206863,
0.0443887859582901,
0.02618550881743431,
-0.020654944702982903,
0.0196200180798769,
0.10571888089179993,
0.08120600134134293,
-0.11567401140928268,
-0.052357301115989685,
-0.059268027544021606,
-0.07263094186782837,
-0.04522480070590973,
0.034192245453596115,
0.03915010765194893,
0.07917314022779465,
0.2498505860567093,
0.02425568923354149,
0.05030221492052078,
0.034294791519641876,
0.009994457475841045,
0.0506180115044117,
0.09702284634113312,
-0.05150073766708374,
-0.13315412402153015,
-0.025374101474881172,
0.09864256531000137,
0.005786322522908449,
-0.027386901900172234,
-0.03541889786720276,
0.06029578298330307,
0.04617399722337723,
0.1090288832783699,
0.09366883337497711,
0.020180407911539078,
-0.07692110538482666,
-0.05078713595867157,
0.19401273131370544,
-0.15443174540996552,
0.033637624233961105,
0.034337762743234634,
-0.03277822583913803,
-0.04201162979006767,
0.008351285941898823,
0.03802844136953354,
-0.036022551357746124,
0.08953704684972763,
-0.0564630925655365,
-0.05063195154070854,
-0.11054704338312149,
-0.028881831094622612,
0.04553275927901268,
0.013422677293419838,
-0.030779702588915825,
-0.028852207586169243,
-0.09160611778497696,
-0.08647685497999191,
0.0933590829372406,
-0.06601894646883011,
-0.07365784049034119,
-0.03445909917354584,
-0.07572175562381744,
0.022073041647672653,
0.0197665523737669,
0.08815137296915054,
-0.02778664045035839,
0.04800492152571678,
-0.05751834064722061,
0.045487694442272186,
0.10566682368516922,
0.037697698920965195,
-0.07037637382745743,
0.0706227496266365,
-0.20493106544017792,
0.08915809541940689,
-0.0830419585108757,
0.04861335828900337,
-0.1653890162706375,
-0.026274170726537704,
0.039689481258392334,
0.01643560826778412,
-0.0033787109423428774,
0.13107116520404816,
-0.19559235870838165,
-0.017797095701098442,
0.17431269586086273,
-0.10360822826623917,
-0.08330244570970535,
0.05526921898126602,
-0.057440612465143204,
0.11731720715761185,
0.037011608481407166,
0.019470080733299255,
0.058710914105176926,
-0.10414515435695648,
-0.01242469996213913,
-0.053001441061496735,
-0.006864918861538172,
0.1263413429260254,
0.0823165625333786,
-0.0915842279791832,
0.03891290724277496,
0.018693674355745316,
-0.041496336460113525,
-0.06606454402208328,
-0.03105081617832184,
-0.10419636964797974,
0.010852072387933731,
-0.07944780588150024,
0.01409090030938387,
-0.01250049751251936,
-0.09268881380558014,
-0.028617506846785545,
-0.15793868899345398,
-0.017865799367427826,
0.08650479465723038,
-0.00420278450474143,
-0.024789050221443176,
-0.10228614509105682,
0.024746622890233994,
0.016695773229002953,
-0.008225910365581512,
-0.12663723528385162,
-0.03015286847949028,
0.02930605225265026,
-0.1463223546743393,
0.025121303275227547,
-0.06900617480278015,
0.04800525680184364,
0.013594716787338257,
-0.03318023681640625,
-0.021950239315629005,
0.013441511429846287,
0.017604129388928413,
-0.027337081730365753,
-0.22970223426818848,
-0.022300025448203087,
-0.03795382380485535,
0.16395093500614166,
-0.22768230736255646,
0.03959988057613373,
0.057571060955524445,
0.14559322595596313,
-0.002035774290561676,
-0.05482302978634834,
0.026347527280449867,
-0.06407148391008377,
-0.026518363505601883,
-0.054260484874248505,
0.004980194848030806,
-0.016927076503634453,
-0.04377713426947594,
0.025113888084888458,
-0.16963110864162445,
-0.041066963225603104,
0.1009739562869072,
0.05045260116457939,
-0.12397581338882446,
-0.04221658408641815,
-0.030096091330051422,
-0.05769895017147064,
-0.04544999822974205,
-0.059382662177085876,
0.10349436849355698,
0.056902505457401276,
0.039259765297174454,
-0.06905653327703476,
-0.07481072843074799,
-0.00414417777210474,
-0.021936526522040367,
-0.024703575298190117,
0.09668172895908356,
0.07970423996448517,
-0.1298835575580597,
0.09787239134311676,
0.08421709388494492,
0.05604126676917076,
0.08308221399784088,
-0.022293850779533386,
-0.07451046258211136,
-0.026380794122815132,
0.035747330635786057,
0.018736932426691055,
0.12649855017662048,
-0.07429172098636627,
0.038807980716228485,
0.045625898987054825,
-0.03244946897029877,
0.025344355031847954,
-0.08402993530035019,
0.01796797104179859,
0.021251529455184937,
-0.021030860021710396,
0.027411481365561485,
-0.04135683551430702,
0.013339306227862835,
0.08553782850503922,
0.05025739222764969,
0.020280251279473305,
0.020059505477547646,
-0.049937810748815536,
-0.11606193333864212,
0.1606271117925644,
-0.11436411738395691,
-0.2052009552717209,
-0.13315674662590027,
0.03165067359805107,
0.03760804608464241,
-0.016980579122900963,
-0.0016035162843763828,
-0.052950531244277954,
-0.10256816446781158,
-0.09110390394926071,
0.007514104712754488,
0.04145568981766701,
-0.09290070086717606,
-0.03897719830274582,
0.03872610628604889,
0.040360383689403534,
-0.13889461755752563,
0.016270309686660767,
0.04555486887693405,
-0.08409781754016876,
-0.009337234310805798,
0.06195974722504616,
0.08807603269815445,
0.1949266940355301,
0.014133783988654613,
-0.012460295110940933,
0.022886890918016434,
0.22538986802101135,
-0.13757912814617157,
0.10110456496477127,
0.1277187615633011,
-0.07486125081777573,
0.08228196948766708,
0.21431635320186615,
0.040862374007701874,
-0.09166773408651352,
0.025235874578356743,
0.038505490869283676,
-0.022305194288492203,
-0.25058743357658386,
-0.07316450774669647,
-0.0049200523644685745,
-0.06417204439640045,
0.08241023123264313,
0.08798143267631531,
0.09824612736701965,
0.037020351737737656,
-0.08483944833278656,
-0.09180966019630432,
0.06513748317956924,
0.10895394533872604,
-0.006382278632372618,
0.006520338822156191,
0.09023737907409668,
-0.031119242310523987,
0.018800098448991776,
0.08879689127206802,
0.017039285972714424,
0.15405668318271637,
0.04811256006360054,
0.17380255460739136,
0.08999684453010559,
0.08171714097261429,
0.0005674214335158467,
0.019576899707317352,
0.010055757127702236,
0.04351765289902687,
-0.0028601118829101324,
-0.0789370909333229,
-0.016239887103438377,
0.11934800446033478,
0.05345144495368004,
0.01708330772817135,
0.02053074724972248,
-0.03853895887732506,
0.07606158405542374,
0.19789741933345795,
-0.005303108599036932,
-0.19659237563610077,
-0.05594524368643761,
0.07944575697183609,
-0.09087561815977097,
-0.10445090383291245,
0.0012648770352825522,
0.02207600325345993,
-0.16718564927577972,
0.0425507128238678,
-0.032484039664268494,
0.11075331270694733,
-0.10687651485204697,
-0.02149091102182865,
0.06949468702077866,
0.06122921034693718,
-0.01464743074029684,
0.07286236435174942,
-0.20449048280715942,
0.1130913719534874,
0.008275190368294716,
0.07164911180734634,
-0.09718446433544159,
0.08861534297466278,
0.0005678947200067341,
-0.01952086202800274,
0.1592944860458374,
-0.005452786572277546,
-0.0763486698269844,
-0.06527400016784668,
-0.09441093355417252,
-0.007634507026523352,
0.09762918204069138,
-0.12904053926467896,
0.08223655819892883,
-0.033019669353961945,
-0.032815903425216675,
0.0006813490763306618,
-0.09557481110095978,
-0.11922643333673477,
-0.17144067585468292,
0.05302504077553749,
-0.11941733956336975,
0.03681783378124237,
-0.1064034253358841,
-0.030312927439808846,
-0.03676198795437813,
0.18239666521549225,
-0.20175118744373322,
-0.0813034176826477,
-0.13755548000335693,
-0.10123365372419357,
0.1406746506690979,
-0.043102510273456573,
0.09645568579435349,
-0.008794162422418594,
0.16338226199150085,
0.009652632288634777,
-0.01340216863900423,
0.07741600275039673,
-0.09172298014163971,
-0.2017849236726761,
-0.06434639543294907,
0.16040602326393127,
0.1114422157406807,
0.03748288005590439,
0.005051454529166222,
0.0383947417140007,
-0.028148308396339417,
-0.11146315932273865,
0.025262581184506416,
0.1439998298883438,
0.08248871564865112,
0.0029930840246379375,
-0.01808653026819229,
-0.14209192991256714,
-0.08416560292243958,
-0.04521634057164192,
0.02476728893816471,
0.16054345667362213,
-0.07303125411272049,
0.1599406898021698,
0.13895317912101746,
-0.0657365694642067,
-0.2002221941947937,
0.006290470249950886,
0.026827840134501457,
-0.010940702632069588,
0.01498403213918209,
-0.17745088040828705,
0.08298364281654358,
0.011942053213715553,
-0.05820121243596077,
0.08775683492422104,
-0.1857975721359253,
-0.13893114030361176,
0.08376453816890717,
0.05419633537530899,
-0.2044990211725235,
-0.13950414955615997,
-0.0963040292263031,
-0.039252035319805145,
-0.1592206358909607,
0.0923963189125061,
-0.002233225852251053,
0.0011471696197986603,
0.03607133403420448,
0.012751033529639244,
0.027581125497817993,
-0.05559009686112404,
0.184194415807724,
0.002587853232398629,
0.029408378526568413,
-0.0853973999619484,
-0.10160977393388748,
0.029257556423544884,
-0.04654192179441452,
0.0771842896938324,
-0.03388750180602074,
0.015159969218075275,
-0.1126389130949974,
-0.04229956865310669,
-0.05704052001237869,
0.015501349233090878,
-0.10054408013820648,
-0.0924275815486908,
-0.04786914214491844,
0.08759854733943939,
0.10720651596784592,
-0.020067252218723297,
-0.0310191810131073,
-0.08270808309316635,
0.0662069097161293,
0.22154001891613007,
0.19471275806427002,
0.0765640065073967,
-0.06753472983837128,
0.0009091472602449358,
-0.02554071880877018,
0.04510895162820816,
-0.1993175894021988,
0.054956164211034775,
0.06033334508538246,
0.018439872190356255,
0.11060556769371033,
-0.026344653218984604,
-0.14409807324409485,
-0.0706387609243393,
0.060687728226184845,
-0.05803653597831726,
-0.20370253920555115,
0.01122827548533678,
0.05618138238787651,
-0.1703799068927765,
-0.04229617491364479,
0.03580870106816292,
-0.015399953350424767,
-0.03552314639091492,
0.011711730621755123,
0.0959220603108406,
-0.007616850081831217,
0.09149198979139328,
0.07215461134910583,
0.09309251606464386,
-0.09687890857458115,
0.08689413219690323,
0.09960173070430756,
-0.061329230666160583,
0.030580563470721245,
0.09173155575990677,
-0.0525328703224659,
-0.0394037663936615,
0.05599869415163994,
0.10034316778182983,
0.01840079389512539,
-0.05585464462637901,
0.0036246594972908497,
-0.09255160391330719,
0.060889314860105515,
0.1093989685177803,
0.024813814088702202,
0.012469364330172539,
0.05441085249185562,
0.032194435596466064,
-0.08214732259511948,
0.12223849445581436,
0.056204844266176224,
0.014192699454724789,
-0.042528290301561356,
-0.014953055419027805,
0.010201917961239815,
-0.03509276360273361,
-0.005019268952310085,
-0.008046603761613369,
-0.07746861129999161,
-0.007181765045970678,
-0.1403387039899826,
0.022289810702204704,
-0.08175966888666153,
0.013787015341222286,
0.023796403780579567,
-0.026725906878709793,
0.007821079343557358,
0.0001906560646602884,
-0.07514636218547821,
-0.05624905601143837,
-0.009548153728246689,
0.10408845543861389,
-0.1651892215013504,
0.01900539919734001,
0.08338131755590439,
-0.10459206998348236,
0.09141772240400314,
-0.008385900408029556,
-0.009890591725707054,
0.007888508029282093,
-0.16260425746440887,
0.052401088178157806,
-0.027303528040647507,
0.004645921755582094,
0.004283140879124403,
-0.18384072184562683,
-0.00428239069879055,
-0.03126665577292442,
-0.06947306543588638,
-0.0062017133459448814,
-0.02631635218858719,
-0.11096397042274475,
0.09245428442955017,
0.010684546083211899,
-0.08446349948644638,
-0.02270403504371643,
0.035123277455568314,
0.090272918343544,
-0.04115687683224678,
0.14325648546218872,
-0.01693929173052311,
0.06450200825929642,
-0.16913360357284546,
-0.00916238036006689,
-0.008691014721989632,
0.020511409267783165,
-0.06107008457183838,
-0.007177166640758514,
0.048090312629938126,
-0.022019952535629272,
0.18647055327892303,
-0.028773868456482887,
0.010638263076543808,
0.060599107295274734,
0.034032776951789856,
-0.0006982669583521783,
0.09846311807632446,
0.0685117244720459,
0.010936494916677475,
0.007215898018330336,
0.011858666315674782,
-0.04749750345945358,
-0.03902202472090721,
-0.18028034269809723,
0.0655660554766655,
0.20538455247879028,
0.10305459797382355,
-0.02155209518969059,
0.06722000241279602,
-0.1133543998003006,
-0.09935101121664047,
0.139744371175766,
-0.03790305182337761,
-0.004522405564785004,
-0.07542582601308823,
0.14351864159107208,
0.14540117979049683,
-0.19453977048397064,
0.07385516166687012,
-0.07807494699954987,
-0.04793347790837288,
-0.10093287378549576,
-0.20358964800834656,
-0.06594622880220413,
-0.039517082273960114,
-0.013079931028187275,
-0.0574735552072525,
0.06760978698730469,
0.08985619992017746,
-0.004441138822585344,
-0.01519135944545269,
0.06695786863565445,
-0.034570787101984024,
-0.0002581503358669579,
0.03333301469683647,
0.056985706090927124,
0.0183477234095335,
-0.06424097716808319,
0.01065297331660986,
-0.009701683185994625,
0.04975178465247154,
0.07717985659837723,
0.03756337985396385,
-0.024198200553655624,
0.013455628417432308,
-0.029356906190514565,
-0.10353074967861176,
0.050629157572984695,
-0.021355563774704933,
-0.053891800343990326,
0.1481945961713791,
0.025117632001638412,
0.006510977167636156,
-0.009177884086966515,
0.2235785275697708,
-0.06651077419519424,
-0.10774586349725723,
-0.14945030212402344,
0.07872933149337769,
-0.052336301654577255,
0.045451220124959946,
0.052770089358091354,
-0.11167564243078232,
0.02688967064023018,
0.15125538408756256,
0.16306868195533752,
-0.03116506338119507,
0.013097112998366356,
0.027573687955737114,
0.005935588851571083,
-0.025699689984321594,
0.039286959916353226,
0.04900084063410759,
0.13694828748703003,
-0.06408719718456268,
0.07125121355056763,
0.0047598169185221195,
-0.08259867876768112,
-0.020779481157660484,
0.12264706194400787,
-0.014121388085186481,
0.0016442675841972232,
-0.05498780682682991,
0.12352734804153442,
-0.06982193142175674,
-0.20818258821964264,
0.03547022119164467,
-0.07633711397647858,
-0.13274122774600983,
-0.027499711140990257,
0.04813192039728165,
0.0004025342350360006,
0.01910916157066822,
0.07624450325965881,
-0.06742072850465775,
0.172502800822258,
0.03384549170732498,
-0.06627769768238068,
-0.050338875502347946,
0.0812113881111145,
-0.07910801470279694,
0.3027973771095276,
0.018784107640385628,
0.048531800508499146,
0.1071595773100853,
-0.01723645254969597,
-0.1239522397518158,
0.02724023349583149,
0.10326878726482391,
-0.07115544378757477,
0.0582791231572628,
0.1616062968969345,
-0.005169853568077087,
0.12930142879486084,
0.07263926416635513,
-0.07595457136631012,
0.0463075116276741,
-0.07637964189052582,
-0.07506715506315231,
-0.10420151054859161,
0.09952887147665024,
-0.08918270468711853,
0.14938555657863617,
0.12495137006044388,
-0.05796056613326073,
0.009720006957650185,
-0.025794677436351776,
0.06097874417901039,
-0.008937055245041847,
0.1251431554555893,
0.009622495621442795,
-0.18729031085968018,
0.029527664184570312,
-0.027558233588933945,
0.10235998779535294,
-0.18572887778282166,
-0.07891097664833069,
0.03731522336602211,
0.0011901403777301311,
-0.0808538869023323,
0.11897921562194824,
0.0759904533624649,
0.029184773564338684,
-0.04821867495775223,
-0.026540882885456085,
-0.010870136320590973,
0.1479586809873581,
-0.09371433407068253,
-0.004159667529165745
] |
null | null | diffusers |
# Text-to-image finetuning - yurman/mri_full_512_v2_base
This pipeline was finetuned from **stabilityai/stable-diffusion-2-base**
on the **OASIS-3** dataset for brain image generation.
Below are some example images generated with the finetuned pipeline:

## Pipeline usage
You can use the pipeline like so:
```python
from diffusers import DiffusionPipeline
import torch
pipeline = DiffusionPipeline.from_pretrained("yurman/mri_full_512_v2_base", torch_dtype=torch.float16)
prompt = "An empty, flat black image with a MRI brain axial scan in the center"
image = pipeline(prompt).images[0]
image.save("my_image.png")
```
## Training info
These are the key hyperparameters used during training:
* Epochs: 6
* Learning rate: 0.0001
* embeds rate: 0.0001
* Batch size: 8
* Gradient accumulation steps: 4
* Image resolution: 512
* Mixed-precision: None
| {"license": "creativeml-openrail-m", "tags": ["stable-diffusion", "stable-diffusion-diffusers", "text-to-image", "diffusers"], "datasets": ["OASIS-3"], "base_model": "stabilityai/stable-diffusion-2-base", "inference": true} | text-to-image | yurman/mri_full_512_v2_base | [
"diffusers",
"tensorboard",
"safetensors",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"dataset:OASIS-3",
"base_model:stabilityai/stable-diffusion-2-base",
"license:creativeml-openrail-m",
"endpoints_compatible",
"diffusers:StableDiffusionPipeline",
"region:us"
] | 2024-02-15T06:03:24+00:00 | [] | [] | TAGS
#diffusers #tensorboard #safetensors #stable-diffusion #stable-diffusion-diffusers #text-to-image #dataset-OASIS-3 #base_model-stabilityai/stable-diffusion-2-base #license-creativeml-openrail-m #endpoints_compatible #diffusers-StableDiffusionPipeline #region-us
|
# Text-to-image finetuning - yurman/mri_full_512_v2_base
This pipeline was finetuned from stabilityai/stable-diffusion-2-base
on the OASIS-3 dataset for brain image generation.
Below are some example images generated with the finetuned pipeline:
!val_imgs_grid
## Pipeline usage
You can use the pipeline like so:
## Training info
These are the key hyperparameters used during training:
* Epochs: 6
* Learning rate: 0.0001
* embeds rate: 0.0001
* Batch size: 8
* Gradient accumulation steps: 4
* Image resolution: 512
* Mixed-precision: None
| [
"# Text-to-image finetuning - yurman/mri_full_512_v2_base\n\nThis pipeline was finetuned from stabilityai/stable-diffusion-2-base\non the OASIS-3 dataset for brain image generation.\nBelow are some example images generated with the finetuned pipeline: \n\n!val_imgs_grid",
"## Pipeline usage\n\nYou can use the pipeline like so:",
"## Training info\n\nThese are the key hyperparameters used during training:\n\n* Epochs: 6\n* Learning rate: 0.0001\n* embeds rate: 0.0001\n* Batch size: 8\n* Gradient accumulation steps: 4\n* Image resolution: 512\n* Mixed-precision: None"
] | [
"TAGS\n#diffusers #tensorboard #safetensors #stable-diffusion #stable-diffusion-diffusers #text-to-image #dataset-OASIS-3 #base_model-stabilityai/stable-diffusion-2-base #license-creativeml-openrail-m #endpoints_compatible #diffusers-StableDiffusionPipeline #region-us \n",
"# Text-to-image finetuning - yurman/mri_full_512_v2_base\n\nThis pipeline was finetuned from stabilityai/stable-diffusion-2-base\non the OASIS-3 dataset for brain image generation.\nBelow are some example images generated with the finetuned pipeline: \n\n!val_imgs_grid",
"## Pipeline usage\n\nYou can use the pipeline like so:",
"## Training info\n\nThese are the key hyperparameters used during training:\n\n* Epochs: 6\n* Learning rate: 0.0001\n* embeds rate: 0.0001\n* Batch size: 8\n* Gradient accumulation steps: 4\n* Image resolution: 512\n* Mixed-precision: None"
] | [
101,
83,
13,
63
] | [
"passage: TAGS\n#diffusers #tensorboard #safetensors #stable-diffusion #stable-diffusion-diffusers #text-to-image #dataset-OASIS-3 #base_model-stabilityai/stable-diffusion-2-base #license-creativeml-openrail-m #endpoints_compatible #diffusers-StableDiffusionPipeline #region-us \n# Text-to-image finetuning - yurman/mri_full_512_v2_base\n\nThis pipeline was finetuned from stabilityai/stable-diffusion-2-base\non the OASIS-3 dataset for brain image generation.\nBelow are some example images generated with the finetuned pipeline: \n\n!val_imgs_grid## Pipeline usage\n\nYou can use the pipeline like so:## Training info\n\nThese are the key hyperparameters used during training:\n\n* Epochs: 6\n* Learning rate: 0.0001\n* embeds rate: 0.0001\n* Batch size: 8\n* Gradient accumulation steps: 4\n* Image resolution: 512\n* Mixed-precision: None"
] | [
-0.1461978405714035,
0.09117377549409866,
-0.00415619695559144,
0.027250027284026146,
0.08651189506053925,
0.00032440113136544824,
0.14110121130943298,
0.1057717353105545,
-0.02083645761013031,
0.15352033078670502,
0.07539955526590347,
0.028215447440743446,
0.10343742370605469,
0.17983472347259521,
-0.06331316381692886,
-0.21374556422233582,
0.029932308942079544,
-0.06285281479358673,
-0.0622713603079319,
0.059613652527332306,
0.05693080276250839,
-0.11559788882732391,
0.029371004551649094,
-0.06794748455286026,
-0.07271092385053635,
-0.028455615043640137,
-0.006276718340814114,
-0.06994988769292831,
0.028890497982501984,
0.0298483707010746,
0.03175995871424675,
0.05391576513648033,
0.09290006756782532,
-0.19244897365570068,
0.023701366037130356,
0.0906086340546608,
0.019780080765485764,
0.05789695307612419,
0.06004913151264191,
-0.009244272485375404,
0.1274808943271637,
-0.09229104965925217,
0.044159822165966034,
0.02306874468922615,
-0.09170153737068176,
-0.14631840586662292,
-0.036422550678253174,
0.03936251997947693,
0.1588519811630249,
0.06762497872114182,
-0.04311060905456543,
0.05095510557293892,
-0.09247011691331863,
0.0672379806637764,
0.1839297115802765,
-0.18881714344024658,
-0.06383588165044785,
0.0688936784863472,
-0.0003267666616011411,
0.10460251569747925,
-0.18876558542251587,
0.013105004094541073,
0.034170251339673996,
0.011778601445257664,
0.06763282418251038,
0.01640854775905609,
0.16858018934726715,
0.0008342033252120018,
-0.08294542133808136,
0.010113432072103024,
0.09756540507078171,
0.06055249646306038,
-0.04740322008728981,
-0.15625473856925964,
-0.05262858420610428,
-0.0702129378914833,
-0.06397983431816101,
-0.009508397430181503,
0.05698772147297859,
-0.03335074335336685,
-0.03773977980017662,
-0.0354907400906086,
-0.11659955978393555,
-0.01197581272572279,
-0.008540411479771137,
0.07411278039216995,
0.03892730548977852,
0.035448864102363586,
0.04084146395325661,
0.12887822091579437,
0.011724866926670074,
-0.09546969085931778,
-0.04921381175518036,
0.002245259704068303,
-0.0936880111694336,
0.02246621996164322,
-0.007085069548338652,
-0.012439556419849396,
0.013255365192890167,
0.10350604355335236,
0.018875304609537125,
0.07164396345615387,
0.022586863487958908,
0.05581558123230934,
-0.024675974622368813,
0.07788398861885071,
-0.06693022698163986,
-0.008003946393728256,
0.01350098755210638,
0.12974140048027039,
-0.031657807528972626,
-0.01572689786553383,
-0.021507997065782547,
0.08344842493534088,
0.03389425575733185,
0.0359121635556221,
-0.02575519308447838,
-0.018567457795143127,
-0.0580914281308651,
-0.016705967485904694,
0.005578902550041676,
-0.13599741458892822,
0.04812827333807945,
0.05474163964390755,
-0.0929945781826973,
0.12676295638084412,
0.018468985334038734,
-0.006769523955881596,
-0.07642243802547455,
0.08972221612930298,
-0.08180223405361176,
-0.0027757338248193264,
-0.059196360409259796,
0.004006980452686548,
-0.00632035918533802,
-0.08181638270616531,
-0.039069872349500656,
-0.0815889984369278,
-0.18345610797405243,
-0.04468505457043648,
0.09253998845815659,
-0.0653943195939064,
0.07429571449756622,
-0.08576266467571259,
-0.07814837992191315,
0.05624065548181534,
-0.004283055197447538,
0.06210932508111,
-0.039841219782829285,
0.09694042801856995,
-0.010581782087683678,
0.024591244757175446,
0.07617348432540894,
0.016168279573321342,
-0.03729715570807457,
0.01875816099345684,
-0.10900064557790756,
0.12132799625396729,
-0.040226444602012634,
0.0013596388744190335,
-0.12629924714565277,
-0.06953225284814835,
-0.048093635588884354,
0.021204184740781784,
0.07280416041612625,
0.16428923606872559,
-0.1901220828294754,
-0.07132413238286972,
0.15661802887916565,
-0.04722870513796806,
-0.047760192304849625,
0.1084376722574234,
-0.006195242051035166,
-0.04887957498431206,
0.09578628093004227,
0.14412999153137207,
-0.06856763362884521,
-0.2243330031633377,
-0.04526088014245033,
-0.06696150451898575,
0.07282459735870361,
0.01725541055202484,
0.10483045876026154,
0.032547637820243835,
0.13437911868095398,
0.03496891260147095,
-0.09273897856473923,
0.0529242604970932,
-0.06084562465548515,
-0.059897858649492264,
-0.006560015492141247,
-0.0662289410829544,
-0.05367724597454071,
0.06053692102432251,
0.024922387674450874,
-0.03306658938527107,
-0.1350892335176468,
-0.011727727949619293,
0.10436785221099854,
-0.13292865455150604,
0.010796272195875645,
-0.07190246134996414,
0.10362253338098526,
-0.09622828662395477,
0.01568690873682499,
-0.12035279721021652,
-0.06892967224121094,
0.007694666739553213,
0.06711377948522568,
-0.008830871433019638,
0.03555302321910858,
0.09503111988306046,
0.09148181229829788,
-0.06621139496564865,
-0.07934340089559555,
-0.0009433298837393522,
-0.03540550172328949,
-0.08373089134693146,
-0.14340715110301971,
0.015911085531115532,
-0.06356766819953918,
0.10814519971609116,
-0.2853916883468628,
0.009793475270271301,
-0.0343899205327034,
0.18457111716270447,
0.08827842026948929,
-0.09778369218111038,
-0.007116090040653944,
-0.06636492908000946,
-0.01610696129500866,
-0.10676944255828857,
0.02460232377052307,
-0.019463231787085533,
-0.06291911751031876,
-0.003367799101397395,
-0.14521420001983643,
0.13312895596027374,
0.08959430456161499,
0.14414715766906738,
-0.05738912150263786,
-0.10625229775905609,
-0.09409774094820023,
-0.027971213683485985,
-0.03913549706339836,
0.013363831676542759,
0.14465674757957458,
0.030862927436828613,
0.10764169692993164,
-0.08957044035196304,
-0.010715492069721222,
0.02704048529267311,
-0.005090588238090277,
-0.04531128332018852,
0.13223347067832947,
-0.015121325850486755,
-0.08952178061008453,
0.06926663964986801,
0.1291104406118393,
-0.006578298285603523,
0.09178958088159561,
-0.06570390611886978,
-0.14558757841587067,
-0.0505279004573822,
0.03779464215040207,
0.018353551626205444,
0.16899536550045013,
0.023170052096247673,
0.024470249190926552,
0.004050104413181543,
0.028310997411608696,
0.01866406947374344,
-0.13706988096237183,
0.006339262705296278,
0.008117456920444965,
-0.03819538280367851,
0.103170245885849,
-0.0027153491973876953,
-0.04738588631153107,
0.10333748161792755,
-0.06225401535630226,
-0.04955137148499489,
-0.01797042414546013,
-0.025467896834015846,
-0.07585006207227707,
0.20283882319927216,
-0.07864416390657425,
-0.1506139636039734,
-0.0024781532119959593,
0.03612024337053299,
-0.0009786665905267,
-0.004809433128684759,
0.014091531746089458,
-0.06290903687477112,
-0.07800264656543732,
-0.10941172391176224,
0.034055210649967194,
0.05266639217734337,
0.008669587783515453,
0.02803155407309532,
0.003912085201591253,
0.03395194560289383,
-0.06309682875871658,
-0.0027288703713566065,
-0.05631083995103836,
-0.0474584735929966,
0.019295498728752136,
0.0451388917863369,
0.017698584124445915,
0.110777847468853,
0.05157819017767906,
0.02755795046687126,
-0.015216660685837269,
0.14645658433437347,
-0.10262349992990494,
0.07719491422176361,
0.08116656541824341,
0.05243765190243721,
0.041698332875967026,
0.07306470721960068,
0.033776771277189255,
-0.10769869387149811,
0.03845924884080887,
0.06468135863542557,
-0.002192603424191475,
-0.21193760633468628,
-0.01194094866514206,
-0.06045025959610939,
-0.03075212799012661,
0.0960225835442543,
0.05984914302825928,
-0.006305491551756859,
0.02049032598733902,
-0.01853429153561592,
-0.004172859713435173,
0.07723471522331238,
0.04780653119087219,
-0.07163987308740616,
-0.008553869090974331,
0.055455345660448074,
0.0011386135593056679,
-0.014463101513683796,
0.07234375178813934,
0.029873240739107132,
0.2579106390476227,
-0.04152766615152359,
0.10780306160449982,
0.012009023688733578,
0.057823602110147476,
-0.028706464916467667,
0.06443573534488678,
-0.010604542680084705,
-0.05680381506681442,
-0.012474967166781425,
-0.07221432775259018,
0.014917230233550072,
0.07457880675792694,
0.11104224622249603,
-0.0010263352887704968,
-0.10240956395864487,
0.0513555072247982,
0.05608120560646057,
0.17063240706920624,
0.11020425707101822,
-0.2742803394794464,
-0.02437528222799301,
0.052521780133247375,
-0.05248751491308212,
-0.04889928176999092,
0.05103124678134918,
0.12173975259065628,
-0.05431651324033737,
-0.019858065992593765,
-0.09249883145093918,
0.07737413048744202,
-0.051229752600193024,
-0.046857040375471115,
-0.016811218112707138,
0.12725421786308289,
-0.01265252847224474,
0.04376285523176193,
-0.19162429869174957,
0.17142052948474884,
-0.011045357212424278,
0.14367543160915375,
-0.04169746860861778,
-0.010791457258164883,
0.057298071682453156,
-0.021661851555109024,
0.17259687185287476,
-0.008129676803946495,
-0.1453765630722046,
-0.0521635003387928,
-0.18772092461585999,
0.020027726888656616,
0.08639226108789444,
-0.11357565224170685,
0.05476196110248566,
-0.014255587011575699,
-0.016068175435066223,
0.02140478976070881,
-0.08765226602554321,
-0.18725672364234924,
-0.12170787155628204,
0.02873334474861622,
0.015886828303337097,
-0.03908253833651543,
-0.10564834624528885,
-0.10004758834838867,
-0.03803171217441559,
0.13843299448490143,
-0.09914243966341019,
-0.04148174449801445,
-0.17240312695503235,
0.11244068294763565,
0.096452996134758,
-0.043326236307621,
-0.011980862356722355,
0.024245429784059525,
0.12927952408790588,
0.029347358271479607,
-0.07978010922670364,
0.09458601474761963,
-0.08010189980268478,
-0.17818792164325714,
-0.08835960179567337,
0.13419944047927856,
0.07279691100120544,
0.021926216781139374,
-0.017887040972709656,
0.017031880095601082,
0.021034812554717064,
-0.04120476543903351,
0.08882786333560944,
0.1545826643705368,
0.10523230582475662,
0.07662701606750488,
-0.030158616602420807,
-0.0637924000620842,
-0.0646946132183075,
-0.0016376591520383954,
0.0524771474301815,
0.25668370723724365,
-0.018411071971058846,
0.062339186668395996,
0.1873149871826172,
-0.08335646986961365,
-0.18500186502933502,
-0.006437734700739384,
0.06883058696985245,
0.06772983819246292,
-0.0036298225168138742,
-0.13970689475536346,
0.10300417244434357,
0.1063094437122345,
-0.027704717591404915,
0.13756467401981354,
-0.35670006275177,
-0.11382153630256653,
0.11399101465940475,
0.13001099228858948,
0.049245066940784454,
-0.1747814267873764,
-0.0009665509569458663,
-0.01394710037857294,
0.00758634926751256,
0.06562188267707825,
-0.022511299699544907,
0.12762008607387543,
-0.04810579493641853,
-0.04770834371447563,
0.024549907073378563,
-0.08963657915592194,
0.14456890523433685,
-0.01822095736861229,
0.06655498594045639,
-0.06020262837409973,
0.04563792049884796,
0.05292658507823944,
-0.08118516951799393,
0.06638450920581818,
-0.10668165981769562,
0.07144845277070999,
-0.107840396463871,
-0.043586745858192444,
-0.023999571800231934,
0.01894993521273136,
-0.016109619289636612,
-0.03031669370830059,
-0.0826287791132927,
0.11832580715417862,
0.08505924046039581,
0.02349851094186306,
0.06233054772019386,
0.011955156922340393,
-0.027660390362143517,
0.09087293595075607,
-0.006568266544491053,
0.033988192677497864,
-0.20444118976593018,
-0.09227114915847778,
0.030962355434894562,
0.024779781699180603,
-0.18328075110912323,
-0.01921163685619831,
0.18599413335323334,
0.008998318575322628,
0.20240318775177002,
0.032565779983997345,
-0.10542279481887817,
0.03614208847284317,
0.05414893850684166,
-0.07508783042430878,
-0.16780418157577515,
-0.0006581912166438997,
0.08657445758581161,
-0.08900506794452667,
0.042926836758852005,
0.11226048320531845,
-0.09664439409971237,
0.04595475643873215,
-0.009850969538092613,
0.11166000366210938,
-0.04480209946632385,
0.18971337378025055,
0.025799177587032318,
0.050315361469984055,
-0.05262118950486183,
0.16134139895439148,
0.03861284628510475,
-0.14586828649044037,
0.031069016084074974,
0.05764109268784523,
-0.04458041116595268,
-0.023677444085478783,
0.10383839160203934,
0.1257316917181015,
0.07358169555664062,
-0.020138252526521683,
-0.06434998661279678,
-0.10994957387447357,
0.08503270149230957,
-0.01859251968562603,
0.003340094583109021,
0.008498063310980797,
-0.060457538813352585,
0.01192003209143877,
-0.07986292243003845,
0.10768698900938034,
0.08987127244472504,
0.01844555139541626,
-0.13216890394687653,
-0.0436776764690876,
-0.01623418554663658,
-0.02457025833427906,
-0.03535463288426399,
-0.02869265154004097,
-0.11863009631633759,
0.020488156005740166,
-0.05729983374476433,
0.04229351505637169,
-0.03777603060007095,
-0.023211408406496048,
0.007296792697161436,
-0.05817697197198868,
-0.02201470173895359,
0.017089204862713814,
-0.07017030566930771,
-0.020787857472896576,
0.02974652871489525,
0.03541601821780205,
-0.10037009418010712,
-0.05934159830212593,
-0.027515875175595284,
-0.11936993151903152,
0.0869111567735672,
0.07854931801557541,
-0.018363915383815765,
0.028971655294299126,
-0.14227506518363953,
0.021099654957652092,
0.04723554104566574,
0.011733386665582657,
0.004796462133526802,
-0.06842370331287384,
0.010096605867147446,
-0.006653805263340473,
0.0032229211647063494,
-0.014518653973937035,
-0.004652258940041065,
-0.09650004655122757,
-0.0574130155146122,
-0.1363389939069748,
-0.030939750373363495,
-0.06469840556383133,
0.027117056772112846,
0.021853286772966385,
0.06620700657367706,
0.12704813480377197,
-0.04710359498858452,
0.06821303069591522,
-0.18296535313129425,
-0.003916231449693441,
0.061444319784641266,
-0.06429543346166611,
0.06783901154994965,
-0.03320790454745293,
0.08785706758499146,
-0.06903274357318878,
0.05647756904363632,
-0.037379104644060135,
-0.060051240026950836,
0.01140251848846674,
-0.12436740845441818,
0.0017534649232402444,
0.08843313902616501,
0.13849100470542908,
0.04000331833958626,
-0.006578373722732067,
-0.03976237773895264,
-0.0038102564867585897,
0.09372679889202118,
0.09975141286849976,
0.17194011807441711,
0.20204780995845795,
0.006687959656119347,
0.17068295180797577,
-0.020099936053156853,
-0.11380167305469513,
-0.133270263671875,
0.15226586163043976,
-0.11225187033414841,
0.09461262077093124,
-0.06144411116838455,
0.11424938589334488,
0.14269594848155975,
-0.1804649531841278,
-0.011390196159482002,
-0.055472712963819504,
-0.09641009569168091,
-0.06525634229183197,
-0.1310625970363617,
-0.06590677052736282,
-0.09590304642915726,
0.03624791279435158,
-0.08902353793382645,
0.015786368399858475,
0.03684772923588753,
0.09021714329719543,
0.01762319542467594,
0.14499783515930176,
0.018049729987978935,
-0.01221522781997919,
0.1722663938999176,
0.0416739359498024,
-0.053818654268980026,
-0.014646359719336033,
-0.039922915399074554,
0.03321676701307297,
-0.0586070790886879,
0.08138727396726608,
0.01032648328691721,
0.0033278665505349636,
0.0894501730799675,
0.02591678500175476,
-0.07034456729888916,
0.020579269155859947,
-0.01154439989477396,
0.023802196606993675,
0.08293294906616211,
0.0521116629242897,
-0.021465908735990524,
-0.05052826553583145,
0.18215586245059967,
-0.07578714936971664,
0.005693765357136726,
-0.12813818454742432,
0.08673105388879776,
-0.008068563416600227,
0.010429766960442066,
0.025021903216838837,
-0.12843775749206543,
-0.020745839923620224,
0.19802722334861755,
0.09747527539730072,
-0.10673949867486954,
-0.043081872165203094,
0.01880539581179619,
-0.019660552963614464,
-0.08986616134643555,
0.12199798971414566,
0.05584603175520897,
0.19808180630207062,
-0.0651877224445343,
-0.002130229026079178,
0.01006856095045805,
-0.09985470026731491,
-0.04225561022758484,
0.046147119253873825,
-0.031803227961063385,
0.05706290900707245,
-0.08204484730958939,
0.07303845882415771,
-0.0293502826243639,
-0.24310500919818878,
0.11663324385881424,
-0.09225606918334961,
-0.12361662834882736,
-0.05208617076277733,
0.09320470690727234,
0.0041002193465828896,
0.025326132774353027,
-0.03666946664452553,
-0.001988619100302458,
0.09437225759029388,
-0.010989697650074959,
-0.008883018977940083,
-0.025676079094409943,
0.011607211083173752,
-0.06331920623779297,
0.21239006519317627,
-0.011807693168520927,
0.08082064241170883,
0.08298841118812561,
-0.019282346591353416,
-0.111241914331913,
0.02834210731089115,
0.041329581290483475,
-0.06409768760204315,
0.05290409177541733,
0.2249639630317688,
-0.03270998224616051,
0.05565667524933815,
0.07306154072284698,
-0.0516367107629776,
-0.002007037401199341,
-0.03159858286380768,
-0.05746116489171982,
-0.08320318162441254,
0.06708262115716934,
-0.0449560284614563,
0.09550149738788605,
0.09235293418169022,
-0.024875514209270477,
-0.0048050470650196075,
-0.01298361737281084,
0.10218638926744461,
0.03871876373887062,
0.05417179316282272,
0.08386112004518509,
-0.18993687629699707,
-0.0039013130590319633,
-0.003628435079008341,
0.04113508388400078,
-0.20516657829284668,
-0.09789172559976578,
-0.025983713567256927,
-0.06887929886579514,
-0.085355244576931,
0.1405765563249588,
0.1209213137626648,
0.01695120707154274,
-0.07339349389076233,
-0.11669370532035828,
-0.01980414427816868,
0.08807753771543503,
-0.08327510952949524,
-0.03043452277779579
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# whisper-small-malayalam
This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1600
- Wer: 42.0254
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- training_steps: 2000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:-------:|
| 0.0098 | 2.31 | 500 | 0.1596 | 46.6036 |
| 0.0038 | 4.63 | 1000 | 0.1561 | 44.5882 |
| 0.0007 | 6.94 | 1500 | 0.1534 | 41.9259 |
| 0.0001 | 9.26 | 2000 | 0.1600 | 42.0254 |
### Framework versions
- Transformers 4.37.2
- Pytorch 2.1.1+cu121
- Datasets 2.16.1
- Tokenizers 0.15.1
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "metrics": ["wer"], "base_model": "openai/whisper-small", "model-index": [{"name": "whisper-small-malayalam", "results": []}]} | automatic-speech-recognition | Bajiyo/whisper-small-malayalam | [
"transformers",
"tensorboard",
"safetensors",
"whisper",
"automatic-speech-recognition",
"generated_from_trainer",
"base_model:openai/whisper-small",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2024-02-15T06:03:38+00:00 | [] | [] | TAGS
#transformers #tensorboard #safetensors #whisper #automatic-speech-recognition #generated_from_trainer #base_model-openai/whisper-small #license-apache-2.0 #endpoints_compatible #region-us
| whisper-small-malayalam
=======================
This model is a fine-tuned version of openai/whisper-small on the None dataset.
It achieves the following results on the evaluation set:
* Loss: 0.1600
* Wer: 42.0254
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 1e-05
* train\_batch\_size: 16
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 500
* training\_steps: 2000
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.37.2
* Pytorch 2.1.1+cu121
* Datasets 2.16.1
* Tokenizers 0.15.1
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* training\\_steps: 2000\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.1.1+cu121\n* Datasets 2.16.1\n* Tokenizers 0.15.1"
] | [
"TAGS\n#transformers #tensorboard #safetensors #whisper #automatic-speech-recognition #generated_from_trainer #base_model-openai/whisper-small #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* training\\_steps: 2000\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.1.1+cu121\n* Datasets 2.16.1\n* Tokenizers 0.15.1"
] | [
69,
130,
4,
33
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #whisper #automatic-speech-recognition #generated_from_trainer #base_model-openai/whisper-small #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* training\\_steps: 2000\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.1.1+cu121\n* Datasets 2.16.1\n* Tokenizers 0.15.1"
] | [
-0.12776117026805878,
0.13606151938438416,
-0.00232751970179379,
0.08276759088039398,
0.10686745494604111,
0.004357135389000177,
0.15126881003379822,
0.1297352910041809,
-0.039641160517930984,
0.08475792407989502,
0.10067412257194519,
0.08730136603116989,
0.05420996993780136,
0.16286642849445343,
-0.047568429261446,
-0.2572926878929138,
0.02560974657535553,
-0.007698305416852236,
-0.03771281614899635,
0.11808313429355621,
0.07870247215032578,
-0.12092115730047226,
0.03472428023815155,
-0.011193093843758106,
-0.11298304796218872,
-0.04072005674242973,
-0.005558764096349478,
-0.06977112591266632,
0.12164813280105591,
0.001508561777882278,
0.0945739820599556,
0.05975979566574097,
0.0954371988773346,
-0.22450073063373566,
0.014749865978956223,
0.039827730506658554,
0.036322079598903656,
0.07144131511449814,
0.04253971204161644,
0.0027556836139410734,
0.04320909455418587,
-0.06695936620235443,
0.0765455961227417,
0.023683743551373482,
-0.10102551430463791,
-0.23880819976329803,
-0.08348187059164047,
0.04910153150558472,
0.09931395202875137,
0.08137135207653046,
-0.014723594300448895,
0.1015542522072792,
-0.058033764362335205,
0.09453576803207397,
0.2462919056415558,
-0.28863003849983215,
-0.05028994381427765,
-0.03406450152397156,
0.035782407969236374,
0.0810505673289299,
-0.08982077986001968,
-0.03512892127037048,
0.03513304889202118,
0.03556611388921738,
0.10008468478918076,
-0.00017487439617980272,
-0.09512095153331757,
-0.018625877797603607,
-0.15178127586841583,
-0.06400691717863083,
0.12267617881298065,
0.02237650752067566,
-0.04485916346311569,
-0.1001952588558197,
-0.05450086668133736,
-0.19071811437606812,
-0.05234246701002121,
0.006398417521268129,
0.027814265340566635,
-0.04488717019557953,
-0.08466114848852158,
-0.0096859410405159,
-0.09187217056751251,
-0.0960383489727974,
-0.016705337911844254,
0.1620178520679474,
0.04448310285806656,
-0.0009873637463897467,
-0.004883807152509689,
0.0875563994050026,
0.020281396806240082,
-0.1526169627904892,
-0.004791418090462685,
0.03767590969800949,
-0.04482368752360344,
-0.009397639892995358,
-0.044537197798490524,
-0.01853407919406891,
0.0425664521753788,
0.15140827000141144,
-0.06329111009836197,
0.06337982416152954,
0.0010891754645854235,
0.04768025502562523,
-0.10127323120832443,
0.16938212513923645,
-0.05110226199030876,
-0.010633822530508041,
-0.0008679964230395854,
0.12068202346563339,
0.05330952629446983,
-0.03070376068353653,
-0.0881609171628952,
0.01917707733809948,
0.10380949825048447,
0.05505204573273659,
-0.023021895438432693,
0.0488714762032032,
-0.06144730746746063,
-0.00679780961945653,
-0.01280431542545557,
-0.12039536237716675,
0.006191304884850979,
0.03261272981762886,
-0.056425340473651886,
-0.07307158410549164,
0.020247865468263626,
0.016467232257127762,
-0.02423911541700363,
0.08202379196882248,
-0.061717040836811066,
0.00014885433483868837,
-0.05728435143828392,
-0.10266058146953583,
0.0325007326900959,
-0.0940694734454155,
-0.005691071972250938,
-0.08690973371267319,
-0.1273786872625351,
-0.030000701546669006,
0.03444233164191246,
-0.029796438291668892,
-0.06080435588955879,
-0.07694444060325623,
-0.09534702450037003,
0.030079741030931473,
-0.02941996417939663,
0.07157866656780243,
-0.06299062073230743,
0.11179815977811813,
0.019297774881124496,
0.08476021140813828,
0.0003673521860036999,
0.047257281839847565,
-0.074834905564785,
0.04118431732058525,
-0.15505559742450714,
0.0727795735001564,
-0.10137011110782623,
0.061665501445531845,
-0.10174312442541122,
-0.09275025874376297,
0.021723061800003052,
-0.012771951965987682,
0.10791975259780884,
0.11884893476963043,
-0.20243462920188904,
-0.07368645071983337,
0.20757870376110077,
-0.12108142673969269,
-0.11078949272632599,
0.14440561830997467,
-0.020777003839612007,
-0.002067710505798459,
0.03950083628296852,
0.24121427536010742,
0.09227573871612549,
-0.11518640071153641,
-0.007259710691869259,
-0.009225782938301563,
0.07065226137638092,
-0.03152981027960777,
0.08590222150087357,
-0.010805482976138592,
0.0286283977329731,
0.018906164914369583,
-0.03349333629012108,
0.05054205656051636,
-0.09402649849653244,
-0.0973798930644989,
-0.02709284797310829,
-0.0974915400147438,
0.03836072236299515,
0.04507994279265404,
0.032773371785879135,
-0.10289044678211212,
-0.08861078321933746,
0.0040205372497439384,
0.11348801106214523,
-0.09729371219873428,
0.0237712524831295,
-0.10101331770420074,
0.0950651541352272,
-0.020020300522446632,
-0.02210228517651558,
-0.16510428488254547,
-0.008972522802650928,
0.0375509150326252,
-0.035921357572078705,
0.022756949067115784,
-0.061707597225904465,
0.06837517023086548,
0.06630542874336243,
-0.027799366042017937,
-0.05010436847805977,
-0.027341369539499283,
-0.0007042569341138005,
-0.0890171155333519,
-0.2223653495311737,
-0.05013313516974449,
-0.04089302197098732,
0.16694603860378265,
-0.18252046406269073,
0.030897099524736404,
0.01569278910756111,
0.08996020257472992,
0.03730589151382446,
-0.03346436098217964,
0.007245006505399942,
0.08296691626310349,
-0.007572465110570192,
-0.054996609687805176,
0.05946386605501175,
0.03249238058924675,
-0.10008732974529266,
0.03159324824810028,
-0.16883356869220734,
0.10249935835599899,
0.12077748030424118,
0.008632492274045944,
-0.06959741562604904,
-0.02072242461144924,
-0.06272143125534058,
-0.03970817103981972,
-0.02121734991669655,
0.016182681545615196,
0.18449744582176208,
0.0038132851477712393,
0.127192884683609,
-0.09599384665489197,
-0.04472047835588455,
0.03512420877814293,
-0.03185386583209038,
0.003947099205106497,
0.13109475374221802,
0.0516500398516655,
-0.04765698313713074,
0.12393146753311157,
0.0869688093662262,
-0.08529891818761826,
0.14348796010017395,
-0.07124973833560944,
-0.07767350226640701,
-0.018484612926840782,
0.03498080372810364,
0.028565797954797745,
0.1251739263534546,
-0.13292928040027618,
-0.014228393323719501,
0.023239562287926674,
-0.0039087277837097645,
0.02901988849043846,
-0.22442471981048584,
-0.015249531716108322,
0.01920229010283947,
-0.08114198595285416,
-0.04149283468723297,
-0.01576867885887623,
0.015335693955421448,
0.10891745239496231,
-0.004736925940960646,
-0.08358263224363327,
0.0028838810976594687,
-0.028054434806108475,
-0.08611143380403519,
0.191089928150177,
-0.09943356364965439,
-0.16960321366786957,
-0.12002906203269958,
-0.018087785691022873,
-0.001600927673280239,
0.003044688142836094,
0.05614785850048065,
-0.07997831702232361,
-0.0161751639097929,
-0.10202135890722275,
-0.011759014800190926,
0.02705729939043522,
0.035583026707172394,
0.03934047743678093,
0.01975017599761486,
0.07577972114086151,
-0.1023569405078888,
0.000741401279810816,
-0.05416521430015564,
-0.029123565182089806,
0.04368114843964577,
0.04168003052473068,
0.09466354548931122,
0.17791910469532013,
0.0016121986554935575,
0.01590842753648758,
-0.03981652483344078,
0.18642593920230865,
-0.08224669098854065,
-0.04550119861960411,
0.1238507330417633,
-0.043916910886764526,
0.05797933042049408,
0.17969581484794617,
0.040238700807094574,
-0.09169923514127731,
0.0015821130946278572,
-0.001437186961993575,
-0.04333271458745003,
-0.2272864282131195,
-0.06237463280558586,
-0.0470476970076561,
0.04752019792795181,
0.09099925309419632,
0.02599846012890339,
-0.011428888887166977,
0.03275631368160248,
0.002531913109123707,
-0.023382266983389854,
0.00898708961904049,
0.06200132891535759,
0.09173615276813507,
0.015841780230402946,
0.12359099835157394,
-0.032029736787080765,
-0.045928020030260086,
0.022923335433006287,
0.007825127802789211,
0.24607044458389282,
-0.02426513284444809,
0.1788320094347,
0.056600864976644516,
0.15579140186309814,
0.04064067825675011,
0.06130838021636009,
-0.004092433489859104,
-0.009674654342234135,
0.011600147932767868,
-0.057945940643548965,
-0.05944449082016945,
0.02430354431271553,
-0.005880315322428942,
0.03498680144548416,
-0.12001261860132217,
0.02785523422062397,
0.031121687963604927,
0.32277581095695496,
0.07746770232915878,
-0.3063572347164154,
-0.10545191168785095,
0.002586309565231204,
-0.0681617483496666,
-0.011534875258803368,
0.04160446301102638,
0.17425832152366638,
-0.0601758249104023,
0.022895170375704765,
-0.05495338514447212,
0.07585810124874115,
-0.08056040108203888,
0.022023290395736694,
0.04668402299284935,
0.07223870605230331,
0.005508228205144405,
0.0467875637114048,
-0.23634850978851318,
0.3153095841407776,
-0.009275534190237522,
0.07079334557056427,
-0.05451441928744316,
0.0020706215873360634,
0.020034821704030037,
-0.01978863589465618,
0.10296457260847092,
-0.011194122955203056,
-0.09732365608215332,
-0.1970268338918686,
-0.12311599403619766,
0.026695014908909798,
0.13100050389766693,
-0.005815522279590368,
0.11037252843379974,
-0.023927615955471992,
-0.01223700400441885,
0.04797739535570145,
-0.08969535678625107,
-0.04724080488085747,
-0.09321824461221695,
0.01599295251071453,
0.0584191158413887,
-0.00608522305265069,
-0.0938965454697609,
-0.10931586474180222,
-0.06926843523979187,
0.11030904948711395,
-0.09343087673187256,
-0.04282788187265396,
-0.1054585725069046,
0.03843660652637482,
0.13263554871082306,
-0.08271624892950058,
0.04625808075070381,
0.02234782837331295,
0.1042531207203865,
0.00621121097356081,
-0.045284248888492584,
0.08900455385446548,
-0.08473937958478928,
-0.21063043177127838,
-0.05154825747013092,
0.17737480998039246,
0.02713116444647312,
0.07588663697242737,
-0.010107386857271194,
0.022247081622481346,
-0.001515236683189869,
-0.0741371363401413,
0.04278463125228882,
0.034607335925102234,
0.023872042074799538,
0.027738552540540695,
-0.03978165239095688,
-0.03368287906050682,
-0.07915952801704407,
-0.033340852707624435,
0.1929234117269516,
0.272349089384079,
-0.093129962682724,
0.08932136744260788,
0.09069806337356567,
-0.0500466525554657,
-0.22083960473537445,
-0.018482064828276634,
0.10201194137334824,
0.02363823913037777,
-0.013061215169727802,
-0.1820429265499115,
0.05797795578837395,
0.06793829798698425,
-0.04224466532468796,
0.07760162651538849,
-0.3306596279144287,
-0.13696162402629852,
0.11927846819162369,
0.11374948173761368,
0.06990470737218857,
-0.1405540555715561,
-0.05698799341917038,
0.002892110263928771,
-0.07919526845216751,
0.09183082729578018,
-0.07693357020616531,
0.13554322719573975,
-0.002352217212319374,
0.05971897393465042,
0.026434777304530144,
-0.06732401251792908,
0.1176833063364029,
-0.012073508463799953,
0.06110931932926178,
-0.04597484692931175,
-0.013373270630836487,
0.014255371876060963,
-0.05187907814979553,
0.028333432972431183,
-0.08011763542890549,
0.05228301137685776,
-0.06038910150527954,
-0.02926723100244999,
-0.0759902074933052,
0.012421239167451859,
-0.020985456183552742,
-0.04274914041161537,
-0.009574956260621548,
0.04402665048837662,
0.059237923473119736,
0.0009532521944493055,
0.09164077788591385,
-0.02720256894826889,
0.1456451267004013,
0.12213888019323349,
0.11636532098054886,
-0.05450991541147232,
-0.030296284705400467,
-0.0011349651031196117,
-0.032020095735788345,
0.0504484660923481,
-0.10837320983409882,
0.0332881323993206,
0.13956156373023987,
0.05031459406018257,
0.12734785676002502,
0.0657443255186081,
-0.0567045621573925,
0.027522480115294456,
0.06687374413013458,
-0.13585321605205536,
-0.14884378015995026,
0.004054730292409658,
0.033902980387210846,
-0.12945473194122314,
0.05100757256150246,
0.11218052357435226,
-0.06232759729027748,
-0.005742673762142658,
-0.013396316207945347,
0.02828485146164894,
-0.02797980234026909,
0.21075233817100525,
0.045986760407686234,
0.07662905007600784,
-0.1077532097697258,
0.08270230889320374,
0.0458989292383194,
-0.11608757823705673,
0.05038013309240341,
0.09377313405275345,
-0.08540189266204834,
-0.03548309579491615,
0.009639820083975792,
0.08665414154529572,
0.03808923810720444,
-0.06031403690576553,
-0.1341927945613861,
-0.1329856812953949,
0.0796988233923912,
0.20124214887619019,
0.05627068877220154,
0.031812649220228195,
-0.022937702015042305,
0.019090164452791214,
-0.11286208033561707,
0.11330655962228775,
0.047169677913188934,
0.051180168986320496,
-0.12595632672309875,
0.16282285749912262,
0.0030140704475343227,
0.03714451566338539,
-0.021210838109254837,
0.01083844993263483,
-0.12184755504131317,
0.02311980351805687,
-0.10724332928657532,
0.023746414110064507,
-0.04708239436149597,
0.0017052080947905779,
0.004107004031538963,
-0.04672488570213318,
-0.0589267872273922,
0.02900012955069542,
-0.1067095622420311,
-0.020412029698491096,
-0.010237016715109348,
0.04141968861222267,
-0.111182801425457,
-0.032013580203056335,
0.034888580441474915,
-0.09293919801712036,
0.11123225092887878,
0.06308282166719437,
0.000059062567743239924,
0.038175519555807114,
-0.13034243881702423,
-0.015562153421342373,
0.05442613363265991,
0.018421173095703125,
0.02736382558941841,
-0.1287379264831543,
-0.020917266607284546,
-0.00934668630361557,
0.008707784116268158,
0.011082574725151062,
0.100953608751297,
-0.11975949257612228,
0.008982295170426369,
-0.011205972172319889,
-0.03988337144255638,
-0.07106675207614899,
0.01978728175163269,
0.07658545672893524,
0.03194347769021988,
0.14324626326560974,
-0.10139519721269608,
0.05538789927959442,
-0.20698115229606628,
0.00006229267455637455,
-0.015148707665503025,
-0.08662473410367966,
-0.09971007704734802,
-0.023223401978611946,
0.08828352391719818,
-0.0610983781516552,
0.11502404510974884,
-0.040680643171072006,
0.02125297300517559,
0.02149706892669201,
-0.057692330330610275,
0.014047862961888313,
0.0512724369764328,
0.20374047756195068,
0.04185057058930397,
-0.04820476844906807,
0.05098866671323776,
-0.00016877340385690331,
0.06337558478116989,
0.09634154289960861,
0.1707257330417633,
0.1707025021314621,
0.07157432287931442,
0.07813012599945068,
0.09407215565443039,
-0.07535985112190247,
-0.1521703451871872,
0.07601912319660187,
-0.05581674352288246,
0.10074062645435333,
-0.020155617967247963,
0.24626074731349945,
0.11088575422763824,
-0.14930541813373566,
0.05682078003883362,
-0.03748087212443352,
-0.08616089075803757,
-0.11732255667448044,
-0.0733446255326271,
-0.08691525459289551,
-0.13671466708183289,
0.005790165159851313,
-0.12004352360963821,
0.018193060532212257,
0.09535615891218185,
0.027621012181043625,
0.02451455034315586,
0.12968400120735168,
0.023253990337252617,
0.038971997797489166,
0.08495576679706573,
0.011787241324782372,
-0.013660253025591373,
-0.06357067078351974,
-0.10285620391368866,
0.0588437095284462,
0.000870916002895683,
0.04852984845638275,
-0.03470080718398094,
-0.06610317528247833,
0.05155221372842789,
-0.014444954693317413,
-0.10880828648805618,
0.01934417523443699,
-0.0066382382065057755,
0.08250077813863754,
0.05407236889004707,
0.038313496857881546,
-0.009000240825116634,
-0.0043706512078642845,
0.23436997830867767,
-0.08891408890485764,
-0.12312374264001846,
-0.12945327162742615,
0.23065908253192902,
-0.012082194909453392,
-0.01689445599913597,
0.026060407981276512,
-0.07895462214946747,
-0.023772312328219414,
0.19276300072669983,
0.19459301233291626,
-0.05453220382332802,
0.01167185977101326,
0.0024626196827739477,
-0.003917592577636242,
-0.06354603171348572,
0.09196329861879349,
0.13762864470481873,
0.11806851625442505,
-0.07329854369163513,
-0.06466121226549149,
-0.03772195428609848,
-0.026697823777794838,
-0.04397518187761307,
0.06740666925907135,
0.003923838958144188,
-0.000238506865571253,
-0.052988287061452866,
0.06956528127193451,
-0.09091604501008987,
-0.09342373162508011,
0.008070841431617737,
-0.2295123189687729,
-0.1775086224079132,
-0.0155086200684309,
0.08824997395277023,
0.024300480261445045,
0.030731739476323128,
-0.006785392761230469,
0.00022156667546369135,
0.05698651447892189,
-0.015072232112288475,
-0.05882212519645691,
-0.05368027091026306,
0.0636240616440773,
-0.09004788100719452,
0.19911327958106995,
-0.03746224194765091,
0.0731646940112114,
0.1164688840508461,
0.07004940509796143,
-0.08984429389238358,
0.08028876781463623,
0.06866908073425293,
-0.1259555071592331,
0.026697641238570213,
0.17095313966274261,
-0.04910691827535629,
0.11544375866651535,
0.05367428436875343,
-0.12311545759439468,
0.00974863301962614,
-0.05989263579249382,
-0.05424314737319946,
-0.05340905487537384,
-0.025038938969373703,
-0.04809851571917534,
0.13343489170074463,
0.18343691527843475,
-0.0647825226187706,
-0.011044503189623356,
-0.03913973644375801,
-0.00647520599886775,
0.04358904808759689,
0.05149547755718231,
-0.030737904831767082,
-0.2679321765899658,
0.010025684721767902,
-0.001964470837265253,
0.011701274663209915,
-0.26075905561447144,
-0.07637383788824081,
0.012686184607446194,
-0.04885699227452278,
-0.08741646260023117,
0.08213591575622559,
0.10486840456724167,
0.04520322009921074,
-0.051071226596832275,
-0.049802426248788834,
-0.027577867731451988,
0.17391453683376312,
-0.16273991763591766,
-0.08036122471094131
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | null | Eugenememe/trocr10 | [
"transformers",
"safetensors",
"vision-encoder-decoder",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-15T06:04:38+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #vision-encoder-decoder #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #vision-encoder-decoder #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
39,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #vision-encoder-decoder #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.05306316912174225,
0.20225799083709717,
-0.004532716237008572,
0.02486577071249485,
0.10745619982481003,
0.005829503294080496,
0.06235508993268013,
0.11292294412851334,
-0.0020419019274413586,
0.12131396681070328,
0.02719549834728241,
0.07967612892389297,
0.12327883392572403,
0.1561002880334854,
0.003298027440905571,
-0.2451947182416916,
0.056976594030857086,
-0.09095743298530579,
0.005862638354301453,
0.11416777968406677,
0.1328127682209015,
-0.10651655495166779,
0.09236611425876617,
-0.006378492806106806,
-0.018223589286208153,
-0.01560811698436737,
-0.07464081048965454,
-0.06576249748468399,
0.05573050677776337,
0.07688132673501968,
0.0721992626786232,
0.009551521390676498,
0.07658318430185318,
-0.2803225517272949,
0.014262731187045574,
0.08472383767366409,
0.001679662149399519,
0.06746198982000351,
0.08127366751432419,
-0.06743600219488144,
0.1253480762243271,
-0.072596974670887,
0.1388878971338272,
0.07806979864835739,
-0.09000035375356674,
-0.19567735493183136,
-0.06588542461395264,
0.06967565417289734,
0.13176710903644562,
0.05219413340091705,
-0.02802334912121296,
0.13226838409900665,
-0.09170275926589966,
0.009603562764823437,
0.11835573613643646,
-0.0669531598687172,
-0.05480296164751053,
0.036625828593969345,
0.09896823018789291,
0.08901896327733994,
-0.11835762113332748,
-0.005250310990959406,
0.02971211075782776,
0.02317153476178646,
0.08707734197378159,
0.01639537699520588,
0.14948242902755737,
0.03726613521575928,
-0.14019928872585297,
-0.05744593217968941,
0.0939771831035614,
0.03824683278799057,
-0.04958764836192131,
-0.2343815118074417,
-0.031156767159700394,
-0.016557393595576286,
-0.0305255725979805,
-0.04024772346019745,
0.05501044541597366,
-0.03525097668170929,
0.0778101459145546,
-0.00980226881802082,
-0.08041384816169739,
-0.028304021805524826,
0.04802321270108223,
0.06495635211467743,
0.0185654629021883,
-0.0052207582630217075,
0.02111206203699112,
0.11669638752937317,
0.0788947269320488,
-0.13250376284122467,
-0.07449234277009964,
-0.07420225441455841,
-0.098720021545887,
-0.04206917807459831,
0.035802435129880905,
0.07063398510217667,
0.03907659277319908,
0.1947319507598877,
-0.017316637560725212,
0.0492829903960228,
0.04488114267587662,
0.006694390904158354,
0.06837808340787888,
0.11208193004131317,
-0.06981375813484192,
-0.1344674974679947,
-0.058068543672561646,
0.0876275971531868,
-0.004179352894425392,
-0.03578972443938255,
-0.050102028995752335,
0.04014993831515312,
0.030947856605052948,
0.11303774267435074,
0.08247632533311844,
0.00018158231978304684,
-0.06263403594493866,
-0.042954958975315094,
0.22251743078231812,
-0.14639760553836823,
0.04202887415885925,
0.005975429899990559,
-0.04316512867808342,
-0.004425262566655874,
0.010939684696495533,
0.012188587337732315,
-0.037748441100120544,
0.10074765980243683,
-0.07618969678878784,
-0.034445326775312424,
-0.11386469751596451,
-0.0645400807261467,
0.02648116648197174,
0.004736251663416624,
-0.021350713446736336,
-0.043291062116622925,
-0.11405764520168304,
-0.04943132400512695,
0.07177776843309402,
-0.07382809370756149,
-0.05491260066628456,
0.011223746463656425,
-0.053137388080358505,
0.004582186229526997,
0.00267049134708941,
0.1107012927532196,
-0.032886769622564316,
0.02474004216492176,
-0.046873610466718674,
0.06870805472135544,
0.10572899132966995,
0.037948183715343475,
-0.08262749016284943,
0.07297130674123764,
-0.23081135749816895,
0.10601279884576797,
-0.0860590934753418,
0.020004073157906532,
-0.14292661845684052,
-0.04098188504576683,
0.028510602191090584,
0.027961768209934235,
-0.010129709728062153,
0.1261758655309677,
-0.2058180272579193,
-0.030243845656514168,
0.14840680360794067,
-0.11589378863573074,
-0.09422067552804947,
0.06595603376626968,
-0.05322439596056938,
0.10703813284635544,
0.04694817587733269,
-0.023595338687300682,
0.07076855003833771,
-0.1317324936389923,
-0.04631568118929863,
-0.021216953173279762,
-0.014257868751883507,
0.1454293578863144,
0.06817390769720078,
-0.05359628424048424,
0.07706184685230255,
0.021635930985212326,
-0.037367887794971466,
-0.03435307368636131,
-0.03299751877784729,
-0.0921449214220047,
0.006188652943819761,
-0.06746368855237961,
0.029415255412459373,
-0.021083641797304153,
-0.09160202741622925,
-0.0303809754550457,
-0.1750536859035492,
0.03627307340502739,
0.08193549513816833,
0.006318137980997562,
-0.0197709072381258,
-0.09286735206842422,
0.01832297258079052,
-0.012543086893856525,
-0.01892191916704178,
-0.16171851754188538,
-0.04700363799929619,
0.04258821904659271,
-0.20086339116096497,
0.01967989094555378,
-0.03661181032657623,
0.04892909526824951,
0.03480081260204315,
-0.04064054414629936,
-0.008631768636405468,
0.003335281042382121,
0.015725387260317802,
-0.024876078590750694,
-0.20007483661174774,
-0.030333993956446648,
-0.02597803995013237,
0.13431528210639954,
-0.22246739268302917,
0.02835940755903721,
0.08417746424674988,
0.1434013694524765,
0.0016660679830238223,
-0.04368278756737709,
0.014076939783990383,
-0.054847393184900284,
-0.05226233974099159,
-0.06906641274690628,
-0.006392029579728842,
-0.03333830088376999,
-0.03956965357065201,
0.07096927613019943,
-0.20027990639209747,
-0.04262993112206459,
0.10826600342988968,
0.09837738424539566,
-0.14323553442955017,
-0.02474437840282917,
-0.04144697263836861,
-0.061835117638111115,
-0.09049158543348312,
-0.06381296366453171,
0.1425853669643402,
0.04930912330746651,
0.05233171954751015,
-0.08628004789352417,
-0.06071622669696808,
0.010883725248277187,
-0.00011982241994701326,
-0.04022165387868881,
0.08589175343513489,
0.08529563993215561,
-0.1104804202914238,
0.09131632000207901,
0.08476598560810089,
0.06810380518436432,
0.10764316469430923,
0.0015695258043706417,
-0.10674308240413666,
-0.02863943576812744,
0.007197246421128511,
0.014064288698136806,
0.14327655732631683,
-0.04052828997373581,
0.048501238226890564,
0.05552942305803299,
-0.026791011914610863,
0.01846246048808098,
-0.1073673665523529,
0.03181065618991852,
0.047323208302259445,
-0.009707847610116005,
0.022087840363383293,
-0.03469701483845711,
0.029233038425445557,
0.08711127936840057,
0.035028502345085144,
0.029371140524744987,
0.006271174643188715,
-0.035827167332172394,
-0.10475867241621017,
0.17433254420757294,
-0.0890800803899765,
-0.2983497679233551,
-0.1401464194059372,
-0.0031294787768274546,
0.04853705316781998,
-0.02189650572836399,
0.011501757428050041,
-0.04715869575738907,
-0.11619791388511658,
-0.10501103848218918,
0.008872180245816708,
0.04236939176917076,
-0.07721052318811417,
-0.0677536204457283,
0.049645159393548965,
0.03511786088347435,
-0.1394437551498413,
0.02193400263786316,
0.04976930096745491,
-0.03652774170041084,
-0.014805521816015244,
0.07396114617586136,
0.1029050275683403,
0.17309242486953735,
-0.006633569020777941,
-0.017617538571357727,
0.023926623165607452,
0.24285922944545746,
-0.14599764347076416,
0.10973547399044037,
0.15899382531642914,
-0.06477009505033493,
0.10309092700481415,
0.19795724749565125,
0.023503681644797325,
-0.07609159499406815,
0.03380175679922104,
0.03942976891994476,
-0.05477331578731537,
-0.22650650143623352,
-0.06247439235448837,
-0.0018889455823227763,
-0.07084330171346664,
0.0898994579911232,
0.09043484926223755,
0.10881193727254868,
0.04595636948943138,
-0.08826620131731033,
-0.06884618103504181,
0.018320003524422646,
0.10980518162250519,
-0.021245790645480156,
0.0068580652587115765,
0.08790436387062073,
-0.04739201068878174,
-0.004643888212740421,
0.10682045668363571,
0.012494737282395363,
0.19045358896255493,
0.026300430297851562,
0.1511247158050537,
0.0705496221780777,
0.02818082831799984,
0.028634218499064445,
0.01902483031153679,
0.02658020332455635,
0.008844421245157719,
-0.017161816358566284,
-0.08914987742900848,
0.024366190657019615,
0.1353050321340561,
0.07279488444328308,
0.03349636495113373,
0.02151625230908394,
-0.03378571942448616,
0.06302186101675034,
0.16916872560977936,
0.010774882510304451,
-0.22069227695465088,
-0.038822758942842484,
0.08901690691709518,
-0.07502853125333786,
-0.12661625444889069,
-0.02531552128493786,
0.041314706206321716,
-0.17944134771823883,
0.04614526778459549,
-0.01675923727452755,
0.11310327053070068,
-0.12792818248271942,
-0.027826640754938126,
0.0399215929210186,
0.08714547008275986,
-0.030904775485396385,
0.07850778102874756,
-0.16933031380176544,
0.11483496427536011,
0.012726574204862118,
0.06142592057585716,
-0.11495199054479599,
0.09937658905982971,
0.012870636768639088,
-0.004057616461068392,
0.166650652885437,
-0.0004076457116752863,
-0.07127676159143448,
-0.06420327723026276,
-0.0747542679309845,
-0.021446911618113518,
0.09447984397411346,
-0.11176029592752457,
0.08143189549446106,
-0.014278407208621502,
-0.038768354803323746,
0.002094640163704753,
-0.1095178872346878,
-0.12445959448814392,
-0.19356580078601837,
0.06069660931825638,
-0.10815826058387756,
0.003831093432381749,
-0.09925412386655807,
-0.05358343943953514,
-0.04677683115005493,
0.20124182105064392,
-0.14483362436294556,
-0.09758627414703369,
-0.15289589762687683,
-0.09554614871740341,
0.1664566993713379,
-0.04663718119263649,
0.08821606636047363,
-0.0038687095511704683,
0.22914768755435944,
0.006724040023982525,
-0.012434800155460835,
0.07520399242639542,
-0.08637748658657074,
-0.17783690989017487,
-0.07604682445526123,
0.12165174633264542,
0.12136691808700562,
0.04747939854860306,
-0.012795967981219292,
0.021366307511925697,
-0.03262670710682869,
-0.1143370196223259,
0.008061125874519348,
0.12403146922588348,
0.05908475071191788,
0.04311663657426834,
0.00427021412178874,
-0.11007828265428543,
-0.07147429138422012,
-0.03523124009370804,
0.021944832056760788,
0.18807166814804077,
-0.08221416175365448,
0.15053938329219818,
0.12907743453979492,
-0.05370299890637398,
-0.21416690945625305,
0.03436880186200142,
0.04090806469321251,
0.00472818361595273,
0.053175248205661774,
-0.1770489513874054,
0.07840683311223984,
0.024572648108005524,
-0.05139078199863434,
0.15225622057914734,
-0.1677834391593933,
-0.1536329686641693,
0.0777980238199234,
0.05417194589972496,
-0.21989625692367554,
-0.12022100389003754,
-0.08520790934562683,
-0.06769770383834839,
-0.14243635535240173,
0.08412115275859833,
0.020698823034763336,
0.00019321028958074749,
0.04856256768107414,
0.034081753343343735,
0.019389502704143524,
-0.04823823645710945,
0.21962693333625793,
-0.009151222184300423,
0.036160267889499664,
-0.07677234709262848,
-0.09645134955644608,
0.07220485061407089,
-0.054490040987730026,
0.08667207509279251,
-0.024035243317484856,
0.007558862213045359,
-0.07708337903022766,
-0.05522387474775314,
-0.051626816391944885,
0.029731709510087967,
-0.07851403951644897,
-0.1059325784444809,
-0.0698300376534462,
0.09317977726459503,
0.09290435165166855,
-0.03390717878937721,
-0.03715480491518974,
-0.09010928869247437,
0.027843475341796875,
0.20326566696166992,
0.1706404834985733,
0.05219545215368271,
-0.10166811943054199,
0.0006855534156784415,
-0.017444688826799393,
0.04213650897145271,
-0.21454355120658875,
0.04805922508239746,
0.046434495598077774,
0.023125024512410164,
0.11990387737751007,
-0.016930051147937775,
-0.16412340104579926,
-0.04671873524785042,
0.056455835700035095,
-0.036497533321380615,
-0.20693081617355347,
-0.012548294849693775,
0.052488550543785095,
-0.18057814240455627,
-0.0640311911702156,
0.01719793491065502,
-0.01455759722739458,
-0.024732910096645355,
0.013886804692447186,
0.06330747157335281,
0.027956031262874603,
0.09375539422035217,
0.05616210401058197,
0.10073219239711761,
-0.11451173573732376,
0.08603792637586594,
0.08968610316514969,
-0.09007474035024643,
0.0125290397554636,
0.0736992210149765,
-0.055889032781124115,
-0.02316245622932911,
0.020047122612595558,
0.06119868531823158,
-0.00162112049292773,
-0.061508238315582275,
-0.01975059136748314,
-0.11018196493387222,
0.06758071482181549,
0.13242226839065552,
0.04006093740463257,
-0.005321177653968334,
0.048064373433589935,
0.021657899022102356,
-0.0826604813337326,
0.11298491060733795,
0.025856876745820045,
0.038508445024490356,
-0.06642783433198929,
-0.023183433338999748,
0.045513976365327835,
0.008002778515219688,
-0.020539432764053345,
-0.02922571264207363,
-0.05385620519518852,
-0.011762911453843117,
-0.18990042805671692,
0.01677597686648369,
-0.0765504315495491,
0.004350012633949518,
0.014810539782047272,
-0.03814086318016052,
-0.020958559587597847,
0.016691848635673523,
-0.07982292026281357,
-0.05134430527687073,
-0.002734045498073101,
0.0985812395811081,
-0.13863937556743622,
0.007181720342487097,
0.08835478872060776,
-0.11878933757543564,
0.06740869581699371,
-0.024482958018779755,
-0.017330501228570938,
-0.0019446499645709991,
-0.12968114018440247,
0.04302144795656204,
0.002646980108693242,
0.01983785443007946,
0.042118169367313385,
-0.16895927488803864,
0.006406146101653576,
-0.04021742194890976,
-0.04932057484984398,
-0.01649690791964531,
-0.07501037418842316,
-0.11416008323431015,
0.11071296036243439,
0.0018048452911898494,
-0.07790763676166534,
-0.0117954695597291,
0.05156298726797104,
0.10936092585325241,
-0.037753306329250336,
0.1212289109826088,
0.0044145528227090836,
0.06474429368972778,
-0.18044467270374298,
-0.024778762832283974,
-0.015598369762301445,
0.006858370266854763,
0.02694357931613922,
-0.0140616400167346,
0.042949724942445755,
-0.013948258012533188,
0.2575705349445343,
-0.02087925560772419,
0.07440228760242462,
0.06519795209169388,
0.046770140528678894,
0.011650492437183857,
0.0869150459766388,
0.06678774207830429,
0.012648052535951138,
0.003416551509872079,
0.032283470034599304,
-0.031422972679138184,
-0.014915283769369125,
-0.1482047736644745,
0.07101717591285706,
0.1439564973115921,
0.08283385634422302,
0.012806775979697704,
0.06352550536394119,
-0.10048910230398178,
-0.10461166501045227,
0.08237126469612122,
-0.041638992726802826,
-0.0008007619762793183,
-0.058745238929986954,
0.1454916000366211,
0.15373669564723969,
-0.17184817790985107,
0.0812029168009758,
-0.0365472249686718,
-0.04692631587386131,
-0.110826775431633,
-0.16450464725494385,
-0.06572620570659637,
-0.02580863982439041,
-0.003703176509588957,
-0.05508402734994888,
0.06688077747821808,
0.11885157972574234,
-0.0016993435565382242,
-0.0014807049883529544,
0.10038914531469345,
-0.022431546822190285,
-0.02085503377020359,
0.03484790027141571,
0.04871930554509163,
0.036450520157814026,
-0.04567375034093857,
0.02037064917385578,
0.010481024160981178,
0.03885771334171295,
0.0584954097867012,
0.024686437100172043,
-0.03216709941625595,
0.015852995216846466,
-0.012883862480521202,
-0.10344603657722473,
0.023220296949148178,
-0.02661588042974472,
-0.07547900825738907,
0.12944786250591278,
0.03020857274532318,
0.016596315428614616,
-0.03272676467895508,
0.20282141864299774,
-0.07233016192913055,
-0.07228796184062958,
-0.14049077033996582,
0.11066530644893646,
-0.03675038740038872,
0.06309209764003754,
0.05746688321232796,
-0.11500853300094604,
-0.005017681512981653,
0.12902788817882538,
0.12829062342643738,
-0.03323068469762802,
0.0049867476336658,
0.026197774335741997,
0.00827173050493002,
-0.049032241106033325,
0.04763645678758621,
0.032188743352890015,
0.15468738973140717,
-0.07100655138492584,
0.07814180850982666,
-0.0005074164946563542,
-0.08921105414628983,
-0.03626048564910889,
0.1392250955104828,
0.0028979976195842028,
0.032514654099941254,
-0.06501611322164536,
0.1007646918296814,
-0.07232671231031418,
-0.22924686968326569,
0.04944430664181709,
-0.0780143067240715,
-0.1530541479587555,
-0.012732122093439102,
0.02754151076078415,
-0.013794543221592903,
0.02254127338528633,
0.061679136008024216,
-0.06457692384719849,
0.16001729667186737,
0.03501654788851738,
-0.08832933753728867,
-0.05492360517382622,
0.07016300410032272,
-0.1023297980427742,
0.29636111855506897,
0.011759842745959759,
0.032544128596782684,
0.10384590178728104,
-0.01977287232875824,
-0.1365358531475067,
0.029908113181591034,
0.09750298410654068,
-0.0958370491862297,
0.0666261613368988,
0.18576137721538544,
-0.01007252186536789,
0.10188479721546173,
0.07690271735191345,
-0.06373509019613266,
0.05595288425683975,
-0.08074036985635757,
-0.06501531600952148,
-0.0913926437497139,
0.0567726232111454,
-0.06534478068351746,
0.14360256493091583,
0.11906343698501587,
-0.03889768570661545,
-0.0050442758947610855,
-0.029765458777546883,
0.041717544198036194,
0.013438636437058449,
0.12334854900836945,
0.013119494542479515,
-0.1562117338180542,
0.029238002374768257,
0.003133314661681652,
0.10249229520559311,
-0.2173282951116562,
-0.08784149587154388,
0.04678759351372719,
-0.03227124735713005,
-0.05141454562544823,
0.1055324450135231,
0.06109399348497391,
0.05229108780622482,
-0.0470384918153286,
-0.05454113334417343,
-0.007119470275938511,
0.1505606472492218,
-0.11762657761573792,
-0.009603551588952541
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | text-generation | Raincleared/prosparse-llama-2-13b | [
"transformers",
"safetensors",
"llama",
"text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-15T06:05:30+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #llama #text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #llama #text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
56,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #llama #text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06061961501836777,
0.15481999516487122,
-0.004844071343541145,
0.02074851468205452,
0.0983177199959755,
0.007407687604427338,
0.07119518518447876,
0.11185134947299957,
-0.023851769044995308,
0.1167980208992958,
0.031993988901376724,
0.09781743586063385,
0.11217817664146423,
0.16186554729938507,
0.0015333457849919796,
-0.22897611558437347,
0.049678247421979904,
-0.125278040766716,
-0.0294334813952446,
0.11977242678403854,
0.1422213912010193,
-0.10954539477825165,
0.0752737894654274,
-0.038042325526475906,
-0.005828251596540213,
-0.0323176346719265,
-0.06205610930919647,
-0.05266609415411949,
0.05311284959316254,
0.06794639676809311,
0.07308239489793777,
0.01171939354389906,
0.09106900542974472,
-0.2724283039569855,
0.02348201349377632,
0.0805930644273758,
-0.0006441773730330169,
0.07586129754781723,
0.04993962123990059,
-0.08749990910291672,
0.07524524629116058,
-0.060156844556331635,
0.1498761922121048,
0.07955671846866608,
-0.09018243104219437,
-0.19217631220817566,
-0.07921334356069565,
0.09916994720697403,
0.1890910118818283,
0.05953684076666832,
-0.026427440345287323,
0.11642678081989288,
-0.08593545109033585,
0.013638701289892197,
0.06446459144353867,
-0.06054406240582466,
-0.055855002254247665,
0.06904532760381699,
0.08335285633802414,
0.08567540347576141,
-0.12976622581481934,
-0.010767064057290554,
0.015032444149255753,
0.008952446281909943,
0.08948688954114914,
0.017146794125437737,
0.1335189938545227,
0.040557652711868286,
-0.13501930236816406,
-0.043155476450920105,
0.09761431813240051,
0.03665134683251381,
-0.04888195917010307,
-0.2485782504081726,
-0.023432478308677673,
-0.04339504987001419,
-0.03198111802339554,
-0.03649339824914932,
0.043764639645814896,
-0.014506848528981209,
0.07738617807626724,
-0.004502781666815281,
-0.0837155357003212,
-0.04301247000694275,
0.07241875678300858,
0.06128999963402748,
0.02571401372551918,
-0.015821760520339012,
0.0059297760017216206,
0.12327717989683151,
0.11431120336055756,
-0.126715749502182,
-0.052547648549079895,
-0.06306339055299759,
-0.08449548482894897,
-0.044861067086458206,
0.030838407576084137,
0.037995077669620514,
0.045936476439237595,
0.23867325484752655,
0.007765117567032576,
0.053257301449775696,
0.04455438256263733,
0.014407169073820114,
0.06501194834709167,
0.11008983850479126,
-0.05894824117422104,
-0.09719445556402206,
-0.028582042083144188,
0.10156717151403427,
0.007986726239323616,
-0.04139331728219986,
-0.05712985619902611,
0.07059531658887863,
0.018587570637464523,
0.12360043078660965,
0.08000938594341278,
0.003056557849049568,
-0.0755772516131401,
-0.062465377151966095,
0.17764076590538025,
-0.15825673937797546,
0.04532013460993767,
0.03055616281926632,
-0.0341108962893486,
-0.009745313785970211,
0.012105142697691917,
0.025474950671195984,
-0.021481726318597794,
0.09522198140621185,
-0.05601342022418976,
-0.034448131918907166,
-0.11389608681201935,
-0.03694311901926994,
0.030394554138183594,
0.011153047904372215,
-0.02865210548043251,
-0.03502652049064636,
-0.08865131437778473,
-0.06405586749315262,
0.09101516753435135,
-0.07148737460374832,
-0.04784895107150078,
-0.016645915806293488,
-0.07833752781152725,
0.021804187446832657,
0.01691517047584057,
0.09064167737960815,
-0.0222476739436388,
0.03985358029603958,
-0.0550384595990181,
0.061440225690603256,
0.11723454296588898,
0.027987057343125343,
-0.05787884071469307,
0.061519939452409744,
-0.2424532175064087,
0.10252492874860764,
-0.07715212553739548,
0.04971238598227501,
-0.15203025937080383,
-0.02478341944515705,
0.03986154496669769,
0.01284773275256157,
-0.008251311257481575,
0.14196595549583435,
-0.21994100511074066,
-0.030957341194152832,
0.16964265704154968,
-0.10025953501462936,
-0.08109250664710999,
0.060782887041568756,
-0.05354252830147743,
0.11210215091705322,
0.04557164013385773,
-0.02375967986881733,
0.05775221437215805,
-0.14725260436534882,
-0.011030761525034904,
-0.041942402720451355,
-0.0180682260543108,
0.16207332909107208,
0.0703711211681366,
-0.06047816202044487,
0.07456906884908676,
0.01960151270031929,
-0.014246034435927868,
-0.04887177795171738,
-0.02822130173444748,
-0.1047162413597107,
0.01184528972953558,
-0.06102835759520531,
0.018109694123268127,
-0.021768750622868538,
-0.09445013850927353,
-0.029118487611413002,
-0.17402999103069305,
-0.0031633328180760145,
0.08821269869804382,
-0.011630427092313766,
-0.021509924903512,
-0.11245372891426086,
0.009332616813480854,
0.030967719852924347,
0.0002618339203763753,
-0.13677829504013062,
-0.06033218279480934,
0.026970699429512024,
-0.16097871959209442,
0.029791243374347687,
-0.05741601809859276,
0.04530094936490059,
0.04005871340632439,
-0.03433511033654213,
-0.03489551320672035,
0.010874404571950436,
0.010431389324367046,
-0.01894843392074108,
-0.25422003865242004,
-0.01882786676287651,
-0.0234990194439888,
0.1751047968864441,
-0.22956320643424988,
0.042598169296979904,
0.07489731162786484,
0.1460893303155899,
0.007349682506173849,
-0.03550100699067116,
0.015185600146651268,
-0.07262228429317474,
-0.03268764168024063,
-0.06316669285297394,
-0.01207790058106184,
-0.038400664925575256,
-0.05820201337337494,
0.04906858503818512,
-0.1686294972896576,
-0.030321966856718063,
0.10717973858118057,
0.06342670321464539,
-0.1473218947649002,
-0.02780107781291008,
-0.04056945815682411,
-0.04624456167221069,
-0.06676914542913437,
-0.05461418256163597,
0.11812574416399002,
0.056411582976579666,
0.04860803112387657,
-0.07140495628118515,
-0.07455260306596756,
0.008036690764129162,
-0.01956399530172348,
-0.014917809516191483,
0.09334591031074524,
0.07554110884666443,
-0.12264352291822433,
0.09177418053150177,
0.09668384492397308,
0.08576478064060211,
0.10314212739467621,
-0.014663571491837502,
-0.08914592862129211,
-0.040637146681547165,
0.02245822176337242,
0.016187267377972603,
0.15129362046718597,
-0.012961224652826786,
0.055492039769887924,
0.0358695350587368,
-0.014034898020327091,
0.011105312965810299,
-0.09736533463001251,
0.02655916102230549,
0.030835967510938644,
-0.016302183270454407,
0.03745110332965851,
-0.0447014644742012,
0.019208140671253204,
0.09039704501628876,
0.040895868092775345,
0.040978945791721344,
0.010155045427381992,
-0.04354988783597946,
-0.11037563532590866,
0.1787576973438263,
-0.12389461696147919,
-0.24818050861358643,
-0.13812170922756195,
0.010281167924404144,
0.04737642779946327,
-0.010411068797111511,
0.006690691225230694,
-0.06616118550300598,
-0.1175973042845726,
-0.09878289699554443,
0.018617089837789536,
0.045352302491664886,
-0.07590975612401962,
-0.06842505931854248,
0.06414616107940674,
0.03875524550676346,
-0.13939815759658813,
0.024007495492696762,
0.04662325978279114,
-0.08205481618642807,
-0.0029386086389422417,
0.0791812464594841,
0.06965780258178711,
0.17661017179489136,
0.013885351829230785,
-0.023669935762882233,
0.026634456589818,
0.20819635689258575,
-0.1436755359172821,
0.10975687950849533,
0.13545554876327515,
-0.08767466992139816,
0.08120133727788925,
0.1998777538537979,
0.03777998685836792,
-0.10680917650461197,
0.03608465939760208,
0.028374753892421722,
-0.028325283899903297,
-0.2502254545688629,
-0.06958996504545212,
0.0019060121849179268,
-0.05172049254179001,
0.07064855098724365,
0.08791537582874298,
0.09593888372182846,
0.016860228031873703,
-0.09976044297218323,
-0.07697858661413193,
0.046900223940610886,
0.10824491083621979,
-0.00015424020239152014,
-0.015208319760859013,
0.0904119610786438,
-0.03033481352031231,
0.01743943803012371,
0.09215071052312851,
0.0030607767403125763,
0.17535938322544098,
0.051709048449993134,
0.17189906537532806,
0.07866133749485016,
0.06444311141967773,
0.02004685252904892,
0.007725914940237999,
0.021817529574036598,
0.017227526754140854,
-0.0030957073904573917,
-0.08709781616926193,
-0.0034981227945536375,
0.1202581599354744,
0.049845851957798004,
0.029173865914344788,
0.012042860500514507,
-0.030704669654369354,
0.08337877690792084,
0.1770893782377243,
0.0029054484330117702,
-0.1893385946750641,
-0.07169844210147858,
0.07795937359333038,
-0.08648337423801422,
-0.10729733109474182,
-0.029470939189195633,
0.041069481521844864,
-0.1729043871164322,
0.016882894560694695,
-0.019335895776748657,
0.10788324475288391,
-0.13190391659736633,
-0.01772487722337246,
0.05657728388905525,
0.06932812184095383,
-0.009677323512732983,
0.06694949418306351,
-0.16090403497219086,
0.11770165711641312,
0.01751571334898472,
0.06636732816696167,
-0.09608277678489685,
0.09618937969207764,
-0.007830657996237278,
0.0041499207727611065,
0.1410749852657318,
0.010120149701833725,
-0.05952107161283493,
-0.09608154743909836,
-0.10546442121267319,
-0.009841260500252247,
0.1306990385055542,
-0.14852415025234222,
0.08813067525625229,
-0.02661319263279438,
-0.044553373008966446,
0.003614129964262247,
-0.12497276812791824,
-0.13103094696998596,
-0.18366187810897827,
0.05707118660211563,
-0.12947207689285278,
0.04045100137591362,
-0.10902881622314453,
-0.045833900570869446,
-0.02098964899778366,
0.20040063560009003,
-0.23137451708316803,
-0.06714103370904922,
-0.1551055610179901,
-0.08061286807060242,
0.14446212351322174,
-0.046455029398202896,
0.08550118654966354,
0.0008278203313238919,
0.19068008661270142,
0.021319707855582237,
-0.017237508669495583,
0.1072206199169159,
-0.10052918642759323,
-0.2010865956544876,
-0.09273224323987961,
0.15895552933216095,
0.13766798377037048,
0.03809428587555885,
-0.004381525795906782,
0.03171157464385033,
-0.02098114788532257,
-0.12076930701732635,
0.020226983353495598,
0.17317426204681396,
0.08982043713331223,
0.025265544652938843,
-0.02972041629254818,
-0.11267432570457458,
-0.07061342149972916,
-0.03774050623178482,
0.024755435064435005,
0.18072067201137543,
-0.07222156971693039,
0.18405316770076752,
0.13775517046451569,
-0.05534014105796814,
-0.19904261827468872,
0.021996473893523216,
0.04293542355298996,
0.0070380112156271935,
0.0323902890086174,
-0.20307663083076477,
0.09384101629257202,
0.0008334947633557022,
-0.05131231248378754,
0.1379684954881668,
-0.1823476254940033,
-0.151598259806633,
0.06042521819472313,
0.043563615530729294,
-0.19374065101146698,
-0.12374074012041092,
-0.08848230540752411,
-0.04693066328763962,
-0.15487661957740784,
0.10312657803297043,
0.0020827590487897396,
0.008401188999414444,
0.03778626397252083,
0.02252252586185932,
0.012139533646404743,
-0.04198719933629036,
0.1914343535900116,
-0.025891713798046112,
0.03347287327051163,
-0.0790715217590332,
-0.060851071029901505,
0.062408581376075745,
-0.058187782764434814,
0.0755455270409584,
-0.025226406753063202,
0.015947066247463226,
-0.10598332434892654,
-0.048235729336738586,
-0.02852320298552513,
0.019321219995617867,
-0.09431382268667221,
-0.09348297864198685,
-0.04829427972435951,
0.09367614984512329,
0.09042316675186157,
-0.03652578964829445,
-0.03649144619703293,
-0.078715980052948,
0.038977332413196564,
0.17627815902233124,
0.18159319460391998,
0.04659178853034973,
-0.07959239184856415,
-0.001915142871439457,
-0.014336181804537773,
0.04684065282344818,
-0.22077152132987976,
0.060553863644599915,
0.04557652771472931,
0.016117896884679794,
0.11537692695856094,
-0.0208132341504097,
-0.16198977828025818,
-0.06710557639598846,
0.061360616236925125,
-0.06944561004638672,
-0.17825035750865936,
0.0039279889315366745,
0.07344977557659149,
-0.16578389704227448,
-0.037031736224889755,
0.04200848564505577,
-0.01189455483108759,
-0.0403641052544117,
0.012352054007351398,
0.08063354343175888,
0.007078902795910835,
0.07699975371360779,
0.055281639099121094,
0.09124495089054108,
-0.10227900743484497,
0.07410510629415512,
0.08149529248476028,
-0.08644098788499832,
0.030720343813300133,
0.09573426842689514,
-0.06469762325286865,
-0.0346054881811142,
0.04237886518239975,
0.08354541659355164,
0.024281201884150505,
-0.04682289808988571,
0.0023111123591661453,
-0.09734189510345459,
0.05927345156669617,
0.11483542621135712,
0.03496333956718445,
0.011234734207391739,
0.03813567012548447,
0.04486291855573654,
-0.08093374222517014,
0.11926916986703873,
0.023795632645487785,
0.020354853942990303,
-0.04112942889332771,
-0.040553025901317596,
0.035851649940013885,
-0.026020776480436325,
-0.011440055444836617,
-0.035174157470464706,
-0.0722682997584343,
-0.014069457538425922,
-0.16000694036483765,
-0.0076758842915296555,
-0.03660871088504791,
0.005114538595080376,
0.022510098293423653,
-0.03652830421924591,
0.00792311318218708,
0.012217256240546703,
-0.06868947297334671,
-0.05553458258509636,
-0.023233558982610703,
0.09422210603952408,
-0.16494666039943695,
0.0220257006585598,
0.0823851153254509,
-0.12121747434139252,
0.09289738535881042,
0.016782134771347046,
0.00412249518558383,
0.026962365955114365,
-0.1545863002538681,
0.04763968288898468,
-0.020152103155851364,
0.013473534025251865,
0.04222847521305084,
-0.21637047827243805,
-0.004404853098094463,
-0.04015503451228142,
-0.05566934496164322,
-0.008993052877485752,
-0.0319182425737381,
-0.11338426172733307,
0.09645436704158783,
0.011025024577975273,
-0.08443772792816162,
-0.02965564839541912,
0.03353232145309448,
0.07690354436635971,
-0.027447547763586044,
0.1498211771249771,
-0.004663881380110979,
0.07559948414564133,
-0.17581342160701752,
-0.02282017655670643,
-0.011197620071470737,
0.022367527708411217,
-0.021871577948331833,
-0.01622559316456318,
0.04623444378376007,
-0.02704801969230175,
0.19120801985263824,
-0.024701936170458794,
0.049393873661756516,
0.06364397704601288,
0.009232889860868454,
-0.013832193799316883,
0.11151392012834549,
0.05708572641015053,
0.024334950372576714,
0.022262847051024437,
0.003451440716162324,
-0.04008655622601509,
-0.009981024079024792,
-0.18596695363521576,
0.06803664565086365,
0.14585918188095093,
0.09060460329055786,
-0.012669353745877743,
0.0707244873046875,
-0.10161512345075607,
-0.12005364894866943,
0.10127941519021988,
-0.06415384262800217,
-0.010188822634518147,
-0.06542414426803589,
0.14027701318264008,
0.14953285455703735,
-0.1886233240365982,
0.06583356112241745,
-0.06602055579423904,
-0.0566304549574852,
-0.11457879096269608,
-0.1930263340473175,
-0.057075321674346924,
-0.050602465867996216,
-0.018466074019670486,
-0.05384097993373871,
0.06939727067947388,
0.05750798434019089,
0.01126816775649786,
0.00868057832121849,
0.08568526059389114,
-0.009656033478677273,
0.00248199631460011,
0.030120067298412323,
0.06713981181383133,
0.016768986359238625,
-0.0321255661547184,
0.0179112758487463,
-0.00597198773175478,
0.034156378358602524,
0.059282708913087845,
0.03608176112174988,
-0.028436895459890366,
0.015559280291199684,
-0.034912437200546265,
-0.11309733241796494,
0.042801856994628906,
-0.029640642926096916,
-0.0749855786561966,
0.1347348988056183,
0.026981467381119728,
0.005015076603740454,
-0.023140020668506622,
0.2503887414932251,
-0.07436972856521606,
-0.09334370493888855,
-0.14373961091041565,
0.11701542884111404,
-0.04212593287229538,
0.0635172426700592,
0.03596310690045357,
-0.10810714215040207,
0.017985546961426735,
0.1320217251777649,
0.15442703664302826,
-0.04732590913772583,
0.019251897931098938,
0.028577854856848717,
0.00439635943621397,
-0.04075566306710243,
0.05177190154790878,
0.07100846618413925,
0.14500564336776733,
-0.05157303810119629,
0.08530787378549576,
0.002609728369861841,
-0.1021018698811531,
-0.041973695158958435,
0.11415864527225494,
-0.014296893030405045,
0.017620453611016273,
-0.057136841118335724,
0.124222531914711,
-0.05874236673116684,
-0.23697422444820404,
0.06316976249217987,
-0.0765061303973198,
-0.1432730257511139,
-0.024886758998036385,
0.071670763194561,
-0.016632623970508575,
0.02605951391160488,
0.07167234271764755,
-0.0754380151629448,
0.18880942463874817,
0.03957989811897278,
-0.05233397334814072,
-0.05954399332404137,
0.0744764655828476,
-0.11850855499505997,
0.27879106998443604,
0.010482731275260448,
0.051307905465364456,
0.1042102724313736,
-0.02021743729710579,
-0.13270841538906097,
0.023401619866490364,
0.09579801559448242,
-0.08917027711868286,
0.04087764397263527,
0.21448291838169098,
-0.00629545608535409,
0.11935057491064072,
0.07611140608787537,
-0.07468950748443604,
0.047562725841999054,
-0.11468592286109924,
-0.07639975845813751,
-0.08699081838130951,
0.09244474768638611,
-0.06785612553358078,
0.14258281886577606,
0.12599852681159973,
-0.05530165135860443,
0.011584274470806122,
-0.028389399871230125,
0.045467376708984375,
0.005578654818236828,
0.100032277405262,
0.011115525849163532,
-0.18496567010879517,
0.024811718612909317,
0.016259413212537766,
0.10884406417608261,
-0.18112654983997345,
-0.09105053544044495,
0.046958595514297485,
0.0005061255069449544,
-0.06443515419960022,
0.12483241409063339,
0.057313691824674606,
0.04654949903488159,
-0.0451689288020134,
-0.026830285787582397,
-0.006042256020009518,
0.14264579117298126,
-0.10707559436559677,
-0.005129707511514425
] |
null | null | transformers | <!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto"
>
<img src="https://github.com/janhq/jan/assets/89722390/35daac7d-b895-487c-a6ac-6663daaad78e" alt="Jan banner"
style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<p align="center">
<a href="https://jan.ai/">Jan</a
>
- <a href="https://discord.gg/AsJ8krTT3N">Discord</a>
</p>
<!-- header end -->
# Prompt template
ChatML
```
<|im_start|>system
{system_message}<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant
```
# Training detail
You can read [here](https://huggingface.co/jan-hq/stealth-finance-v1-adapter).
# Run this model
You can run this model using [Jan Desktop](https://jan.ai/) on Mac, Windows, or Linux.
Jan is an open source, ChatGPT alternative that is:
- 💻 **100% offline on your machine**: Your conversations remain confidential, and visible only to you.
- 🗂️ **
An Open File Format**: Conversations and model settings stay on your computer and can be exported or deleted at any time.
- 🌐 **OpenAI Compatible**: Local server on port `1337` with OpenAI compatible endpoints
- 🌍 **Open Source & Free**: We build in public; check out our [Github](https://github.com/janhq)

# About Jan
Jan believes in the need for an open-source AI ecosystem and is building the infra and tooling to allow open-source AIs to compete on a level playing field with proprietary ones.
Jan's long-term vision is to build a cognitive framework for future robots, who are practical, useful assistants for humans and businesses in everyday life. | {"language": ["en"], "license": "apache-2.0"} | text-generation | jan-hq/stealth-finance-v1-e1 | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"conversational",
"en",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-15T06:06:48+00:00 | [] | [
"en"
] | TAGS
#transformers #safetensors #mistral #text-generation #conversational #en #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div style="width: auto; margin-left: auto; margin-right: auto"
>
<img src="URL alt="Jan banner"
style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<p align="center">
<a href="URL
>
- <a href="URL
</p>
# Prompt template
ChatML
# Training detail
You can read here.
# Run this model
You can run this model using Jan Desktop on Mac, Windows, or Linux.
Jan is an open source, ChatGPT alternative that is:
- 100% offline on your machine: Your conversations remain confidential, and visible only to you.
- ️
An Open File Format: Conversations and model settings stay on your computer and can be exported or deleted at any time.
- OpenAI Compatible: Local server on port '1337' with OpenAI compatible endpoints
- Open Source & Free: We build in public; check out our Github
!image/png
# About Jan
Jan believes in the need for an open-source AI ecosystem and is building the infra and tooling to allow open-source AIs to compete on a level playing field with proprietary ones.
Jan's long-term vision is to build a cognitive framework for future robots, who are practical, useful assistants for humans and businesses in everyday life. | [
"# Prompt template\n\nChatML",
"# Training detail\nYou can read here.",
"# Run this model\nYou can run this model using Jan Desktop on Mac, Windows, or Linux.\n\nJan is an open source, ChatGPT alternative that is:\n\n- 100% offline on your machine: Your conversations remain confidential, and visible only to you.\n- ️ \nAn Open File Format: Conversations and model settings stay on your computer and can be exported or deleted at any time.\n- OpenAI Compatible: Local server on port '1337' with OpenAI compatible endpoints\n\n- Open Source & Free: We build in public; check out our Github\n\n!image/png",
"# About Jan\nJan believes in the need for an open-source AI ecosystem and is building the infra and tooling to allow open-source AIs to compete on a level playing field with proprietary ones.\n\nJan's long-term vision is to build a cognitive framework for future robots, who are practical, useful assistants for humans and businesses in everyday life."
] | [
"TAGS\n#transformers #safetensors #mistral #text-generation #conversational #en #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Prompt template\n\nChatML",
"# Training detail\nYou can read here.",
"# Run this model\nYou can run this model using Jan Desktop on Mac, Windows, or Linux.\n\nJan is an open source, ChatGPT alternative that is:\n\n- 100% offline on your machine: Your conversations remain confidential, and visible only to you.\n- ️ \nAn Open File Format: Conversations and model settings stay on your computer and can be exported or deleted at any time.\n- OpenAI Compatible: Local server on port '1337' with OpenAI compatible endpoints\n\n- Open Source & Free: We build in public; check out our Github\n\n!image/png",
"# About Jan\nJan believes in the need for an open-source AI ecosystem and is building the infra and tooling to allow open-source AIs to compete on a level playing field with proprietary ones.\n\nJan's long-term vision is to build a cognitive framework for future robots, who are practical, useful assistants for humans and businesses in everyday life."
] | [
61,
7,
8,
126,
77
] | [
"passage: TAGS\n#transformers #safetensors #mistral #text-generation #conversational #en #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Prompt template\n\nChatML# Training detail\nYou can read here.# Run this model\nYou can run this model using Jan Desktop on Mac, Windows, or Linux.\n\nJan is an open source, ChatGPT alternative that is:\n\n- 100% offline on your machine: Your conversations remain confidential, and visible only to you.\n- ️ \nAn Open File Format: Conversations and model settings stay on your computer and can be exported or deleted at any time.\n- OpenAI Compatible: Local server on port '1337' with OpenAI compatible endpoints\n\n- Open Source & Free: We build in public; check out our Github\n\n!image/png# About Jan\nJan believes in the need for an open-source AI ecosystem and is building the infra and tooling to allow open-source AIs to compete on a level playing field with proprietary ones.\n\nJan's long-term vision is to build a cognitive framework for future robots, who are practical, useful assistants for humans and businesses in everyday life."
] | [
-0.07681140303611755,
0.06756508350372314,
-0.0006939280428923666,
0.03379666060209274,
0.05353933572769165,
-0.024314014241099358,
0.2158588320016861,
0.0738421306014061,
0.16322718560695648,
0.004240577574819326,
-0.005759343504905701,
0.06538901478052139,
0.0909937247633934,
0.17526017129421234,
0.0024181045591831207,
-0.19522899389266968,
0.09650039672851562,
0.009287445805966854,
0.015088804066181183,
0.029494551941752434,
0.06184526905417442,
-0.02012593299150467,
0.06658489257097244,
0.042075809091329575,
0.01215453539043665,
-0.13125960528850555,
-0.016459494829177856,
-0.02403181977570057,
0.13143189251422882,
0.1076253280043602,
0.0037059527821838856,
-0.02959403768181801,
-0.009051937609910965,
-0.019702479243278503,
0.039772894233465195,
0.002767830854281783,
0.0263447817414999,
0.05547700449824333,
-0.04840300604701042,
0.04061133414506912,
0.0943908840417862,
0.04377434030175209,
0.02226267009973526,
0.06386931240558624,
-0.046147916465997696,
-0.18308575451374054,
-0.03138725459575653,
-0.1624741554260254,
0.11710204184055328,
0.09626739472150803,
0.005261968821287155,
0.1816607564687729,
-0.013768885284662247,
0.06405483931303024,
-0.09142939746379852,
-0.20069168508052826,
-0.03651733696460724,
-0.026287751272320747,
-0.00034738556132651865,
0.09840121865272522,
0.029078088700771332,
0.024136589840054512,
0.0270583163946867,
0.015380661003291607,
0.05679101496934891,
-0.019800543785095215,
-0.04033862426877022,
-0.1149202361702919,
-0.06121552363038063,
-0.0092554222792387,
0.3021276891231537,
0.00029716265271417797,
-0.09915488958358765,
-0.11383680999279022,
-0.0034650766756385565,
0.09797000885009766,
-0.006140671670436859,
0.0025578804779797792,
-0.016466932371258736,
0.04002626985311508,
0.059877462685108185,
-0.08913882821798325,
-0.13139115273952484,
-0.0039055112283676863,
-0.04216230660676956,
0.20692047476768494,
0.046107590198516846,
0.05523988977074623,
-0.07637153565883636,
0.013419588096439838,
-0.08230073004961014,
-0.05182377249002457,
-0.06665565073490143,
-0.049251191318035126,
-0.01887788251042366,
0.0075590224005281925,
-0.01655164733529091,
-0.08189211785793304,
0.05595859885215759,
0.038925651460886,
-0.00034451569081284106,
-0.016637157648801804,
-0.059646155685186386,
0.08359071612358093,
0.06768499314785004,
0.0023184027522802353,
0.002724278252571821,
-0.003460794221609831,
0.11228986829519272,
-0.06826922297477722,
0.07313135266304016,
-0.061007287353277206,
-0.09220457822084427,
0.020567692816257477,
-0.03296462818980217,
0.03786586970090866,
0.13948389887809753,
0.06386114656925201,
-0.025100981816649437,
-0.03121308982372284,
0.17826257646083832,
-0.02133895643055439,
0.0054821837693452835,
-0.011570008471608162,
-0.04437069594860077,
-0.04782629385590553,
0.00028543610824272037,
0.03153841942548752,
-0.07769597321748734,
-0.04778600111603737,
0.018535172566771507,
-0.03077293373644352,
-0.11583762615919113,
-0.07995408028364182,
0.03934161365032196,
0.05245215445756912,
0.07009922713041306,
-0.1901901215314865,
-0.13103421032428741,
0.013228297233581543,
0.10321325063705444,
-0.044488806277513504,
-0.0548778772354126,
0.0361948236823082,
-0.004202661104500294,
-0.061545275151729584,
-0.055909935384988785,
-0.04104923829436302,
-0.04770798236131668,
0.003893114160746336,
-0.03714619576931,
0.06441554427146912,
-0.1814485490322113,
0.01961084082722664,
-0.0882091224193573,
-0.04729262366890907,
-0.06594263762235641,
0.020041532814502716,
-0.0804886743426323,
0.015423474833369255,
-0.010269755497574806,
0.07352478057146072,
-0.05320410057902336,
0.03555096685886383,
-0.007573795970529318,
0.11946477741003036,
-0.1274021863937378,
0.10358059406280518,
0.1443953812122345,
-0.13151411712169647,
-0.170613095164299,
0.18433675169944763,
0.010316235944628716,
0.13974198698997498,
0.1111108586192131,
0.038033366203308105,
0.03717876225709915,
-0.06192370504140854,
0.020788243040442467,
0.007708373479545116,
-0.08021476864814758,
-0.010937826707959175,
0.06140279024839401,
0.06897037476301193,
-0.07489576190710068,
0.027053873986005783,
-0.08553909510374069,
0.07025150209665298,
0.021884877234697342,
-0.02014867402613163,
0.013651263900101185,
-0.10698289424180984,
0.023706618696451187,
-0.0375487245619297,
0.05515182018280029,
-0.026921860873699188,
-0.020443474873900414,
-0.14468321204185486,
0.08881549537181854,
-0.021602675318717957,
-0.03569444641470909,
-0.13942629098892212,
0.02468939870595932,
0.021884635090827942,
0.06860073655843735,
-0.04470111057162285,
-0.04842386022210121,
0.10468403995037079,
-0.008075709454715252,
0.103028304874897,
0.12274369597434998,
0.04468420147895813,
0.10638609528541565,
0.045249998569488525,
0.01514983456581831,
0.05343133583664894,
-0.011145982891321182,
-0.052071843296289444,
-0.1285644918680191,
0.03433498740196228,
-0.08755845576524734,
0.0660647600889206,
-0.0293598435819149,
0.047326669096946716,
0.030250785872340202,
0.003392694052308798,
0.05488155782222748,
0.03299500793218613,
0.02707800455391407,
-0.01846982352435589,
-0.06332655996084213,
-0.008314734324812889,
0.03659653291106224,
0.04100200906395912,
-0.08591874688863754,
0.2616642415523529,
-0.10691540688276291,
0.07375071197748184,
0.09439704567193985,
0.07507488876581192,
-0.0208220686763525,
0.0268145389854908,
0.00898082833737135,
-0.04482394829392433,
-0.09505759179592133,
-0.09656957536935806,
0.2644205391407013,
0.01271024253219366,
0.06725522130727768,
-0.06503228098154068,
0.01661633513867855,
-0.04366571083664894,
-0.08995179086923599,
0.07187148928642273,
-0.012563501484692097,
-0.013021577149629593,
-0.09736933559179306,
0.07603389024734497,
-0.059827595949172974,
-0.03233983367681503,
0.14061357080936432,
0.07889558374881744,
-0.023594390600919724,
-0.020981092005968094,
0.11472474038600922,
0.07644078135490417,
0.15570345520973206,
-0.23533520102500916,
-0.002703749807551503,
0.03528223931789398,
0.03718402236700058,
0.03674597665667534,
-0.09607234597206116,
0.032883789390325546,
0.00040217634523287416,
0.010743970982730389,
0.1354834884405136,
0.01789233647286892,
-0.03677859902381897,
0.11423879116773605,
0.00822271779179573,
-0.003940574359148741,
0.004735915921628475,
-0.018658967688679695,
-0.10435061901807785,
0.10062331706285477,
-0.044930350035429,
-0.24388599395751953,
-0.09815012663602829,
-0.014394042082130909,
0.01869252510368824,
0.04658729210495949,
0.12217801064252853,
0.011625543236732483,
-0.08748216181993484,
-0.07051736861467361,
0.010511253029108047,
0.1107545793056488,
-0.10146971791982651,
0.0039248354732990265,
0.011007023975253105,
-0.03774715214967728,
-0.06929490715265274,
-0.014337425120174885,
0.0796208307147026,
-0.030798520892858505,
0.09070797264575958,
0.030256126075983047,
0.07163296639919281,
-0.01402716152369976,
-0.011528491973876953,
-0.053869035094976425,
0.022041210904717445,
0.17428821325302124,
-0.09645123779773712,
0.12180288136005402,
0.1930229365825653,
-0.03449322655797005,
0.013818621635437012,
0.0982867181301117,
0.04977336898446083,
-0.10829780995845795,
0.04211343079805374,
-0.09504932910203934,
-0.08252210915088654,
-0.1503407061100006,
-0.11362927407026291,
-0.07796213775873184,
0.14186827838420868,
0.015861794352531433,
0.01326752733439207,
0.06846082210540771,
0.11547847837209702,
-0.08068251609802246,
0.09603577852249146,
0.07503781467676163,
0.08892140537500381,
-0.04914858937263489,
-0.0894864872097969,
0.053526900708675385,
-0.018627015873789787,
0.0014755072770640254,
0.14253750443458557,
0.1084214597940445,
0.18175636231899261,
0.008401251398026943,
0.13564619421958923,
0.05717433989048004,
0.06482495367527008,
0.04731069132685661,
0.058767251670360565,
-0.012171836569905281,
0.00004033596997032873,
-0.06359222531318665,
-0.03540793061256409,
-0.06920402497053146,
0.1464526355266571,
-0.07796215265989304,
-0.10501609742641449,
0.06005285307765007,
0.019586673006415367,
-0.018078884109854698,
0.14462129771709442,
-0.05518464371562004,
-0.0953865647315979,
-0.14603982865810394,
0.08057045191526413,
0.014052733778953552,
-0.08817154914140701,
0.013827070593833923,
0.18525122106075287,
-0.11042231321334839,
-0.10642094165086746,
-0.03425042703747749,
0.0695551410317421,
-0.14235398173332214,
-0.0032513849437236786,
-0.14400142431259155,
-0.0008359966450370848,
0.046267274767160416,
0.0655522421002388,
-0.14576268196105957,
0.07297029346227646,
0.021449021995067596,
0.10378078371286392,
-0.1373082846403122,
-0.007146124262362719,
0.06552132219076157,
0.09476790577173233,
0.030064307153224945,
0.027065007016062737,
-0.14918667078018188,
-0.039133645594120026,
-0.1421053558588028,
0.05439886078238487,
-0.07048080861568451,
0.0019067395478487015,
-0.016296131536364555,
0.010420622304081917,
0.02904859371483326,
-0.05423598736524582,
-0.06729473173618317,
-0.07174213975667953,
-0.13707290589809418,
0.029949579387903214,
0.1502457857131958,
0.1725359559059143,
-0.032038427889347076,
0.02392515353858471,
0.03295310214161873,
0.19667282700538635,
-0.007898577488958836,
-0.08089771866798401,
-0.0696067065000534,
-0.0375584252178669,
-0.037621043622493744,
-0.05696547403931618,
-0.03612015023827553,
-0.028226034715771675,
0.10146227478981018,
-0.021280672401189804,
-0.036645375192165375,
0.008464078418910503,
-0.14691419899463654,
-0.07326851040124893,
-0.04916854202747345,
-0.0721619576215744,
0.10388830304145813,
0.07710272073745728,
0.029894521459937096,
-0.06493687629699707,
-0.048635855317115784,
-0.08317278325557709,
-0.035741548985242844,
0.22313626110553741,
-0.017062028869986534,
-0.01485071424394846,
-0.14699779450893402,
-0.17158986628055573,
-0.08824507147073746,
-0.1047639325261116,
0.14775659143924713,
0.15762144327163696,
-0.07315891236066818,
0.026770757511258125,
0.22303779423236847,
0.0030894705560058355,
-0.25067803263664246,
-0.06996467709541321,
-0.009727626107633114,
0.019488949328660965,
0.004076317418366671,
-0.15682294964790344,
0.08869718015193939,
0.025644075125455856,
-0.05529232323169708,
0.14296571910381317,
-0.09984864294528961,
-0.06204034760594368,
0.06650055199861526,
0.1251816600561142,
0.2024236023426056,
-0.06264311820268631,
0.020710106939077377,
0.023237023502588272,
-0.04053022712469101,
0.07453186810016632,
-0.08800256252288818,
0.09542742371559143,
0.026411443948745728,
0.0727459192276001,
0.026015745475888252,
-0.033005379140377045,
0.0656304880976677,
-0.14928090572357178,
0.047584790736436844,
-0.07195010036230087,
0.10879573225975037,
-0.08789397776126862,
-0.045398399233818054,
0.22605778276920319,
-0.1412811130285263,
-0.0004468305269256234,
-0.012264292687177658,
-0.020442521199584007,
-0.010760260745882988,
0.054776377975940704,
0.041398514062166214,
-0.09666845947504044,
-0.05707412585616112,
0.038049448281526566,
0.05512924864888191,
0.06753067672252655,
0.061161864548921585,
-0.001450320240110159,
0.035292066633701324,
0.1540302038192749,
0.0962006077170372,
-0.20787525177001953,
0.10055209696292877,
-0.03613168001174927,
-0.014431845396757126,
0.11088903248310089,
-0.07971390336751938,
0.004180321004241705,
0.037343207746744156,
-0.04864373803138733,
0.0673292949795723,
-0.02415730059146881,
-0.13192616403102875,
0.08340145647525787,
0.011663307435810566,
-0.03354427590966225,
-0.2135341614484787,
-0.019517773762345314,
0.35077598690986633,
0.06449490040540695,
0.07703890651464462,
0.14949971437454224,
-0.10674389451742172,
-0.015345446765422821,
-0.0263030044734478,
0.04784006252884865,
0.03241993859410286,
0.047257475554943085,
-0.1261490285396576,
0.017693644389510155,
-0.06404983252286911,
-0.0008934470824897289,
0.03192136436700821,
-0.07922276854515076,
0.0728074461221695,
0.06177205219864845,
-0.0999259427189827,
-0.13874295353889465,
-0.01695255935192108,
-0.0006924749468453228,
0.10240531712770462,
-0.09569700807332993,
-0.0018224046798422933,
-0.06856123358011246,
-0.024218183010816574,
0.06631594896316528,
0.02587166428565979,
-0.02565598487854004,
0.042808279395103455,
0.005154852289706469,
-0.03531435877084732,
0.04535859078168869,
-0.08244118839502335,
0.03354357182979584,
-0.012784730643033981,
-0.057118501514196396,
0.07515416294336319,
-0.042131565511226654,
-0.029601717367768288,
-0.07688344269990921,
-0.09133709222078323,
-0.03496897965669632,
-0.07392234355211258,
0.028848964720964432,
-0.08902151882648468,
-0.02109540067613125,
-0.005569164175540209,
-0.07331597059965134,
0.01815742999315262,
0.0879179984331131,
-0.042520735412836075,
0.04130181297659874,
-0.0767301619052887,
0.04646358639001846,
0.003180885687470436,
-0.02340696007013321,
0.05394599959254265,
-0.03176020458340645,
0.07451724261045456,
0.011708726175129414,
-0.08964808285236359,
-0.05237753316760063,
-0.15416638553142548,
0.07910042256116867,
0.020274007692933083,
0.055198393762111664,
0.05651544779539108,
-0.13596764206886292,
-0.07000590860843658,
0.0634571835398674,
0.03313300386071205,
-0.027047518640756607,
0.2911251485347748,
-0.053845975548028946,
0.053126320242881775,
0.055269673466682434,
-0.0422976128757,
-0.06304522603750229,
-0.010955456644296646,
0.033299077302217484,
0.013011972419917583,
0.1342339962720871,
-0.004181988071650267,
0.06145955249667168,
-0.02129344828426838,
0.03752285614609718,
0.047832805663347244,
-0.02965349145233631,
-0.0029445234686136246,
-0.04285648092627525,
-0.019696809351444244,
-0.02306954935193062,
0.14375723898410797,
-0.05277728661894798,
-0.05793493241071701,
0.022065529599785805,
-0.05238110572099686,
-0.04736948385834694,
-0.054763056337833405,
-0.08132181316614151,
-0.08714096248149872,
0.021512692794203758,
-0.10144943743944168,
0.06183617562055588,
-0.009120562113821507,
-0.12030807137489319,
0.051948849111795425,
0.05001719295978546,
0.0008073636563494802,
0.022188479080796242,
0.008470566011965275,
0.03173105791211128,
-0.1813925951719284,
-0.12966732680797577,
-0.06832082569599152,
0.030915923416614532,
-0.06553175300359726,
0.2178872674703598,
0.15312860906124115,
-0.027806764468550682,
0.07932761311531067,
-0.027026696130633354,
-0.020006412640213966,
-0.040140945464372635,
-0.21057026088237762,
-0.04085313156247139,
-0.09047935903072357,
-0.03561926260590553,
-0.07616900652647018,
-0.003562093945220113,
-0.02742239646613598,
0.020565928891301155,
-0.04401977360248566,
0.1423402726650238,
-0.030940629541873932,
-0.03380466252565384,
-0.03797909617424011,
-0.0270924624055624,
0.005993240047246218,
-0.02450915053486824,
-0.06480153650045395,
-0.09655575454235077,
0.1079925149679184,
0.06354276835918427,
0.08001533895730972,
0.010781842283904552,
-0.010424645617604256,
-0.00843704491853714,
-0.03123893402516842,
-0.011034640483558178,
-0.018522420898079872,
0.020580319687724113,
0.16259942948818207,
0.05875848978757858,
-0.06770854443311691,
0.06057402119040489,
0.22058597207069397,
-0.05233307182788849,
-0.09303994476795197,
-0.17323271930217743,
0.08758021891117096,
-0.09209302812814713,
-0.08784201741218567,
-0.019725676625967026,
-0.0046431864611804485,
-0.027945848181843758,
0.2755805253982544,
0.09561637043952942,
-0.0052883862517774105,
-0.0067318812943995,
-0.11829669773578644,
-0.005622375290840864,
-0.0917392149567604,
0.14780044555664062,
0.06800264865159988,
0.3175709843635559,
-0.018780434504151344,
0.06695996224880219,
-0.01514813955873251,
0.035733114928007126,
-0.18325243890285492,
0.006338717881590128,
-0.09942106902599335,
0.0031535334419459105,
-0.045999057590961456,
0.054402779787778854,
-0.03470440208911896,
-0.08684291690587997,
-0.12197787314653397,
-0.06519010663032532,
-0.047558240592479706,
0.027922268956899643,
0.09403908997774124,
-0.048708464950323105,
0.10236216336488724,
-0.008467433974146843,
0.03593525290489197,
0.027995949611067772,
-0.017410505563020706,
-0.02897481992840767,
0.06427732855081558,
0.030172940343618393,
-0.14144033193588257,
0.17653438448905945,
-0.011730731464922428,
0.05649147927761078,
0.06817910075187683,
-0.017991576343774796,
-0.1393590271472931,
0.07920657098293304,
-0.04300708696246147,
-0.08880871534347534,
0.03326786682009697,
0.06103873997926712,
-0.07049312442541122,
0.16507264971733093,
0.11851660162210464,
-0.1318807452917099,
-0.013981647789478302,
0.009474736638367176,
0.0093081621453166,
-0.02255072072148323,
0.04803973436355591,
-0.08879075944423676,
0.11980599910020828,
0.05603799968957901,
-0.04602073132991791,
-0.10265649855136871,
-0.03892719745635986,
0.036714307963848114,
-0.03214174136519432,
-0.04926566407084465,
-0.08786514401435852,
-0.23425346612930298,
-0.04532405734062195,
0.06252992898225784,
0.06335967779159546,
-0.1557452231645584,
-0.042135171592235565,
-0.09186793118715286,
-0.03105689026415348,
0.031806010752916336,
-0.038894813507795334,
0.16510605812072754,
-0.02241860330104828,
-0.043308403342962265,
-0.1801118403673172,
0.01891512982547283,
0.04576770216226578,
-0.04818006604909897,
-0.09417697787284851
] |
null | null | null |
# Lora of aurora/オーロラ/欧若拉 (Azur Lane)
## What Is This?
This is the LoRA model of waifu aurora/オーロラ/欧若拉 (Azur Lane).
## How Is It Trained?
* This model is trained with [HCP-Diffusion](https://github.com/7eu7d7/HCP-Diffusion).
* The [auto-training framework](https://github.com/deepghs/cyberharem) is maintained by [DeepGHS Team](https://huggingface.co/deepghs).
* The base model used for training is [deepghs/animefull-latest](https://huggingface.co/deepghs/animefull-latest).
* Dataset used for training is the `stage3-p480-800` in [CyberHarem/aurora_azurlane](https://huggingface.co/datasets/CyberHarem/aurora_azurlane), which contains 222 images.
* Batch size is 4, resolution is 720x720, clustering into 5 buckets.
* Batch size for regularization dataset is 16, resolution is 720x720, clustering into 20 buckets.
* Trained for 2240 steps, 40 checkpoints were saved and evaluated.
* **Trigger word is `aurora_azurlane`.**
* Pruned core tags for this waifu are `blonde_hair, long_hair, green_eyes, breasts, bangs, large_breasts, very_long_hair, medium_breasts, ribbon`. You can add them to the prompt when some features of waifu (e.g. hair color) are not stable.
## How to Use It?
### If You Are Using A1111 WebUI v1.7+
**Just use it like the classic LoRA**. The LoRA we provided are bundled with the embedding file.
### If You Are Using A1111 WebUI v1.6 or Lower
After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.
For example, if you want to use the model from step 672, you need to download [`672/aurora_azurlane.pt`](https://huggingface.co/CyberHarem/aurora_azurlane/resolve/main/672/aurora_azurlane.pt) as the embedding and [`672/aurora_azurlane.safetensors`](https://huggingface.co/CyberHarem/aurora_azurlane/resolve/main/672/aurora_azurlane.safetensors) for loading Lora. By using both files together, you can generate images for the desired characters.
## Which Step Should I Use?
We selected 5 good steps for you to choose. The best one is step 672.
1720 images (1.95 GiB) were generated for auto-testing.

The base model used for generating preview images is [Meina/MeinaMix_V11](https://huggingface.co/Meina/MeinaMix_V11).
Here are the preview of the recommended steps:
| Step | Epoch | CCIP | AI Corrupt | Bikini Plus | Score | Download | pattern_0_0 | pattern_0_1 | pattern_1_0 | pattern_1_1 | pattern_2 | pattern_3_0 | pattern_3_1 | pattern_3_2 | portrait_0 | portrait_1 | portrait_2 | full_body_0 | full_body_1 | profile_0 | profile_1 | free_0 | free_1 | shorts | maid_0 | maid_1 | miko | yukata | suit | china | bikini_0 | bikini_1 | bikini_2 | sit | squat | kneel | jump | crossed_arms | angry | smile | cry | grin | n_lie_0 | n_lie_1 | n_stand_0 | n_stand_1 | n_stand_2 | n_sex_0 | n_sex_1 |
|-------:|--------:|:----------|:-------------|:--------------|:----------|:----------------------------------------------------------------------------------------------------|:----------------------------------------------|:----------------------------------------------|:----------------------------------------------|:----------------------------------------------|:------------------------------------------|:----------------------------------------------|:----------------------------------------------|:----------------------------------------------|:--------------------------------------------|:--------------------------------------------|:--------------------------------------------|:----------------------------------------------|:----------------------------------------------|:------------------------------------------|:------------------------------------------|:------------------------------------|:------------------------------------|:------------------------------------|:------------------------------------|:------------------------------------|:--------------------------------|:------------------------------------|:--------------------------------|:----------------------------------|:----------------------------------------|:----------------------------------------|:----------------------------------------|:------------------------------|:----------------------------------|:----------------------------------|:--------------------------------|:------------------------------------------------|:----------------------------------|:----------------------------------|:------------------------------|:--------------------------------|:--------------------------------------|:--------------------------------------|:------------------------------------------|:------------------------------------------|:------------------------------------------|:--------------------------------------|:--------------------------------------|
| 672 | 13 | 0.875 | 0.962 | **0.840** | **0.751** | [Download](https://huggingface.co/CyberHarem/aurora_azurlane/resolve/main/672/aurora_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
| 1736 | 32 | **0.877** | 0.905 | 0.832 | 0.740 | [Download](https://huggingface.co/CyberHarem/aurora_azurlane/resolve/main/1736/aurora_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
| 1120 | 21 | 0.850 | 0.909 | 0.835 | 0.720 | [Download](https://huggingface.co/CyberHarem/aurora_azurlane/resolve/main/1120/aurora_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
| 392 | 8 | 0.804 | **0.973** | 0.838 | 0.677 | [Download](https://huggingface.co/CyberHarem/aurora_azurlane/resolve/main/392/aurora_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
| 560 | 11 | 0.808 | 0.926 | 0.831 | 0.672 | [Download](https://huggingface.co/CyberHarem/aurora_azurlane/resolve/main/560/aurora_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
## Anything Else?
Because the automation of LoRA training always annoys some people. So for the following groups, it is not recommended to use this model and we express regret:
1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.
2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.
3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.
4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.
5. Individuals who finds the generated image content offensive to their values.
## All Steps
We uploaded the files in all steps. you can check the images, metrics and download them in the following links:
* [Steps From 1736 to 2240](all/0.md)
* [Steps From 1176 to 1680](all/1.md)
* [Steps From 616 to 1120](all/2.md)
* [Steps From 56 to 560](all/3.md)
| {"license": "mit", "tags": ["art", "not-for-all-audiences"], "datasets": ["CyberHarem/aurora_azurlane"], "pipeline_tag": "text-to-image"} | text-to-image | CyberHarem/aurora_azurlane | [
"art",
"not-for-all-audiences",
"text-to-image",
"dataset:CyberHarem/aurora_azurlane",
"license:mit",
"region:us"
] | 2024-02-15T06:14:09+00:00 | [] | [] | TAGS
#art #not-for-all-audiences #text-to-image #dataset-CyberHarem/aurora_azurlane #license-mit #region-us
| Lora of aurora/オーロラ/欧若拉 (Azur Lane)
===================================
What Is This?
-------------
This is the LoRA model of waifu aurora/オーロラ/欧若拉 (Azur Lane).
How Is It Trained?
------------------
* This model is trained with HCP-Diffusion.
* The auto-training framework is maintained by DeepGHS Team.
* The base model used for training is deepghs/animefull-latest.
* Dataset used for training is the 'stage3-p480-800' in CyberHarem/aurora\_azurlane, which contains 222 images.
* Batch size is 4, resolution is 720x720, clustering into 5 buckets.
* Batch size for regularization dataset is 16, resolution is 720x720, clustering into 20 buckets.
* Trained for 2240 steps, 40 checkpoints were saved and evaluated.
* Trigger word is 'aurora\_azurlane'.
* Pruned core tags for this waifu are 'blonde\_hair, long\_hair, green\_eyes, breasts, bangs, large\_breasts, very\_long\_hair, medium\_breasts, ribbon'. You can add them to the prompt when some features of waifu (e.g. hair color) are not stable.
How to Use It?
--------------
### If You Are Using A1111 WebUI v1.7+
Just use it like the classic LoRA. The LoRA we provided are bundled with the embedding file.
### If You Are Using A1111 WebUI v1.6 or Lower
After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.
For example, if you want to use the model from step 672, you need to download '672/aurora\_azurlane.pt' as the embedding and '672/aurora\_azurlane.safetensors' for loading Lora. By using both files together, you can generate images for the desired characters.
Which Step Should I Use?
------------------------
We selected 5 good steps for you to choose. The best one is step 672.
1720 images (1.95 GiB) were generated for auto-testing.
!Metrics Plot
The base model used for generating preview images is Meina/MeinaMix\_V11.
Here are the preview of the recommended steps:
Anything Else?
--------------
Because the automation of LoRA training always annoys some people. So for the following groups, it is not recommended to use this model and we express regret:
1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.
2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.
3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.
4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.
5. Individuals who finds the generated image content offensive to their values.
All Steps
---------
We uploaded the files in all steps. you can check the images, metrics and download them in the following links:
* Steps From 1736 to 2240
* Steps From 1176 to 1680
* Steps From 616 to 1120
* Steps From 56 to 560
| [
"### If You Are Using A1111 WebUI v1.7+\n\n\nJust use it like the classic LoRA. The LoRA we provided are bundled with the embedding file.",
"### If You Are Using A1111 WebUI v1.6 or Lower\n\n\nAfter downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.\n\n\nFor example, if you want to use the model from step 672, you need to download '672/aurora\\_azurlane.pt' as the embedding and '672/aurora\\_azurlane.safetensors' for loading Lora. By using both files together, you can generate images for the desired characters.\n\n\nWhich Step Should I Use?\n------------------------\n\n\nWe selected 5 good steps for you to choose. The best one is step 672.\n\n\n1720 images (1.95 GiB) were generated for auto-testing.\n\n\n!Metrics Plot\n\n\nThe base model used for generating preview images is Meina/MeinaMix\\_V11.\n\n\nHere are the preview of the recommended steps:\n\n\n\nAnything Else?\n--------------\n\n\nBecause the automation of LoRA training always annoys some people. So for the following groups, it is not recommended to use this model and we express regret:\n\n\n1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.\n2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.\n3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.\n4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.\n5. Individuals who finds the generated image content offensive to their values.\n\n\nAll Steps\n---------\n\n\nWe uploaded the files in all steps. you can check the images, metrics and download them in the following links:\n\n\n* Steps From 1736 to 2240\n* Steps From 1176 to 1680\n* Steps From 616 to 1120\n* Steps From 56 to 560"
] | [
"TAGS\n#art #not-for-all-audiences #text-to-image #dataset-CyberHarem/aurora_azurlane #license-mit #region-us \n",
"### If You Are Using A1111 WebUI v1.7+\n\n\nJust use it like the classic LoRA. The LoRA we provided are bundled with the embedding file.",
"### If You Are Using A1111 WebUI v1.6 or Lower\n\n\nAfter downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.\n\n\nFor example, if you want to use the model from step 672, you need to download '672/aurora\\_azurlane.pt' as the embedding and '672/aurora\\_azurlane.safetensors' for loading Lora. By using both files together, you can generate images for the desired characters.\n\n\nWhich Step Should I Use?\n------------------------\n\n\nWe selected 5 good steps for you to choose. The best one is step 672.\n\n\n1720 images (1.95 GiB) were generated for auto-testing.\n\n\n!Metrics Plot\n\n\nThe base model used for generating preview images is Meina/MeinaMix\\_V11.\n\n\nHere are the preview of the recommended steps:\n\n\n\nAnything Else?\n--------------\n\n\nBecause the automation of LoRA training always annoys some people. So for the following groups, it is not recommended to use this model and we express regret:\n\n\n1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.\n2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.\n3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.\n4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.\n5. Individuals who finds the generated image content offensive to their values.\n\n\nAll Steps\n---------\n\n\nWe uploaded the files in all steps. you can check the images, metrics and download them in the following links:\n\n\n* Steps From 1736 to 2240\n* Steps From 1176 to 1680\n* Steps From 616 to 1120\n* Steps From 56 to 560"
] | [
45,
38,
474
] | [
"passage: TAGS\n#art #not-for-all-audiences #text-to-image #dataset-CyberHarem/aurora_azurlane #license-mit #region-us \n### If You Are Using A1111 WebUI v1.7+\n\n\nJust use it like the classic LoRA. The LoRA we provided are bundled with the embedding file."
] | [
0.0066404095850884914,
-0.0051877908408641815,
-0.0043930537067353725,
0.08440637588500977,
0.07018383592367172,
0.07893764972686768,
0.2295033186674118,
0.07600772380828857,
0.12120365351438522,
-0.06877975165843964,
0.09159611165523529,
0.05764662101864815,
-0.008095423690974712,
0.03465268388390541,
-0.03870026394724846,
-0.1607835441827774,
-0.0653410255908966,
-0.029284419491887093,
0.0055097490549087524,
0.018143780529499054,
0.07005178183317184,
0.0063675325363874435,
0.09883936494588852,
-0.055903539061546326,
-0.03794039040803909,
0.04399107024073601,
-0.030944567173719406,
-0.04658420756459236,
0.03778892382979393,
0.08606310188770294,
0.1199517548084259,
0.012487800791859627,
0.05727054178714752,
-0.16536931693553925,
0.06611859053373337,
-0.015187189914286137,
-0.10701726377010345,
-0.0028900448232889175,
0.017678778618574142,
-0.032613880932331085,
0.13922224938869476,
0.04320421814918518,
-0.08992527425289154,
0.04243355616927147,
-0.1306297332048416,
-0.030938338488340378,
-0.052346836775541306,
0.04174409806728363,
0.13838158547878265,
0.06125669553875923,
0.02336091548204422,
0.07380679249763489,
-0.04406145215034485,
0.07641468197107315,
0.10089273750782013,
-0.1377004235982895,
-0.0658743754029274,
0.10707703977823257,
0.022675571963191032,
0.14081890881061554,
-0.0930330902338028,
0.10055609047412872,
0.06909182667732239,
-0.0402040034532547,
-0.14090992510318756,
-0.09295165538787842,
-0.22208601236343384,
-0.010246324352920055,
0.01106152031570673,
0.02068142034113407,
0.38589364290237427,
0.0597744919359684,
0.03341582790017128,
0.060735005885362625,
-0.07507481426000595,
0.023644309490919113,
-0.10422932356595993,
0.13190989196300507,
0.05047275125980377,
0.09619855135679245,
-0.03952701762318611,
-0.1093960702419281,
-0.11540422588586807,
-0.06683624535799026,
-0.07207158952951431,
-0.01124113705009222,
0.020978111773729324,
0.12625983357429504,
-0.20060628652572632,
0.008660426363348961,
-0.047769639641046524,
-0.13439582288265228,
0.02471759170293808,
-0.10369910299777985,
0.16671493649482727,
0.07577252388000488,
-0.024977311491966248,
-0.0057203820906579494,
0.23933830857276917,
0.1449543982744217,
0.1936039924621582,
0.053635500371456146,
-0.11120853573083878,
0.13146178424358368,
0.03751388192176819,
-0.07889729738235474,
-0.00766773009672761,
-0.09680195152759552,
0.14837191998958588,
-0.05374765023589134,
0.1230064183473587,
-0.0728483498096466,
-0.10607805103063583,
0.020844582468271255,
-0.10892648994922638,
0.06403372436761856,
0.03686847910284996,
-0.0034239268861711025,
-0.052569542080163956,
0.0506235770881176,
0.04111502319574356,
-0.03003847785294056,
-0.003142894245684147,
-0.012989241629838943,
-0.042500220239162445,
0.05780065432190895,
0.11240547895431519,
0.03651462867856026,
0.06716223806142807,
0.00523278908804059,
-0.020712893456220627,
-0.0036100333090871572,
-0.0446857213973999,
0.017809433862566948,
0.044435594230890274,
0.062195174396038055,
0.08740806579589844,
-0.16229218244552612,
-0.06927665323019028,
-0.01400425098836422,
0.062049273401498795,
-0.0026521955151110888,
0.09484176337718964,
0.002748031634837389,
0.050683312118053436,
0.01989617571234703,
-0.01934436336159706,
0.032291218638420105,
-0.10356481373310089,
0.08455660939216614,
-0.012231200933456421,
0.09378176927566528,
-0.20879006385803223,
-0.009893640875816345,
-0.051802437752485275,
0.01146552711725235,
0.0375375896692276,
-0.002234159968793392,
-0.1075909361243248,
0.13307730853557587,
-0.002608611946925521,
0.06618504971265793,
-0.09337085485458374,
0.049021944403648376,
0.0270830187946558,
0.07745103538036346,
-0.09391260892152786,
0.013132873922586441,
0.10611627250909805,
-0.1441396027803421,
-0.16705778241157532,
0.08743446320295334,
-0.025221191346645355,
0.01402070838958025,
0.03800720348954201,
0.15049850940704346,
0.16808249056339264,
-0.1788496971130371,
-0.015986792743206024,
0.04816781356930733,
-0.022407639771699905,
-0.09363768249750137,
-0.01403103768825531,
0.12198161333799362,
0.0061571188271045685,
0.03307165950536728,
-0.03469137102365494,
0.12541431188583374,
-0.024423593655228615,
-0.08627086132764816,
-0.03229416906833649,
-0.07293594628572464,
-0.0903417244553566,
0.05462000146508217,
-0.007994663901627064,
-0.0586208738386631,
0.010067730210721493,
-0.16699960827827454,
0.15415062010288239,
0.01982997916638851,
0.03005949966609478,
-0.06724969297647476,
0.10989398509263992,
0.016530122607946396,
0.009080063551664352,
0.005308354739099741,
-0.052865169942379,
-0.10316148400306702,
0.2292247861623764,
0.08294935524463654,
0.07768344134092331,
0.052923936396837234,
-0.054327089339494705,
-0.07671013474464417,
0.015299203805625439,
0.003933304455131292,
-0.04674992710351944,
0.023629646748304367,
-0.10968322306871414,
0.05438155680894852,
-0.020153328776359558,
0.027186596766114235,
-0.012178624980151653,
-0.025331644341349602,
0.07487739622592926,
0.010748786851763725,
-0.014395567588508129,
0.08124983310699463,
0.05458930507302284,
-0.022923115640878677,
-0.06698694080114365,
0.00735883042216301,
0.06953408569097519,
0.0012581831542775035,
-0.0931084081530571,
0.029503444209694862,
-0.0006529832608066499,
0.04050687700510025,
0.20077089965343475,
-0.2243201583623886,
0.037936411798000336,
0.021015102043747902,
0.04666108265519142,
0.04090425744652748,
0.0025260192342102528,
-0.02607201226055622,
0.04475117847323418,
-0.0169870313256979,
0.06921390444040298,
-0.009125644341111183,
0.0667753592133522,
-0.02633359283208847,
-0.1338217407464981,
-0.009501038119196892,
-0.034447427839040756,
0.16300734877586365,
-0.13554392755031586,
0.05356477200984955,
0.16501697897911072,
-0.11795911937952042,
0.12405181676149368,
0.008008288219571114,
-0.005284988787025213,
0.012272018007934093,
0.021984605118632317,
0.010864974930882454,
0.10688819736242294,
-0.08367131650447845,
-0.03190162405371666,
0.025036025792360306,
-0.09746143966913223,
0.027663258835673332,
-0.1284598410129547,
-0.11477965861558914,
-0.067366823554039,
-0.02118486911058426,
-0.02588600106537342,
0.01986554078757763,
-0.05426337197422981,
0.07660531252622604,
-0.09003379195928574,
-0.08762862533330917,
-0.02743448130786419,
-0.08894629031419754,
0.019911110401153564,
0.02089606039226055,
-0.051597677171230316,
-0.14979374408721924,
-0.14103539288043976,
-0.08169949054718018,
-0.13951042294502258,
-0.006896475329995155,
0.06952261924743652,
-0.10882339626550674,
-0.04706212878227234,
0.009943445213139057,
-0.03756813704967499,
0.10087708383798599,
-0.06517314165830612,
0.01139238104224205,
0.05265609174966812,
-0.03555021807551384,
-0.1688108891248703,
0.002915332792326808,
-0.06615696102380753,
-0.04621700197458267,
0.16645313799381256,
-0.13380828499794006,
0.18643592298030853,
-0.01722656935453415,
0.054486848413944244,
0.05172910541296005,
0.03777937963604927,
0.11582895368337631,
-0.1216781884431839,
0.0764135792851448,
0.1823400855064392,
0.040454670786857605,
0.07816402614116669,
0.12188993394374847,
0.09014041721820831,
-0.11986136436462402,
0.041617706418037415,
0.07103968411684036,
-0.10668677091598511,
-0.10035353899002075,
-0.0557953417301178,
-0.10325076431035995,
-0.06309285759925842,
0.046522047370672226,
0.062343936413526535,
0.05798453465104103,
0.12667231261730194,
-0.054027654230594635,
-0.0020594652742147446,
0.08266782015562057,
0.05046483874320984,
0.09214723855257034,
0.02101968415081501,
0.06135634332895279,
-0.14892365038394928,
-0.050917722284793854,
0.15742278099060059,
0.20912477374076843,
0.2224976271390915,
0.02126910164952278,
0.044226400554180145,
0.11455879360437393,
0.059083860367536545,
0.10369456559419632,
0.05211291462182999,
0.00700940890237689,
0.016827747225761414,
-0.07309099286794662,
-0.05145245045423508,
0.01643032394349575,
0.012831583619117737,
-0.06649258732795715,
-0.14038622379302979,
0.10366832464933395,
-0.004166069440543652,
0.07236998528242111,
0.14004017412662506,
0.03860270231962204,
-0.1050068661570549,
0.1624314934015274,
0.0958532989025116,
0.07774955034255981,
-0.06915519386529922,
0.136123925447464,
0.05847305431962013,
-0.0005795996985398233,
0.16693733632564545,
0.022299541160464287,
0.14608033001422882,
-0.034489937126636505,
-0.07382660359144211,
-0.06630589812994003,
-0.06070077791810036,
0.008522119373083115,
0.02645445056259632,
-0.21548360586166382,
0.09404055029153824,
0.050618160516023636,
0.00795887503772974,
0.0030511850491166115,
-0.04607527330517769,
0.18476127088069916,
0.1517925262451172,
0.07204488664865494,
0.02152695134282112,
-0.022388242185115814,
-0.008543679490685463,
-0.09375330805778503,
0.05614056438207626,
0.009665453806519508,
0.07412679493427277,
-0.032765354961156845,
-0.08719710260629654,
-0.01904440112411976,
-0.0018288420978933573,
0.030912313610315323,
-0.07022466510534286,
-0.11084810644388199,
-0.04164683818817139,
0.2544795274734497,
-0.06993535161018372,
0.04363253340125084,
0.047266095876693726,
0.037416961044073105,
-0.06259690225124359,
0.05841624736785889,
-0.036368224769830704,
-0.012067153118550777,
-0.033923547714948654,
0.007203870918601751,
0.007309881970286369,
-0.03917136788368225,
-0.05273039638996124,
-0.02717188559472561,
-0.09912869334220886,
-0.10170748829841614,
0.0069986367598176,
-0.03621990606188774,
-0.0037680137902498245,
-0.025228697806596756,
0.02138379029929638,
-0.10596208274364471,
-0.02367166243493557,
0.01450631394982338,
0.04790418595075607,
-0.08018355071544647,
-0.1376054733991623,
0.003856360912322998,
-0.02567167580127716,
-0.0645199790596962,
0.02860262244939804,
-0.09112638980150223,
-0.07676363736391068,
-0.0499812476336956,
-0.034871820360422134,
0.12673848867416382,
0.22057384252548218,
-0.025644678622484207,
0.0026540544349700212,
0.1422295868396759,
-0.10053588449954987,
-0.3187329173088074,
-0.1504444181919098,
-0.16090205311775208,
-0.10665734857320786,
0.025266069918870926,
-0.08454634994268417,
0.040659524500370026,
0.08790749311447144,
-0.039682257920503616,
0.21101120114326477,
-0.19202949106693268,
-0.09741199016571045,
0.07686934620141983,
0.09887290745973587,
0.2975088953971863,
-0.26654699444770813,
0.009210392832756042,
-0.11333532631397247,
-0.03910277411341667,
0.0079735042527318,
-0.06594642251729965,
0.11617919057607651,
0.021340465173125267,
0.06473693996667862,
-0.0011510021286085248,
-0.010286962613463402,
0.14859691262245178,
-0.08040381222963333,
0.13771191239356995,
-0.11200707405805588,
-0.08131370693445206,
0.21239221096038818,
-0.03480982780456543,
0.009202363900840282,
-0.22005844116210938,
-0.035131655633449554,
-0.030327215790748596,
0.03366852179169655,
-0.008655661717057228,
0.04605237394571304,
-0.00948355719447136,
-0.008424483239650726,
-0.137702077627182,
-0.031214997172355652,
-0.028714196756482124,
0.056333623826503754,
0.24045391380786896,
-0.05771780386567116,
-0.06339824944734573,
0.023715438321232796,
-0.0029596746899187565,
0.09409995377063751,
0.0018575823633000255,
-0.05730385705828667,
-0.04798927903175354,
0.09893568605184555,
-0.2290959358215332,
0.05411957576870918,
0.009745930321514606,
-0.005967032629996538,
0.009027842432260513,
0.01823478378355503,
0.026201631873846054,
0.12090741842985153,
0.18502579629421234,
-0.00430719880387187,
-0.03928283229470253,
-0.018757062032818794,
0.05618998780846596,
0.1361180543899536,
-0.017368126660585403,
0.10484329611063004,
0.010951234959065914,
0.0367736779153347,
0.011618265882134438,
0.05232799053192139,
-0.07151210308074951,
-0.08353835344314575,
0.10222509503364563,
-0.042420644313097,
-0.08513245731592178,
0.08724964410066605,
0.05529533699154854,
0.06007430702447891,
0.006403757259249687,
0.05314498767256737,
0.031699635088443756,
-0.12719085812568665,
0.028935477137565613,
0.2093593329191208,
-0.08640164136886597,
-0.058842308819293976,
-0.07688455283641815,
0.008611329831182957,
-0.11536198109388351,
0.08540961146354675,
0.0329488068819046,
-0.02976820431649685,
0.10960213840007782,
-0.04380180686712265,
-0.025782732293009758,
0.0068136644549667835,
-0.07742748409509659,
0.05124124884605408,
-0.15226279199123383,
-0.20597341656684875,
0.052645985037088394,
-0.02083883062005043,
-0.06795119494199753,
-0.09214089810848236,
-0.08217161893844604,
0.06935269385576248,
-0.16917581856250763,
0.14814221858978271,
-0.05609135329723358,
0.06196874752640724,
-0.04729916527867317,
-0.04502153396606445,
-0.106157585978508,
-0.018287787213921547,
-0.05287642776966095,
-0.028172684833407402,
0.05072696879506111,
0.006205116398632526,
-0.1201135516166687,
-0.11704768985509872,
0.0594620443880558,
-0.012765520252287388,
-0.0029357147868722677,
0.010047468356788158,
-0.07562391459941864,
0.008253855630755424,
-0.21976163983345032,
-0.06086036562919617,
0.09407226741313934,
0.046048179268836975,
-0.09781178832054138,
0.1373436599969864,
0.05076111480593681,
-0.02880348637700081,
0.03418358042836189,
0.005882805213332176,
0.18175525963306427,
-0.08257468044757843,
0.028851956129074097,
-0.12246029078960419,
-0.17205767333507538,
-0.023478075861930847,
0.022964851930737495,
0.2245684713125229,
0.09089282155036926,
0.12049352377653122,
-0.053434886038303375,
0.025265831500291824,
-0.022026540711522102,
0.06855760514736176,
0.01910245232284069,
-0.09388991445302963,
-0.05016225576400757,
-0.1689678132534027,
-0.06541828066110611,
-0.06248897314071655,
0.17435729503631592,
0.03456180542707443,
-0.1419733464717865,
-0.0010620587272569537,
0.12736764550209045,
-0.15240268409252167,
-0.009416159242391586,
0.16970956325531006,
-0.03420129790902138,
0.025905264541506767,
-0.13399288058280945,
0.02929013967514038,
0.09323931485414505,
-0.028092041611671448,
-0.004417199641466141,
0.12275195866823196,
0.025133181363344193,
0.010441603139042854,
0.04217039421200752,
-0.014002706855535507,
0.07463636249303818,
-0.08565284311771393,
0.061758819967508316,
0.0041969455778598785,
-0.045871179550886154,
-0.10829641669988632,
0.1767314225435257,
-0.010625455528497696,
0.023649942129850388,
-0.06156279146671295,
0.006676509976387024,
-0.09349555522203445,
-0.08437367528676987,
-0.06821206957101822,
-0.14292852580547333,
0.0725918635725975,
-0.05992141366004944,
-0.004405423998832703,
0.0033159975428134203,
0.013324238359928131,
-0.07778559625148773,
0.010594774968922138,
-0.1785813719034195,
-0.03576565533876419,
0.02070745825767517,
-0.0075464434921741486,
-0.02767929807305336,
-0.030203117057681084,
-0.026007669046521187,
0.027722684666514397,
-0.055491212755441666,
-0.07157033681869507,
0.06506551802158356,
0.08361711353063583,
0.06338736414909363,
-0.15451735258102417,
-0.11028187721967697,
-0.06437351554632187,
0.022757796570658684,
0.07556755840778351,
0.16580374538898468,
0.036113861948251724,
-0.003584435675293207,
0.04198712483048439,
0.15008816123008728,
0.011368653737008572,
-0.06280392408370972,
-0.06637237966060638,
-0.131106436252594,
-0.1442723572254181,
-0.01652619242668152,
-0.07064882665872574,
-0.036868225783109665,
0.010374272242188454,
0.22277067601680756,
0.17999999225139618,
-0.1462002396583557,
0.03642845153808594,
-0.06299988925457001,
0.037896353751420975,
-0.030101897194981575,
0.1655295044183731,
0.06226310506463051,
0.1673624962568283,
-0.028926854953169823,
-0.05248301103711128,
-0.07025778293609619,
0.015535043552517891,
-0.09091237187385559,
0.03751303255558014,
-0.01694670133292675,
-0.0695224329829216,
-0.07192013412714005,
0.10310591012239456,
-0.10567346960306168,
0.07487434893846512,
0.19269421696662903,
-0.15649451315402985,
-0.02093985490500927,
-0.0411105751991272,
0.057331494987010956,
0.10483813285827637,
0.012590856291353703,
-0.06491051614284515,
-0.02217160165309906,
0.0004674353695008904,
0.030253123492002487,
-0.17653846740722656,
-0.1108119934797287,
-0.006746888160705566,
-0.1487722247838974,
0.14059622585773468,
-0.006493106484413147,
-0.00607804162427783,
0.03572896495461464,
-0.053714752197265625,
-0.005261267069727182,
0.17369835078716278,
0.01524812076240778,
-0.04690871760249138,
-0.025284312665462494,
-0.0499524250626564,
-0.09894249588251114,
0.07158307731151581,
0.09393160045146942,
0.05934105068445206,
-0.013407640159130096,
0.16218514740467072,
-0.01890563778579235,
-0.0434802770614624,
0.12451985478401184,
-0.17505766451358795,
0.08088690042495728,
0.0006054530385881662,
-0.01996057666838169,
-0.07031264156103134,
-0.038261715322732925,
0.036115922033786774,
0.07944396138191223,
-0.16468767821788788,
-0.0401373989880085,
0.057523831725120544,
-0.09845137596130371,
0.05529101565480232,
0.04645804315805435,
-0.10194427520036697,
0.014966584742069244,
-0.11810795962810516,
-0.0033295617904514074,
-0.09864190220832825,
0.03521040827035904,
0.2004079967737198,
-0.029824113473296165,
0.008264056406915188,
-0.14202801883220673,
0.060094498097896576,
-0.020892726257443428,
-0.041521381586790085,
-0.07314686477184296
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-300m-england-0215-parallel-12-23-dim128-iceberg
This model is a fine-tuned version of [vitouphy/wav2vec2-xls-r-300m-english](https://huggingface.co/vitouphy/wav2vec2-xls-r-300m-english) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2860
- Wer: 0.1938
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1227
- num_epochs: 15
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:-----:|:---------------:|:------:|
| 1.4151 | 1.0 | 1227 | 0.2565 | 0.2610 |
| 0.2618 | 2.0 | 2454 | 0.2270 | 0.2213 |
| 0.2219 | 3.0 | 3681 | 0.2205 | 0.2126 |
| 0.1934 | 4.0 | 4908 | 0.2104 | 0.2017 |
| 0.1705 | 5.0 | 6135 | 0.2132 | 0.1990 |
| 0.1515 | 6.0 | 7362 | 0.2149 | 0.1972 |
| 0.1354 | 7.0 | 8589 | 0.2144 | 0.1963 |
| 0.1212 | 8.0 | 9816 | 0.2207 | 0.1946 |
| 0.108 | 9.0 | 11043 | 0.2281 | 0.1921 |
| 0.0962 | 10.0 | 12270 | 0.2357 | 0.1927 |
| 0.0854 | 11.0 | 13497 | 0.2426 | 0.1925 |
| 0.0758 | 12.0 | 14724 | 0.2573 | 0.1925 |
| 0.0678 | 13.0 | 15951 | 0.2667 | 0.1948 |
| 0.0613 | 14.0 | 17178 | 0.2748 | 0.1936 |
| 0.0559 | 15.0 | 18405 | 0.2860 | 0.1938 |
### Framework versions
- Transformers 4.36.0.dev0
- Pytorch 2.1.0
- Datasets 2.14.7
- Tokenizers 0.15.0
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "metrics": ["wer"], "base_model": "vitouphy/wav2vec2-xls-r-300m-english", "model-index": [{"name": "wav2vec2-300m-england-0215-parallel-12-23-dim128-iceberg", "results": []}]} | automatic-speech-recognition | Lin25/wav2vec2-300m-england-0215-parallel-12-23-dim128-iceberg | [
"transformers",
"tensorboard",
"safetensors",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"base_model:vitouphy/wav2vec2-xls-r-300m-english",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2024-02-15T06:16:13+00:00 | [] | [] | TAGS
#transformers #tensorboard #safetensors #wav2vec2 #automatic-speech-recognition #generated_from_trainer #base_model-vitouphy/wav2vec2-xls-r-300m-english #license-apache-2.0 #endpoints_compatible #region-us
| wav2vec2-300m-england-0215-parallel-12-23-dim128-iceberg
========================================================
This model is a fine-tuned version of vitouphy/wav2vec2-xls-r-300m-english on the None dataset.
It achieves the following results on the evaluation set:
* Loss: 0.2860
* Wer: 0.1938
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 0.001
* train\_batch\_size: 16
* eval\_batch\_size: 8
* seed: 42
* gradient\_accumulation\_steps: 2
* total\_train\_batch\_size: 32
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 1227
* num\_epochs: 15
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.36.0.dev0
* Pytorch 2.1.0
* Datasets 2.14.7
* Tokenizers 0.15.0
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.001\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1227\n* num\\_epochs: 15\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.36.0.dev0\n* Pytorch 2.1.0\n* Datasets 2.14.7\n* Tokenizers 0.15.0"
] | [
"TAGS\n#transformers #tensorboard #safetensors #wav2vec2 #automatic-speech-recognition #generated_from_trainer #base_model-vitouphy/wav2vec2-xls-r-300m-english #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.001\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1227\n* num\\_epochs: 15\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.36.0.dev0\n* Pytorch 2.1.0\n* Datasets 2.14.7\n* Tokenizers 0.15.0"
] | [
80,
159,
4,
37
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #wav2vec2 #automatic-speech-recognition #generated_from_trainer #base_model-vitouphy/wav2vec2-xls-r-300m-english #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.001\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1227\n* num\\_epochs: 15\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.36.0.dev0\n* Pytorch 2.1.0\n* Datasets 2.14.7\n* Tokenizers 0.15.0"
] | [
-0.11125040054321289,
0.11672348529100418,
-0.0033811433240771294,
0.045987311750650406,
0.0869944840669632,
0.023578835651278496,
0.10811027884483337,
0.14709368348121643,
-0.05466161668300629,
0.12874917685985565,
0.11158871650695801,
0.08080225437879562,
0.07414627075195312,
0.14294543862342834,
-0.025912530720233917,
-0.305961936712265,
0.030796058475971222,
-0.014198211953043938,
-0.10813137888908386,
0.10028233379125595,
0.0772201344370842,
-0.10852054506540298,
0.03244476765394211,
0.007960456423461437,
-0.09120530635118484,
-0.010434444062411785,
-0.03236847743391991,
-0.06939660757780075,
0.10821015387773514,
0.049945250153541565,
0.06922361999750137,
0.0335264727473259,
0.0785842165350914,
-0.27644598484039307,
0.013893542811274529,
0.04608464241027832,
0.020465349778532982,
0.07049182057380676,
0.09868130832910538,
-0.002262058900669217,
0.11742134392261505,
-0.09644583612680435,
0.07574785500764847,
0.04160989075899124,
-0.08741267025470734,
-0.2978936731815338,
-0.07163486629724503,
0.049757760018110275,
0.1351420283317566,
0.07821270078420639,
-0.03006582334637642,
0.07637427747249603,
-0.05031337961554527,
0.08184508234262466,
0.22600287199020386,
-0.2668791711330414,
-0.06862615793943405,
-0.008786828257143497,
0.05933660641312599,
0.05306391417980194,
-0.12230260670185089,
-0.020721660926938057,
0.015240314416587353,
0.024181192740797997,
0.09113454818725586,
0.009061560966074467,
0.07613759487867355,
0.012335472740232944,
-0.15266117453575134,
-0.03176775947213173,
0.11056479811668396,
0.095452681183815,
-0.012929768301546574,
-0.11783576756715775,
-0.04470670223236084,
-0.1561776101589203,
-0.06498768925666809,
-0.027628613635897636,
0.020282777026295662,
-0.032295551151037216,
-0.08206674456596375,
0.020397046580910683,
-0.05925620347261429,
-0.07732632011175156,
0.01864779181778431,
0.15803363919258118,
0.05416860431432724,
-0.041521523147821426,
0.025962864980101585,
0.07600732892751694,
0.04159146547317505,
-0.1555069386959076,
-0.004245550837367773,
0.030715597793459892,
-0.10222557187080383,
-0.015560079365968704,
-0.014195778407156467,
-0.011810754425823689,
0.036347661167383194,
0.14454613626003265,
-0.032829079777002335,
0.10252435505390167,
0.02540670335292816,
0.008117973804473877,
-0.09029565751552582,
0.14343410730361938,
-0.056808747351169586,
-0.0788479745388031,
-0.05110646039247513,
0.11395905166864395,
0.018090544268488884,
-0.014324644580483437,
-0.07521529495716095,
0.028577063232660294,
0.10473518818616867,
0.04292221739888191,
-0.011000865139067173,
0.014989365823566914,
-0.0651741623878479,
-0.022809365764260292,
0.025968166068196297,
-0.10803902894258499,
0.06000052019953728,
0.040110256522893906,
-0.03503705933690071,
-0.0008621293818578124,
-0.005477202124893665,
0.022808076813817024,
-0.005996016785502434,
0.12165001779794693,
-0.07304990291595459,
-0.01723290979862213,
-0.049489233642816544,
-0.09294670820236206,
0.03428088128566742,
-0.024542540311813354,
-0.0041365716606378555,
-0.07884515821933746,
-0.08151698857545853,
-0.0469217449426651,
0.05687825754284859,
-0.05205889791250229,
-0.051181238144636154,
-0.07554778456687927,
-0.058581627905368805,
0.06968175619840622,
-0.004353370517492294,
0.10547435283660889,
-0.05417579784989357,
0.09682296216487885,
0.014752404764294624,
0.06561492383480072,
0.057240501046180725,
0.05602185055613518,
-0.038896575570106506,
0.04524527117609978,
-0.17173196375370026,
0.07040172815322876,
-0.10060130804777145,
0.049345217645168304,
-0.15981373190879822,
-0.09319637715816498,
-0.027423175051808357,
-0.0012027116026729345,
0.08426200598478317,
0.11689338833093643,
-0.16840489208698273,
-0.10421961545944214,
0.18465708196163177,
-0.0909581184387207,
-0.09801614284515381,
0.14860936999320984,
-0.01434240397065878,
-0.04052523896098137,
0.027301248162984848,
0.18243998289108276,
0.09684263914823532,
-0.09944082051515579,
-0.011331168934702873,
-0.05930353328585625,
0.11809193342924118,
0.043047741055488586,
0.1095237135887146,
-0.04611702263355255,
0.00961561594158411,
-0.0029860869981348515,
-0.015063534490764141,
0.05863656476140022,
-0.07467639446258545,
-0.08342790603637695,
-0.024205202236771584,
-0.0644688829779625,
0.02705649472773075,
0.05022626370191574,
0.02939620241522789,
-0.08530709147453308,
-0.13840974867343903,
0.022961078211665154,
0.10921836644411087,
-0.09773162752389908,
0.028020154684782028,
-0.0712176039814949,
0.06666459143161774,
-0.03005686216056347,
0.002873169956728816,
-0.13900317251682281,
-0.000015585135770379566,
0.038551487028598785,
-0.0573650524020195,
0.018939698114991188,
-0.019958794116973877,
0.08005693554878235,
0.06013508886098862,
-0.05887673422694206,
-0.0691106989979744,
-0.04400309920310974,
0.011235828511416912,
-0.06980215758085251,
-0.24094487726688385,
-0.04771483317017555,
-0.04340505972504616,
0.14378952980041504,
-0.21928226947784424,
0.010043175891041756,
0.014206111431121826,
0.14648500084877014,
0.03944495692849159,
-0.04733646288514137,
-0.004966219887137413,
0.06074922904372215,
-0.025406215339899063,
-0.06428752839565277,
0.0336940735578537,
-0.013091953471302986,
-0.1263117492198944,
-0.007255719043314457,
-0.15121617913246155,
0.10352122783660889,
0.10123386979103088,
0.036324720829725266,
-0.0790107399225235,
-0.08090367913246155,
-0.05477483198046684,
-0.05292755737900734,
-0.028096064925193787,
-0.0051421960815787315,
0.14049707353115082,
0.025039857253432274,
0.09691198915243149,
-0.07023152709007263,
-0.03498105704784393,
0.043977975845336914,
0.022579096257686615,
-0.048477720469236374,
0.14696146547794342,
0.07238748669624329,
-0.08072729408740997,
0.09973517060279846,
0.1407405436038971,
-0.04913105443120003,
0.1341710090637207,
-0.06222176551818848,
-0.09396787732839584,
-0.0397845022380352,
0.03257042169570923,
0.0333406962454319,
0.09257335215806961,
-0.12805911898612976,
-0.002397094154730439,
0.013216383755207062,
0.026987634599208832,
0.005504322238266468,
-0.17448042333126068,
-0.0045450590550899506,
0.05027681589126587,
-0.06214132532477379,
0.012126506306231022,
-0.00021859222033526748,
-0.01789042539894581,
0.07760372757911682,
0.01912335492670536,
-0.07102897763252258,
-0.017593802884221077,
-0.013495332561433315,
-0.09277325123548508,
0.18030861020088196,
-0.1181328296661377,
-0.14045171439647675,
-0.11743523925542831,
-0.022359393537044525,
-0.013525797054171562,
-0.01459397841244936,
0.06278937309980392,
-0.10672096163034439,
-0.03901626914739609,
-0.07621166855096817,
0.024412043392658234,
-0.06418348103761673,
0.055267393589019775,
0.026166774332523346,
-0.0008972569485194981,
0.04716913402080536,
-0.08978444337844849,
0.018452132120728493,
-0.013465750962495804,
0.0003981892659794539,
0.007437915541231632,
0.015986446291208267,
0.09541906416416168,
0.16047482192516327,
0.04378354176878929,
0.027006877586245537,
-0.047626905143260956,
0.17678505182266235,
-0.09706307202577591,
0.003348992671817541,
0.10117466747760773,
0.001243483042344451,
0.05716513842344284,
0.1680709570646286,
0.05188771337270737,
-0.08012495934963226,
0.018629450350999832,
0.026830771937966347,
-0.0033633983694016933,
-0.222286194562912,
-0.0359543114900589,
-0.06156989187002182,
-0.003729772986844182,
0.11867345869541168,
0.05199667438864708,
-0.018736062571406364,
0.02625175565481186,
-0.013278947211802006,
-0.00813223235309124,
0.01262789499014616,
0.08093065768480301,
0.09925718605518341,
0.04648677259683609,
0.1158515140414238,
-0.01756645180284977,
-0.03697226569056511,
0.035859644412994385,
-0.006883406080305576,
0.22252388298511505,
0.036939773708581924,
0.1493314951658249,
0.03414954990148544,
0.14336740970611572,
0.016296442598104477,
0.042121097445487976,
0.0157496128231287,
-0.021932706236839294,
0.003282074350863695,
-0.0651535913348198,
-0.016937630251049995,
0.06866899877786636,
0.09735434502363205,
0.029784347862005234,
-0.11202400922775269,
0.01682434044778347,
0.03217422217130661,
0.2945035696029663,
0.08316489309072495,
-0.27940231561660767,
-0.08671722561120987,
0.0204121433198452,
-0.08735847473144531,
-0.02657165937125683,
0.03436672315001488,
0.1010785847902298,
-0.05706968531012535,
0.08239482343196869,
-0.07319074124097824,
0.07758994400501251,
-0.04643344506621361,
-0.0006271661841310561,
0.04690684378147125,
0.09241478890180588,
-0.007870204746723175,
0.05141476169228554,
-0.23599368333816528,
0.29592713713645935,
0.003536512842401862,
0.06310877203941345,
-0.039130616933107376,
0.03602297976613045,
0.0326574333012104,
-0.016925692558288574,
0.09052696824073792,
-0.01860683411359787,
-0.16294428706169128,
-0.15743038058280945,
-0.09956714510917664,
0.025134671479463577,
0.12404318898916245,
-0.0635005310177803,
0.10142835974693298,
-0.031083907932043076,
-0.034998029470443726,
0.06044420227408409,
-0.034075699746608734,
-0.11849040538072586,
-0.12366949766874313,
0.025100789964199066,
0.034773170948028564,
0.054279182106256485,
-0.08732069283723831,
-0.11676585674285889,
-0.09481672942638397,
0.15284651517868042,
-0.09852614998817444,
0.005955438129603863,
-0.13632310926914215,
0.06862016767263412,
0.15827256441116333,
-0.08503198623657227,
0.05339088663458824,
-0.0013354896800592542,
0.11797936260700226,
-0.006339826621115208,
-0.025495173409581184,
0.12359072268009186,
-0.08477987349033356,
-0.19829009473323822,
-0.06870461255311966,
0.16444018483161926,
0.02945098839700222,
0.06429765373468399,
-0.025037497282028198,
0.044366706162691116,
-0.012563238851726055,
-0.07999780774116516,
0.07907303422689438,
0.05079150199890137,
0.019085370004177094,
0.028251394629478455,
-0.03262144699692726,
-0.03337010368704796,
-0.06204165518283844,
-0.06481772661209106,
0.1339120864868164,
0.3063141703605652,
-0.09767036139965057,
0.05047191306948662,
0.07842813432216644,
-0.039860162883996964,
-0.13847042620182037,
-0.02091336064040661,
0.10563898831605911,
0.02901599369943142,
0.022672023624181747,
-0.18818485736846924,
0.04134274646639824,
0.07624213397502899,
-0.02252543717622757,
0.05498311668634415,
-0.29598891735076904,
-0.1359994113445282,
0.10732582956552505,
0.1010739728808403,
-0.014688246883451939,
-0.1675737053155899,
-0.0734938457608223,
-0.008448641747236252,
-0.08441881835460663,
0.044224586337804794,
-0.01439726073294878,
0.11891723424196243,
-0.006518296431750059,
0.012773225083947182,
0.0130988834425807,
-0.05390370264649391,
0.14871634542942047,
-0.016644228249788284,
0.03722798824310303,
-0.012664098292589188,
0.021789591759443283,
-0.04666031524538994,
-0.06835343688726425,
0.005614324007183313,
-0.10638050734996796,
0.031019579619169235,
-0.10401832312345505,
-0.03363680839538574,
-0.060881108045578,
0.021275276318192482,
-0.039881542325019836,
-0.0365155003964901,
-0.0372975617647171,
0.047665562480688095,
0.07259248197078705,
-0.008285160176455975,
0.14420445263385773,
-0.034324727952480316,
0.1598251760005951,
0.11561811715364456,
0.08665712177753448,
-0.004496111534535885,
-0.07492593675851822,
-0.011240532621741295,
-0.031892675906419754,
0.04502181336283684,
-0.127507746219635,
0.02625088021159172,
0.14511233568191528,
0.03109927475452423,
0.1593732386827469,
0.05232463777065277,
-0.08425019681453705,
0.005058830138295889,
0.07195340842008591,
-0.09023431688547134,
-0.18087659776210785,
-0.020864415913820267,
0.04831898957490921,
-0.14672359824180603,
0.009041533805429935,
0.10776457190513611,
-0.0424765907227993,
-0.006876726634800434,
0.0167338028550148,
0.037779927253723145,
-0.024207651615142822,
0.21213935315608978,
0.0224633626639843,
0.07641582936048508,
-0.08183883875608444,
0.06725231558084488,
0.058127835392951965,
-0.1770939975976944,
0.04478456825017929,
0.10081841051578522,
-0.06280083954334259,
-0.02085166983306408,
0.03469080477952957,
0.09337472915649414,
0.025374911725521088,
-0.044561974704265594,
-0.10955788195133209,
-0.13676294684410095,
0.09175597131252289,
0.10210075974464417,
0.02787865325808525,
0.010772627778351307,
-0.031357720494270325,
0.04090035706758499,
-0.0816037505865097,
0.12178315222263336,
0.07580644637346268,
0.07321812957525253,
-0.1455857753753662,
0.0967322289943695,
0.00719171529635787,
-0.009250449016690254,
-0.0007969120051711798,
0.010978120379149914,
-0.12529562413692474,
-0.003296090755611658,
-0.09597703814506531,
-0.015769891440868378,
-0.08599425852298737,
-0.005111309699714184,
0.010616015642881393,
-0.06839397549629211,
-0.04972195625305176,
0.006150325760245323,
-0.09891363978385925,
-0.04687775298953056,
-0.020919417962431908,
0.07255452871322632,
-0.10807473957538605,
-0.018001988530158997,
0.033562470227479935,
-0.1094459593296051,
0.09295301139354706,
0.03350798785686493,
0.02765616960823536,
0.02122689038515091,
-0.09182094037532806,
0.02297062985599041,
0.03987521678209305,
-0.011650007218122482,
0.01706613041460514,
-0.19513703882694244,
-0.013845182955265045,
-0.029460366815328598,
0.01434008777141571,
-0.0015070561785250902,
0.04135845601558685,
-0.11995894461870193,
-0.009474464692175388,
-0.07134617120027542,
-0.07497116178274155,
-0.049713414162397385,
0.034178540110588074,
0.07949082553386688,
0.003429786767810583,
0.15369179844856262,
-0.0924302190542221,
0.053954094648361206,
-0.21988928318023682,
0.005673393607139587,
-0.027769101783633232,
-0.06270740926265717,
-0.061077870428562164,
-0.027901874855160713,
0.07232556492090225,
-0.05565764382481575,
0.06962648779153824,
-0.07028694450855255,
0.04276060312986374,
0.04117397591471672,
-0.11610512435436249,
0.01450312603265047,
0.036350857466459274,
0.2045416533946991,
0.05654139816761017,
-0.028793111443519592,
0.04828225076198578,
0.00022618239745497704,
0.06620636582374573,
0.1312987506389618,
0.13795816898345947,
0.1696651726961136,
0.03255227953195572,
0.09734862297773361,
0.06919356435537338,
-0.11050422489643097,
-0.15144914388656616,
0.13202013075351715,
-0.04040680453181267,
0.124272920191288,
-0.007921814918518066,
0.19983242452144623,
0.13132187724113464,
-0.18843404948711395,
0.033824753016233444,
-0.02642347849905491,
-0.08210574835538864,
-0.11421271413564682,
-0.06264319270849228,
-0.0965409204363823,
-0.19737447798252106,
0.006751309148967266,
-0.09239188581705093,
0.04660925641655922,
0.007165633141994476,
0.04909483715891838,
0.0463617704808712,
0.11157841980457306,
0.05220404267311096,
0.012820740230381489,
0.09357165545225143,
0.025030486285686493,
-0.024494051933288574,
-0.024410495534539223,
-0.0938718169927597,
0.03027375601232052,
-0.046962134540081024,
0.04408808797597885,
-0.039630550891160965,
-0.09237006306648254,
0.07307836413383484,
0.011753041297197342,
-0.09992513060569763,
0.02207903563976288,
-0.0037799591664224863,
0.04856259748339653,
0.10026407986879349,
0.03396597504615784,
-0.027066316455602646,
-0.01291949488222599,
0.2043222188949585,
-0.09973455220460892,
-0.04780346900224686,
-0.12354208528995514,
0.22355403006076813,
0.0014740725746378303,
0.008994778618216515,
0.008190159685909748,
-0.08053392916917801,
-0.0037458522710949183,
0.1454310119152069,
0.1308739334344864,
0.004041510168462992,
-0.008346261456608772,
0.03932690620422363,
-0.010614803992211819,
-0.03454037383198738,
0.051121290773153305,
0.11498218774795532,
0.07574617117643356,
-0.04807744547724724,
-0.04578683525323868,
-0.04537874087691307,
-0.05596432462334633,
-0.03515719994902611,
0.05985892564058304,
0.02628074772655964,
-0.015889842063188553,
-0.009110546670854092,
0.11077287048101425,
-0.04039786010980606,
-0.12654802203178406,
0.03300660848617554,
-0.1848394125699997,
-0.1717325896024704,
-0.026880457997322083,
0.08475376665592194,
0.02648530900478363,
0.037198133766651154,
0.005862687714397907,
-0.03610939159989357,
0.1003790870308876,
0.006596104241907597,
-0.059488695114851,
-0.09675329178571701,
0.07472053915262222,
-0.063965804874897,
0.1627553254365921,
-0.030987979844212532,
0.018767327070236206,
0.12893763184547424,
0.07960448414087296,
-0.08066672831773758,
0.04606111720204353,
0.08777092397212982,
-0.10834519565105438,
0.059624332934617996,
0.16848692297935486,
-0.03972490876913071,
0.1559506207704544,
0.06128307804465294,
-0.10178174823522568,
0.023131251335144043,
-0.09848055243492126,
-0.06894071400165558,
-0.05116487294435501,
0.025825733318924904,
-0.04022854194045067,
0.15300045907497406,
0.18271569907665253,
-0.059765152633190155,
-0.025192473083734512,
-0.03417597711086273,
0.02346028946340084,
0.038164496421813965,
0.13918429613113403,
-0.02874426729977131,
-0.2712148427963257,
0.026619045063853264,
0.0069289556704461575,
0.03121339902281761,
-0.23717741668224335,
-0.11571343243122101,
0.025267720222473145,
-0.04123814404010773,
-0.077186219394207,
0.11900442093610764,
0.08290667086839676,
0.034316230565309525,
-0.06246182695031166,
-0.14270710945129395,
-0.024620123207569122,
0.17496679723262787,
-0.17576231062412262,
-0.05104701220989227
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fugumt-en-ja-finetuned-en-to-ja-gt-2797388
This model is a fine-tuned version of [staka/fugumt-en-ja](https://huggingface.co/staka/fugumt-en-ja) on the None dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Bleu | Gen Len |
|:-------------:|:-----:|:----:|:---------------:|:----:|:-------:|
| No log | 1.0 | 1 | 57.0593 | 0.0 | 147.0 |
### Framework versions
- Transformers 4.36.0
- Pytorch 2.1.2+cu118
- Datasets 2.17.0
- Tokenizers 0.15.0
| {"license": "cc-by-sa-4.0", "tags": ["generated_from_trainer"], "base_model": "staka/fugumt-en-ja", "model-index": [{"name": "fugumt-en-ja-finetuned-en-to-ja-gt-2797388", "results": []}]} | text2text-generation | VJ11/fugumt-en-ja-finetuned-en-to-ja-gt-2797388 | [
"transformers",
"safetensors",
"marian",
"text2text-generation",
"generated_from_trainer",
"base_model:staka/fugumt-en-ja",
"license:cc-by-sa-4.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-15T06:16:14+00:00 | [] | [] | TAGS
#transformers #safetensors #marian #text2text-generation #generated_from_trainer #base_model-staka/fugumt-en-ja #license-cc-by-sa-4.0 #autotrain_compatible #endpoints_compatible #region-us
| fugumt-en-ja-finetuned-en-to-ja-gt-2797388
==========================================
This model is a fine-tuned version of staka/fugumt-en-ja on the None dataset.
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 0.0002
* train\_batch\_size: 32
* eval\_batch\_size: 32
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 1.0
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.36.0
* Pytorch 2.1.2+cu118
* Datasets 2.17.0
* Tokenizers 0.15.0
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0002\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 1.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.36.0\n* Pytorch 2.1.2+cu118\n* Datasets 2.17.0\n* Tokenizers 0.15.0"
] | [
"TAGS\n#transformers #safetensors #marian #text2text-generation #generated_from_trainer #base_model-staka/fugumt-en-ja #license-cc-by-sa-4.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0002\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 1.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.36.0\n* Pytorch 2.1.2+cu118\n* Datasets 2.17.0\n* Tokenizers 0.15.0"
] | [
73,
112,
4,
35
] | [
"passage: TAGS\n#transformers #safetensors #marian #text2text-generation #generated_from_trainer #base_model-staka/fugumt-en-ja #license-cc-by-sa-4.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0002\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 1.0\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.36.0\n* Pytorch 2.1.2+cu118\n* Datasets 2.17.0\n* Tokenizers 0.15.0"
] | [
-0.109549380838871,
0.08634161204099655,
-0.004076956771314144,
0.0788763239979744,
0.09986869245767593,
-0.013713160529732704,
0.17603884637355804,
0.12576411664485931,
-0.12896797060966492,
0.07280908524990082,
0.1423172950744629,
0.121566541492939,
0.03263706713914871,
0.20522017776966095,
-0.08621995151042938,
-0.22018279135227203,
0.07271997630596161,
0.028616953641176224,
-0.015883231535553932,
0.11527091264724731,
0.09156036376953125,
-0.13142387568950653,
0.09368538111448288,
0.02125915326178074,
-0.1797032356262207,
-0.0024621242191642523,
0.024590788409113884,
-0.08657349646091461,
0.09746451675891876,
0.026925813406705856,
0.11077630519866943,
0.06416014581918716,
0.05912061408162117,
-0.16239091753959656,
0.01404434721916914,
0.04608418419957161,
-0.0009011523216031492,
0.08493868261575699,
0.042395688593387604,
-0.03143414109945297,
0.09287287294864655,
-0.08976548910140991,
0.06329850107431412,
0.023328585550189018,
-0.14703130722045898,
-0.24603575468063354,
-0.1199176162481308,
0.02916790544986725,
0.0836026519536972,
0.06282907724380493,
-0.010142233222723007,
0.17149990797042847,
-0.024086328223347664,
0.09756221622228622,
0.25667503476142883,
-0.3206805884838104,
-0.05909990146756172,
0.015087505802512169,
0.050148289650678635,
0.09090408682823181,
-0.08020114153623581,
-0.007454784587025642,
0.04950781911611557,
0.0198239553719759,
0.14708605408668518,
-0.02363906055688858,
0.004094479139894247,
-0.028340114280581474,
-0.13826844096183777,
-0.055516839027404785,
0.1612769365310669,
0.0322035551071167,
-0.07934785634279251,
-0.04840843752026558,
-0.06657155603170395,
-0.1521712988615036,
-0.06751769781112671,
-0.021056141704320908,
0.05252346768975258,
-0.03126245737075806,
-0.08627504855394363,
-0.008383694104850292,
-0.07022726535797119,
-0.07064184546470642,
-0.04388999938964844,
0.19748292863368988,
0.05360015481710434,
0.011684865690767765,
-0.05109282210469246,
0.059554580599069595,
-0.08592534065246582,
-0.1559145152568817,
-0.03808272257447243,
0.0045478069223463535,
0.027150453999638557,
-0.05563732236623764,
-0.044580817222595215,
-0.11676476150751114,
0.023321574553847313,
0.16196297109127045,
-0.1412765234708786,
0.07643764466047287,
-0.03398670628666878,
0.028759701177477837,
-0.0949520692229271,
0.15180358290672302,
-0.011129985563457012,
0.021343177184462547,
0.04012347012758255,
0.07653454691171646,
0.0733109638094902,
-0.022044990211725235,
-0.09689535945653915,
0.031978510320186615,
0.10274198651313782,
0.030922990292310715,
-0.07183653861284256,
0.06897809356451035,
-0.04173703491687775,
0.005812701769173145,
0.05380665138363838,
-0.10101024061441422,
0.03546294569969177,
-0.014097277075052261,
-0.047898683696985245,
-0.04328037425875664,
0.015565465204417706,
0.006398017052561045,
-0.008375058881938457,
0.0914883092045784,
-0.07961050420999527,
0.018357453867793083,
-0.08076320588588715,
-0.14064131677150726,
0.021540328860282898,
-0.0846552699804306,
0.011365522630512714,
-0.11939600110054016,
-0.1460295170545578,
-0.005219209007918835,
0.03788020834326744,
-0.036792315542697906,
-0.009539445862174034,
-0.05042225494980812,
-0.10197847336530685,
0.039920780807733536,
-0.01849980652332306,
0.0198188629001379,
-0.07411214709281921,
0.08602607250213623,
0.0668666884303093,
0.07762552052736282,
-0.049358028918504715,
0.023021074011921883,
-0.09112710505723953,
0.04850383102893829,
-0.2575241029262543,
0.04845558851957321,
-0.06516704708337784,
0.08110835403203964,
-0.08873060345649719,
-0.08545377850532532,
0.011386482045054436,
-0.012320799753069878,
0.11302077025175095,
0.11971649527549744,
-0.15848781168460846,
-0.05801837891340256,
0.2334241420030594,
-0.13214527070522308,
-0.15023112297058105,
0.1259530931711197,
-0.04650888592004776,
0.022738587111234665,
0.0662810280919075,
0.20765037834644318,
0.044553570449352264,
-0.10967934876680374,
-0.0006817260291427374,
-0.0697900652885437,
0.05098840594291687,
-0.028529876843094826,
0.06892714649438858,
0.010000268928706646,
0.04190206527709961,
0.005424306262284517,
-0.008893883787095547,
0.01583048887550831,
-0.08838052302598953,
-0.07239169627428055,
-0.05620413273572922,
-0.07928373664617538,
0.02425960823893547,
0.016953974962234497,
0.060444436967372894,
-0.15277227759361267,
-0.09058552980422974,
0.05437403544783592,
0.06785181164741516,
-0.07029116153717041,
0.04937198758125305,
-0.12327703088521957,
0.12009931355714798,
-0.06527296453714371,
-0.0017746109515428543,
-0.15407729148864746,
-0.01511270459741354,
0.033812884241342545,
-0.012151767499744892,
0.02204320766031742,
-0.08559238165616989,
0.07894246280193329,
0.09330788254737854,
-0.048654038459062576,
-0.034311890602111816,
-0.022814618423581123,
0.01518251746892929,
-0.10271935909986496,
-0.20297180116176605,
-0.014757199212908745,
-0.04582229629158974,
0.09242498129606247,
-0.1721060872077942,
0.0586501844227314,
0.07775621861219406,
0.10605873167514801,
0.04746879264712334,
-0.025261258706450462,
-0.007735078688710928,
0.07857153564691544,
-0.03884238377213478,
-0.070843406021595,
0.04690573364496231,
0.03532393276691437,
-0.08137454837560654,
0.006392189767211676,
-0.1788332611322403,
0.1849871426820755,
0.14106276631355286,
-0.005939517170190811,
-0.07527709007263184,
-0.0057843513786792755,
-0.028100840747356415,
-0.010134510695934296,
-0.027795866131782532,
0.0319383442401886,
0.12946805357933044,
0.0026243804022669792,
0.15423588454723358,
-0.1083451360464096,
-0.022192388772964478,
0.05696636065840721,
-0.05592306703329086,
0.0006266269483603537,
0.08231671899557114,
0.002299324609339237,
-0.1267135888338089,
0.12827962636947632,
0.16192877292633057,
-0.062095459550619125,
0.10978944599628448,
-0.06366796046495438,
-0.0540812611579895,
-0.040299057960510254,
0.019543275237083435,
0.04293983057141304,
0.10484492033720016,
-0.09030988067388535,
-0.023826150223612785,
0.009452194906771183,
0.03286977484822273,
-0.008932686410844326,
-0.1919577717781067,
-0.0021390675101429224,
0.05119498074054718,
-0.05709533020853996,
-0.04905989393591881,
-0.011583327315747738,
-0.0003847151529043913,
0.09750581532716751,
0.008092127740383148,
-0.05970063805580139,
0.0263813603669405,
0.012816482223570347,
-0.06842141598463058,
0.18978753685951233,
-0.10154075175523758,
-0.12696877121925354,
-0.11798095703125,
-0.09233435243368149,
-0.05511673912405968,
0.025307031348347664,
0.09266649931669235,
-0.0714619979262352,
-0.05624764412641525,
-0.11538522690534592,
-0.027541304007172585,
0.015630919486284256,
0.02612394653260708,
0.04015670716762543,
-0.015986811369657516,
0.08339893817901611,
-0.10063992440700531,
-0.02855123020708561,
-0.015682034194469452,
-0.005319640506058931,
0.0693984180688858,
0.016993913799524307,
0.12546473741531372,
0.09993527829647064,
-0.05302532762289047,
0.028779910877346992,
-0.04113107547163963,
0.22211900353431702,
-0.0624224916100502,
-0.029476910829544067,
0.14103379845619202,
-0.005213168915361166,
0.07354270666837692,
0.12381409853696823,
0.043085116893053055,
-0.11609166860580444,
-0.002000221284106374,
-0.004728948697447777,
-0.04850992187857628,
-0.2028171569108963,
-0.013290869072079659,
-0.03826527297496796,
-0.009972969070076942,
0.09156639873981476,
0.029482703655958176,
0.014920924790203571,
0.06737872213125229,
0.004448238760232925,
0.04709508642554283,
0.019668912515044212,
0.1295996755361557,
0.1050853505730629,
0.06428256630897522,
0.14475327730178833,
-0.05939607322216034,
-0.026659497991204262,
0.04314113035798073,
-0.012876763008534908,
0.19854874908924103,
0.0243044625967741,
0.16123367846012115,
0.06348647177219391,
0.15603524446487427,
0.047100264579057693,
0.053956177085638046,
-0.0000014698742916152696,
-0.02792070433497429,
-0.014789547771215439,
-0.056072577834129333,
-0.05245848000049591,
0.026836363598704338,
-0.1250763088464737,
0.04519668221473694,
-0.12826696038246155,
0.03963642939925194,
0.05489661172032356,
0.2665242850780487,
0.030986947938799858,
-0.3763653635978699,
-0.11027362197637558,
0.021457290276885033,
-0.018549149855971336,
-0.049972016364336014,
0.01805134117603302,
0.0958365872502327,
-0.04220796376466751,
0.08980781584978104,
-0.07307444512844086,
0.09510943293571472,
-0.020985707640647888,
0.032244857400655746,
0.0026497789658606052,
0.09662161767482758,
-0.020644191652536392,
0.026432635262608528,
-0.2910000681877136,
0.28037065267562866,
0.035340700298547745,
0.09010095149278641,
-0.02630828134715557,
0.01192170288413763,
0.021774785593152046,
0.11544357240200043,
0.06950648128986359,
-0.01746978797018528,
-0.1653428077697754,
-0.1712970733642578,
-0.07079339027404785,
0.01892450638115406,
0.1149851530790329,
0.02759339101612568,
0.13045401871204376,
-0.014751997776329517,
0.00969430897384882,
0.04635201767086983,
-0.0180842112749815,
-0.09511742740869522,
-0.0802672803401947,
0.004442716483026743,
0.06501376628875732,
0.03603939339518547,
-0.0914599671959877,
-0.08173611760139465,
-0.07846798002719879,
0.17629672586917877,
0.005717677995562553,
-0.043936118483543396,
-0.12179894745349884,
0.014278246089816093,
0.051353711634874344,
-0.08055342733860016,
0.04986703395843506,
-0.008148100227117538,
0.13137765228748322,
-0.006300840061157942,
-0.05166798457503319,
0.12064749747514725,
-0.05676049739122391,
-0.16997332870960236,
-0.04667025804519653,
0.09983891248703003,
0.0062033082358539104,
0.041792478412389755,
0.01200586836785078,
0.04690169543027878,
0.0021329140290617943,
-0.07320833951234818,
0.024668103083968163,
0.007041577715426683,
0.07854042947292328,
-0.025981685146689415,
-0.022026002407073975,
0.00027508256607688963,
-0.06447799503803253,
-0.024113522842526436,
0.14123940467834473,
0.3210431635379791,
-0.08594565093517303,
0.04271699860692024,
0.06559441983699799,
-0.05221902206540108,
-0.1682087481021881,
0.022715508937835693,
0.02523184008896351,
0.0014123099390417337,
-0.007576923351734877,
-0.14475779235363007,
0.01385661493986845,
0.1000000536441803,
-0.023261262103915215,
0.05309107527136803,
-0.2713581323623657,
-0.13092702627182007,
0.09664015471935272,
0.16116057336330414,
0.1296878606081009,
-0.17766664922237396,
-0.05259346589446068,
-0.042589761316776276,
-0.1293024867773056,
0.08868963271379471,
-0.1389530897140503,
0.10175517946481705,
-0.017743753269314766,
0.0732472687959671,
0.008241765201091766,
-0.05384201556444168,
0.13451659679412842,
-0.031012754887342453,
0.11606857925653458,
-0.07343154400587082,
0.0457175150513649,
0.12655577063560486,
-0.09472612291574478,
0.043814677745103836,
-0.09729816019535065,
0.04160584509372711,
-0.05479217693209648,
-0.017153732478618622,
-0.040698908269405365,
0.011147872544825077,
-0.026216717436909676,
-0.027161721140146255,
-0.056100644171237946,
-0.0007322424789890647,
0.05865808203816414,
-0.029804987832903862,
0.20388808846473694,
-0.002463567303493619,
0.16772960126399994,
0.15723907947540283,
0.10143917053937912,
-0.13967865705490112,
0.023142127320170403,
0.026535939425230026,
-0.039595820009708405,
0.05467430502176285,
-0.16784711182117462,
0.05256245657801628,
0.10977554321289062,
0.004010908771306276,
0.12240318208932877,
0.061318933963775635,
-0.04988161846995354,
0.043112240731716156,
0.06688358634710312,
-0.16815882921218872,
-0.10023367404937744,
0.01870260201394558,
0.0697217807173729,
-0.08816707879304886,
0.08649035543203354,
0.138588085770607,
-0.07154121994972229,
-0.010621891357004642,
-0.01313859038054943,
0.010334404185414314,
-0.019889095798134804,
0.17145030200481415,
0.019000619649887085,
0.058495715260505676,
-0.0981062725186348,
0.0796615332365036,
0.03652137890458107,
-0.09848761558532715,
0.04740111902356148,
0.07741566002368927,
-0.1109120175242424,
-0.027814486995339394,
0.042548809200525284,
0.17026923596858978,
-0.062450628727674484,
-0.060884445905685425,
-0.1635848581790924,
-0.1378677934408188,
0.06631433218717575,
0.23423394560813904,
0.06640759110450745,
0.0040295738726854324,
0.0035551388282328844,
0.006521844305098057,
-0.12562528252601624,
0.08450761437416077,
0.04150496795773506,
0.09906858950853348,
-0.1415991485118866,
0.14087754487991333,
-0.014199084602296352,
-0.0011174631072208285,
-0.018148677423596382,
0.030200036242604256,
-0.11333561688661575,
0.004742758348584175,
-0.1590854525566101,
0.024080883711576462,
-0.057505059987306595,
0.0011785528622567654,
-0.016150232404470444,
-0.041803110390901566,
-0.06725766509771347,
0.02569063939154148,
-0.09866863489151001,
-0.017447475343942642,
0.01841697283089161,
0.029298728331923485,
-0.12266731262207031,
-0.02523050643503666,
0.000967424304690212,
-0.08589482307434082,
0.053918272256851196,
0.04920043423771858,
-0.0026395157910883427,
0.02850060723721981,
-0.08554742485284805,
0.014146072790026665,
0.07543151080608368,
-0.01360387820750475,
0.05274930223822594,
-0.1265517771244049,
-0.010075429454445839,
0.021016204729676247,
0.012925205752253532,
0.028077730908989906,
0.11079735308885574,
-0.10165798664093018,
0.00199695210903883,
0.00034517221502028406,
-0.049917448312044144,
-0.05227769911289215,
0.0469706729054451,
0.12764570116996765,
0.0006670634029433131,
0.1806318759918213,
-0.10138469934463501,
-0.0009313741466030478,
-0.17363138496875763,
-0.004276372957974672,
0.0066092852503061295,
-0.14974461495876312,
-0.11790579557418823,
-0.0445142537355423,
0.06217196211218834,
-0.06382552534341812,
0.12722249329090118,
-0.01057734340429306,
0.0213694516569376,
0.05502377450466156,
-0.07717543095350266,
0.0012363095302134752,
0.025332974269986153,
0.19331222772598267,
0.03405154123902321,
-0.04970671981573105,
0.05416271835565567,
0.03114348091185093,
0.09561121463775635,
0.06995322555303574,
0.17362165451049805,
0.15569289028644562,
0.014037341810762882,
0.11352842301130295,
0.031563565135002136,
-0.010710980743169785,
-0.13985009491443634,
0.044987551867961884,
-0.03836360201239586,
0.091668039560318,
-0.0028549782000482082,
0.16481809318065643,
0.16960878670215607,
-0.1428680270910263,
0.03290875256061554,
-0.037943921983242035,
-0.07476767897605896,
-0.12184956669807434,
-0.0709783285856247,
-0.11605778336524963,
-0.16563914716243744,
0.00010571542952675372,
-0.12127641588449478,
0.04325145483016968,
0.04249975457787514,
0.0017974239308387041,
0.009379319846630096,
0.15128159523010254,
0.006113054696470499,
0.020691871643066406,
0.0554196760058403,
-0.009922612458467484,
-0.045169372111558914,
-0.024141855537891388,
-0.10614495724439621,
0.045190077275037766,
-0.027592414990067482,
0.03834335505962372,
0.0041022999212145805,
0.0027720010839402676,
0.04512326046824455,
-0.026150887832045555,
-0.11601778864860535,
0.021165508776903152,
0.04419781640172005,
0.06892053037881851,
0.01967736892402172,
0.014408508315682411,
-0.017185471951961517,
0.0033935862593352795,
0.19979120790958405,
-0.06498362869024277,
-0.058596257120370865,
-0.10427387803792953,
0.24932803213596344,
0.041530027985572815,
-0.0007031856221146882,
0.013195001520216465,
-0.07306493818759918,
0.024488164111971855,
0.16285981237888336,
0.14590196311473846,
-0.02190670743584633,
0.005437089130282402,
-0.053267840296030045,
-0.011584545485675335,
-0.028929390013217926,
0.0911676362156868,
0.10364536195993423,
-0.021762043237686157,
-0.055624932050704956,
-0.03225233405828476,
-0.061399150639772415,
0.0000013288407672007452,
-0.06001097336411476,
0.06590066850185394,
-0.0004060969513375312,
0.000621404149569571,
-0.04290314391255379,
0.05228147283196449,
-0.02451332099735737,
-0.0534508116543293,
0.04623504728078842,
-0.18626075983047485,
-0.12894050776958466,
-0.0012511239619925618,
0.058517634868621826,
-0.004647298716008663,
0.053820088505744934,
-0.010788948275148869,
-0.005290128290653229,
0.06719592958688736,
-0.013692948967218399,
-0.04410204663872719,
-0.09162798523902893,
0.07031019777059555,
-0.1498631238937378,
0.20230461657047272,
-0.01935337483882904,
0.018622996285557747,
0.13939039409160614,
0.03585110977292061,
-0.11183435469865799,
0.09522552043199539,
0.0408007837831974,
-0.06876807659864426,
0.009847227483987808,
0.12362338602542877,
-0.044455599039793015,
0.11117193847894669,
0.05321313813328743,
-0.13119888305664062,
-0.004913767799735069,
-0.07700568437576294,
-0.06386776268482208,
-0.02327396720647812,
-0.03794924542307854,
-0.052384909242391586,
0.1114044338464737,
0.16931754350662231,
-0.03982008993625641,
0.014331136830151081,
-0.04557311534881592,
0.04555520415306091,
0.08904209733009338,
-0.009220893494784832,
-0.02961750328540802,
-0.27472832798957825,
0.014837583526968956,
0.12212037295103073,
0.0016105625545606017,
-0.28874891996383667,
-0.0881153792142868,
-0.009156299754977226,
-0.023206530138850212,
-0.09588512778282166,
0.0978325679898262,
0.11282595992088318,
0.049316249787807465,
-0.07143978774547577,
-0.09642211347818375,
-0.06192188709974289,
0.18924622237682343,
-0.112298883497715,
-0.07824075222015381
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | null | MaggieZhang/test3 | [
"transformers",
"safetensors",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-15T06:16:29+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
31,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06646376848220825,
0.2168014943599701,
-0.00225935154594481,
0.023818302899599075,
0.1271018385887146,
-0.001635765191167593,
0.04218708351254463,
0.13324736058712006,
-0.020175931975245476,
0.11144465953111649,
0.046588581055402756,
0.09377603232860565,
0.09928803145885468,
0.18404334783554077,
0.04859916493296623,
-0.2059975117444992,
0.007056170143187046,
-0.09090408682823181,
0.014076028019189835,
0.1116579994559288,
0.13719257712364197,
-0.10291384905576706,
0.08272874355316162,
-0.04045208916068077,
-0.02019004337489605,
0.00012576708104461432,
-0.09259183704853058,
-0.07032395154237747,
0.06885425746440887,
0.06264153122901917,
0.051234472543001175,
0.001456156256608665,
0.09140396863222122,
-0.2864592671394348,
0.017265573143959045,
0.08406311273574829,
0.0027674848679453135,
0.06290827691555023,
0.07236549258232117,
-0.07389893382787704,
0.11328595131635666,
-0.08021481335163116,
0.13019037246704102,
0.08625296503305435,
-0.062064990401268005,
-0.23071379959583282,
-0.07525765895843506,
0.0963398814201355,
0.12251301854848862,
0.06215599179267883,
-0.022921854630112648,
0.15455181896686554,
-0.06248689442873001,
0.012971068732440472,
0.1294165402650833,
-0.11526761949062347,
-0.05572471022605896,
0.061741601675748825,
0.11775490641593933,
0.10740239918231964,
-0.14110268652439117,
-0.0017287094378843904,
0.04900608956813812,
0.029121357947587967,
0.08589313924312592,
0.022661056369543076,
0.12003941088914871,
0.04652795568108559,
-0.13695219159126282,
-0.04037507623434067,
0.12011898308992386,
0.038862764835357666,
-0.06446044892072678,
-0.2168138176202774,
-0.006778308190405369,
-0.0601806715130806,
-0.014732478186488152,
-0.07019448280334473,
0.039128515869379044,
-0.02470310963690281,
0.07317749410867691,
-0.04465159401297569,
-0.1063927412033081,
-0.0421026237308979,
0.0892222449183464,
0.07748593389987946,
0.011527054943144321,
-0.02519804798066616,
0.04627908393740654,
0.13455867767333984,
0.05402068421244621,
-0.10399353504180908,
-0.07017925381660461,
-0.06942764669656754,
-0.09420394152402878,
-0.04035796597599983,
0.056760527193546295,
0.031942449510097504,
0.02665667235851288,
0.22703726589679718,
0.016653569415211678,
0.04155244305729866,
0.0224777739495039,
0.01032855175435543,
0.043662428855895996,
0.0955500528216362,
-0.05303520709276199,
-0.15660029649734497,
-0.04072032496333122,
0.09077946096658707,
-0.0027527001220732927,
-0.036689214408397675,
-0.03966725245118141,
0.03849169611930847,
0.06843466311693192,
0.13122352957725525,
0.07552056759595871,
-0.017929591238498688,
-0.04813180863857269,
-0.030096933245658875,
0.23523783683776855,
-0.1493375599384308,
0.04426715523004532,
-0.02271856553852558,
-0.01804111897945404,
-0.03908449783921242,
0.03597262129187584,
0.022118929773569107,
-0.000004518366949923802,
0.09706240892410278,
-0.058981191366910934,
-0.05378659814596176,
-0.10168042778968811,
-0.03272576630115509,
0.04088849574327469,
-0.013975566253066063,
-0.010589460842311382,
-0.09025166928768158,
-0.09490354359149933,
-0.04766594246029854,
0.05537205561995506,
-0.05123869329690933,
-0.03770573064684868,
0.009465423412621021,
-0.08151785284280777,
-0.005444355774670839,
-0.005417742300778627,
0.10699385404586792,
-0.03222226724028587,
0.04445803165435791,
-0.027600755915045738,
0.05225523188710213,
0.09919606149196625,
0.031576547771692276,
-0.0773419588804245,
0.0561848059296608,
-0.22559374570846558,
0.07503069192171097,
-0.11481974273920059,
0.04335082694888115,
-0.1704932004213333,
-0.042439818382263184,
0.005444696638733149,
0.0139949731528759,
0.013206101022660732,
0.12720820307731628,
-0.19255615770816803,
-0.01654396951198578,
0.13260798156261444,
-0.09212633967399597,
-0.118110790848732,
0.07884611934423447,
-0.029701577499508858,
0.1624738723039627,
0.04682036489248276,
-0.027025915682315826,
0.09224298596382141,
-0.16434773802757263,
-0.07092688232660294,
-0.00949116237461567,
-0.01727987825870514,
0.12109188735485077,
0.07512219995260239,
-0.05991523340344429,
0.046571120619773865,
0.02832140028476715,
-0.038078423589468,
-0.04424772411584854,
-0.050857074558734894,
-0.10884185880422592,
-0.01070026308298111,
-0.08987759798765182,
0.04065500199794769,
-0.01250192429870367,
-0.07916021347045898,
-0.029885273426771164,
-0.18612512946128845,
-0.0030564051121473312,
0.10038342326879501,
0.0035033065360039473,
-0.005652366206049919,
-0.08666291832923889,
0.026358824223279953,
-0.03112892620265484,
-0.008404186926782131,
-0.16764774918556213,
-0.04399421438574791,
0.046902090311050415,
-0.16094985604286194,
0.020117372274398804,
-0.06413903087377548,
0.06334125250577927,
0.03641495108604431,
-0.05590536445379257,
-0.0248766727745533,
-0.01730942726135254,
0.011945613659918308,
-0.05083848536014557,
-0.18994836509227753,
-0.056277405470609665,
-0.037882111966609955,
0.149809330701828,
-0.25956398248672485,
0.032966937869787216,
0.051140617579221725,
0.14649195969104767,
0.00406361510977149,
-0.05115427449345589,
0.01429014839231968,
-0.05360214412212372,
-0.054652128368616104,
-0.06746816635131836,
-0.006135428790003061,
-0.027576493099331856,
-0.05147203803062439,
0.019243421033024788,
-0.1755700707435608,
-0.021410830318927765,
0.09424154460430145,
0.12876708805561066,
-0.1486445665359497,
-0.018640631809830666,
-0.048725154250860214,
-0.06339836865663528,
-0.0715010017156601,
-0.07038594037294388,
0.10712739825248718,
0.0513901449739933,
0.04796046018600464,
-0.07435787469148636,
-0.07092321664094925,
0.02726263552904129,
0.006906150374561548,
-0.03382374346256256,
0.08727246522903442,
0.05199531093239784,
-0.09209315478801727,
0.0756213590502739,
0.1092359870672226,
0.07177663594484329,
0.09363535046577454,
0.01574566215276718,
-0.11756632477045059,
-0.028492970392107964,
0.036266472190618515,
0.02740776725113392,
0.1465986967086792,
-0.05952361226081848,
0.04016614332795143,
0.04494241625070572,
-0.04170418903231621,
0.022319864481687546,
-0.08787637203931808,
0.024075502529740334,
0.025203049182891846,
-0.0034381982404738665,
0.06284574419260025,
-0.02525499276816845,
-0.0050758360885083675,
0.07016654312610626,
0.047779910266399384,
0.04621000960469246,
0.009655474685132504,
-0.01720241829752922,
-0.1047825813293457,
0.16950392723083496,
-0.0951867327094078,
-0.269941508769989,
-0.17632324993610382,
0.026197833940386772,
0.04035249724984169,
-0.022378476336598396,
0.031619444489479065,
-0.07056326419115067,
-0.10630585998296738,
-0.1060405746102333,
-0.002429972169920802,
0.01714223250746727,
-0.06364088505506516,
-0.0741225928068161,
0.07348573952913284,
0.04382912442088127,
-0.14902326464653015,
0.038552410900592804,
0.055694397538900375,
-0.057955220341682434,
-0.0233661737293005,
0.09118817001581192,
0.12397737801074982,
0.14583967626094818,
-0.021366750821471214,
-0.028626007959246635,
0.029004426673054695,
0.19620531797409058,
-0.13469526171684265,
0.10371150821447372,
0.13814030587673187,
-0.04545360431075096,
0.08360563963651657,
0.1560150384902954,
0.029186224564909935,
-0.08317049592733383,
0.05044832453131676,
0.04082648828625679,
-0.043159641325473785,
-0.2666129767894745,
-0.0534592866897583,
0.012832709588110447,
-0.06255637854337692,
0.09786593168973923,
0.10183793306350708,
0.11542957276105881,
0.034910861402750015,
-0.07166364789009094,
-0.043925940990448,
-0.0058974819257855415,
0.11737963557243347,
-0.05490213260054588,
-0.012639665976166725,
0.07686592638492584,
-0.05086168646812439,
0.005355054512619972,
0.10266812145709991,
0.02973790094256401,
0.17442677915096283,
0.020399179309606552,
0.11231429129838943,
0.06195578724145889,
0.08633565157651901,
0.0007386076031252742,
0.02951662428677082,
0.05147615820169449,
0.017203815281391144,
-0.002300140680745244,
-0.10421168059110641,
-0.006156572140753269,
0.1449710875749588,
0.028103826567530632,
0.029669636860489845,
-0.0018948549404740334,
-0.005003341939300299,
0.05121048167347908,
0.1746254414319992,
-0.011592294089496136,
-0.22072425484657288,
-0.0845772922039032,
0.06936841458082199,
-0.06218599155545235,
-0.12968985736370087,
-0.026130788028240204,
0.045467354357242584,
-0.17519839107990265,
0.026703642681241035,
-0.027433741837739944,
0.0919293761253357,
-0.09345759451389313,
-0.02221956104040146,
0.03687324374914169,
0.084866963326931,
-0.014529162086546421,
0.08703910559415817,
-0.14498743414878845,
0.11886418610811234,
0.02978132851421833,
0.09024628251791,
-0.11081171780824661,
0.07909037172794342,
-0.007550720125436783,
0.009180475026369095,
0.19379350543022156,
-0.011335089802742004,
-0.03514958545565605,
-0.08774717897176743,
-0.11210042238235474,
-0.013537433929741383,
0.12687496840953827,
-0.1243172138929367,
0.08773399889469147,
-0.015198243781924248,
-0.044079482555389404,
0.00937260314822197,
-0.12100647389888763,
-0.17273177206516266,
-0.19628387689590454,
0.05585884302854538,
-0.09575839340686798,
0.025643249973654747,
-0.11914430558681488,
-0.07089093327522278,
-0.02952558360993862,
0.241120383143425,
-0.1745356321334839,
-0.06510113179683685,
-0.1468164622783661,
-0.046294767409563065,
0.1662203073501587,
-0.04437198117375374,
0.0718095526099205,
-0.0208172257989645,
0.20345525443553925,
0.005988610442727804,
-0.004939318168908358,
0.06724198162555695,
-0.08892562240362167,
-0.16873881220817566,
-0.06771010160446167,
0.1510489284992218,
0.11680185794830322,
0.04907919466495514,
-0.002248800592496991,
0.0011772146681323647,
-0.016943959519267082,
-0.1137804463505745,
-0.0033210667315870523,
0.16037839651107788,
0.03878779336810112,
0.025986969470977783,
-0.05243593826889992,
-0.08797456324100494,
-0.06899320334196091,
-0.06853509694337845,
0.06221301481127739,
0.19590823352336884,
-0.10376439243555069,
0.1700313836336136,
0.147536963224411,
-0.07305635511875153,
-0.23175598680973053,
0.035342130810022354,
0.04983805492520332,
0.0014306638622656465,
0.04886869341135025,
-0.18252557516098022,
0.10521943867206573,
0.019543392583727837,
-0.05505957826972008,
0.13485197722911835,
-0.1557481735944748,
-0.1552847921848297,
0.0722852572798729,
0.03904085233807564,
-0.22423844039440155,
-0.1354004591703415,
-0.09622503817081451,
-0.05825018882751465,
-0.14065024256706238,
0.06054598465561867,
-0.002136280992999673,
0.015948504209518433,
0.03500790148973465,
-0.0015643214574083686,
0.027123261243104935,
-0.058935679495334625,
0.18609118461608887,
-0.004065449349582195,
0.020676052197813988,
-0.060264769941568375,
-0.0478842556476593,
0.09839435666799545,
-0.06130504235625267,
0.12208222597837448,
0.004057085141539574,
0.01594383642077446,
-0.10362856835126877,
-0.048314861953258514,
-0.04328322783112526,
0.05154227837920189,
-0.07548051327466965,
-0.10070807486772537,
-0.043625857681035995,
0.08841723203659058,
0.07005169242620468,
-0.03383097052574158,
0.00549331633374095,
-0.07189501076936722,
0.10019614547491074,
0.17795267701148987,
0.17573626339435577,
0.009926567785441875,
-0.07241068035364151,
0.01677953451871872,
-0.04142116755247116,
0.044231921434402466,
-0.2513144314289093,
0.03756171092391014,
0.06098250672221184,
0.029438555240631104,
0.09217222779989243,
-0.020435843616724014,
-0.1820858269929886,
-0.04050002992153168,
0.08094815909862518,
-0.05452597141265869,
-0.22617179155349731,
-0.019085140898823738,
0.0954197570681572,
-0.2020406424999237,
-0.007372708059847355,
0.03995226323604584,
-0.048725228756666183,
-0.023169852793216705,
0.00010950004070764408,
0.06317184865474701,
0.002471912419423461,
0.09773622453212738,
0.0735151618719101,
0.09715340286493301,
-0.08337292820215225,
0.10562895983457565,
0.10150538384914398,
-0.09572599828243256,
0.03605884686112404,
0.06754924356937408,
-0.05300498008728027,
-0.043293699622154236,
0.03665391728281975,
0.033023297786712646,
0.005234600510448217,
-0.060321882367134094,
0.013913018628954887,
-0.036497246474027634,
0.044923391193151474,
0.08326134830713272,
0.03754979372024536,
-0.013354414142668247,
0.06462216377258301,
0.03401726484298706,
-0.10898099094629288,
0.10366570204496384,
0.01731540448963642,
0.04105307161808014,
-0.08384523540735245,
-0.019968897104263306,
0.035425446927547455,
0.030576206743717194,
-0.01765924133360386,
-0.02306121215224266,
-0.02860277332365513,
-0.01614218018949032,
-0.14299540221691132,
-0.023106401786208153,
-0.07243485748767853,
0.006181265693157911,
0.014656842686235905,
-0.031884219497442245,
-0.011233693920075893,
0.02475680410861969,
-0.06979699432849884,
-0.07426341623067856,
-0.006949664559215307,
0.09833318740129471,
-0.15115703642368317,
0.008848577737808228,
0.06907843053340912,
-0.11088496446609497,
0.08190931379795074,
-0.008411259390413761,
0.016245156526565552,
0.022527478635311127,
-0.15448406338691711,
0.05601610988378525,
0.0008648968650959432,
0.01916889287531376,
0.025886621326208115,
-0.16471809148788452,
0.004104440100491047,
-0.04661374166607857,
-0.02149827405810356,
-0.00004464812809601426,
-0.02647159807384014,
-0.12325995415449142,
0.06858719140291214,
-0.015622655861079693,
-0.035931166261434555,
-0.02701525390148163,
0.0539589487016201,
0.07888586074113846,
-0.027474910020828247,
0.10445091128349304,
-0.008690856397151947,
0.04941811040043831,
-0.16801609098911285,
-0.02470702864229679,
-0.04982255399227142,
0.019377702847123146,
0.009884213097393513,
-0.007693959400057793,
0.04183054715394974,
-0.00976533442735672,
0.21883612871170044,
-0.05075952783226967,
0.1607085019350052,
0.05847611650824547,
-0.017352959141135216,
-0.0007513365126214921,
0.06180921941995621,
0.05997028574347496,
0.04658793285489082,
0.009480604901909828,
0.023740366101264954,
-0.022450892254710197,
-0.006695089396089315,
-0.15932634472846985,
0.01890849508345127,
0.14999441802501678,
0.06301083415746689,
0.024745315313339233,
0.05866100639104843,
-0.12775006890296936,
-0.12135478109121323,
0.09311001747846603,
-0.026755332946777344,
0.00928465835750103,
-0.08245618641376495,
0.1358020007610321,
0.14980104565620422,
-0.14000412821769714,
0.05256148427724838,
-0.06134212389588356,
-0.05217423290014267,
-0.10388828068971634,
-0.12032219022512436,
-0.05887215584516525,
-0.053666237741708755,
0.002330566756427288,
-0.03760887682437897,
0.054546963423490524,
0.03344334661960602,
-0.009351172484457493,
-0.00022941511997487396,
0.13597318530082703,
-0.019751882180571556,
-0.0028988157864660025,
0.048313532024621964,
0.03693558648228645,
0.02373051457107067,
-0.05275435373187065,
0.02940409444272518,
0.02539868652820587,
0.032232340425252914,
0.06546790152788162,
0.033412106335163116,
-0.047448933124542236,
0.03804153576493263,
-0.0025254099164158106,
-0.11207924783229828,
0.019641218706965446,
-0.00460948096588254,
-0.0742158442735672,
0.1268945336341858,
0.0407399944961071,
0.010224059224128723,
-0.03741471841931343,
0.24361543357372284,
-0.06653323769569397,
-0.06378097087144852,
-0.13251738250255585,
0.10491154342889786,
-0.0027236645109951496,
0.06476365029811859,
0.023412218317389488,
-0.1284150779247284,
0.005243356805294752,
0.13858191668987274,
0.12181595712900162,
0.0045748427510261536,
0.009228081442415714,
0.0518609918653965,
0.0025186820421367884,
-0.06998204439878464,
0.054019294679164886,
0.06992026418447495,
0.12919506430625916,
-0.07847554981708527,
0.07680778950452805,
0.0006860480643808842,
-0.08370215445756912,
-0.02947772853076458,
0.11312682181596756,
-0.0409729965031147,
0.03491825982928276,
-0.047444481402635574,
0.10916327685117722,
-0.05787910893559456,
-0.29412412643432617,
0.02350960113108158,
-0.09588567912578583,
-0.15202060341835022,
-0.018367812037467957,
0.05944539234042168,
-0.02624768204987049,
0.018029648810625076,
0.06971040368080139,
-0.06011629104614258,
0.20098382234573364,
0.0335683599114418,
-0.07864278554916382,
-0.0664360448718071,
0.04837050288915634,
-0.06564252078533173,
0.2949807047843933,
0.008418165147304535,
0.02863333560526371,
0.10770907253026962,
-0.03253700211644173,
-0.18271861970424652,
0.010723991319537163,
0.1133992001414299,
-0.08056149631738663,
0.08200647681951523,
0.19000613689422607,
-0.012578671798110008,
0.1209007054567337,
0.05294662341475487,
-0.047376248985528946,
0.04217283055186272,
-0.03389401361346245,
-0.051268599927425385,
-0.10752558708190918,
0.058453381061553955,
-0.05909625440835953,
0.15447644889354706,
0.10152646154165268,
-0.05671518296003342,
-0.004550917539745569,
-0.05555408447980881,
0.04875178262591362,
0.01804669201374054,
0.12263146042823792,
0.02951994352042675,
-0.1865430772304535,
0.032826557755470276,
-0.01144319772720337,
0.10186848044395447,
-0.25588861107826233,
-0.08421015739440918,
0.08833149075508118,
-0.011924264021217823,
-0.05105875805020332,
0.10560628771781921,
0.057650718837976456,
0.04243382066488266,
-0.043439045548439026,
-0.10480839014053345,
-0.02186836116015911,
0.14663739502429962,
-0.1469624787569046,
-0.025013303384184837
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | null | p1atdev/tokenizer_test_1 | [
"transformers",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-15T06:19:02+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
26,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.08389580249786377,
0.19830818474292755,
-0.0013316317927092314,
0.02313883788883686,
0.11396584659814835,
0.01961737498641014,
0.053626976907253265,
0.14538456499576569,
0.0060051376931369305,
0.10656800121068954,
0.066679947078228,
0.09131570905447006,
0.09678101539611816,
0.20042605698108673,
0.04371999576687813,
-0.17659740149974823,
0.010636410675942898,
-0.06930278241634369,
-0.010073255747556686,
0.11651819199323654,
0.141214057803154,
-0.10151198506355286,
0.07627976685762405,
-0.03319970890879631,
-0.02870541252195835,
-0.0070160143077373505,
-0.07769215852022171,
-0.05755697935819626,
0.07573003321886063,
0.054863471537828445,
0.04207949340343475,
-0.0008347301045432687,
0.08447454124689102,
-0.2674994468688965,
0.013753628358244896,
0.07452993094921112,
0.010659529827535152,
0.05990942195057869,
0.07833302766084671,
-0.04036625102162361,
0.12881849706172943,
-0.06320446729660034,
0.13035163283348083,
0.0906217098236084,
-0.0681561604142189,
-0.24378153681755066,
-0.08239314705133438,
0.06505522131919861,
0.12533815205097198,
0.07694927603006363,
-0.02823091857135296,
0.16422191262245178,
-0.07247646898031235,
0.019290022552013397,
0.09481704235076904,
-0.1151006743311882,
-0.060644298791885376,
0.08318385481834412,
0.14101974666118622,
0.10340547561645508,
-0.1255619376897812,
-0.012289565056562424,
0.04275871813297272,
0.045979104936122894,
0.07389909774065018,
0.011339850723743439,
0.1143413558602333,
0.05629947781562805,
-0.13526225090026855,
-0.05700986459851265,
0.14547574520111084,
0.023872992023825645,
-0.057064127177000046,
-0.2138909548521042,
-0.002902575535699725,
-0.07730814069509506,
-0.011685127392411232,
-0.06846728920936584,
0.0291305985301733,
-0.01194276288151741,
0.060226380825042725,
-0.0496203787624836,
-0.09797755628824234,
-0.046314824372529984,
0.1015089675784111,
0.054820988327264786,
0.011354796588420868,
-0.01489334274083376,
0.03576440364122391,
0.13432876765727997,
0.04213530570268631,
-0.10012737661600113,
-0.07065672427415848,
-0.0701170489192009,
-0.09620913118124008,
-0.03947552293539047,
0.04272124543786049,
0.020167991518974304,
0.042202774435281754,
0.2283228635787964,
0.024096308276057243,
0.05459817871451378,
0.029667891561985016,
0.0026177873369306326,
0.03211980313062668,
0.1073630079627037,
-0.041210614144802094,
-0.188126802444458,
-0.03292805701494217,
0.0931866466999054,
-0.009821015410125256,
-0.028658604249358177,
-0.033444397151470184,
0.035014089196920395,
0.08379437029361725,
0.11821532249450684,
0.08875755965709686,
-0.012828069739043713,
-0.037612639367580414,
-0.03493109717965126,
0.2115669697523117,
-0.14141373336315155,
0.045799970626831055,
-0.022097334265708923,
-0.018195297569036484,
-0.06905751675367355,
0.030103791505098343,
0.01831657998263836,
-0.003142025787383318,
0.06966056674718857,
-0.061253178864717484,
-0.05794486775994301,
-0.11518853157758713,
-0.045523155480623245,
0.04711875319480896,
-0.024105608463287354,
-0.024469668045639992,
-0.07765042781829834,
-0.11219723522663116,
-0.06417357176542282,
0.06612563133239746,
-0.04156653955578804,
-0.03974827378988266,
0.005308232270181179,
-0.07131324708461761,
0.008387917652726173,
0.008993842639029026,
0.12122467905282974,
-0.030063031241297722,
0.05833350867033005,
-0.002476902212947607,
0.05916252359747887,
0.10643328726291656,
0.03227818012237549,
-0.08492200076580048,
0.057466037571430206,
-0.20633617043495178,
0.08371785283088684,
-0.11420095711946487,
0.034276340156793594,
-0.17048145830631256,
-0.024183684960007668,
0.008447963744401932,
0.023597201332449913,
0.023726604878902435,
0.1338067352771759,
-0.2097422182559967,
-0.016196569427847862,
0.14133213460445404,
-0.09649793803691864,
-0.12422871589660645,
0.07990546524524689,
-0.03459475561976433,
0.1747698187828064,
0.038475677371025085,
-0.019652999937534332,
0.09909367561340332,
-0.15559963881969452,
-0.05852397903800011,
-0.026064254343509674,
-0.008927824907004833,
0.08823978155851364,
0.07542291283607483,
-0.05844951793551445,
0.02285866066813469,
0.02562655322253704,
-0.04727208614349365,
-0.0268824752420187,
-0.05256075784564018,
-0.10127434879541397,
-0.023140445351600647,
-0.09642518311738968,
0.026515161618590355,
0.000058677000197349116,
-0.07310442626476288,
-0.028560271486639977,
-0.17347893118858337,
-0.02563360333442688,
0.10103316605091095,
0.004820956848561764,
-0.007559072691947222,
-0.08540112525224686,
0.022149885073304176,
-0.05362366884946823,
-0.006164622958749533,
-0.16996455192565918,
-0.03558015450835228,
0.051895126700401306,
-0.14917676150798798,
0.015460150316357613,
-0.07327745854854584,
0.07047311216592789,
0.02098717913031578,
-0.05859505757689476,
-0.03108096309006214,
0.0007694467785768211,
0.004292082041501999,
-0.06229274719953537,
-0.1903683841228485,
-0.058886781334877014,
-0.041500482708215714,
0.15720732510089874,
-0.24841000139713287,
0.0300158578902483,
0.03247617185115814,
0.13185922801494598,
0.007058668415993452,
-0.06344027817249298,
0.02096918225288391,
-0.04676475748419762,
-0.050621338188648224,
-0.06898977607488632,
-0.009901339188218117,
-0.014539826661348343,
-0.031393732875585556,
0.012980648316442966,
-0.14970256388187408,
-0.060514215379953384,
0.09452559798955917,
0.11224991828203201,
-0.14555825293064117,
0.00204002158716321,
-0.0460561066865921,
-0.07002599537372589,
-0.07487804442644119,
-0.0761631652712822,
0.07739497721195221,
0.044650159776210785,
0.049250341951847076,
-0.06317461282014847,
-0.06234706938266754,
0.023210179060697556,
0.005524294450879097,
-0.019023682922124863,
0.0948529988527298,
0.074309803545475,
-0.09122881293296814,
0.07973480224609375,
0.08461450785398483,
0.04414684325456619,
0.086973637342453,
0.005991141777485609,
-0.11396963149309158,
-0.03062884695827961,
0.037754856050014496,
0.024159027263522148,
0.15351562201976776,
-0.08692087233066559,
0.030462130904197693,
0.052177220582962036,
-0.03854219615459442,
0.03157065063714981,
-0.0923321321606636,
0.025362705811858177,
0.021495236083865166,
-0.006555700208991766,
0.05864228308200836,
-0.018769768998026848,
-0.01403577346354723,
0.06336429715156555,
0.05677810311317444,
0.044270504266023636,
0.02595379762351513,
-0.02093072421848774,
-0.1278371512889862,
0.16537296772003174,
-0.09028079360723495,
-0.2540280222892761,
-0.17074446380138397,
0.015454737469553947,
0.03706491366028786,
-0.021728800609707832,
0.039588842540979385,
-0.06286025792360306,
-0.10237989574670792,
-0.09417891502380371,
0.0029635571409016848,
0.023925531655550003,
-0.058347854763269424,
-0.0817074254155159,
0.060779985040426254,
0.04047083482146263,
-0.13689260184764862,
0.0349188968539238,
0.06170675903558731,
-0.03042641654610634,
0.0018567070364952087,
0.07321398705244064,
0.12743599712848663,
0.14838241040706635,
-0.006730219814926386,
-0.012446845881640911,
0.035035960376262665,
0.229813352227211,
-0.1490442156791687,
0.10630457103252411,
0.14053207635879517,
-0.021705523133277893,
0.06635113060474396,
0.1461038440465927,
0.023231739178299904,
-0.07546708732843399,
0.04147516191005707,
0.04027445614337921,
-0.04228919371962547,
-0.2589097023010254,
-0.05694316700100899,
-0.00946022942662239,
-0.07043391466140747,
0.09718906134366989,
0.09238530695438385,
0.11972260475158691,
0.0337289460003376,
-0.05568677559494972,
-0.025771914049983025,
-0.003401360474526882,
0.114128477871418,
-0.027640055865049362,
-0.004564122296869755,
0.07965842634439468,
-0.05878787487745285,
0.011684526689350605,
0.09941446036100388,
0.019347423687577248,
0.17601320147514343,
0.02533329278230667,
0.10681075602769852,
0.06725578010082245,
0.09347675740718842,
-0.0015635732561349869,
0.034774236381053925,
0.05337131395936012,
0.022044572979211807,
0.010453542694449425,
-0.09408048540353775,
-0.012431944720447063,
0.13713060319423676,
0.019816776737570763,
0.009031654335558414,
0.008926562033593655,
-0.01010479498654604,
0.03131420537829399,
0.20501568913459778,
0.0009575071162544191,
-0.22537250816822052,
-0.09500737488269806,
0.059459153562784195,
-0.06931101530790329,
-0.143676295876503,
-0.02094252221286297,
0.030270220711827278,
-0.17292405664920807,
0.016790566965937614,
-0.0316389761865139,
0.09112390875816345,
-0.07145322859287262,
-0.028050832450389862,
0.06891903281211853,
0.07569212466478348,
-0.012108199298381805,
0.07973295450210571,
-0.19069278240203857,
0.12254468351602554,
0.03037673607468605,
0.08605273067951202,
-0.11708726733922958,
0.07849059253931046,
-0.0019813794642686844,
-0.014807495288550854,
0.17999744415283203,
-0.014062200672924519,
-0.0586031936109066,
-0.08878950774669647,
-0.08704045414924622,
-0.011727320961654186,
0.10361312329769135,
-0.09322915226221085,
0.09586969763040543,
-0.02775636687874794,
-0.03705112263560295,
0.012418309226632118,
-0.10469507426023483,
-0.1636953055858612,
-0.18679304420948029,
0.06244563311338425,
-0.07802703976631165,
0.012347841635346413,
-0.11227322369813919,
-0.06334327906370163,
-0.01575082167983055,
0.23160123825073242,
-0.16648635268211365,
-0.07049825042486191,
-0.1498587429523468,
-0.03997112438082695,
0.17463743686676025,
-0.042160745710134506,
0.06849376112222672,
-0.021383514627814293,
0.1873992383480072,
-0.008081548847258091,
-0.013158116489648819,
0.06569221615791321,
-0.09637628495693207,
-0.16879262030124664,
-0.05748843029141426,
0.14160962402820587,
0.10863390564918518,
0.05731578543782234,
-0.0038195757661014795,
0.013171887956559658,
-0.03383830562233925,
-0.09896382689476013,
0.013824623078107834,
0.13817466795444489,
0.0034514935687184334,
0.00682973163202405,
-0.03995988517999649,
-0.07027145475149155,
-0.05825701728463173,
-0.07912654429674149,
0.057147104293107986,
0.187900573015213,
-0.09512355923652649,
0.1602867990732193,
0.12431421875953674,
-0.06468851119279861,
-0.2306901067495346,
0.03996593505144119,
0.04701630026102066,
0.007666614837944508,
0.022401191294193268,
-0.19138796627521515,
0.09788824617862701,
0.0009011493530124426,
-0.06807263940572739,
0.14616990089416504,
-0.16564498841762543,
-0.1461436152458191,
0.08002161979675293,
0.025075770914554596,
-0.22560662031173706,
-0.14821304380893707,
-0.1037549376487732,
-0.03735695406794548,
-0.13707835972309113,
0.048581719398498535,
0.02614329755306244,
0.019834673032164574,
0.025222565978765488,
0.005338077899068594,
0.029657263308763504,
-0.07272187620401382,
0.1870686560869217,
-0.020297454670071602,
0.0072362530045211315,
-0.050640691071748734,
-0.04617878794670105,
0.09227550774812698,
-0.06150037795305252,
0.11741586774587631,
0.018679620698094368,
0.018796883523464203,
-0.1431548148393631,
-0.049209367483854294,
-0.060803934931755066,
0.04456847906112671,
-0.07284719496965408,
-0.09393193572759628,
-0.04137463867664337,
0.08888561278581619,
0.07211937010288239,
-0.032792408019304276,
-0.0027768779546022415,
-0.07569456845521927,
0.09405932575464249,
0.184477761387825,
0.17357055842876434,
0.009977072477340698,
-0.07020942866802216,
0.024555526673793793,
-0.042279548943042755,
0.03349342197179794,
-0.24652716517448425,
0.03456863760948181,
0.066053606569767,
0.03803660348057747,
0.08509242534637451,
-0.016836483031511307,
-0.1781480610370636,
-0.04086102172732353,
0.08498652279376984,
-0.06206206604838371,
-0.19876568019390106,
-0.02703288197517395,
0.08424776047468185,
-0.20383712649345398,
-0.032998621463775635,
0.041543323546648026,
-0.03834589570760727,
-0.02396267279982567,
-0.002415500348433852,
0.06396626681089401,
-0.008327016606926918,
0.12156640738248825,
0.06747189164161682,
0.10266115516424179,
-0.09284433722496033,
0.08920657634735107,
0.10416955500841141,
-0.09140542894601822,
0.03545991703867912,
0.10264154523611069,
-0.05670900270342827,
-0.04460543021559715,
0.033935222774744034,
0.05925208330154419,
-0.028357384726405144,
-0.06409841030836105,
-0.000502707262057811,
-0.0359574519097805,
0.04993389546871185,
0.08058220148086548,
0.036113787442445755,
-0.01202210783958435,
0.06544706225395203,
0.028145326301455498,
-0.11693570017814636,
0.10949387401342392,
0.04405685141682625,
0.04509059712290764,
-0.07182393968105316,
-0.012280966155230999,
0.015999672934412956,
0.032540347427129745,
-0.019734015688300133,
-0.014576527290046215,
-0.03146412968635559,
-0.007561005651950836,
-0.1553635597229004,
-0.02064543403685093,
-0.06516171246767044,
0.006067827809602022,
0.022207623347640038,
-0.03830232471227646,
-0.012014663778245449,
0.01381110493093729,
-0.07979435473680496,
-0.07571027427911758,
-0.01700955256819725,
0.08539021760225296,
-0.1381402313709259,
0.006627439055591822,
0.07182712107896805,
-0.10980239510536194,
0.07347989827394485,
-0.0048679932951927185,
0.017079560086131096,
0.010923396795988083,
-0.11654401570558548,
0.04386281594634056,
-0.005810429807752371,
0.01551580335944891,
0.022556742653250694,
-0.171111062169075,
0.011553828604519367,
-0.038553636521101,
-0.03114982508122921,
0.011926400475203991,
-0.025060230866074562,
-0.11875922232866287,
0.08676479011774063,
-0.028097305446863174,
-0.037512701004743576,
-0.03292486071586609,
0.06296087801456451,
0.08736220002174377,
-0.011740099638700485,
0.09667140990495682,
-0.025766119360923767,
0.04818311333656311,
-0.1756584197282791,
-0.01910574547946453,
-0.050167568027973175,
0.02537350542843342,
-0.01759655587375164,
-0.0070639788173139095,
0.055272240191698074,
-0.004191063344478607,
0.20991376042366028,
-0.03921036794781685,
0.1548677533864975,
0.05199402943253517,
-0.009925156831741333,
0.010884369723498821,
0.05032730847597122,
0.06423956155776978,
0.031145188957452774,
0.00853167474269867,
0.04660189896821976,
-0.004552975296974182,
-0.020357951521873474,
-0.13699717819690704,
0.02791593410074711,
0.16117429733276367,
0.061918217688798904,
0.0392887257039547,
0.03704594820737839,
-0.1422400325536728,
-0.09538721293210983,
0.10306388139724731,
-0.0331864058971405,
0.014331420883536339,
-0.08317886292934418,
0.17621558904647827,
0.12328410148620605,
-0.1574767529964447,
0.0577850341796875,
-0.07234696298837662,
-0.05066767707467079,
-0.1024852767586708,
-0.11832084506750107,
-0.06293155997991562,
-0.06027044355869293,
-0.004747506696730852,
-0.042489297688007355,
0.05734556168317795,
0.026751231402158737,
-0.003270963439717889,
-0.006759525276720524,
0.12665949761867523,
-0.0249644722789526,
-0.004145825747400522,
0.04152364656329155,
0.0326087586581707,
0.019319625571370125,
-0.05872373282909393,
0.017997145652770996,
0.018602589145302773,
0.022180357947945595,
0.06835069507360458,
0.0260987039655447,
-0.059317342936992645,
0.044286735355854034,
0.00319746439345181,
-0.11313364654779434,
0.018146557733416557,
-0.00002245741598017048,
-0.05020225793123245,
0.13557326793670654,
0.04076748713850975,
0.01548024732619524,
-0.029270920902490616,
0.24342355132102966,
-0.07199113070964813,
-0.08681939542293549,
-0.13965600728988647,
0.11511493474245071,
-0.023563209921121597,
0.03755274787545204,
0.016542524099349976,
-0.12659503519535065,
0.011511262506246567,
0.18531471490859985,
0.12824349105358124,
0.012459068559110165,
-0.007656481582671404,
0.05736639350652695,
-0.0007639875984750688,
-0.05985576659440994,
0.05051197111606598,
0.0664999932050705,
0.16097788512706757,
-0.09069112688302994,
0.0652846097946167,
-0.008405503816902637,
-0.0831485390663147,
-0.027498632669448853,
0.11705785244703293,
-0.022675158455967903,
0.02148384228348732,
-0.03778035193681717,
0.11204422265291214,
-0.052532415837049484,
-0.2719486355781555,
0.02952493168413639,
-0.09503202140331268,
-0.13993041217327118,
-0.02591860294342041,
0.041448429226875305,
-0.03349510580301285,
0.01577647216618061,
0.06254769116640091,
-0.045389387756586075,
0.18837277591228485,
0.025987716391682625,
-0.08679025620222092,
-0.07755549252033234,
0.05874146893620491,
-0.08695939928293228,
0.2789687216281891,
0.003863075515255332,
0.04782010242342949,
0.12108923494815826,
-0.03053574077785015,
-0.18664880096912384,
0.014769754372537136,
0.11989909410476685,
-0.09114406257867813,
0.07780203968286514,
0.18139931559562683,
-0.005561648402363062,
0.12649618089199066,
0.04705416411161423,
-0.03877115994691849,
0.03976387158036232,
-0.02721380814909935,
-0.03821522742509842,
-0.12209630757570267,
0.05661242455244064,
-0.0612691193819046,
0.15957388281822205,
0.1158948540687561,
-0.05964287370443344,
0.001120698289014399,
-0.06126941740512848,
0.06300627440214157,
0.014774397015571594,
0.12115653604269028,
0.018452486023306847,
-0.2023056596517563,
0.05087360367178917,
-0.03283824771642685,
0.08166342973709106,
-0.254973828792572,
-0.08186668157577515,
0.07622263580560684,
-0.019022729247808456,
-0.04275642707943916,
0.12311509251594543,
0.06101066991686821,
0.03676839917898178,
-0.03853875398635864,
-0.08537755906581879,
-0.01412904355674982,
0.15376435220241547,
-0.14123432338237762,
-0.029574336484074593
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# whisper-medium-kannada
This model is a fine-tuned version of [openai/whisper-medium](https://huggingface.co/openai/whisper-medium) on the kannada_asr_corpus dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0534
- Wer: 99.9323
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 32
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 250
- training_steps: 4000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:-------:|
| 0.0985 | 0.25 | 1000 | 0.0963 | 100.0 |
| 0.0784 | 0.5 | 2000 | 0.0802 | 100.0 |
| 0.0662 | 1.08 | 3000 | 0.0553 | 100.0 |
| 0.0525 | 1.33 | 4000 | 0.0534 | 99.9323 |
### Framework versions
- Transformers 4.37.2
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.2
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["kannada_asr_corpus"], "metrics": ["wer"], "base_model": "openai/whisper-medium", "model-index": [{"name": "whisper-medium-kannada", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "kannada_asr_corpus", "type": "kannada_asr_corpus", "config": "default", "split": "train", "args": "default"}, "metrics": [{"type": "wer", "value": 99.93226844273862, "name": "Wer"}]}]}]} | automatic-speech-recognition | kuladeep2112/whisper-medium-kannada | [
"transformers",
"safetensors",
"whisper",
"automatic-speech-recognition",
"generated_from_trainer",
"dataset:kannada_asr_corpus",
"base_model:openai/whisper-medium",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | 2024-02-15T06:22:50+00:00 | [] | [] | TAGS
#transformers #safetensors #whisper #automatic-speech-recognition #generated_from_trainer #dataset-kannada_asr_corpus #base_model-openai/whisper-medium #license-apache-2.0 #model-index #endpoints_compatible #region-us
| whisper-medium-kannada
======================
This model is a fine-tuned version of openai/whisper-medium on the kannada\_asr\_corpus dataset.
It achieves the following results on the evaluation set:
* Loss: 0.0534
* Wer: 99.9323
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 1e-05
* train\_batch\_size: 32
* eval\_batch\_size: 16
* seed: 42
* gradient\_accumulation\_steps: 2
* total\_train\_batch\_size: 64
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 250
* training\_steps: 4000
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.37.2
* Pytorch 2.1.0+cu121
* Datasets 2.17.0
* Tokenizers 0.15.2
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 64\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 250\n* training\\_steps: 4000\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
"TAGS\n#transformers #safetensors #whisper #automatic-speech-recognition #generated_from_trainer #dataset-kannada_asr_corpus #base_model-openai/whisper-medium #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 64\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 250\n* training\\_steps: 4000\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
81,
158,
4,
33
] | [
"passage: TAGS\n#transformers #safetensors #whisper #automatic-speech-recognition #generated_from_trainer #dataset-kannada_asr_corpus #base_model-openai/whisper-medium #license-apache-2.0 #model-index #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 64\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 250\n* training\\_steps: 4000\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
-0.1275096982717514,
0.13348306715488434,
-0.0027007327880710363,
0.048677485436201096,
0.09105069935321808,
0.009613631293177605,
0.10843708366155624,
0.15850025415420532,
-0.045830801129341125,
0.10314776748418808,
0.11271096020936966,
0.09970007091760635,
0.07091601192951202,
0.21538977324962616,
-0.026736529543995857,
-0.30723053216934204,
0.03405148163437843,
-0.02527162991464138,
-0.0838092491030693,
0.11348918080329895,
0.09203150868415833,
-0.11112475395202637,
0.026374576613307,
-0.0034576256293803453,
-0.08419811725616455,
-0.03579263016581535,
-0.01935519278049469,
-0.07041464000940323,
0.10793694108724594,
0.011709138751029968,
0.061522338539361954,
0.0480586476624012,
0.08778628706932068,
-0.2684251666069031,
0.006741981487721205,
0.03678826242685318,
0.04485606774687767,
0.06807902455329895,
0.06774845719337463,
-0.03321661800146103,
0.07720056921243668,
-0.09882409870624542,
0.09093590825796127,
0.046318165957927704,
-0.11311519891023636,
-0.3230960965156555,
-0.08736996352672577,
0.04315349459648132,
0.14561288058757782,
0.058507226407527924,
-0.03337792307138443,
0.08234763890504837,
-0.05199037119746208,
0.07809814810752869,
0.203688845038414,
-0.29196327924728394,
-0.05694706365466118,
-0.004009878728538752,
0.04460613429546356,
0.06684690713882446,
-0.10837048292160034,
-0.020885176956653595,
0.018137849867343903,
0.013920905999839306,
0.13626648485660553,
0.004291328135877848,
0.07569686323404312,
-0.009279744699597359,
-0.1356704831123352,
-0.05215707793831825,
0.11601343750953674,
0.07626309245824814,
-0.03349484130740166,
-0.15996387600898743,
-0.024032624438405037,
-0.1807558685541153,
-0.06581548601388931,
-0.006536992732435465,
0.028070176020264626,
-0.030166475102305412,
-0.10664554685354233,
0.011992759071290493,
-0.04597875475883484,
-0.07806281000375748,
0.052021291106939316,
0.14788229763507843,
0.042436663061380386,
-0.043956249952316284,
0.034639567136764526,
0.09238127619028091,
0.02324719727039337,
-0.15393057465553284,
-0.010205351747572422,
0.027235180139541626,
-0.10807874798774719,
-0.026338106021285057,
-0.0019644906278699636,
0.007861930876970291,
0.029305413365364075,
0.1704373061656952,
-0.011751092970371246,
0.0996483713388443,
0.0365123525261879,
0.02348070777952671,
-0.09347588568925858,
0.1713821291923523,
-0.03046261891722679,
-0.10264288634061813,
-0.044441089034080505,
0.13492447137832642,
0.023018386214971542,
-0.020366208627820015,
-0.07277437299489975,
0.03196139633655548,
0.07681664079427719,
0.0557808093726635,
-0.013939584605395794,
0.038193464279174805,
-0.08756531774997711,
-0.011729586869478226,
0.022152701392769814,
-0.11096780747175217,
0.04912590608000755,
0.012401261366903782,
-0.04694158583879471,
-0.07171617448329926,
-0.02246389165520668,
0.028989603742957115,
-0.0038978278171271086,
0.08412271738052368,
-0.04902441054582596,
-0.017663951963186264,
-0.0693165585398674,
-0.08836439996957779,
0.015438768081367016,
-0.05345522239804268,
-0.007839430123567581,
-0.06417407095432281,
-0.10191245377063751,
-0.05111874267458916,
0.06409380584955215,
-0.054651714861392975,
-0.05960690230131149,
-0.09538087993860245,
-0.062417615205049515,
0.056604523211717606,
-0.02053448185324669,
0.094655342400074,
-0.06127360835671425,
0.08982723206281662,
0.019246036186814308,
0.08423799276351929,
0.07983899116516113,
0.043889619410037994,
-0.05212659761309624,
0.06299225986003876,
-0.2140078842639923,
0.10489640384912491,
-0.09659545868635178,
0.07658056914806366,
-0.15554489195346832,
-0.08637557178735733,
-0.0038333148695528507,
0.0011556154349818826,
0.09121459722518921,
0.12927919626235962,
-0.18994836509227753,
-0.08450302481651306,
0.2107839584350586,
-0.08179764449596405,
-0.10050404816865921,
0.14360886812210083,
-0.027978496626019478,
0.018915541470050812,
0.03883861377835274,
0.21840332448482513,
0.08213675767183304,
-0.08362532407045364,
0.025855910032987595,
-0.05511412397027016,
0.08386225998401642,
0.03402011841535568,
0.0917309820652008,
-0.06029227003455162,
0.02810131199657917,
0.0036336411722004414,
-0.019350312650203705,
0.057882245630025864,
-0.06772918254137039,
-0.08609268069267273,
-0.01580861583352089,
-0.07610434293746948,
0.02977355383336544,
0.03240806236863136,
0.017869122326374054,
-0.10836383700370789,
-0.127637580037117,
-0.03369670361280441,
0.0998045951128006,
-0.09899692237377167,
0.020835254341363907,
-0.09459608793258667,
0.05375315994024277,
-0.008984745480120182,
0.0025166328996419907,
-0.12780919671058655,
0.024633392691612244,
0.04970058053731918,
-0.03656656667590141,
-0.0037520613986998796,
-0.06727074831724167,
0.08318185061216354,
0.03973756730556488,
-0.06860480457544327,
-0.08145243674516678,
-0.016898535192012787,
0.0028863493353128433,
-0.07773182541131973,
-0.22643430531024933,
-0.05884728953242302,
-0.04690957069396973,
0.16496606171131134,
-0.21799345314502716,
0.022697055712342262,
0.038063645362854004,
0.11582286655902863,
0.05257851257920265,
-0.051237862557172775,
0.02130991220474243,
0.062199339270591736,
-0.010664205066859722,
-0.09235307574272156,
0.04041025787591934,
0.012305361218750477,
-0.1535896509885788,
0.014208576641976833,
-0.15314239263534546,
0.0757749006152153,
0.08807773888111115,
0.06328587234020233,
-0.10320084542036057,
-0.10281140357255936,
-0.048189278692007065,
-0.039303094148635864,
-0.024330560117959976,
0.0237869992852211,
0.17581897974014282,
0.023091383278369904,
0.10886838287115097,
-0.06923247128725052,
-0.03516723960638046,
0.031162384897470474,
-0.011546200141310692,
-0.008680524304509163,
0.14593636989593506,
-0.024943560361862183,
-0.07376987487077713,
0.09753990173339844,
0.10369832068681717,
-0.03471674397587776,
0.13325436413288116,
-0.06707767397165298,
-0.07226331532001495,
-0.043331943452358246,
0.05652396008372307,
0.048493996262550354,
0.10610328614711761,
-0.1129843071103096,
-0.02765953540802002,
0.020635778084397316,
0.022640319541096687,
-0.012763839215040207,
-0.17683663964271545,
-0.01171488594263792,
0.04594207927584648,
-0.06649748980998993,
-0.012958868406713009,
-0.02473466657102108,
-0.02164413593709469,
0.07725534588098526,
0.02620888128876686,
-0.06004045903682709,
0.0049216123297810555,
-0.024456564337015152,
-0.08200042694807053,
0.1823573261499405,
-0.10687272250652313,
-0.15050624310970306,
-0.10744359344244003,
-0.011071189306676388,
0.020923646166920662,
-0.0023338557220995426,
0.042304765433073044,
-0.10700592398643494,
-0.030195750296115875,
-0.08598430454730988,
0.005807993467897177,
-0.02016798034310341,
0.024917514994740486,
0.02007146179676056,
0.02207774668931961,
0.06905069202184677,
-0.07464494556188583,
0.001581310760229826,
-0.01105289626866579,
-0.008190636523067951,
0.04093183949589729,
0.002069327514618635,
0.09197811782360077,
0.16231028735637665,
0.03729104995727539,
0.034982793033123016,
-0.05084718018770218,
0.1622098833322525,
-0.12438883632421494,
-0.001331666368059814,
0.08806955814361572,
-0.0011872411705553532,
0.05277247354388237,
0.18172799050807953,
0.03537735715508461,
-0.10209736973047256,
0.025638092309236526,
-0.003553498536348343,
-0.027001934126019478,
-0.19311568140983582,
-0.024989664554595947,
-0.05793339014053345,
-0.012419136241078377,
0.12530678510665894,
0.028219424188137054,
-0.01736602559685707,
0.03649403527379036,
-0.011180478148162365,
-0.033568356186151505,
0.04166271910071373,
0.09119927883148193,
0.037167344242334366,
0.034690503031015396,
0.11228223890066147,
-0.019132889807224274,
-0.03637617826461792,
0.016906369477510452,
0.032366082072257996,
0.23612463474273682,
0.015332325361669064,
0.20612585544586182,
0.03685837984085083,
0.1576104760169983,
0.027852747589349747,
0.027399685233831406,
0.028273139148950577,
-0.02454540878534317,
0.0005577695555984974,
-0.05072861537337303,
-0.047672852873802185,
0.06460056453943253,
0.07035515457391739,
0.029322005808353424,
-0.11267058551311493,
0.025488324463367462,
0.049097102135419846,
0.35855016112327576,
0.08626380562782288,
-0.3054425120353699,
-0.08667647838592529,
0.025734657421708107,
-0.07926278561353683,
-0.03824763372540474,
0.027517147362232208,
0.12816379964351654,
-0.08405940234661102,
0.06840573996305466,
-0.06992442905902863,
0.07300203293561935,
-0.10110719501972198,
-0.0016768989153206348,
0.03383198752999306,
0.07801701128482819,
-0.011591379530727863,
0.040449108928442,
-0.26092368364334106,
0.3204907476902008,
0.0012558861635625362,
0.0895908921957016,
-0.03769509121775627,
0.02702665515244007,
0.022007150575518608,
-0.023658927530050278,
0.12065714597702026,
-0.006579928565770388,
-0.18015733361244202,
-0.19649267196655273,
-0.09911621361970901,
0.00888044573366642,
0.13298265635967255,
-0.0610555075109005,
0.11315447837114334,
-0.025950895622372627,
-0.027007175609469414,
0.041864145547151566,
-0.08615195751190186,
-0.07878335565328598,
-0.11698562651872635,
0.008377445861697197,
0.035681113600730896,
0.06925401836633682,
-0.10911468416452408,
-0.07621020078659058,
-0.028795229271054268,
0.1572316288948059,
-0.09425243735313416,
-0.021388646215200424,
-0.14541509747505188,
0.037322867661714554,
0.1367528736591339,
-0.06989315897226334,
0.054135359823703766,
0.020069044083356857,
0.1269204318523407,
-0.000020347910322016105,
0.009754599072039127,
0.12142336368560791,
-0.07704915851354599,
-0.21287408471107483,
-0.06988812237977982,
0.1672135442495346,
0.04091717675328255,
0.06937050074338913,
-0.00667523592710495,
0.050283871591091156,
0.011148476041853428,
-0.06968910992145538,
0.08830195665359497,
0.046685464680194855,
0.030568795278668404,
0.017212241888046265,
-0.017278265208005905,
0.012390556745231152,
-0.08306027948856354,
-0.060349833220243454,
0.14514797925949097,
0.31616470217704773,
-0.10588645190000534,
0.0747881755232811,
0.07107527554035187,
-0.04108569398522377,
-0.15316906571388245,
0.005797655321657658,
0.10476049780845642,
0.04058382660150528,
-0.0004434279107954353,
-0.19028641283512115,
0.0077436622232198715,
0.07395132631063461,
-0.03908730298280716,
0.026794113218784332,
-0.31883296370506287,
-0.13156867027282715,
0.09688249975442886,
0.08706851303577423,
-0.029680848121643066,
-0.14847098290920258,
-0.06933454424142838,
-0.020940328016877174,
-0.03974959999322891,
0.009482844732701778,
-0.004808086436241865,
0.12194040417671204,
0.004512654151767492,
0.01582258567214012,
0.030214371159672737,
-0.0580761656165123,
0.13901568949222565,
-0.009984651580452919,
0.06306730955839157,
-0.020017288625240326,
0.027649717405438423,
-0.02052605152130127,
-0.09223867207765579,
0.02088412269949913,
-0.09056884795427322,
0.026868553832173347,
-0.10359323024749756,
-0.036307964473962784,
-0.06751451641321182,
0.014059877954423428,
-0.023531418293714523,
-0.04750675708055496,
-0.021395113319158554,
0.0635729506611824,
0.09034724533557892,
-0.005856858100742102,
0.1299840658903122,
-0.060666270554065704,
0.14623340964317322,
0.1016252264380455,
0.11574042588472366,
-0.023985307663679123,
-0.07466034591197968,
0.010114284232258797,
-0.0263082105666399,
0.03893252834677696,
-0.13712455332279205,
0.03511985391378403,
0.12965969741344452,
0.034229882061481476,
0.15703874826431274,
0.05179687216877937,
-0.06825397163629532,
0.017442896962165833,
0.07381865382194519,
-0.06985737383365631,
-0.1707974672317505,
0.00343041168525815,
0.05685652419924736,
-0.13891127705574036,
0.007477696985006332,
0.09668488800525665,
-0.03945357725024223,
-0.008441541343927383,
0.0038846100214868784,
0.04125707596540451,
-0.0257041547447443,
0.19852052628993988,
0.025093287229537964,
0.09259282797574997,
-0.09159747511148453,
0.0879126563668251,
0.0503619983792305,
-0.13984768092632294,
0.06057018041610718,
0.07609810680150986,
-0.08650491386651993,
-0.011652081273496151,
0.028412243351340294,
0.07023175805807114,
0.06878887116909027,
-0.035545166581869125,
-0.11062897741794586,
-0.1325419694185257,
0.06799711287021637,
0.07868747413158417,
0.034154631197452545,
0.012414442375302315,
-0.01790332794189453,
0.025609595701098442,
-0.10018803924322128,
0.10053566098213196,
0.07853934168815613,
0.06110148876905441,
-0.12208729982376099,
0.1341249793767929,
-0.0010048208059743047,
-0.017801648005843163,
-0.0017974033253267407,
-0.0038615036755800247,
-0.13038352131843567,
0.02370567061007023,
-0.06900704652070999,
-0.02031044289469719,
-0.08922229707241058,
-0.0031541278585791588,
0.0013720651622861624,
-0.040026724338531494,
-0.032884303480386734,
0.004163734149187803,
-0.11563922464847565,
-0.04992600902915001,
-0.013868022710084915,
0.06532225757837296,
-0.07854656875133514,
-0.03618329018354416,
0.033995334059000015,
-0.11798401921987534,
0.08986124396324158,
0.02103089913725853,
0.013295995071530342,
-0.0027931062504649162,
-0.07573489099740982,
0.01562191266566515,
0.04385320469737053,
-0.02283228188753128,
0.028892118483781815,
-0.183930441737175,
-0.027080273255705833,
-0.031451329588890076,
-0.0034080352634191513,
0.011254970915615559,
0.05032474175095558,
-0.12293460220098495,
0.0012963013723492622,
-0.03514360263943672,
-0.06736189872026443,
-0.0639907643198967,
0.058705884963274,
0.08599507063627243,
-0.0000384849809051957,
0.14465521275997162,
-0.09428747743368149,
0.042623598128557205,
-0.2034994512796402,
-0.013871093280613422,
-0.0197198074311018,
-0.07667583227157593,
-0.07603919506072998,
-0.022262560203671455,
0.09251055121421814,
-0.06460874527692795,
0.09092672914266586,
-0.06144385784864426,
0.02727862261235714,
0.030978532508015633,
-0.13288940489292145,
0.04195922985672951,
0.05630849301815033,
0.18861231207847595,
0.0521354004740715,
-0.04904589056968689,
0.08663451671600342,
0.01798955909907818,
0.04834125190973282,
0.138493150472641,
0.11868548393249512,
0.16639965772628784,
0.08010277152061462,
0.0881543904542923,
0.03580806404352188,
-0.10402771830558777,
-0.1499415934085846,
0.14374583959579468,
-0.04407569766044617,
0.12216494977474213,
-0.01570468582212925,
0.20345185697078705,
0.1250915229320526,
-0.2010853886604309,
0.057988885790109634,
-0.03318198397755623,
-0.09042524546384811,
-0.10166239738464355,
-0.11511647701263428,
-0.09192056208848953,
-0.1834019273519516,
0.008205764926970005,
-0.09206237643957138,
0.031033948063850403,
0.0457971952855587,
0.027958566322922707,
0.05189649388194084,
0.10909731686115265,
0.043926067650318146,
0.03240948170423508,
0.11887533962726593,
0.01564677245914936,
-0.01701360009610653,
-0.039698176085948944,
-0.10696093738079071,
0.042119815945625305,
-0.041194140911102295,
0.05048804730176926,
-0.03942227363586426,
-0.09338562190532684,
0.051442768424749374,
0.013044062070548534,
-0.10968687385320663,
0.020907022058963776,
-0.000416091934312135,
0.06472839415073395,
0.06154155731201172,
0.04935343191027641,
-0.03300577774643898,
-0.018990419805049896,
0.2417086809873581,
-0.09919832646846771,
-0.05194881185889244,
-0.14255914092063904,
0.2595909535884857,
-0.0023674846161156893,
0.005381824914366007,
0.0016574374167248607,
-0.076865054666996,
0.011314604431390762,
0.1466200202703476,
0.15019027888774872,
-0.0027308070566505194,
-0.010483680292963982,
-0.0030027679167687893,
-0.015128235332667828,
-0.044535085558891296,
0.0804857611656189,
0.10616856813430786,
0.02595842070877552,
-0.05534833297133446,
-0.023034757003188133,
-0.026881439611315727,
-0.07843007892370224,
-0.03867025673389435,
0.08594061434268951,
0.023541809991002083,
0.009748279117047787,
-0.03969279304146767,
0.11923951655626297,
-0.04423230513930321,
-0.1097494512796402,
0.040384430438280106,
-0.19069479405879974,
-0.16972710192203522,
-0.025573911145329475,
0.028954073786735535,
0.02658168412744999,
0.05243859812617302,
0.013483808375895023,
-0.01564403623342514,
0.07951018214225769,
0.00014833472960162908,
-0.027869779616594315,
-0.09670557826757431,
0.06862052530050278,
-0.0998314842581749,
0.187282994389534,
-0.036832328885793686,
-0.002026426373049617,
0.1353757381439209,
0.04999709501862526,
-0.09293240308761597,
0.06335913389921188,
0.07152226567268372,
-0.09017585963010788,
0.0520346499979496,
0.18372651934623718,
-0.04170181602239609,
0.1473436802625656,
0.06719497591257095,
-0.10280849039554596,
0.03464289754629135,
-0.12381436675786972,
-0.054569464176893234,
-0.07254020124673843,
0.0029318041633814573,
-0.02789541706442833,
0.145534485578537,
0.20760226249694824,
-0.07284080982208252,
-0.00436463812366128,
-0.03212164714932442,
0.015767129138112068,
0.055908314883708954,
0.10139824450016022,
-0.03972490504384041,
-0.26558420062065125,
0.014018281362950802,
0.04502610117197037,
0.005759414751082659,
-0.2217930108308792,
-0.11270415037870407,
0.010519299656152725,
-0.04770124331116676,
-0.05594097077846527,
0.12455617636442184,
0.07317313551902771,
0.04960041865706444,
-0.06396662443876266,
-0.12350470572710037,
-0.016709519550204277,
0.18684422969818115,
-0.16125203669071198,
-0.05088454484939575
] |
null | null | null |
# Lora of golden_hind/ゴールデン・ハインド/金鹿号 (Azur Lane)
## What Is This?
This is the LoRA model of waifu golden_hind/ゴールデン・ハインド/金鹿号 (Azur Lane).
## How Is It Trained?
* This model is trained with [HCP-Diffusion](https://github.com/7eu7d7/HCP-Diffusion).
* The [auto-training framework](https://github.com/deepghs/cyberharem) is maintained by [DeepGHS Team](https://huggingface.co/deepghs).
* The base model used for training is [deepghs/animefull-latest](https://huggingface.co/deepghs/animefull-latest).
* Dataset used for training is the `stage3-p480-800` in [CyberHarem/golden_hind_azurlane](https://huggingface.co/datasets/CyberHarem/golden_hind_azurlane), which contains 180 images.
* Batch size is 4, resolution is 720x720, clustering into 5 buckets.
* Batch size for regularization dataset is 16, resolution is 720x720, clustering into 20 buckets.
* Trained for 1800 steps, 40 checkpoints were saved and evaluated.
* **Trigger word is `golden_hind_azurlane`.**
* Pruned core tags for this waifu are `breasts, long_hair, horns, black_hair, large_breasts, blue_eyes, bangs, very_long_hair, mole, mole_under_mouth`. You can add them to the prompt when some features of waifu (e.g. hair color) are not stable.
## How to Use It?
### If You Are Using A1111 WebUI v1.7+
**Just use it like the classic LoRA**. The LoRA we provided are bundled with the embedding file.
### If You Are Using A1111 WebUI v1.6 or Lower
After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.
For example, if you want to use the model from step 630, you need to download [`630/golden_hind_azurlane.pt`](https://huggingface.co/CyberHarem/golden_hind_azurlane/resolve/main/630/golden_hind_azurlane.pt) as the embedding and [`630/golden_hind_azurlane.safetensors`](https://huggingface.co/CyberHarem/golden_hind_azurlane/resolve/main/630/golden_hind_azurlane.safetensors) for loading Lora. By using both files together, you can generate images for the desired characters.
## Which Step Should I Use?
We selected 5 good steps for you to choose. The best one is step 630.
1520 images (1.62 GiB) were generated for auto-testing.

The base model used for generating preview images is [Meina/MeinaMix_V11](https://huggingface.co/Meina/MeinaMix_V11).
Here are the preview of the recommended steps:
| Step | Epoch | CCIP | AI Corrupt | Bikini Plus | Score | Download | pattern_0_0 | pattern_0_1 | pattern_0_2 | portrait_0 | portrait_1 | portrait_2 | full_body_0 | full_body_1 | profile_0 | profile_1 | free_0 | free_1 | shorts | maid_0 | maid_1 | miko | yukata | suit | china | bikini_0 | bikini_1 | bikini_2 | sit | squat | kneel | jump | crossed_arms | angry | smile | cry | grin | n_lie_0 | n_lie_1 | n_stand_0 | n_stand_1 | n_stand_2 | n_sex_0 | n_sex_1 |
|-------:|--------:|:----------|:-------------|:--------------|:----------|:-------------------------------------------------------------------------------------------------------------|:---------------------------------------------|:---------------------------------------------|:---------------------------------------------|:-------------------------------------------|:-------------------------------------------|:-------------------------------------------|:---------------------------------------------|:---------------------------------------------|:-----------------------------------------|:-----------------------------------------|:-----------------------------------|:-----------------------------------|:-----------------------------------|:-----------------------------------|:-----------------------------------|:-------------------------------|:-----------------------------------|:-------------------------------|:---------------------------------|:---------------------------------------|:---------------------------------------|:---------------------------------------|:-----------------------------|:---------------------------------|:---------------------------------|:-------------------------------|:-----------------------------------------------|:---------------------------------|:---------------------------------|:-----------------------------|:-------------------------------|:-------------------------------------|:-------------------------------------|:-----------------------------------------|:-----------------------------------------|:-----------------------------------------|:-------------------------------------|:-------------------------------------|
| 630 | 14 | **0.974** | **0.967** | 0.843 | **0.846** | [Download](https://huggingface.co/CyberHarem/golden_hind_azurlane/resolve/main/630/golden_hind_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
| 675 | 15 | 0.948 | 0.941 | 0.839 | 0.792 | [Download](https://huggingface.co/CyberHarem/golden_hind_azurlane/resolve/main/675/golden_hind_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
| 135 | 3 | 0.929 | 0.950 | 0.842 | 0.758 | [Download](https://huggingface.co/CyberHarem/golden_hind_azurlane/resolve/main/135/golden_hind_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
| 315 | 7 | 0.921 | 0.940 | **0.849** | 0.747 | [Download](https://huggingface.co/CyberHarem/golden_hind_azurlane/resolve/main/315/golden_hind_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
| 720 | 16 | 0.926 | 0.921 | 0.833 | 0.745 | [Download](https://huggingface.co/CyberHarem/golden_hind_azurlane/resolve/main/720/golden_hind_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
## Anything Else?
Because the automation of LoRA training always annoys some people. So for the following groups, it is not recommended to use this model and we express regret:
1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.
2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.
3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.
4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.
5. Individuals who finds the generated image content offensive to their values.
## All Steps
We uploaded the files in all steps. you can check the images, metrics and download them in the following links:
* [Steps From 1395 to 1800](all/0.md)
* [Steps From 945 to 1350](all/1.md)
* [Steps From 495 to 900](all/2.md)
* [Steps From 45 to 450](all/3.md)
| {"license": "mit", "tags": ["art", "not-for-all-audiences"], "datasets": ["CyberHarem/golden_hind_azurlane"], "pipeline_tag": "text-to-image"} | text-to-image | CyberHarem/golden_hind_azurlane | [
"art",
"not-for-all-audiences",
"text-to-image",
"dataset:CyberHarem/golden_hind_azurlane",
"license:mit",
"region:us"
] | 2024-02-15T06:24:07+00:00 | [] | [] | TAGS
#art #not-for-all-audiences #text-to-image #dataset-CyberHarem/golden_hind_azurlane #license-mit #region-us
| Lora of golden\_hind/ゴールデン・ハインド/金鹿号 (Azur Lane)
===============================================
What Is This?
-------------
This is the LoRA model of waifu golden\_hind/ゴールデン・ハインド/金鹿号 (Azur Lane).
How Is It Trained?
------------------
* This model is trained with HCP-Diffusion.
* The auto-training framework is maintained by DeepGHS Team.
* The base model used for training is deepghs/animefull-latest.
* Dataset used for training is the 'stage3-p480-800' in CyberHarem/golden\_hind\_azurlane, which contains 180 images.
* Batch size is 4, resolution is 720x720, clustering into 5 buckets.
* Batch size for regularization dataset is 16, resolution is 720x720, clustering into 20 buckets.
* Trained for 1800 steps, 40 checkpoints were saved and evaluated.
* Trigger word is 'golden\_hind\_azurlane'.
* Pruned core tags for this waifu are 'breasts, long\_hair, horns, black\_hair, large\_breasts, blue\_eyes, bangs, very\_long\_hair, mole, mole\_under\_mouth'. You can add them to the prompt when some features of waifu (e.g. hair color) are not stable.
How to Use It?
--------------
### If You Are Using A1111 WebUI v1.7+
Just use it like the classic LoRA. The LoRA we provided are bundled with the embedding file.
### If You Are Using A1111 WebUI v1.6 or Lower
After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.
For example, if you want to use the model from step 630, you need to download '630/golden\_hind\_azurlane.pt' as the embedding and '630/golden\_hind\_azurlane.safetensors' for loading Lora. By using both files together, you can generate images for the desired characters.
Which Step Should I Use?
------------------------
We selected 5 good steps for you to choose. The best one is step 630.
1520 images (1.62 GiB) were generated for auto-testing.
!Metrics Plot
The base model used for generating preview images is Meina/MeinaMix\_V11.
Here are the preview of the recommended steps:
Anything Else?
--------------
Because the automation of LoRA training always annoys some people. So for the following groups, it is not recommended to use this model and we express regret:
1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.
2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.
3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.
4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.
5. Individuals who finds the generated image content offensive to their values.
All Steps
---------
We uploaded the files in all steps. you can check the images, metrics and download them in the following links:
* Steps From 1395 to 1800
* Steps From 945 to 1350
* Steps From 495 to 900
* Steps From 45 to 450
| [
"### If You Are Using A1111 WebUI v1.7+\n\n\nJust use it like the classic LoRA. The LoRA we provided are bundled with the embedding file.",
"### If You Are Using A1111 WebUI v1.6 or Lower\n\n\nAfter downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.\n\n\nFor example, if you want to use the model from step 630, you need to download '630/golden\\_hind\\_azurlane.pt' as the embedding and '630/golden\\_hind\\_azurlane.safetensors' for loading Lora. By using both files together, you can generate images for the desired characters.\n\n\nWhich Step Should I Use?\n------------------------\n\n\nWe selected 5 good steps for you to choose. The best one is step 630.\n\n\n1520 images (1.62 GiB) were generated for auto-testing.\n\n\n!Metrics Plot\n\n\nThe base model used for generating preview images is Meina/MeinaMix\\_V11.\n\n\nHere are the preview of the recommended steps:\n\n\n\nAnything Else?\n--------------\n\n\nBecause the automation of LoRA training always annoys some people. So for the following groups, it is not recommended to use this model and we express regret:\n\n\n1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.\n2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.\n3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.\n4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.\n5. Individuals who finds the generated image content offensive to their values.\n\n\nAll Steps\n---------\n\n\nWe uploaded the files in all steps. you can check the images, metrics and download them in the following links:\n\n\n* Steps From 1395 to 1800\n* Steps From 945 to 1350\n* Steps From 495 to 900\n* Steps From 45 to 450"
] | [
"TAGS\n#art #not-for-all-audiences #text-to-image #dataset-CyberHarem/golden_hind_azurlane #license-mit #region-us \n",
"### If You Are Using A1111 WebUI v1.7+\n\n\nJust use it like the classic LoRA. The LoRA we provided are bundled with the embedding file.",
"### If You Are Using A1111 WebUI v1.6 or Lower\n\n\nAfter downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.\n\n\nFor example, if you want to use the model from step 630, you need to download '630/golden\\_hind\\_azurlane.pt' as the embedding and '630/golden\\_hind\\_azurlane.safetensors' for loading Lora. By using both files together, you can generate images for the desired characters.\n\n\nWhich Step Should I Use?\n------------------------\n\n\nWe selected 5 good steps for you to choose. The best one is step 630.\n\n\n1520 images (1.62 GiB) were generated for auto-testing.\n\n\n!Metrics Plot\n\n\nThe base model used for generating preview images is Meina/MeinaMix\\_V11.\n\n\nHere are the preview of the recommended steps:\n\n\n\nAnything Else?\n--------------\n\n\nBecause the automation of LoRA training always annoys some people. So for the following groups, it is not recommended to use this model and we express regret:\n\n\n1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.\n2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.\n3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.\n4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.\n5. Individuals who finds the generated image content offensive to their values.\n\n\nAll Steps\n---------\n\n\nWe uploaded the files in all steps. you can check the images, metrics and download them in the following links:\n\n\n* Steps From 1395 to 1800\n* Steps From 945 to 1350\n* Steps From 495 to 900\n* Steps From 45 to 450"
] | [
46,
38,
475
] | [
"passage: TAGS\n#art #not-for-all-audiences #text-to-image #dataset-CyberHarem/golden_hind_azurlane #license-mit #region-us \n### If You Are Using A1111 WebUI v1.7+\n\n\nJust use it like the classic LoRA. The LoRA we provided are bundled with the embedding file."
] | [
0.016010846942663193,
0.0020527541637420654,
-0.003806306282058358,
0.08732005953788757,
0.05978981405496597,
0.08441121876239777,
0.21343685686588287,
0.08035757392644882,
0.11906734108924866,
-0.07196316123008728,
0.09441251307725906,
0.05275062099099159,
-0.004352047573775053,
0.013745632953941822,
-0.026742594316601753,
-0.16207990050315857,
-0.06755559146404266,
-0.032387908548116684,
-0.0025004507042467594,
0.01624543033540249,
0.07020026445388794,
0.010187194682657719,
0.09258466213941574,
-0.05913853645324707,
-0.04399615153670311,
0.05221742391586304,
-0.028312290087342262,
-0.04162602126598358,
0.042335763573646545,
0.07899237424135208,
0.10900235921144485,
0.005891330074518919,
0.05392460897564888,
-0.1483071744441986,
0.0705866888165474,
-0.008490859530866146,
-0.10633272677659988,
-0.003630643943324685,
0.019186459481716156,
-0.02238639071583748,
0.12513059377670288,
0.039473626762628555,
-0.11068954318761826,
0.036349937319755554,
-0.13370786607265472,
-0.05755864456295967,
-0.05501668155193329,
0.05890608951449394,
0.1304236799478531,
0.05312900245189667,
0.020416857674717903,
0.05220677703619003,
-0.042757946997880936,
0.07762427628040314,
0.09905199706554413,
-0.15136706829071045,
-0.07170448452234268,
0.09731251746416092,
0.015571272000670433,
0.1365639716386795,
-0.08805406093597412,
0.10568460822105408,
0.06431996822357178,
-0.05240856483578682,
-0.14529520273208618,
-0.10103294998407364,
-0.21930812299251556,
-0.005464521236717701,
0.010275354608893394,
0.019834984093904495,
0.3829509913921356,
0.05968146398663521,
0.0366605781018734,
0.07293779402971268,
-0.08253011107444763,
0.03579338267445564,
-0.09532047063112259,
0.13473691046237946,
0.044583722949028015,
0.10097341984510422,
-0.03804386779665947,
-0.10892467200756073,
-0.1080857589840889,
-0.07050585001707077,
-0.086370088160038,
-0.021025950089097023,
0.026530439034104347,
0.11495253443717957,
-0.20294919610023499,
0.018666096031665802,
-0.06840627640485764,
-0.12464533001184464,
0.01792873442173004,
-0.09130080044269562,
0.16746312379837036,
0.07573504000902176,
-0.004784995224326849,
0.04941178485751152,
0.2508838474750519,
0.12904483079910278,
0.20865100622177124,
0.06359145045280457,
-0.12061071395874023,
0.13263128697872162,
0.019024275243282318,
-0.09622473269701004,
-0.012708118185400963,
-0.1196262389421463,
0.16034041345119476,
-0.06144902482628822,
0.11864109337329865,
-0.060726068913936615,
-0.1094055101275444,
0.010359467007219791,
-0.09526260197162628,
0.06789366900920868,
0.04680035263299942,
0.002226051175966859,
-0.046029724180698395,
0.061458148062229156,
0.051475346088409424,
-0.03490424156188965,
-0.00516486307606101,
-0.017946407198905945,
-0.053852688521146774,
0.021877678111195564,
0.09678889065980911,
0.036480244249105453,
0.06510599702596664,
-0.007585705723613501,
-0.02089286595582962,
-0.0043410626240074635,
-0.03672049194574356,
0.012126642279326916,
0.04715004935860634,
0.051697466522455215,
0.09446641802787781,
-0.15925078094005585,
-0.08514987677335739,
-0.005371299106627703,
0.056187987327575684,
0.0060679856687784195,
0.09512705355882645,
-0.005062571261078119,
0.05027034506201744,
0.009150517173111439,
-0.026789994910359383,
0.043797243386507034,
-0.10292492061853409,
0.0846489891409874,
-0.004533617757260799,
0.09189565479755402,
-0.20632566511631012,
-0.013844072818756104,
-0.04587451368570328,
0.01139635406434536,
0.06339532136917114,
-0.006675529759377241,
-0.09242656081914902,
0.12576061487197876,
-0.006437544710934162,
0.0711899921298027,
-0.09889796376228333,
0.04552362114191055,
0.021285468712449074,
0.07680460810661316,
-0.10703937709331512,
0.005222370382398367,
0.13673385977745056,
-0.13656722009181976,
-0.17508138716220856,
0.09545242786407471,
-0.026407646015286446,
0.029241057112812996,
0.05328860133886337,
0.17492912709712982,
0.15757955610752106,
-0.19043420255184174,
-0.01754678040742874,
0.04266832768917084,
-0.029706494882702827,
-0.08272825926542282,
-0.012223338708281517,
0.10882451385259628,
0.0021236659958958626,
0.02774345874786377,
-0.03339042142033577,
0.1250404417514801,
-0.030731799080967903,
-0.08040434122085571,
-0.02791200391948223,
-0.07697734981775284,
-0.0700104758143425,
0.04783967509865761,
-0.0028534226585179567,
-0.05776042118668556,
0.016240738332271576,
-0.16732917726039886,
0.15843918919563293,
0.023086536675691605,
0.028537841513752937,
-0.08127302676439285,
0.11234942078590393,
0.0037812923546880484,
0.014133668504655361,
0.007517085876315832,
-0.051110733300447464,
-0.10842549800872803,
0.2283586859703064,
0.09010683000087738,
0.08679762482643127,
0.06396854668855667,
-0.042284123599529266,
-0.07942207902669907,
0.006709078326821327,
0.020398767665028572,
-0.04444122314453125,
0.015510347671806812,
-0.09745007008314133,
0.05064048990607262,
-0.010331918485462666,
0.04861745610833168,
-0.011082697659730911,
-0.031124603003263474,
0.08920837938785553,
0.017016908153891563,
-0.017782872542738914,
0.09299108386039734,
0.044773515313863754,
-0.025332767516374588,
-0.06168698891997337,
-0.0012134629068896174,
0.07495040446519852,
-0.0056616091169416904,
-0.10311978310346603,
0.01656114123761654,
0.005605289712548256,
0.05678023397922516,
0.1947821080684662,
-0.21677392721176147,
0.0514533556997776,
0.020388279110193253,
0.04909873381257057,
0.03493908420205116,
0.0031168332789093256,
-0.03802002966403961,
0.048195675015449524,
-0.022576039656996727,
0.07129326462745667,
-0.014269529841840267,
0.06357547640800476,
-0.027435582131147385,
-0.1302884966135025,
-0.01241636835038662,
-0.03301355615258217,
0.12858019769191742,
-0.17881637811660767,
0.06005457416176796,
0.18139857053756714,
-0.10972151160240173,
0.1384318619966507,
0.007804323919117451,
0.0036891044583171606,
0.007815389893949032,
0.02610425464808941,
0.005391411948949099,
0.1021995097398758,
-0.07641755789518356,
-0.029716117307543755,
0.022033708170056343,
-0.082099050283432,
0.028486989438533783,
-0.12790542840957642,
-0.10994577407836914,
-0.07556324452161789,
-0.030428998172283173,
-0.04014597460627556,
0.03926683962345123,
-0.05863439664244652,
0.07969918847084045,
-0.08181598037481308,
-0.06631708145141602,
-0.016716333106160164,
-0.08668816834688187,
0.027630649507045746,
0.010305306874215603,
-0.061461035162210464,
-0.12168080359697342,
-0.11241435259580612,
-0.09282884746789932,
-0.13765276968479156,
-0.004467134363949299,
0.07494860887527466,
-0.11419594287872314,
-0.03250180184841156,
0.01667274720966816,
-0.06167022883892059,
0.08973503857851028,
-0.06379655748605728,
0.01993468590080738,
0.04485238716006279,
-0.0320458747446537,
-0.16100449860095978,
-0.0027832211926579475,
-0.06448211520910263,
-0.06339030712842941,
0.15697871148586273,
-0.139485165476799,
0.17862609028816223,
-0.031474657356739044,
0.05907105281949043,
0.06703952699899673,
0.028786562383174896,
0.11844611167907715,
-0.11846122145652771,
0.07983040064573288,
0.1820507049560547,
0.04526187852025032,
0.08086048811674118,
0.12316396087408066,
0.07729746401309967,
-0.12117564678192139,
0.03750637546181679,
0.0760761946439743,
-0.10127793997526169,
-0.07480161637067795,
-0.049643754959106445,
-0.10845077782869339,
-0.06509962677955627,
0.04999110847711563,
0.06961507350206375,
0.0414314791560173,
0.11645123362541199,
-0.05161052569746971,
0.007522175088524818,
0.0945267379283905,
0.047774601727724075,
0.08265736699104309,
0.018308259546756744,
0.05155739560723305,
-0.14894741773605347,
-0.0408049002289772,
0.16528967022895813,
0.21738332509994507,
0.22206531465053558,
0.02570408396422863,
0.041175052523612976,
0.12354382872581482,
0.10129351913928986,
0.10570025444030762,
0.04086241126060486,
0.011639007367193699,
0.010913249105215073,
-0.07225989550352097,
-0.052297238260507584,
0.02037615329027176,
0.0042352317832410336,
-0.06737148016691208,
-0.13605590164661407,
0.11091388016939163,
-0.005674474406987429,
0.08363893628120422,
0.1251242309808731,
0.03587332367897034,
-0.10142987221479416,
0.1527673751115799,
0.10859913378953934,
0.08203303813934326,
-0.07206574082374573,
0.12907131016254425,
0.045101676136255264,
-0.004594497382640839,
0.174014151096344,
0.026616839691996574,
0.14920955896377563,
-0.05029040575027466,
-0.06584667414426804,
-0.07093071937561035,
-0.06106432527303696,
0.011837735772132874,
0.030319130048155785,
-0.21482259035110474,
0.10441947728395462,
0.0591052882373333,
0.016034331172704697,
-0.007867339998483658,
-0.05448533594608307,
0.18027815222740173,
0.1529260277748108,
0.08127675950527191,
0.0235983207821846,
-0.03614741936326027,
-0.006950922776013613,
-0.0781884491443634,
0.05758615955710411,
0.015135660767555237,
0.06194814667105675,
-0.04467140883207321,
-0.09716600924730301,
-0.017385179176926613,
-0.007399844937026501,
0.024577951058745384,
-0.08534074574708939,
-0.10696377605199814,
-0.0514569915831089,
0.2576749622821808,
-0.046172428876161575,
0.040919218212366104,
0.05807363986968994,
0.03712170198559761,
-0.06032147631049156,
0.022015416994690895,
-0.019687380641698837,
-0.012538725510239601,
-0.020745771005749702,
-0.0021163339260965586,
0.001981212291866541,
-0.04633915424346924,
-0.05468297377228737,
-0.03393387049436569,
-0.099286749958992,
-0.10066840052604675,
0.00994668249040842,
-0.042559172958135605,
0.009880141355097294,
-0.02739298902451992,
0.022518547251820564,
-0.08312324434518814,
-0.03078608587384224,
0.023533940315246582,
0.038477495312690735,
-0.07902087271213531,
-0.13617004454135895,
0.007659755647182465,
-0.016721177846193314,
-0.05763256549835205,
0.01291731558740139,
-0.09021136164665222,
-0.0552632100880146,
-0.04270855709910393,
-0.04194347932934761,
0.11195922642946243,
0.23319286108016968,
-0.020238153636455536,
0.0015312951290979981,
0.15812014043331146,
-0.09996736794710159,
-0.31746265292167664,
-0.16421836614608765,
-0.16675186157226562,
-0.096836619079113,
0.02901277132332325,
-0.07135339826345444,
0.029147189110517502,
0.0790749043226242,
-0.04533673822879791,
0.20043177902698517,
-0.1902584582567215,
-0.0977402925491333,
0.0721953809261322,
0.09513344615697861,
0.32737675309181213,
-0.24413299560546875,
0.008762606419622898,
-0.11986450850963593,
-0.04077169671654701,
-0.0167737677693367,
-0.11493799835443497,
0.11127486824989319,
0.03149588406085968,
0.07573679089546204,
-0.011283617466688156,
-0.0036679087206721306,
0.1448751538991928,
-0.0788092315196991,
0.140796959400177,
-0.12390682846307755,
-0.09406223148107529,
0.21991105377674103,
-0.029935332015156746,
-0.015508250333368778,
-0.1948079913854599,
-0.03625674173235893,
-0.011409266851842403,
0.0397164560854435,
-0.007162947673350573,
0.04773751646280289,
-0.006912910845130682,
-0.014149699360132217,
-0.13143253326416016,
-0.0210314579308033,
-0.038619883358478546,
0.050963178277015686,
0.23729419708251953,
-0.053577255457639694,
-0.04940636083483696,
0.01089057419449091,
-0.016300000250339508,
0.0930975154042244,
0.027552539482712746,
-0.05081614479422569,
-0.04118217155337334,
0.09796931594610214,
-0.21806415915489197,
0.05475684255361557,
0.010897922329604626,
-0.005145539529621601,
0.013933985494077206,
0.01058834046125412,
0.018530821427702904,
0.1336948573589325,
0.18371859192848206,
-0.014407542534172535,
-0.0723068043589592,
-0.01262158714234829,
0.014410690404474735,
0.14414271712303162,
-0.021124618127942085,
0.10747426003217697,
0.024046486243605614,
0.0334915928542614,
0.012662466615438461,
0.05230298265814781,
-0.08781468123197556,
-0.08456539362668991,
0.09128469973802567,
-0.04403217136859894,
-0.08332356810569763,
0.0946090891957283,
0.047315970063209534,
0.05511573329567909,
0.0010967841371893883,
0.04872378334403038,
0.01422041654586792,
-0.12562181055545807,
0.00804492924362421,
0.2226129174232483,
-0.0626583993434906,
-0.07005340605974197,
-0.06483227759599686,
0.01599705219268799,
-0.12613771855831146,
0.053463105112314224,
0.029032904654741287,
-0.02165025658905506,
0.10459401458501816,
-0.036675356328487396,
-0.02581886015832424,
0.0039024814032018185,
-0.0698353722691536,
0.02796059660613537,
-0.15935726463794708,
-0.20723122358322144,
0.05101076513528824,
-0.0065590995363891125,
-0.06266960501670837,
-0.09336545318365097,
-0.08486393094062805,
0.0713638886809349,
-0.1587381660938263,
0.13595274090766907,
-0.06921479105949402,
0.06250789016485214,
-0.04251555725932121,
-0.05116792023181915,
-0.10239202529191971,
-0.01622684672474861,
-0.05314668267965317,
-0.020978419110178947,
0.06458986550569534,
0.012196352705359459,
-0.11411735415458679,
-0.11247199028730392,
0.06404892355203629,
0.006098063196986914,
0.009168585762381554,
0.015716552734375,
-0.06607003509998322,
0.03331490978598595,
-0.21136817336082458,
-0.06107746437191963,
0.09030088037252426,
0.032045189291238785,
-0.08473128080368042,
0.13277912139892578,
0.041615646332502365,
-0.03230203688144684,
0.03151657432317734,
0.006570856552571058,
0.17273518443107605,
-0.07739940285682678,
0.01859726756811142,
-0.12257595360279083,
-0.15602776408195496,
-0.03288901969790459,
0.03708202391862869,
0.23334918916225433,
0.09291570633649826,
0.10830769687891006,
-0.05082167685031891,
0.011577489785850048,
-0.015558910556137562,
0.06685539335012436,
0.017880642786622047,
-0.1149391457438469,
-0.03948833420872688,
-0.17617462575435638,
-0.06327077746391296,
-0.05827583000063896,
0.16512678563594818,
0.018947718665003777,
-0.1339876353740692,
-0.004802306182682514,
0.11565471440553665,
-0.18479269742965698,
-0.017498578876256943,
0.1731644868850708,
-0.04804692417383194,
0.01727042719721794,
-0.1391468346118927,
0.030487002804875374,
0.07540257275104523,
-0.015615417622029781,
-0.007855192758142948,
0.10527315735816956,
0.02578088454902172,
-0.004804267082363367,
0.02798248454928398,
-0.023068005219101906,
0.05957156792283058,
-0.0755988284945488,
0.06729631125926971,
-0.006831823382526636,
-0.05365807190537453,
-0.11938292533159256,
0.19803762435913086,
-0.011064344085752964,
0.006808004807680845,
-0.056063853204250336,
0.005480976775288582,
-0.10590100288391113,
-0.11047477275133133,
-0.06894011795520782,
-0.1281527876853943,
0.07426001876592636,
-0.055341687053442,
0.016469117254018784,
-0.00801782961934805,
0.014501276426017284,
-0.07457813620567322,
0.02003440260887146,
-0.15841582417488098,
-0.04646911472082138,
0.02042073756456375,
-0.012900776229798794,
-0.035831037908792496,
-0.029478715732693672,
-0.030824802815914154,
0.0137202562764287,
-0.0769764706492424,
-0.07348605245351791,
0.06049356609582901,
0.07359541207551956,
0.059189990162849426,
-0.16917358338832855,
-0.10308679938316345,
-0.07520425319671631,
0.044833049178123474,
0.07877656817436218,
0.1617361307144165,
0.03423074260354042,
-0.019251136109232903,
0.04096095636487007,
0.1616082638502121,
0.013741887174546719,
-0.06865373253822327,
-0.04451701417565346,
-0.13441452383995056,
-0.13770678639411926,
-0.023678278550505638,
-0.06174512952566147,
-0.017071617767214775,
0.03011595457792282,
0.22686724364757538,
0.19256438314914703,
-0.14185410737991333,
0.042851075530052185,
-0.07216431945562363,
0.039377953857183456,
-0.029162252321839333,
0.15616661310195923,
0.054836466908454895,
0.1556675136089325,
-0.035283662378787994,
-0.039505962282419205,
-0.05847438424825668,
0.02143804542720318,
-0.09745988994836807,
0.044905323535203934,
-0.027081657201051712,
-0.05933954939246178,
-0.06067001447081566,
0.09953336417675018,
-0.10581623762845993,
0.04853913560509682,
0.1816282421350479,
-0.14154520630836487,
-0.016155878081917763,
-0.036345187574625015,
0.05549594387412071,
0.11635244637727737,
0.01799251325428486,
-0.07278501242399216,
-0.026062656193971634,
0.012651916593313217,
0.032649025321006775,
-0.17428980767726898,
-0.12153374403715134,
-0.004116432275623083,
-0.11412610858678818,
0.13514520227909088,
-0.01406407356262207,
0.0028556613251566887,
0.044478632509708405,
-0.06274335086345673,
-0.015404569916427135,
0.17159999907016754,
0.018157504498958588,
-0.0345403328537941,
-0.03457310423254967,
-0.06015650928020477,
-0.0969616174697876,
0.04786156490445137,
0.09629631042480469,
0.07820894569158554,
-0.01035409327596426,
0.14610527455806732,
-0.014852532185614109,
-0.04934791475534439,
0.1358058899641037,
-0.16705940663814545,
0.09800796210765839,
-0.01700759120285511,
-0.022046592086553574,
-0.061816636472940445,
-0.04588945955038071,
0.03131365776062012,
0.08204490691423416,
-0.1558639109134674,
-0.03997422382235527,
0.06914986670017242,
-0.08706995099782944,
0.0734136700630188,
0.03228848800063133,
-0.07705873250961304,
0.004976954311132431,
-0.12038981169462204,
-0.003036697395145893,
-0.0977974683046341,
0.052708350121974945,
0.20263148844242096,
-0.02970154955983162,
0.013620906509459019,
-0.1569422334432602,
0.04965246841311455,
-0.027857057750225067,
-0.043853700160980225,
-0.07887666672468185
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | null | aryachakraborty/NL-TO-SQL-Tiny-Llama | [
"transformers",
"safetensors",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-15T06:24:44+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
31,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.06646376848220825,
0.2168014943599701,
-0.00225935154594481,
0.023818302899599075,
0.1271018385887146,
-0.001635765191167593,
0.04218708351254463,
0.13324736058712006,
-0.020175931975245476,
0.11144465953111649,
0.046588581055402756,
0.09377603232860565,
0.09928803145885468,
0.18404334783554077,
0.04859916493296623,
-0.2059975117444992,
0.007056170143187046,
-0.09090408682823181,
0.014076028019189835,
0.1116579994559288,
0.13719257712364197,
-0.10291384905576706,
0.08272874355316162,
-0.04045208916068077,
-0.02019004337489605,
0.00012576708104461432,
-0.09259183704853058,
-0.07032395154237747,
0.06885425746440887,
0.06264153122901917,
0.051234472543001175,
0.001456156256608665,
0.09140396863222122,
-0.2864592671394348,
0.017265573143959045,
0.08406311273574829,
0.0027674848679453135,
0.06290827691555023,
0.07236549258232117,
-0.07389893382787704,
0.11328595131635666,
-0.08021481335163116,
0.13019037246704102,
0.08625296503305435,
-0.062064990401268005,
-0.23071379959583282,
-0.07525765895843506,
0.0963398814201355,
0.12251301854848862,
0.06215599179267883,
-0.022921854630112648,
0.15455181896686554,
-0.06248689442873001,
0.012971068732440472,
0.1294165402650833,
-0.11526761949062347,
-0.05572471022605896,
0.061741601675748825,
0.11775490641593933,
0.10740239918231964,
-0.14110268652439117,
-0.0017287094378843904,
0.04900608956813812,
0.029121357947587967,
0.08589313924312592,
0.022661056369543076,
0.12003941088914871,
0.04652795568108559,
-0.13695219159126282,
-0.04037507623434067,
0.12011898308992386,
0.038862764835357666,
-0.06446044892072678,
-0.2168138176202774,
-0.006778308190405369,
-0.0601806715130806,
-0.014732478186488152,
-0.07019448280334473,
0.039128515869379044,
-0.02470310963690281,
0.07317749410867691,
-0.04465159401297569,
-0.1063927412033081,
-0.0421026237308979,
0.0892222449183464,
0.07748593389987946,
0.011527054943144321,
-0.02519804798066616,
0.04627908393740654,
0.13455867767333984,
0.05402068421244621,
-0.10399353504180908,
-0.07017925381660461,
-0.06942764669656754,
-0.09420394152402878,
-0.04035796597599983,
0.056760527193546295,
0.031942449510097504,
0.02665667235851288,
0.22703726589679718,
0.016653569415211678,
0.04155244305729866,
0.0224777739495039,
0.01032855175435543,
0.043662428855895996,
0.0955500528216362,
-0.05303520709276199,
-0.15660029649734497,
-0.04072032496333122,
0.09077946096658707,
-0.0027527001220732927,
-0.036689214408397675,
-0.03966725245118141,
0.03849169611930847,
0.06843466311693192,
0.13122352957725525,
0.07552056759595871,
-0.017929591238498688,
-0.04813180863857269,
-0.030096933245658875,
0.23523783683776855,
-0.1493375599384308,
0.04426715523004532,
-0.02271856553852558,
-0.01804111897945404,
-0.03908449783921242,
0.03597262129187584,
0.022118929773569107,
-0.000004518366949923802,
0.09706240892410278,
-0.058981191366910934,
-0.05378659814596176,
-0.10168042778968811,
-0.03272576630115509,
0.04088849574327469,
-0.013975566253066063,
-0.010589460842311382,
-0.09025166928768158,
-0.09490354359149933,
-0.04766594246029854,
0.05537205561995506,
-0.05123869329690933,
-0.03770573064684868,
0.009465423412621021,
-0.08151785284280777,
-0.005444355774670839,
-0.005417742300778627,
0.10699385404586792,
-0.03222226724028587,
0.04445803165435791,
-0.027600755915045738,
0.05225523188710213,
0.09919606149196625,
0.031576547771692276,
-0.0773419588804245,
0.0561848059296608,
-0.22559374570846558,
0.07503069192171097,
-0.11481974273920059,
0.04335082694888115,
-0.1704932004213333,
-0.042439818382263184,
0.005444696638733149,
0.0139949731528759,
0.013206101022660732,
0.12720820307731628,
-0.19255615770816803,
-0.01654396951198578,
0.13260798156261444,
-0.09212633967399597,
-0.118110790848732,
0.07884611934423447,
-0.029701577499508858,
0.1624738723039627,
0.04682036489248276,
-0.027025915682315826,
0.09224298596382141,
-0.16434773802757263,
-0.07092688232660294,
-0.00949116237461567,
-0.01727987825870514,
0.12109188735485077,
0.07512219995260239,
-0.05991523340344429,
0.046571120619773865,
0.02832140028476715,
-0.038078423589468,
-0.04424772411584854,
-0.050857074558734894,
-0.10884185880422592,
-0.01070026308298111,
-0.08987759798765182,
0.04065500199794769,
-0.01250192429870367,
-0.07916021347045898,
-0.029885273426771164,
-0.18612512946128845,
-0.0030564051121473312,
0.10038342326879501,
0.0035033065360039473,
-0.005652366206049919,
-0.08666291832923889,
0.026358824223279953,
-0.03112892620265484,
-0.008404186926782131,
-0.16764774918556213,
-0.04399421438574791,
0.046902090311050415,
-0.16094985604286194,
0.020117372274398804,
-0.06413903087377548,
0.06334125250577927,
0.03641495108604431,
-0.05590536445379257,
-0.0248766727745533,
-0.01730942726135254,
0.011945613659918308,
-0.05083848536014557,
-0.18994836509227753,
-0.056277405470609665,
-0.037882111966609955,
0.149809330701828,
-0.25956398248672485,
0.032966937869787216,
0.051140617579221725,
0.14649195969104767,
0.00406361510977149,
-0.05115427449345589,
0.01429014839231968,
-0.05360214412212372,
-0.054652128368616104,
-0.06746816635131836,
-0.006135428790003061,
-0.027576493099331856,
-0.05147203803062439,
0.019243421033024788,
-0.1755700707435608,
-0.021410830318927765,
0.09424154460430145,
0.12876708805561066,
-0.1486445665359497,
-0.018640631809830666,
-0.048725154250860214,
-0.06339836865663528,
-0.0715010017156601,
-0.07038594037294388,
0.10712739825248718,
0.0513901449739933,
0.04796046018600464,
-0.07435787469148636,
-0.07092321664094925,
0.02726263552904129,
0.006906150374561548,
-0.03382374346256256,
0.08727246522903442,
0.05199531093239784,
-0.09209315478801727,
0.0756213590502739,
0.1092359870672226,
0.07177663594484329,
0.09363535046577454,
0.01574566215276718,
-0.11756632477045059,
-0.028492970392107964,
0.036266472190618515,
0.02740776725113392,
0.1465986967086792,
-0.05952361226081848,
0.04016614332795143,
0.04494241625070572,
-0.04170418903231621,
0.022319864481687546,
-0.08787637203931808,
0.024075502529740334,
0.025203049182891846,
-0.0034381982404738665,
0.06284574419260025,
-0.02525499276816845,
-0.0050758360885083675,
0.07016654312610626,
0.047779910266399384,
0.04621000960469246,
0.009655474685132504,
-0.01720241829752922,
-0.1047825813293457,
0.16950392723083496,
-0.0951867327094078,
-0.269941508769989,
-0.17632324993610382,
0.026197833940386772,
0.04035249724984169,
-0.022378476336598396,
0.031619444489479065,
-0.07056326419115067,
-0.10630585998296738,
-0.1060405746102333,
-0.002429972169920802,
0.01714223250746727,
-0.06364088505506516,
-0.0741225928068161,
0.07348573952913284,
0.04382912442088127,
-0.14902326464653015,
0.038552410900592804,
0.055694397538900375,
-0.057955220341682434,
-0.0233661737293005,
0.09118817001581192,
0.12397737801074982,
0.14583967626094818,
-0.021366750821471214,
-0.028626007959246635,
0.029004426673054695,
0.19620531797409058,
-0.13469526171684265,
0.10371150821447372,
0.13814030587673187,
-0.04545360431075096,
0.08360563963651657,
0.1560150384902954,
0.029186224564909935,
-0.08317049592733383,
0.05044832453131676,
0.04082648828625679,
-0.043159641325473785,
-0.2666129767894745,
-0.0534592866897583,
0.012832709588110447,
-0.06255637854337692,
0.09786593168973923,
0.10183793306350708,
0.11542957276105881,
0.034910861402750015,
-0.07166364789009094,
-0.043925940990448,
-0.0058974819257855415,
0.11737963557243347,
-0.05490213260054588,
-0.012639665976166725,
0.07686592638492584,
-0.05086168646812439,
0.005355054512619972,
0.10266812145709991,
0.02973790094256401,
0.17442677915096283,
0.020399179309606552,
0.11231429129838943,
0.06195578724145889,
0.08633565157651901,
0.0007386076031252742,
0.02951662428677082,
0.05147615820169449,
0.017203815281391144,
-0.002300140680745244,
-0.10421168059110641,
-0.006156572140753269,
0.1449710875749588,
0.028103826567530632,
0.029669636860489845,
-0.0018948549404740334,
-0.005003341939300299,
0.05121048167347908,
0.1746254414319992,
-0.011592294089496136,
-0.22072425484657288,
-0.0845772922039032,
0.06936841458082199,
-0.06218599155545235,
-0.12968985736370087,
-0.026130788028240204,
0.045467354357242584,
-0.17519839107990265,
0.026703642681241035,
-0.027433741837739944,
0.0919293761253357,
-0.09345759451389313,
-0.02221956104040146,
0.03687324374914169,
0.084866963326931,
-0.014529162086546421,
0.08703910559415817,
-0.14498743414878845,
0.11886418610811234,
0.02978132851421833,
0.09024628251791,
-0.11081171780824661,
0.07909037172794342,
-0.007550720125436783,
0.009180475026369095,
0.19379350543022156,
-0.011335089802742004,
-0.03514958545565605,
-0.08774717897176743,
-0.11210042238235474,
-0.013537433929741383,
0.12687496840953827,
-0.1243172138929367,
0.08773399889469147,
-0.015198243781924248,
-0.044079482555389404,
0.00937260314822197,
-0.12100647389888763,
-0.17273177206516266,
-0.19628387689590454,
0.05585884302854538,
-0.09575839340686798,
0.025643249973654747,
-0.11914430558681488,
-0.07089093327522278,
-0.02952558360993862,
0.241120383143425,
-0.1745356321334839,
-0.06510113179683685,
-0.1468164622783661,
-0.046294767409563065,
0.1662203073501587,
-0.04437198117375374,
0.0718095526099205,
-0.0208172257989645,
0.20345525443553925,
0.005988610442727804,
-0.004939318168908358,
0.06724198162555695,
-0.08892562240362167,
-0.16873881220817566,
-0.06771010160446167,
0.1510489284992218,
0.11680185794830322,
0.04907919466495514,
-0.002248800592496991,
0.0011772146681323647,
-0.016943959519267082,
-0.1137804463505745,
-0.0033210667315870523,
0.16037839651107788,
0.03878779336810112,
0.025986969470977783,
-0.05243593826889992,
-0.08797456324100494,
-0.06899320334196091,
-0.06853509694337845,
0.06221301481127739,
0.19590823352336884,
-0.10376439243555069,
0.1700313836336136,
0.147536963224411,
-0.07305635511875153,
-0.23175598680973053,
0.035342130810022354,
0.04983805492520332,
0.0014306638622656465,
0.04886869341135025,
-0.18252557516098022,
0.10521943867206573,
0.019543392583727837,
-0.05505957826972008,
0.13485197722911835,
-0.1557481735944748,
-0.1552847921848297,
0.0722852572798729,
0.03904085233807564,
-0.22423844039440155,
-0.1354004591703415,
-0.09622503817081451,
-0.05825018882751465,
-0.14065024256706238,
0.06054598465561867,
-0.002136280992999673,
0.015948504209518433,
0.03500790148973465,
-0.0015643214574083686,
0.027123261243104935,
-0.058935679495334625,
0.18609118461608887,
-0.004065449349582195,
0.020676052197813988,
-0.060264769941568375,
-0.0478842556476593,
0.09839435666799545,
-0.06130504235625267,
0.12208222597837448,
0.004057085141539574,
0.01594383642077446,
-0.10362856835126877,
-0.048314861953258514,
-0.04328322783112526,
0.05154227837920189,
-0.07548051327466965,
-0.10070807486772537,
-0.043625857681035995,
0.08841723203659058,
0.07005169242620468,
-0.03383097052574158,
0.00549331633374095,
-0.07189501076936722,
0.10019614547491074,
0.17795267701148987,
0.17573626339435577,
0.009926567785441875,
-0.07241068035364151,
0.01677953451871872,
-0.04142116755247116,
0.044231921434402466,
-0.2513144314289093,
0.03756171092391014,
0.06098250672221184,
0.029438555240631104,
0.09217222779989243,
-0.020435843616724014,
-0.1820858269929886,
-0.04050002992153168,
0.08094815909862518,
-0.05452597141265869,
-0.22617179155349731,
-0.019085140898823738,
0.0954197570681572,
-0.2020406424999237,
-0.007372708059847355,
0.03995226323604584,
-0.048725228756666183,
-0.023169852793216705,
0.00010950004070764408,
0.06317184865474701,
0.002471912419423461,
0.09773622453212738,
0.0735151618719101,
0.09715340286493301,
-0.08337292820215225,
0.10562895983457565,
0.10150538384914398,
-0.09572599828243256,
0.03605884686112404,
0.06754924356937408,
-0.05300498008728027,
-0.043293699622154236,
0.03665391728281975,
0.033023297786712646,
0.005234600510448217,
-0.060321882367134094,
0.013913018628954887,
-0.036497246474027634,
0.044923391193151474,
0.08326134830713272,
0.03754979372024536,
-0.013354414142668247,
0.06462216377258301,
0.03401726484298706,
-0.10898099094629288,
0.10366570204496384,
0.01731540448963642,
0.04105307161808014,
-0.08384523540735245,
-0.019968897104263306,
0.035425446927547455,
0.030576206743717194,
-0.01765924133360386,
-0.02306121215224266,
-0.02860277332365513,
-0.01614218018949032,
-0.14299540221691132,
-0.023106401786208153,
-0.07243485748767853,
0.006181265693157911,
0.014656842686235905,
-0.031884219497442245,
-0.011233693920075893,
0.02475680410861969,
-0.06979699432849884,
-0.07426341623067856,
-0.006949664559215307,
0.09833318740129471,
-0.15115703642368317,
0.008848577737808228,
0.06907843053340912,
-0.11088496446609497,
0.08190931379795074,
-0.008411259390413761,
0.016245156526565552,
0.022527478635311127,
-0.15448406338691711,
0.05601610988378525,
0.0008648968650959432,
0.01916889287531376,
0.025886621326208115,
-0.16471809148788452,
0.004104440100491047,
-0.04661374166607857,
-0.02149827405810356,
-0.00004464812809601426,
-0.02647159807384014,
-0.12325995415449142,
0.06858719140291214,
-0.015622655861079693,
-0.035931166261434555,
-0.02701525390148163,
0.0539589487016201,
0.07888586074113846,
-0.027474910020828247,
0.10445091128349304,
-0.008690856397151947,
0.04941811040043831,
-0.16801609098911285,
-0.02470702864229679,
-0.04982255399227142,
0.019377702847123146,
0.009884213097393513,
-0.007693959400057793,
0.04183054715394974,
-0.00976533442735672,
0.21883612871170044,
-0.05075952783226967,
0.1607085019350052,
0.05847611650824547,
-0.017352959141135216,
-0.0007513365126214921,
0.06180921941995621,
0.05997028574347496,
0.04658793285489082,
0.009480604901909828,
0.023740366101264954,
-0.022450892254710197,
-0.006695089396089315,
-0.15932634472846985,
0.01890849508345127,
0.14999441802501678,
0.06301083415746689,
0.024745315313339233,
0.05866100639104843,
-0.12775006890296936,
-0.12135478109121323,
0.09311001747846603,
-0.026755332946777344,
0.00928465835750103,
-0.08245618641376495,
0.1358020007610321,
0.14980104565620422,
-0.14000412821769714,
0.05256148427724838,
-0.06134212389588356,
-0.05217423290014267,
-0.10388828068971634,
-0.12032219022512436,
-0.05887215584516525,
-0.053666237741708755,
0.002330566756427288,
-0.03760887682437897,
0.054546963423490524,
0.03344334661960602,
-0.009351172484457493,
-0.00022941511997487396,
0.13597318530082703,
-0.019751882180571556,
-0.0028988157864660025,
0.048313532024621964,
0.03693558648228645,
0.02373051457107067,
-0.05275435373187065,
0.02940409444272518,
0.02539868652820587,
0.032232340425252914,
0.06546790152788162,
0.033412106335163116,
-0.047448933124542236,
0.03804153576493263,
-0.0025254099164158106,
-0.11207924783229828,
0.019641218706965446,
-0.00460948096588254,
-0.0742158442735672,
0.1268945336341858,
0.0407399944961071,
0.010224059224128723,
-0.03741471841931343,
0.24361543357372284,
-0.06653323769569397,
-0.06378097087144852,
-0.13251738250255585,
0.10491154342889786,
-0.0027236645109951496,
0.06476365029811859,
0.023412218317389488,
-0.1284150779247284,
0.005243356805294752,
0.13858191668987274,
0.12181595712900162,
0.0045748427510261536,
0.009228081442415714,
0.0518609918653965,
0.0025186820421367884,
-0.06998204439878464,
0.054019294679164886,
0.06992026418447495,
0.12919506430625916,
-0.07847554981708527,
0.07680778950452805,
0.0006860480643808842,
-0.08370215445756912,
-0.02947772853076458,
0.11312682181596756,
-0.0409729965031147,
0.03491825982928276,
-0.047444481402635574,
0.10916327685117722,
-0.05787910893559456,
-0.29412412643432617,
0.02350960113108158,
-0.09588567912578583,
-0.15202060341835022,
-0.018367812037467957,
0.05944539234042168,
-0.02624768204987049,
0.018029648810625076,
0.06971040368080139,
-0.06011629104614258,
0.20098382234573364,
0.0335683599114418,
-0.07864278554916382,
-0.0664360448718071,
0.04837050288915634,
-0.06564252078533173,
0.2949807047843933,
0.008418165147304535,
0.02863333560526371,
0.10770907253026962,
-0.03253700211644173,
-0.18271861970424652,
0.010723991319537163,
0.1133992001414299,
-0.08056149631738663,
0.08200647681951523,
0.19000613689422607,
-0.012578671798110008,
0.1209007054567337,
0.05294662341475487,
-0.047376248985528946,
0.04217283055186272,
-0.03389401361346245,
-0.051268599927425385,
-0.10752558708190918,
0.058453381061553955,
-0.05909625440835953,
0.15447644889354706,
0.10152646154165268,
-0.05671518296003342,
-0.004550917539745569,
-0.05555408447980881,
0.04875178262591362,
0.01804669201374054,
0.12263146042823792,
0.02951994352042675,
-0.1865430772304535,
0.032826557755470276,
-0.01144319772720337,
0.10186848044395447,
-0.25588861107826233,
-0.08421015739440918,
0.08833149075508118,
-0.011924264021217823,
-0.05105875805020332,
0.10560628771781921,
0.057650718837976456,
0.04243382066488266,
-0.043439045548439026,
-0.10480839014053345,
-0.02186836116015911,
0.14663739502429962,
-0.1469624787569046,
-0.025013303384184837
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# nslPOS_4
This model is a fine-tuned version of [naver-clova-ix/donut-base](https://huggingface.co/naver-clova-ix/donut-base) on the imagefolder dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 2
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.36.0
- Pytorch 2.1.2+cu118
- Datasets 2.16.1
- Tokenizers 0.15.1
| {"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["imagefolder"], "base_model": "naver-clova-ix/donut-base", "model-index": [{"name": "nslPOS_4", "results": []}]} | null | saniasinghania/nslPOS_4 | [
"transformers",
"tensorboard",
"safetensors",
"vision-encoder-decoder",
"generated_from_trainer",
"dataset:imagefolder",
"base_model:naver-clova-ix/donut-base",
"license:mit",
"endpoints_compatible",
"region:us"
] | 2024-02-15T06:29:20+00:00 | [] | [] | TAGS
#transformers #tensorboard #safetensors #vision-encoder-decoder #generated_from_trainer #dataset-imagefolder #base_model-naver-clova-ix/donut-base #license-mit #endpoints_compatible #region-us
|
# nslPOS_4
This model is a fine-tuned version of naver-clova-ix/donut-base on the imagefolder dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 2
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.36.0
- Pytorch 2.1.2+cu118
- Datasets 2.16.1
- Tokenizers 0.15.1
| [
"# nslPOS_4\n\nThis model is a fine-tuned version of naver-clova-ix/donut-base on the imagefolder dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 2\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- Transformers 4.36.0\n- Pytorch 2.1.2+cu118\n- Datasets 2.16.1\n- Tokenizers 0.15.1"
] | [
"TAGS\n#transformers #tensorboard #safetensors #vision-encoder-decoder #generated_from_trainer #dataset-imagefolder #base_model-naver-clova-ix/donut-base #license-mit #endpoints_compatible #region-us \n",
"# nslPOS_4\n\nThis model is a fine-tuned version of naver-clova-ix/donut-base on the imagefolder dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 2\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- Transformers 4.36.0\n- Pytorch 2.1.2+cu118\n- Datasets 2.16.1\n- Tokenizers 0.15.1"
] | [
70,
36,
6,
12,
8,
3,
103,
4,
35
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #vision-encoder-decoder #generated_from_trainer #dataset-imagefolder #base_model-naver-clova-ix/donut-base #license-mit #endpoints_compatible #region-us \n# nslPOS_4\n\nThis model is a fine-tuned version of naver-clova-ix/donut-base on the imagefolder dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 2\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3\n- mixed_precision_training: Native AMP### Training results### Framework versions\n\n- Transformers 4.36.0\n- Pytorch 2.1.2+cu118\n- Datasets 2.16.1\n- Tokenizers 0.15.1"
] | [
-0.07126634567975998,
0.1511370688676834,
-0.0017771217972040176,
0.0721704512834549,
0.13298819959163666,
0.013270774856209755,
0.09129809588193893,
0.11206058412790298,
-0.10346683859825134,
0.0852675512433052,
0.06438711285591125,
0.028564216569066048,
0.06269112974405289,
0.17436984181404114,
-0.03465913236141205,
-0.26474422216415405,
0.035458359867334366,
0.0017736338777467608,
-0.031828317791223526,
0.0859876349568367,
0.06799163669347763,
-0.0982813835144043,
0.06256813555955887,
-0.02142825350165367,
-0.1505414992570877,
0.021207334473729134,
-0.05057842656970024,
-0.06120442971587181,
0.05711758881807327,
-0.004729032050818205,
0.10587626695632935,
0.025520190596580505,
0.14235541224479675,
-0.2272893786430359,
0.0007424707291647792,
0.08999575674533844,
0.031199093908071518,
0.0625639334321022,
0.07832488417625427,
0.020140372216701508,
0.09825586527585983,
-0.19951754808425903,
0.11061092466115952,
0.01617409847676754,
-0.06451039016246796,
-0.14576967060565948,
-0.07745508849620819,
0.08785568922758102,
0.12044724076986313,
0.10267217457294464,
0.0036743790842592716,
0.12683433294296265,
-0.05317807197570801,
0.07912872731685638,
0.1549428105354309,
-0.2151094675064087,
-0.07338925451040268,
0.07664170116186142,
0.028731826692819595,
0.07804500311613083,
-0.10657282173633575,
0.011544996872544289,
0.05658780783414841,
0.026124032214283943,
0.07360091805458069,
-0.010760895907878876,
-0.06654801964759827,
-0.03229809179902077,
-0.14854511618614197,
-0.03954889252781868,
0.14281226694583893,
0.07576636224985123,
-0.03999854996800423,
-0.09298215061426163,
-0.0477268286049366,
-0.0887521505355835,
-0.02834826149046421,
-0.053196050226688385,
0.043454620987176895,
-0.04361506178975105,
-0.07099121809005737,
-0.07209721207618713,
-0.08568950742483139,
-0.04294443503022194,
0.009336407296359539,
0.04483025521039963,
0.04433799162507057,
0.014658019877970219,
-0.009822352789342403,
0.09888032078742981,
-0.015845157206058502,
-0.11175679415464401,
-0.02806839905679226,
-0.0034879930317401886,
-0.097760871052742,
-0.07012806832790375,
-0.002946543740108609,
-0.02126610279083252,
-0.012199767865240574,
0.09938451647758484,
-0.06677011400461197,
0.08819235116243362,
-0.0427863746881485,
0.006557576823979616,
-0.026244450360536575,
0.11548971384763718,
-0.028856704011559486,
-0.0027024345472455025,
-0.011730852536857128,
0.0767483040690422,
0.018032429739832878,
-0.013344245962798595,
-0.07947682589292526,
-0.00959586538374424,
0.09691500663757324,
0.05781856179237366,
-0.03602193295955658,
0.028577962890267372,
-0.015140345320105553,
-0.02785307914018631,
0.00812491588294506,
-0.12220357358455658,
0.07199795544147491,
-0.011984213255345821,
-0.06498508900403976,
0.015430419705808163,
0.050214484333992004,
0.0004707063199020922,
-0.03725772723555565,
0.09037090837955475,
-0.0520368255674839,
0.013091225177049637,
-0.09057250618934631,
-0.04681580886244774,
0.04698457196354866,
-0.06142985075712204,
-0.026944318786263466,
-0.07203304022550583,
-0.20161066949367523,
-0.04909422621130943,
0.04477802291512489,
-0.050417400896549225,
-0.022340793162584305,
-0.05886397138237953,
-0.07329413294792175,
0.00649329973384738,
0.0015177208697423339,
0.09859493374824524,
-0.04536478593945503,
0.061357077211141586,
-0.014891673810780048,
0.02481911890208721,
0.027001867070794106,
0.029328562319278717,
-0.08340264856815338,
0.03603716194629669,
-0.1518612653017044,
0.07821398973464966,
-0.0893450677394867,
0.02770807407796383,
-0.11744097620248795,
-0.11231347918510437,
-0.04529400169849396,
-0.03367230296134949,
0.02822164073586464,
0.11797040700912476,
-0.1861235797405243,
-0.006558099761605263,
0.13497132062911987,
-0.10511468350887299,
-0.05367368832230568,
0.10610891878604889,
-0.0410151444375515,
-0.007870794273912907,
0.05849818140268326,
0.16339151561260223,
0.09638573974370956,
-0.14350248873233795,
-0.012557854875922203,
-0.03134417161345482,
0.042296648025512695,
0.010758745484054089,
0.053571414202451706,
-0.006422363221645355,
0.035978011786937714,
-0.0020895195193588734,
-0.07477200031280518,
-0.0032937652431428432,
-0.08029714226722717,
-0.08800475299358368,
-0.057081080973148346,
-0.05996663495898247,
0.008018792606890202,
0.04163552075624466,
0.027919156476855278,
-0.07458556443452835,
-0.10479182749986649,
0.09564313292503357,
0.1112123653292656,
-0.07895027101039886,
0.015089045278728008,
-0.06361217796802521,
0.008323154412209988,
-0.06186657026410103,
-0.031424202024936676,
-0.16400721669197083,
-0.14662505686283112,
0.02628440596163273,
-0.09439844638109207,
0.04464850574731827,
-0.013392234221100807,
0.05579228699207306,
0.08369441330432892,
-0.04020606726408005,
-0.03954602777957916,
-0.0770481750369072,
0.019439762458205223,
-0.07048434764146805,
-0.17617249488830566,
-0.046921227127313614,
-0.02338971570134163,
0.1512717455625534,
-0.26211073994636536,
0.017167361453175545,
0.005328815430402756,
0.1658509373664856,
0.032598406076431274,
-0.06559668481349945,
0.006589087191969156,
0.05450914427638054,
0.008487804792821407,
-0.11694125086069107,
0.03533684462308884,
-0.00243041361682117,
-0.05326830968260765,
-0.06921076029539108,
-0.1606738418340683,
0.012544648721814156,
0.08451184630393982,
0.05090920627117157,
-0.0970149040222168,
-0.0188522357493639,
-0.06061773747205734,
-0.04854605346918106,
-0.10830662399530411,
0.014695977792143822,
0.1363033503293991,
0.0009712844039313495,
0.10300760716199875,
-0.04263082146644592,
-0.044841740280389786,
0.01853683404624462,
0.0056610130704939365,
-0.06636615842580795,
0.06642285734415054,
0.08408044278621674,
-0.14471721649169922,
0.10360042750835419,
0.09036174416542053,
-0.03673746436834335,
0.1488906890153885,
-0.05294542759656906,
-0.0912943109869957,
-0.019106470048427582,
0.03865915164351463,
-0.0014448537258431315,
0.12998823821544647,
-0.06327113509178162,
0.010875394567847252,
0.015679622069001198,
0.014429948292672634,
0.036480702459812164,
-0.17510011792182922,
0.00931174959987402,
0.03454800695180893,
-0.04428375884890556,
0.035841409116983414,
-0.04252145066857338,
0.032338712364435196,
0.07522818446159363,
0.00906752422451973,
-0.024425651878118515,
0.02620110474526882,
-0.0054536438547074795,
-0.08931805193424225,
0.15154634416103363,
-0.10593095421791077,
-0.15429462492465973,
-0.11991724371910095,
0.07027287036180496,
-0.044459693133831024,
-0.024882351979613304,
0.021121732890605927,
-0.07707672566175461,
-0.06740976125001907,
-0.1195640042424202,
-0.049704790115356445,
-0.05241001397371292,
-0.01946384645998478,
0.06448844075202942,
0.01989908330142498,
0.06821335852146149,
-0.09824623167514801,
-0.0007470613927580416,
-0.008233016356825829,
-0.08681976795196533,
0.005320308264344931,
0.039065901190042496,
0.11672434210777283,
0.12838168442249298,
-0.033301740884780884,
0.031025102362036705,
-0.03604898229241371,
0.1619246006011963,
-0.07897260040044785,
0.015713363885879517,
0.09963749349117279,
0.014826187863945961,
0.05654655769467354,
0.1282500922679901,
0.018057851120829582,
-0.08730117231607437,
0.03017589822411537,
0.07568959146738052,
-0.029047908261418343,
-0.1986476629972458,
-0.05991799011826515,
-0.0337030328810215,
-0.05381035804748535,
0.10631623864173889,
0.057655591517686844,
0.03141896054148674,
0.041915412992239,
-0.02558494545519352,
0.0331994891166687,
0.006999373435974121,
0.09675367176532745,
0.11285403370857239,
0.03890335559844971,
0.07479555159807205,
-0.045129984617233276,
-0.04426067695021629,
0.055686626583337784,
0.00328540219925344,
0.273103803396225,
-0.0029853573068976402,
0.06246119737625122,
0.025445159524679184,
0.12758949398994446,
-0.0063015492632985115,
0.005714844912290573,
0.04884686321020126,
0.001949654659256339,
0.020151503384113312,
-0.07459013164043427,
0.001061819028109312,
0.034769244492053986,
-0.007618976756930351,
0.021997720003128052,
-0.08754000067710876,
0.045831549912691116,
0.018103662878274918,
0.22962023317813873,
0.03884986415505409,
-0.3197900354862213,
-0.04934169352054596,
0.023235145956277847,
-0.0028796757105737925,
-0.05807582288980484,
-0.0035246622283011675,
0.13553597033023834,
-0.12856389582157135,
0.08768579363822937,
-0.06966158002614975,
0.08378690481185913,
-0.0525212436914444,
-0.008288259617984295,
0.05100997909903526,
0.10114260762929916,
0.00032174948137253523,
0.08301479369401932,
-0.1775239109992981,
0.20703190565109253,
0.011378586292266846,
0.12980400025844574,
-0.05408859625458717,
0.031588200479745865,
0.02118833363056183,
0.0904247984290123,
0.13350528478622437,
-0.009618761949241161,
-0.14225028455257416,
-0.18403330445289612,
-0.09472628682851791,
0.027925802394747734,
0.1268332302570343,
-0.0048329466953873634,
0.07449879497289658,
-0.023957043886184692,
-0.009562167339026928,
0.04374157637357712,
-0.04585445672273636,
-0.21952873468399048,
-0.12903910875320435,
0.007021309342235327,
0.02811707928776741,
-0.017715785652399063,
-0.07595329731702805,
-0.10488343983888626,
-0.03857961669564247,
0.17983871698379517,
0.015540073625743389,
-0.02434227056801319,
-0.1478445827960968,
0.09243486076593399,
0.10353481769561768,
-0.047796327620744705,
0.01954834721982479,
0.027648981660604477,
0.1520397663116455,
0.033073991537094116,
-0.10156503319740295,
0.048753682523965836,
-0.06511078774929047,
-0.13687437772750854,
-0.050091616809368134,
0.11658476293087006,
0.04785590618848801,
0.03513908013701439,
0.0064377146773040295,
0.0351213663816452,
0.06109826639294624,
-0.08438087999820709,
0.028441250324249268,
0.10500755161046982,
0.08300139009952545,
0.07384256273508072,
-0.07245675474405289,
-0.022760989144444466,
-0.036686621606349945,
-0.006367791444063187,
0.11320946365594864,
0.170592799782753,
-0.07019810378551483,
0.05633307248353958,
0.043074119836091995,
-0.08878814429044724,
-0.16894878447055817,
0.10054878145456314,
0.08611596375703812,
-0.0002455142675898969,
0.03832262381911278,
-0.15503711998462677,
0.10869898647069931,
0.13536599278450012,
-0.025059066712856293,
0.03913579136133194,
-0.32246121764183044,
-0.11936741322278976,
0.07487679272890091,
0.13571949303150177,
-0.03576304763555527,
-0.14468742907047272,
-0.035312455147504807,
-0.015696724876761436,
-0.14540506899356842,
0.13389292359352112,
-0.0878172218799591,
0.09449252486228943,
0.012095318175852299,
0.06490962952375412,
0.02060527540743351,
-0.04031946137547493,
0.14160244166851044,
0.015406801365315914,
0.08634387701749802,
-0.07404179126024246,
0.06050962582230568,
0.08503218740224838,
-0.08698884397745132,
0.07377472519874573,
0.011340810917317867,
0.04872085154056549,
-0.12086021155118942,
-0.01910601556301117,
-0.04939631372690201,
0.06435323506593704,
-0.047840360552072525,
-0.04858887940645218,
-0.0461287684738636,
0.07087527960538864,
0.07505948841571808,
-0.031309135258197784,
0.08150090277194977,
0.027695123106241226,
0.08751264959573746,
0.08167771250009537,
0.07312921434640884,
0.006999209523200989,
-0.10351389646530151,
-0.024436090141534805,
-0.030416011810302734,
0.05727747827768326,
-0.09671574085950851,
0.017514871433377266,
0.13416674733161926,
0.03140401840209961,
0.1485300362110138,
0.033793799579143524,
-0.055969320237636566,
-0.004586558789014816,
0.03401539474725723,
-0.0995999351143837,
-0.178104430437088,
-0.0018926123157143593,
-0.031562693417072296,
-0.09586763381958008,
0.005927957594394684,
0.09043579548597336,
-0.08526507765054703,
0.012037385255098343,
-0.01939995214343071,
0.03742954879999161,
-0.01339030172675848,
0.17019855976104736,
0.026099814102053642,
0.05317162349820137,
-0.0699673518538475,
0.13392698764801025,
0.0884447991847992,
-0.13255347311496735,
0.060858894139528275,
0.09539788216352463,
-0.07173565030097961,
-0.02770424447953701,
0.11438381671905518,
0.1868877410888672,
-0.017829593271017075,
-0.056400369852781296,
-0.05611175671219826,
-0.09698376059532166,
0.04680792987346649,
0.09736693650484085,
0.0335286408662796,
-0.021929603070020676,
-0.018819347023963928,
0.02859104797244072,
-0.159132182598114,
0.0828440934419632,
0.0460762120783329,
0.08631531149148941,
-0.17364083230495453,
0.10477206856012344,
0.013970978558063507,
0.028644954785704613,
-0.020309388637542725,
0.008754940703511238,
-0.08236990869045258,
-0.014575605280697346,
-0.0664018914103508,
0.018914608284831047,
-0.047357991337776184,
-0.00002940464037237689,
-0.004146974999457598,
-0.030190275982022285,
-0.04250187799334526,
0.05713812634348869,
-0.05433008819818497,
-0.0732080414891243,
0.009771053679287434,
0.058501020073890686,
-0.13966569304466248,
-0.012157212011516094,
0.006408181972801685,
-0.08797647058963776,
0.02933187037706375,
0.03402358666062355,
0.020582007244229317,
0.013132618740200996,
-0.15701259672641754,
-0.019257528707385063,
0.07075710594654083,
0.01793646067380905,
0.041325658559799194,
-0.08355901390314102,
-0.014424869790673256,
0.006484223995357752,
0.03427572920918465,
0.01996041275560856,
0.06281763315200806,
-0.1154588982462883,
-0.04802443087100983,
-0.08890458196401596,
-0.05187303572893143,
-0.04670478776097298,
0.0556323379278183,
0.09733033925294876,
0.026385489851236343,
0.1559898406267166,
-0.1028478741645813,
0.04955015704035759,
-0.18982908129692078,
-0.045705396682024,
0.0013917713658884168,
-0.019628819078207016,
-0.006674737203866243,
-0.014207734726369381,
0.07581713795661926,
-0.05206172168254852,
0.0794609859585762,
0.007252972107380629,
0.09597057849168777,
0.04573051258921623,
-0.057571686804294586,
-0.021473856642842293,
0.029760882258415222,
0.18590635061264038,
0.055171046406030655,
-0.03154606372117996,
0.11032155156135559,
-0.013808210380375385,
0.07964761555194855,
0.06777366995811462,
0.18161515891551971,
0.1747174859046936,
-0.055642616003751755,
0.07037156075239182,
0.08476809412240982,
-0.08051933348178864,
-0.14668945968151093,
0.11854524910449982,
-0.02649150975048542,
0.12966443598270416,
-0.05271491780877113,
0.09820369631052017,
0.10560854524374008,
-0.18733838200569153,
0.026756515726447105,
-0.03470036759972572,
-0.10753823071718216,
-0.09511920809745789,
-0.11421137303113937,
-0.09044688940048218,
-0.1141243502497673,
0.034930936992168427,
-0.10483906418085098,
0.03113827109336853,
0.09179217368364334,
0.024120425805449486,
0.01624342054128647,
0.16936750710010529,
-0.04868360981345177,
0.0242754053324461,
0.06873733550310135,
0.01281543169170618,
-0.015092793852090836,
-0.056942958384752274,
-0.05560847744345665,
0.06398076564073563,
0.007959398441016674,
0.07938896119594574,
-0.03521265089511871,
0.020283062011003494,
0.05154643580317497,
-0.014311222359538078,
-0.07183445990085602,
0.0316319614648819,
0.019275236874818802,
0.03512745350599289,
0.04448378458619118,
0.05865022540092468,
-0.02800207957625389,
-0.039158474653959274,
0.25666704773902893,
-0.06672230362892151,
-0.09116598218679428,
-0.1206478476524353,
0.1525537222623825,
0.029298558831214905,
-0.017702890560030937,
0.061342231929302216,
-0.1298312097787857,
0.011754957027733326,
0.11197221279144287,
0.12305109202861786,
-0.012320440262556076,
-0.010361592285335064,
-0.02506941184401512,
-0.017494304105639458,
-0.045593224465847015,
0.11360468715429306,
0.09163422137498856,
0.033612318336963654,
-0.05227641761302948,
-0.014779775403439999,
-0.0029807654209434986,
-0.05428609624505043,
-0.11253209412097931,
0.02801051363348961,
0.023484965786337852,
0.0020366886164993048,
-0.03597169741988182,
0.10714583098888397,
0.005153328645974398,
-0.1312796175479889,
0.09084925800561905,
-0.1484723836183548,
-0.154714435338974,
-0.006414408329874277,
0.12062493711709976,
-0.03440583497285843,
0.01856311410665512,
-0.03174496442079544,
-0.008973312564194202,
0.09818246960639954,
-0.004672185517847538,
-0.0645095482468605,
-0.09472137689590454,
0.02278456836938858,
-0.08375178277492523,
0.25302040576934814,
0.013154061511158943,
0.06486246734857559,
0.09139924496412277,
0.01899431087076664,
-0.13899384438991547,
0.056220974773168564,
0.049941521137952805,
-0.04796217009425163,
0.03595680370926857,
0.1859310269355774,
-0.06690230220556259,
0.09674959629774094,
0.04978775978088379,
-0.1563376635313034,
-0.022630823776125908,
-0.03788357600569725,
-0.00040338310645893216,
-0.06137802451848984,
0.0046977452002465725,
-0.08145027607679367,
0.1494605392217636,
0.16719120740890503,
-0.016209689900279045,
0.020584259182214737,
-0.0709981694817543,
0.035671718418598175,
0.04458382725715637,
0.09918687492609024,
0.0008360821520909667,
-0.17323575913906097,
0.012190126813948154,
-0.050439685583114624,
0.03940243646502495,
-0.2185802012681961,
-0.1200273260474205,
0.01509584579616785,
-0.05348867177963257,
-0.06905730068683624,
0.11410875618457794,
0.10178577154874802,
0.020001471042633057,
-0.0472550131380558,
-0.15083155035972595,
-0.024263883009552956,
0.14733558893203735,
-0.137531578540802,
-0.030954672023653984
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | null | juliansmidek/donut_cord_v2 | [
"transformers",
"safetensors",
"vision-encoder-decoder",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-15T06:31:18+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #vision-encoder-decoder #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #vision-encoder-decoder #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
39,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #vision-encoder-decoder #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.05306316912174225,
0.20225799083709717,
-0.004532716237008572,
0.02486577071249485,
0.10745619982481003,
0.005829503294080496,
0.06235508993268013,
0.11292294412851334,
-0.0020419019274413586,
0.12131396681070328,
0.02719549834728241,
0.07967612892389297,
0.12327883392572403,
0.1561002880334854,
0.003298027440905571,
-0.2451947182416916,
0.056976594030857086,
-0.09095743298530579,
0.005862638354301453,
0.11416777968406677,
0.1328127682209015,
-0.10651655495166779,
0.09236611425876617,
-0.006378492806106806,
-0.018223589286208153,
-0.01560811698436737,
-0.07464081048965454,
-0.06576249748468399,
0.05573050677776337,
0.07688132673501968,
0.0721992626786232,
0.009551521390676498,
0.07658318430185318,
-0.2803225517272949,
0.014262731187045574,
0.08472383767366409,
0.001679662149399519,
0.06746198982000351,
0.08127366751432419,
-0.06743600219488144,
0.1253480762243271,
-0.072596974670887,
0.1388878971338272,
0.07806979864835739,
-0.09000035375356674,
-0.19567735493183136,
-0.06588542461395264,
0.06967565417289734,
0.13176710903644562,
0.05219413340091705,
-0.02802334912121296,
0.13226838409900665,
-0.09170275926589966,
0.009603562764823437,
0.11835573613643646,
-0.0669531598687172,
-0.05480296164751053,
0.036625828593969345,
0.09896823018789291,
0.08901896327733994,
-0.11835762113332748,
-0.005250310990959406,
0.02971211075782776,
0.02317153476178646,
0.08707734197378159,
0.01639537699520588,
0.14948242902755737,
0.03726613521575928,
-0.14019928872585297,
-0.05744593217968941,
0.0939771831035614,
0.03824683278799057,
-0.04958764836192131,
-0.2343815118074417,
-0.031156767159700394,
-0.016557393595576286,
-0.0305255725979805,
-0.04024772346019745,
0.05501044541597366,
-0.03525097668170929,
0.0778101459145546,
-0.00980226881802082,
-0.08041384816169739,
-0.028304021805524826,
0.04802321270108223,
0.06495635211467743,
0.0185654629021883,
-0.0052207582630217075,
0.02111206203699112,
0.11669638752937317,
0.0788947269320488,
-0.13250376284122467,
-0.07449234277009964,
-0.07420225441455841,
-0.098720021545887,
-0.04206917807459831,
0.035802435129880905,
0.07063398510217667,
0.03907659277319908,
0.1947319507598877,
-0.017316637560725212,
0.0492829903960228,
0.04488114267587662,
0.006694390904158354,
0.06837808340787888,
0.11208193004131317,
-0.06981375813484192,
-0.1344674974679947,
-0.058068543672561646,
0.0876275971531868,
-0.004179352894425392,
-0.03578972443938255,
-0.050102028995752335,
0.04014993831515312,
0.030947856605052948,
0.11303774267435074,
0.08247632533311844,
0.00018158231978304684,
-0.06263403594493866,
-0.042954958975315094,
0.22251743078231812,
-0.14639760553836823,
0.04202887415885925,
0.005975429899990559,
-0.04316512867808342,
-0.004425262566655874,
0.010939684696495533,
0.012188587337732315,
-0.037748441100120544,
0.10074765980243683,
-0.07618969678878784,
-0.034445326775312424,
-0.11386469751596451,
-0.0645400807261467,
0.02648116648197174,
0.004736251663416624,
-0.021350713446736336,
-0.043291062116622925,
-0.11405764520168304,
-0.04943132400512695,
0.07177776843309402,
-0.07382809370756149,
-0.05491260066628456,
0.011223746463656425,
-0.053137388080358505,
0.004582186229526997,
0.00267049134708941,
0.1107012927532196,
-0.032886769622564316,
0.02474004216492176,
-0.046873610466718674,
0.06870805472135544,
0.10572899132966995,
0.037948183715343475,
-0.08262749016284943,
0.07297130674123764,
-0.23081135749816895,
0.10601279884576797,
-0.0860590934753418,
0.020004073157906532,
-0.14292661845684052,
-0.04098188504576683,
0.028510602191090584,
0.027961768209934235,
-0.010129709728062153,
0.1261758655309677,
-0.2058180272579193,
-0.030243845656514168,
0.14840680360794067,
-0.11589378863573074,
-0.09422067552804947,
0.06595603376626968,
-0.05322439596056938,
0.10703813284635544,
0.04694817587733269,
-0.023595338687300682,
0.07076855003833771,
-0.1317324936389923,
-0.04631568118929863,
-0.021216953173279762,
-0.014257868751883507,
0.1454293578863144,
0.06817390769720078,
-0.05359628424048424,
0.07706184685230255,
0.021635930985212326,
-0.037367887794971466,
-0.03435307368636131,
-0.03299751877784729,
-0.0921449214220047,
0.006188652943819761,
-0.06746368855237961,
0.029415255412459373,
-0.021083641797304153,
-0.09160202741622925,
-0.0303809754550457,
-0.1750536859035492,
0.03627307340502739,
0.08193549513816833,
0.006318137980997562,
-0.0197709072381258,
-0.09286735206842422,
0.01832297258079052,
-0.012543086893856525,
-0.01892191916704178,
-0.16171851754188538,
-0.04700363799929619,
0.04258821904659271,
-0.20086339116096497,
0.01967989094555378,
-0.03661181032657623,
0.04892909526824951,
0.03480081260204315,
-0.04064054414629936,
-0.008631768636405468,
0.003335281042382121,
0.015725387260317802,
-0.024876078590750694,
-0.20007483661174774,
-0.030333993956446648,
-0.02597803995013237,
0.13431528210639954,
-0.22246739268302917,
0.02835940755903721,
0.08417746424674988,
0.1434013694524765,
0.0016660679830238223,
-0.04368278756737709,
0.014076939783990383,
-0.054847393184900284,
-0.05226233974099159,
-0.06906641274690628,
-0.006392029579728842,
-0.03333830088376999,
-0.03956965357065201,
0.07096927613019943,
-0.20027990639209747,
-0.04262993112206459,
0.10826600342988968,
0.09837738424539566,
-0.14323553442955017,
-0.02474437840282917,
-0.04144697263836861,
-0.061835117638111115,
-0.09049158543348312,
-0.06381296366453171,
0.1425853669643402,
0.04930912330746651,
0.05233171954751015,
-0.08628004789352417,
-0.06071622669696808,
0.010883725248277187,
-0.00011982241994701326,
-0.04022165387868881,
0.08589175343513489,
0.08529563993215561,
-0.1104804202914238,
0.09131632000207901,
0.08476598560810089,
0.06810380518436432,
0.10764316469430923,
0.0015695258043706417,
-0.10674308240413666,
-0.02863943576812744,
0.007197246421128511,
0.014064288698136806,
0.14327655732631683,
-0.04052828997373581,
0.048501238226890564,
0.05552942305803299,
-0.026791011914610863,
0.01846246048808098,
-0.1073673665523529,
0.03181065618991852,
0.047323208302259445,
-0.009707847610116005,
0.022087840363383293,
-0.03469701483845711,
0.029233038425445557,
0.08711127936840057,
0.035028502345085144,
0.029371140524744987,
0.006271174643188715,
-0.035827167332172394,
-0.10475867241621017,
0.17433254420757294,
-0.0890800803899765,
-0.2983497679233551,
-0.1401464194059372,
-0.0031294787768274546,
0.04853705316781998,
-0.02189650572836399,
0.011501757428050041,
-0.04715869575738907,
-0.11619791388511658,
-0.10501103848218918,
0.008872180245816708,
0.04236939176917076,
-0.07721052318811417,
-0.0677536204457283,
0.049645159393548965,
0.03511786088347435,
-0.1394437551498413,
0.02193400263786316,
0.04976930096745491,
-0.03652774170041084,
-0.014805521816015244,
0.07396114617586136,
0.1029050275683403,
0.17309242486953735,
-0.006633569020777941,
-0.017617538571357727,
0.023926623165607452,
0.24285922944545746,
-0.14599764347076416,
0.10973547399044037,
0.15899382531642914,
-0.06477009505033493,
0.10309092700481415,
0.19795724749565125,
0.023503681644797325,
-0.07609159499406815,
0.03380175679922104,
0.03942976891994476,
-0.05477331578731537,
-0.22650650143623352,
-0.06247439235448837,
-0.0018889455823227763,
-0.07084330171346664,
0.0898994579911232,
0.09043484926223755,
0.10881193727254868,
0.04595636948943138,
-0.08826620131731033,
-0.06884618103504181,
0.018320003524422646,
0.10980518162250519,
-0.021245790645480156,
0.0068580652587115765,
0.08790436387062073,
-0.04739201068878174,
-0.004643888212740421,
0.10682045668363571,
0.012494737282395363,
0.19045358896255493,
0.026300430297851562,
0.1511247158050537,
0.0705496221780777,
0.02818082831799984,
0.028634218499064445,
0.01902483031153679,
0.02658020332455635,
0.008844421245157719,
-0.017161816358566284,
-0.08914987742900848,
0.024366190657019615,
0.1353050321340561,
0.07279488444328308,
0.03349636495113373,
0.02151625230908394,
-0.03378571942448616,
0.06302186101675034,
0.16916872560977936,
0.010774882510304451,
-0.22069227695465088,
-0.038822758942842484,
0.08901690691709518,
-0.07502853125333786,
-0.12661625444889069,
-0.02531552128493786,
0.041314706206321716,
-0.17944134771823883,
0.04614526778459549,
-0.01675923727452755,
0.11310327053070068,
-0.12792818248271942,
-0.027826640754938126,
0.0399215929210186,
0.08714547008275986,
-0.030904775485396385,
0.07850778102874756,
-0.16933031380176544,
0.11483496427536011,
0.012726574204862118,
0.06142592057585716,
-0.11495199054479599,
0.09937658905982971,
0.012870636768639088,
-0.004057616461068392,
0.166650652885437,
-0.0004076457116752863,
-0.07127676159143448,
-0.06420327723026276,
-0.0747542679309845,
-0.021446911618113518,
0.09447984397411346,
-0.11176029592752457,
0.08143189549446106,
-0.014278407208621502,
-0.038768354803323746,
0.002094640163704753,
-0.1095178872346878,
-0.12445959448814392,
-0.19356580078601837,
0.06069660931825638,
-0.10815826058387756,
0.003831093432381749,
-0.09925412386655807,
-0.05358343943953514,
-0.04677683115005493,
0.20124182105064392,
-0.14483362436294556,
-0.09758627414703369,
-0.15289589762687683,
-0.09554614871740341,
0.1664566993713379,
-0.04663718119263649,
0.08821606636047363,
-0.0038687095511704683,
0.22914768755435944,
0.006724040023982525,
-0.012434800155460835,
0.07520399242639542,
-0.08637748658657074,
-0.17783690989017487,
-0.07604682445526123,
0.12165174633264542,
0.12136691808700562,
0.04747939854860306,
-0.012795967981219292,
0.021366307511925697,
-0.03262670710682869,
-0.1143370196223259,
0.008061125874519348,
0.12403146922588348,
0.05908475071191788,
0.04311663657426834,
0.00427021412178874,
-0.11007828265428543,
-0.07147429138422012,
-0.03523124009370804,
0.021944832056760788,
0.18807166814804077,
-0.08221416175365448,
0.15053938329219818,
0.12907743453979492,
-0.05370299890637398,
-0.21416690945625305,
0.03436880186200142,
0.04090806469321251,
0.00472818361595273,
0.053175248205661774,
-0.1770489513874054,
0.07840683311223984,
0.024572648108005524,
-0.05139078199863434,
0.15225622057914734,
-0.1677834391593933,
-0.1536329686641693,
0.0777980238199234,
0.05417194589972496,
-0.21989625692367554,
-0.12022100389003754,
-0.08520790934562683,
-0.06769770383834839,
-0.14243635535240173,
0.08412115275859833,
0.020698823034763336,
0.00019321028958074749,
0.04856256768107414,
0.034081753343343735,
0.019389502704143524,
-0.04823823645710945,
0.21962693333625793,
-0.009151222184300423,
0.036160267889499664,
-0.07677234709262848,
-0.09645134955644608,
0.07220485061407089,
-0.054490040987730026,
0.08667207509279251,
-0.024035243317484856,
0.007558862213045359,
-0.07708337903022766,
-0.05522387474775314,
-0.051626816391944885,
0.029731709510087967,
-0.07851403951644897,
-0.1059325784444809,
-0.0698300376534462,
0.09317977726459503,
0.09290435165166855,
-0.03390717878937721,
-0.03715480491518974,
-0.09010928869247437,
0.027843475341796875,
0.20326566696166992,
0.1706404834985733,
0.05219545215368271,
-0.10166811943054199,
0.0006855534156784415,
-0.017444688826799393,
0.04213650897145271,
-0.21454355120658875,
0.04805922508239746,
0.046434495598077774,
0.023125024512410164,
0.11990387737751007,
-0.016930051147937775,
-0.16412340104579926,
-0.04671873524785042,
0.056455835700035095,
-0.036497533321380615,
-0.20693081617355347,
-0.012548294849693775,
0.052488550543785095,
-0.18057814240455627,
-0.0640311911702156,
0.01719793491065502,
-0.01455759722739458,
-0.024732910096645355,
0.013886804692447186,
0.06330747157335281,
0.027956031262874603,
0.09375539422035217,
0.05616210401058197,
0.10073219239711761,
-0.11451173573732376,
0.08603792637586594,
0.08968610316514969,
-0.09007474035024643,
0.0125290397554636,
0.0736992210149765,
-0.055889032781124115,
-0.02316245622932911,
0.020047122612595558,
0.06119868531823158,
-0.00162112049292773,
-0.061508238315582275,
-0.01975059136748314,
-0.11018196493387222,
0.06758071482181549,
0.13242226839065552,
0.04006093740463257,
-0.005321177653968334,
0.048064373433589935,
0.021657899022102356,
-0.0826604813337326,
0.11298491060733795,
0.025856876745820045,
0.038508445024490356,
-0.06642783433198929,
-0.023183433338999748,
0.045513976365327835,
0.008002778515219688,
-0.020539432764053345,
-0.02922571264207363,
-0.05385620519518852,
-0.011762911453843117,
-0.18990042805671692,
0.01677597686648369,
-0.0765504315495491,
0.004350012633949518,
0.014810539782047272,
-0.03814086318016052,
-0.020958559587597847,
0.016691848635673523,
-0.07982292026281357,
-0.05134430527687073,
-0.002734045498073101,
0.0985812395811081,
-0.13863937556743622,
0.007181720342487097,
0.08835478872060776,
-0.11878933757543564,
0.06740869581699371,
-0.024482958018779755,
-0.017330501228570938,
-0.0019446499645709991,
-0.12968114018440247,
0.04302144795656204,
0.002646980108693242,
0.01983785443007946,
0.042118169367313385,
-0.16895927488803864,
0.006406146101653576,
-0.04021742194890976,
-0.04932057484984398,
-0.01649690791964531,
-0.07501037418842316,
-0.11416008323431015,
0.11071296036243439,
0.0018048452911898494,
-0.07790763676166534,
-0.0117954695597291,
0.05156298726797104,
0.10936092585325241,
-0.037753306329250336,
0.1212289109826088,
0.0044145528227090836,
0.06474429368972778,
-0.18044467270374298,
-0.024778762832283974,
-0.015598369762301445,
0.006858370266854763,
0.02694357931613922,
-0.0140616400167346,
0.042949724942445755,
-0.013948258012533188,
0.2575705349445343,
-0.02087925560772419,
0.07440228760242462,
0.06519795209169388,
0.046770140528678894,
0.011650492437183857,
0.0869150459766388,
0.06678774207830429,
0.012648052535951138,
0.003416551509872079,
0.032283470034599304,
-0.031422972679138184,
-0.014915283769369125,
-0.1482047736644745,
0.07101717591285706,
0.1439564973115921,
0.08283385634422302,
0.012806775979697704,
0.06352550536394119,
-0.10048910230398178,
-0.10461166501045227,
0.08237126469612122,
-0.041638992726802826,
-0.0008007619762793183,
-0.058745238929986954,
0.1454916000366211,
0.15373669564723969,
-0.17184817790985107,
0.0812029168009758,
-0.0365472249686718,
-0.04692631587386131,
-0.110826775431633,
-0.16450464725494385,
-0.06572620570659637,
-0.02580863982439041,
-0.003703176509588957,
-0.05508402734994888,
0.06688077747821808,
0.11885157972574234,
-0.0016993435565382242,
-0.0014807049883529544,
0.10038914531469345,
-0.022431546822190285,
-0.02085503377020359,
0.03484790027141571,
0.04871930554509163,
0.036450520157814026,
-0.04567375034093857,
0.02037064917385578,
0.010481024160981178,
0.03885771334171295,
0.0584954097867012,
0.024686437100172043,
-0.03216709941625595,
0.015852995216846466,
-0.012883862480521202,
-0.10344603657722473,
0.023220296949148178,
-0.02661588042974472,
-0.07547900825738907,
0.12944786250591278,
0.03020857274532318,
0.016596315428614616,
-0.03272676467895508,
0.20282141864299774,
-0.07233016192913055,
-0.07228796184062958,
-0.14049077033996582,
0.11066530644893646,
-0.03675038740038872,
0.06309209764003754,
0.05746688321232796,
-0.11500853300094604,
-0.005017681512981653,
0.12902788817882538,
0.12829062342643738,
-0.03323068469762802,
0.0049867476336658,
0.026197774335741997,
0.00827173050493002,
-0.049032241106033325,
0.04763645678758621,
0.032188743352890015,
0.15468738973140717,
-0.07100655138492584,
0.07814180850982666,
-0.0005074164946563542,
-0.08921105414628983,
-0.03626048564910889,
0.1392250955104828,
0.0028979976195842028,
0.032514654099941254,
-0.06501611322164536,
0.1007646918296814,
-0.07232671231031418,
-0.22924686968326569,
0.04944430664181709,
-0.0780143067240715,
-0.1530541479587555,
-0.012732122093439102,
0.02754151076078415,
-0.013794543221592903,
0.02254127338528633,
0.061679136008024216,
-0.06457692384719849,
0.16001729667186737,
0.03501654788851738,
-0.08832933753728867,
-0.05492360517382622,
0.07016300410032272,
-0.1023297980427742,
0.29636111855506897,
0.011759842745959759,
0.032544128596782684,
0.10384590178728104,
-0.01977287232875824,
-0.1365358531475067,
0.029908113181591034,
0.09750298410654068,
-0.0958370491862297,
0.0666261613368988,
0.18576137721538544,
-0.01007252186536789,
0.10188479721546173,
0.07690271735191345,
-0.06373509019613266,
0.05595288425683975,
-0.08074036985635757,
-0.06501531600952148,
-0.0913926437497139,
0.0567726232111454,
-0.06534478068351746,
0.14360256493091583,
0.11906343698501587,
-0.03889768570661545,
-0.0050442758947610855,
-0.029765458777546883,
0.041717544198036194,
0.013438636437058449,
0.12334854900836945,
0.013119494542479515,
-0.1562117338180542,
0.029238002374768257,
0.003133314661681652,
0.10249229520559311,
-0.2173282951116562,
-0.08784149587154388,
0.04678759351372719,
-0.03227124735713005,
-0.05141454562544823,
0.1055324450135231,
0.06109399348497391,
0.05229108780622482,
-0.0470384918153286,
-0.05454113334417343,
-0.007119470275938511,
0.1505606472492218,
-0.11762657761573792,
-0.009603551588952541
] |
null | null | diffusers | # Kotori Minami
<Gallery />
## Model description
This model was trained to generate high quality images based on SIFAS cards.
To achieve better quality, you should be using hako-mikan's regional prompter, along with Latent Mode, which modifies the way Stable Diffusion isolates the LoRA resulting in a significant improvement.
## Trigger words
You should use `id_kotori_minami` to trigger the image generation.
## Download model
Weights for this model are available in Safetensors format.
[Download](/theidoldaily/kotori-minami/tree/main) them in the Files & versions tab.
| {"license": "mit", "tags": ["text-to-image", "stable-diffusion", "lora", "diffusers", "template:sd-lora"], "widget": [{"text": "masterpiece, high quality, defined pupil, looking at viewer, rounded pupil, defined iris, (soft iris:1.2),", "parameters": {"negative_prompt": "bad_anatomy, deformation, amputation, deformity, deformed_nipples, duplicated_torso, deformed_torso, long_torso, large_torso, unproportioned_torso, (deformed_pussy:1.2), (deformed_hands:1.2), unproportioned_eyes, unproportioned_head, small_head, duplicated_nose, big_nose, fusioned_clothes, fusioned_arms, undefined_limbs, divided_pussy, red_pussy, duplicated_pussy, deformed_anus, deformed_pussy,"}, "output": {"url": "images/00000-2136358392.png"}}], "base_model": "cagliostrolab/animagine-xl-3.0", "instance_prompt": "id_kotori_minami"} | text-to-image | theidoldaily/kotori-minami | [
"diffusers",
"text-to-image",
"stable-diffusion",
"lora",
"template:sd-lora",
"base_model:cagliostrolab/animagine-xl-3.0",
"license:mit",
"region:us"
] | 2024-02-15T06:36:24+00:00 | [] | [] | TAGS
#diffusers #text-to-image #stable-diffusion #lora #template-sd-lora #base_model-cagliostrolab/animagine-xl-3.0 #license-mit #region-us
| # Kotori Minami
<Gallery />
## Model description
This model was trained to generate high quality images based on SIFAS cards.
To achieve better quality, you should be using hako-mikan's regional prompter, along with Latent Mode, which modifies the way Stable Diffusion isolates the LoRA resulting in a significant improvement.
## Trigger words
You should use 'id_kotori_minami' to trigger the image generation.
## Download model
Weights for this model are available in Safetensors format.
Download them in the Files & versions tab.
| [
"# Kotori Minami\n\n<Gallery />",
"## Model description \n\nThis model was trained to generate high quality images based on SIFAS cards.\n\nTo achieve better quality, you should be using hako-mikan's regional prompter, along with Latent Mode, which modifies the way Stable Diffusion isolates the LoRA resulting in a significant improvement.",
"## Trigger words\n\nYou should use 'id_kotori_minami' to trigger the image generation.",
"## Download model\n\nWeights for this model are available in Safetensors format.\n\nDownload them in the Files & versions tab."
] | [
"TAGS\n#diffusers #text-to-image #stable-diffusion #lora #template-sd-lora #base_model-cagliostrolab/animagine-xl-3.0 #license-mit #region-us \n",
"# Kotori Minami\n\n<Gallery />",
"## Model description \n\nThis model was trained to generate high quality images based on SIFAS cards.\n\nTo achieve better quality, you should be using hako-mikan's regional prompter, along with Latent Mode, which modifies the way Stable Diffusion isolates the LoRA resulting in a significant improvement.",
"## Trigger words\n\nYou should use 'id_kotori_minami' to trigger the image generation.",
"## Download model\n\nWeights for this model are available in Safetensors format.\n\nDownload them in the Files & versions tab."
] | [
56,
10,
68,
22,
28
] | [
"passage: TAGS\n#diffusers #text-to-image #stable-diffusion #lora #template-sd-lora #base_model-cagliostrolab/animagine-xl-3.0 #license-mit #region-us \n# Kotori Minami\n\n<Gallery />## Model description \n\nThis model was trained to generate high quality images based on SIFAS cards.\n\nTo achieve better quality, you should be using hako-mikan's regional prompter, along with Latent Mode, which modifies the way Stable Diffusion isolates the LoRA resulting in a significant improvement.## Trigger words\n\nYou should use 'id_kotori_minami' to trigger the image generation.## Download model\n\nWeights for this model are available in Safetensors format.\n\nDownload them in the Files & versions tab."
] | [
-0.05163584649562836,
-0.07573968917131424,
-0.00045647239312529564,
0.024810565635561943,
0.11018609255552292,
0.031790584325790405,
0.2311900109052658,
0.07226501405239105,
0.08441995829343796,
0.050685737282037735,
0.008185750804841518,
0.07106003910303116,
0.06534890830516815,
0.27383413910865784,
-0.05426190048456192,
-0.3002275824546814,
0.11717737466096878,
-0.03633011132478714,
-0.06097111478447914,
0.01918051205575466,
0.07222383469343185,
-0.05378446355462074,
0.11218191683292389,
-0.029072945937514305,
-0.07387037575244904,
0.01757972314953804,
-0.01350870169699192,
-0.03991985321044922,
0.011505204252898693,
0.06459221243858337,
0.008548851124942303,
0.08590734004974365,
0.07483437657356262,
-0.16194023191928864,
0.06761380285024643,
0.003925464116036892,
-0.012574823573231697,
-0.01472535915672779,
-0.020116982981562614,
-0.034021925181150436,
0.16639022529125214,
-0.1168469712138176,
-0.09570658206939697,
-0.030489861965179443,
-0.05302324518561363,
-0.05324282869696617,
-0.024029739201068878,
-0.020454084500670433,
0.13956402242183685,
0.0019366026390343904,
0.010331626050174236,
-0.042068250477313995,
-0.028882797807455063,
0.027865484356880188,
0.1461385190486908,
-0.09713448584079742,
-0.07030654698610306,
0.19746993482112885,
0.008530541323125362,
0.13760791718959808,
-0.01667359471321106,
0.1471322774887085,
0.11858513951301575,
-0.08359391987323761,
0.0006633254233747721,
-0.05216911435127258,
0.06868696212768555,
-0.038453493267297745,
-0.054312363266944885,
-0.004908165894448757,
0.29825425148010254,
0.0413191094994545,
-0.03534558042883873,
-0.12120743840932846,
-0.04719192907214165,
0.12459149956703186,
-0.08119365572929382,
-0.020444678142666817,
0.0034095700830221176,
0.01404627412557602,
0.03913724794983864,
-0.07762350887060165,
-0.07090755552053452,
-0.10216059535741806,
-0.035683196038007736,
0.22507357597351074,
0.05230597034096718,
0.06323747336864471,
-0.03521812707185745,
0.06822294741868973,
-0.14200913906097412,
-0.15413711965084076,
0.005030037835240364,
-0.0729125440120697,
0.06472162902355194,
0.07142990827560425,
0.0164885725826025,
-0.11982808262109756,
0.11802525818347931,
-0.00802398007363081,
0.046104300767183304,
-0.010122823528945446,
0.02854761853814125,
0.0859222337603569,
0.055449530482292175,
-0.012803709134459496,
-0.03485510125756264,
-0.066863052546978,
0.04905302822589874,
0.05990239977836609,
0.07847431302070618,
-0.08694981038570404,
-0.11716391146183014,
-0.060246776789426804,
-0.0992385596036911,
0.026951150968670845,
-0.04226401820778847,
0.05599189177155495,
-0.0057707177475094795,
-0.008364323526620865,
0.08871607482433319,
-0.04750971123576164,
-0.06235519051551819,
-0.091416135430336,
-0.03565521538257599,
0.21306926012039185,
0.01446057204157114,
-0.0026739889290183783,
0.04307841882109642,
0.09625314176082611,
-0.016313549131155014,
-0.02815847471356392,
-0.056628480553627014,
0.002386330394074321,
-0.007700692862272263,
-0.1307041049003601,
0.027310341596603394,
-0.15536810457706451,
-0.27088725566864014,
-0.002890367992222309,
0.0741625726222992,
-0.02091171406209469,
0.04297611489892006,
-0.023959195241332054,
-0.004257721360772848,
0.010583944618701935,
0.007820074446499348,
-0.03242155537009239,
-0.09379946440458298,
0.058951329439878464,
0.007004439365118742,
0.11707884818315506,
-0.13846425712108612,
0.008423696272075176,
-0.07033435255289078,
0.08195103704929352,
-0.15130308270454407,
0.04821740835905075,
-0.07731161266565323,
0.015163303352892399,
-0.05791972577571869,
0.0017703634221106768,
-0.14158281683921814,
0.08399150520563126,
-0.014972871169447899,
0.17962951958179474,
-0.18249447643756866,
-0.02725936286151409,
0.0420415922999382,
-0.2217414379119873,
-0.049714408814907074,
0.09439104050397873,
-0.02079308032989502,
0.18274658918380737,
0.08037613332271576,
0.15464051067829132,
0.09866004437208176,
-0.1581461876630783,
0.02394019439816475,
0.03351809084415436,
-0.07295996695756912,
-0.07038036733865738,
0.08332756906747818,
0.1151774451136589,
-0.001318225055001676,
0.10682173818349838,
-0.09371627867221832,
0.06278745085000992,
-0.0533226914703846,
-0.05956771597266197,
-0.04528045281767845,
-0.07975669950246811,
-0.028099270537495613,
0.0022830767557024956,
0.03333938866853714,
0.03049689717590809,
0.005679404363036156,
-0.034377019852399826,
0.1468864381313324,
-0.0590798445045948,
0.004438573028892279,
-0.034811340272426605,
0.2284562736749649,
-0.15795300900936127,
0.025820884853601456,
-0.016470899805426598,
-0.037650685757398605,
-0.01659674197435379,
0.033730849623680115,
0.04389767348766327,
0.0036777008790522814,
0.0511852391064167,
0.026495391502976418,
-0.06438631564378738,
0.01585760898888111,
0.08373727649450302,
-0.020959069952368736,
0.01979009434580803,
-0.10926015675067902,
0.02500625140964985,
-0.030307071283459663,
0.13639037311077118,
-0.11022244393825531,
0.004026057664304972,
0.03548174351453781,
0.06982932984828949,
0.011736760847270489,
0.07352085411548615,
0.0401037223637104,
-0.02233237400650978,
-0.0576818473637104,
-0.013853778131306171,
0.03163832053542137,
-0.008628549054265022,
-0.02248666249215603,
0.1351316273212433,
-0.09457770735025406,
0.056781429797410965,
0.11806505173444748,
0.078113853931427,
0.004143811762332916,
-0.1790522038936615,
0.029802117496728897,
-0.007720254361629486,
-0.08571762591600418,
0.004395969677716494,
-0.07702188193798065,
-0.02390764094889164,
0.051198866218328476,
-0.07916784286499023,
0.11969589442014694,
0.06394004076719284,
-0.07076918333768845,
-0.07794450968503952,
0.05654120445251465,
0.1471485048532486,
-0.030261728912591934,
0.08990880101919174,
0.1035669669508934,
-0.08476635813713074,
0.19226202368736267,
-0.030483076348900795,
-0.12667015194892883,
0.011987070553004742,
0.06808175891637802,
0.012949294410645962,
0.17608867585659027,
0.09897517412900925,
-0.024316895753145218,
0.036202240735292435,
-0.026855867356061935,
0.02454795129597187,
-0.08975208550691605,
-0.06112591549754143,
0.016244322061538696,
-0.06971366703510284,
0.07415114343166351,
0.12992064654827118,
-0.07306645810604095,
0.08558542281389236,
-0.05238635838031769,
-0.03397514298558235,
-0.002418635180220008,
-0.006983622908592224,
0.02696836367249489,
0.08419714868068695,
0.03100396879017353,
-0.09899411350488663,
-0.17572897672653198,
-0.020382247865200043,
-0.11480464041233063,
-0.004844039212912321,
0.05418253317475319,
-0.06991008669137955,
-0.0598493330180645,
-0.08547444641590118,
0.008442265912890434,
0.06827928125858307,
-0.06312718987464905,
-0.0718010738492012,
-0.027187207713723183,
-0.085276760160923,
-0.07127989828586578,
-0.04426298663020134,
-0.05902799218893051,
0.02424694038927555,
0.0905972495675087,
-0.12298863381147385,
0.20402874052524567,
0.07317738980054855,
0.009667688980698586,
0.04529454559087753,
-0.002832820639014244,
0.10164105892181396,
-0.09160193800926208,
0.05722125619649887,
0.2598828673362732,
0.050365619361400604,
0.05354143679141998,
0.1389196813106537,
-0.004934694617986679,
-0.09857945889234543,
0.03376346081495285,
0.012391019612550735,
-0.16171352565288544,
-0.05798526108264923,
-0.09027153253555298,
-0.08045093715190887,
-0.014851541258394718,
-0.006325002294033766,
0.02936812862753868,
0.07428531348705292,
0.19236986339092255,
-0.005320195574313402,
0.010940775275230408,
0.06562256067991257,
0.08488371223211288,
0.021126048639416695,
0.03646573796868324,
0.053976234048604965,
-0.07099558413028717,
-0.078202024102211,
0.14929361641407013,
0.08730657398700714,
0.1931995004415512,
-0.0624769851565361,
0.07098646461963654,
0.09122321009635925,
0.058723580092191696,
0.12126022577285767,
0.059880904853343964,
0.010348649695515633,
-0.009768994525074959,
-0.07166140526533127,
-0.10410086065530777,
0.0035084534902125597,
0.0969112366437912,
-0.09843847900629044,
-0.013059306889772415,
-0.00821844395250082,
0.11431950330734253,
0.06098851189017296,
0.12006333470344543,
-0.01683388277888298,
-0.28768956661224365,
0.06237581744790077,
0.07459110021591187,
0.12298430502414703,
-0.05889378860592842,
0.03589760884642601,
0.10188301652669907,
-0.038003068417310715,
0.026255672797560692,
-0.03836934268474579,
0.08149724453687668,
-0.0774916335940361,
-0.04202275350689888,
-0.10089223086833954,
0.14431104063987732,
-0.03403884544968605,
0.048764463514089584,
-0.09960738569498062,
0.12375746667385101,
-0.009684794582426548,
0.025178762152791023,
-0.06296297907829285,
-0.03599165380001068,
0.1263110488653183,
0.10810386389493942,
0.16818708181381226,
0.002055190270766616,
-0.09129536151885986,
-0.12090688943862915,
-0.12582415342330933,
0.05289848521351814,
0.06953692436218262,
-0.025112206116318703,
0.021399881690740585,
-0.04832075908780098,
0.0005095481174066663,
-0.01926439069211483,
0.07303367555141449,
-0.087908074259758,
-0.12763842940330505,
0.022338464856147766,
0.1386624276638031,
0.040799472481012344,
-0.0032086654100567102,
-0.04667752981185913,
-0.07465563714504242,
0.05529573932290077,
0.24617163836956024,
-0.07503480464220047,
-0.13241854310035706,
-0.08901096135377884,
0.05162594094872475,
-0.0573011115193367,
-0.016107069328427315,
-0.033368419855833054,
0.11495061218738556,
-0.0546966977417469,
-0.13004392385482788,
-0.03429345041513443,
-0.0245579332113266,
-0.0054007223807275295,
0.00025406968779861927,
0.03306322172284126,
-0.016692420467734337,
-0.06830701977014542,
0.029695210978388786,
-0.018349096179008484,
0.05584316700696945,
-0.0772680938243866,
-0.015149153769016266,
0.16565240919589996,
0.014237858355045319,
0.0507538840174675,
-0.07548931241035461,
-0.016224199905991554,
-0.06470753997564316,
-0.036467064172029495,
-0.017257068306207657,
0.21550774574279785,
-0.009518928825855255,
0.0013539438368752599,
0.13545037806034088,
-0.043099209666252136,
-0.22027674317359924,
-0.03149905428290367,
-0.09204363077878952,
0.01578325591981411,
-0.033378537744283676,
-0.02959250658750534,
0.11849065124988556,
0.12203577905893326,
-0.053329214453697205,
0.17720413208007812,
-0.25533753633499146,
-0.098531574010849,
0.020662596449255943,
0.17260482907295227,
0.4094379246234894,
-0.23350748419761658,
-0.017226800322532654,
-0.13915914297103882,
-0.14344890415668488,
0.020977972075343132,
-0.0599590428173542,
0.07091855257749557,
-0.002763291820883751,
-0.01018325425684452,
-0.004985653329640627,
-0.023208266124129295,
0.20866911113262177,
-0.055752579122781754,
0.11886615306138992,
-0.10620976239442825,
-0.023140178993344307,
0.07622750103473663,
-0.10333538800477982,
0.10715401917695999,
-0.1535756140947342,
0.02312305197119713,
-0.05094720050692558,
-0.05668475851416588,
0.01173700112849474,
0.053619544953107834,
0.03590690717101097,
-0.0885419100522995,
-0.0725712701678276,
0.06019041687250137,
0.020808663219213486,
0.015943439677357674,
0.06131155043840408,
-0.07509756833314896,
-0.03238634392619133,
0.007577588316053152,
-0.015741387382149696,
0.03929676488041878,
0.006717074196785688,
-0.07309980690479279,
-0.05270591378211975,
0.10660286247730255,
-0.17583055794239044,
0.0007221027044579387,
0.08547833561897278,
0.013293802738189697,
0.10815798491239548,
0.005381327122449875,
-0.0036051326896995306,
0.09113646298646927,
0.1298469454050064,
-0.07658658921718597,
-0.07982081919908524,
-0.04368916526436806,
-0.04604226350784302,
0.1253252625465393,
-0.0031362210866063833,
0.10669101774692535,
-0.08704124391078949,
0.05475940555334091,
0.0007845754735171795,
0.018361130729317665,
-0.023639870807528496,
0.028462324291467667,
0.01022995077073574,
-0.035453859716653824,
-0.08311549574136734,
0.11064624041318893,
-0.03071858175098896,
-0.007307632360607386,
-0.03141319006681442,
-0.03379155322909355,
-0.1011100560426712,
-0.047317005693912506,
-0.004236637614667416,
0.09092937409877777,
-0.06282371282577515,
-0.10155324637889862,
-0.08688477426767349,
-0.08355233818292618,
-0.0659882202744484,
0.08987986296415329,
0.0890507623553276,
-0.07562524825334549,
0.028966601938009262,
0.021406350657343864,
-0.111512690782547,
0.03402918204665184,
0.08144862204790115,
0.08179003745317459,
-0.17180293798446655,
-0.10622044652700424,
-0.013701856136322021,
0.03453774377703667,
-0.10075794905424118,
-0.07350925356149673,
-0.04498044773936272,
0.01627953350543976,
-0.1119912639260292,
0.11884219944477081,
-0.1264093518257141,
-0.031801097095012665,
-0.03002874180674553,
-0.03351391851902008,
-0.04065172001719475,
-0.0051209875382483006,
-0.062483880668878555,
0.03743641451001167,
0.0022082417272031307,
0.04483848065137863,
-0.03957526385784149,
-0.059848301112651825,
0.02631278522312641,
-0.06438271701335907,
-0.014912002719938755,
0.049001894891262054,
-0.0501137338578701,
0.020470736548304558,
-0.2328924834728241,
0.0016904022777453065,
0.11111394315958023,
0.013818645849823952,
-0.031124094501137733,
0.02221357636153698,
0.03891255706548691,
0.03981030359864235,
0.02374054677784443,
-0.031228242442011833,
-0.08919990807771683,
-0.07105109840631485,
0.08766523748636246,
-0.0867474228143692,
-0.024094685912132263,
-0.015528092160820961,
0.02233118563890457,
0.1987730860710144,
0.11543314158916473,
0.11111979186534882,
-0.03316175937652588,
-0.008160386234521866,
-0.039528314024209976,
0.02042935974895954,
0.038415879011154175,
-0.1462358683347702,
-0.04956347867846489,
-0.11997396498918533,
-0.014218415133655071,
-0.026635117828845978,
0.1734514832496643,
0.0606902651488781,
-0.10320599377155304,
-0.05610913410782814,
-0.0047699324786663055,
0.11926815658807755,
0.006101896520704031,
0.23910629749298096,
0.10157491266727448,
0.10379114001989365,
-0.0816425234079361,
0.06945227831602097,
0.09218142181634903,
0.056914813816547394,
0.0017504425486549735,
0.1406209021806717,
-0.0017172388033941388,
0.11576250195503235,
0.007170394994318485,
0.0551561638712883,
0.05482897907495499,
-0.001974832732230425,
-0.13344989717006683,
0.02821853943169117,
0.01435825601220131,
-0.022964157164096832,
0.22289876639842987,
-0.1082395389676094,
0.004849615506827831,
0.0584055557847023,
-0.01742175593972206,
-0.12633948028087616,
-0.3175275921821594,
-0.10039880126714706,
-0.18057173490524292,
0.06508098542690277,
-0.05013575404882431,
0.00942350272089243,
0.2102217972278595,
0.02647441253066063,
0.028933148831129074,
0.09661515057086945,
-0.09872530400753021,
-0.019249415025115013,
0.11668160557746887,
-0.05919424444437027,
-0.08889193087816238,
0.00953751988708973,
-0.10267508029937744,
0.053228508681058884,
-0.08601713925600052,
-0.03310757130384445,
0.03914441168308258,
0.0720287412405014,
0.026112766936421394,
-0.07155175507068634,
-0.04212731868028641,
-0.04218888282775879,
0.013933929614722729,
0.00640986580401659,
0.06319771707057953,
0.060772836208343506,
-0.042417772114276886,
0.026590991765260696,
0.15625141561031342,
0.024220356717705727,
-0.07204824686050415,
-0.10395651310682297,
0.0348854698240757,
-0.07034555077552795,
0.04088462144136429,
-0.04696940258145332,
-0.06369397789239883,
0.011934609152376652,
0.22275099158287048,
0.13436941802501678,
-0.06266627460718155,
0.005491918884217739,
-0.0836242288351059,
0.012641264125704765,
-0.046119675040245056,
0.12174885720014572,
-0.013045373372733593,
0.14948198199272156,
-0.04046594724059105,
0.009332765825092793,
-0.09525345265865326,
-0.06078793480992317,
-0.03222828358411789,
0.011966736987233162,
-0.01166930515319109,
-0.08334355056285858,
-0.06364012509584427,
0.07937484234571457,
-0.1403389275074005,
0.01168032456189394,
0.0726979523897171,
-0.035708144307136536,
-0.029344648122787476,
-0.06615160405635834,
-0.05342608690261841,
0.05540146678686142,
-0.013915793038904667,
-0.14731380343437195,
0.041695259511470795,
-0.07725627720355988,
0.0334085114300251,
-0.0960867628455162,
-0.058769553899765015,
-0.008228796534240246,
0.025567082688212395,
0.09803064912557602,
0.011825619265437126,
0.003112867008894682,
-0.008896704763174057,
-0.03615863248705864,
-0.019734537228941917,
0.1417798548936844,
-0.013830525800585747,
-0.0990389809012413,
-0.02313319779932499,
0.06799151003360748,
-0.08997901529073715,
0.009982803836464882,
0.05680109187960625,
-0.03591528907418251,
0.03685731068253517,
0.0944923684000969,
-0.03156176581978798,
-0.07798516005277634,
0.044549863785505295,
-0.14373870193958282,
0.11875883489847183,
0.07777499407529831,
0.038441698998212814,
-0.021235331892967224,
-0.025600705295801163,
0.1289009004831314,
0.049171414226293564,
-0.08929643034934998,
0.0235564224421978,
-0.03506174311041832,
-0.11362463235855103,
-0.005681334994733334,
0.016464581713080406,
-0.15292027592658997,
-0.03998103737831116,
-0.16462978720664978,
-0.029915599152445793,
-0.005830750335007906,
0.0727151408791542,
0.23977260291576385,
-0.004731283988803625,
-0.021478505805134773,
-0.236857607960701,
0.0407448448240757,
0.08983242511749268,
-0.06565111130475998,
-0.056971076875925064
] |
null | null | peft |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# train_2024-02-15-06-06-50
This model is a fine-tuned version of [mistralai/Mixtral-8x7B-Instruct-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1) on the 3_line dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 1
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 4
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- num_epochs: 3.0
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- PEFT 0.8.2
- Transformers 4.37.2
- Pytorch 2.2.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.2 | {"license": "other", "library_name": "peft", "tags": ["llama-factory", "lora", "generated_from_trainer"], "base_model": "mistralai/Mixtral-8x7B-Instruct-v0.1", "model-index": [{"name": "train_2024-02-15-06-06-50", "results": []}]} | null | h2m/Convex-Workshop-8x7B-Adapter | [
"peft",
"safetensors",
"llama-factory",
"lora",
"generated_from_trainer",
"base_model:mistralai/Mixtral-8x7B-Instruct-v0.1",
"license:other",
"region:us"
] | 2024-02-15T06:37:01+00:00 | [] | [] | TAGS
#peft #safetensors #llama-factory #lora #generated_from_trainer #base_model-mistralai/Mixtral-8x7B-Instruct-v0.1 #license-other #region-us
|
# train_2024-02-15-06-06-50
This model is a fine-tuned version of mistralai/Mixtral-8x7B-Instruct-v0.1 on the 3_line dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 1
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 4
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- num_epochs: 3.0
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- PEFT 0.8.2
- Transformers 4.37.2
- Pytorch 2.2.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.2 | [
"# train_2024-02-15-06-06-50\n\nThis model is a fine-tuned version of mistralai/Mixtral-8x7B-Instruct-v0.1 on the 3_line dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 1\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 4\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- num_epochs: 3.0\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- PEFT 0.8.2\n- Transformers 4.37.2\n- Pytorch 2.2.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.2"
] | [
"TAGS\n#peft #safetensors #llama-factory #lora #generated_from_trainer #base_model-mistralai/Mixtral-8x7B-Instruct-v0.1 #license-other #region-us \n",
"# train_2024-02-15-06-06-50\n\nThis model is a fine-tuned version of mistralai/Mixtral-8x7B-Instruct-v0.1 on the 3_line dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 1\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 4\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- num_epochs: 3.0\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- PEFT 0.8.2\n- Transformers 4.37.2\n- Pytorch 2.2.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.2"
] | [
57,
45,
6,
12,
8,
3,
127,
4,
39
] | [
"passage: TAGS\n#peft #safetensors #llama-factory #lora #generated_from_trainer #base_model-mistralai/Mixtral-8x7B-Instruct-v0.1 #license-other #region-us \n# train_2024-02-15-06-06-50\n\nThis model is a fine-tuned version of mistralai/Mixtral-8x7B-Instruct-v0.1 on the 3_line dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 1\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 4\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- num_epochs: 3.0\n- mixed_precision_training: Native AMP### Training results### Framework versions\n\n- PEFT 0.8.2\n- Transformers 4.37.2\n- Pytorch 2.2.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.2"
] | [
-0.10139178484678268,
0.1024806797504425,
-0.0035732684191316366,
0.08296290785074234,
0.14241546392440796,
0.030307289212942123,
0.13967062532901764,
0.11378397792577744,
-0.0816819965839386,
0.0768822431564331,
0.03530485928058624,
0.04391425475478172,
0.054548367857933044,
0.13669618964195251,
-0.03243299946188927,
-0.2368771880865097,
0.01282660011202097,
-0.027036495506763458,
-0.09829887002706528,
0.109576016664505,
0.11461202800273895,
-0.10690803825855255,
0.056754425168037415,
0.02740250714123249,
-0.1605583280324936,
0.021485809236764908,
-0.0015571924159303308,
-0.04436402767896652,
0.11270391196012497,
0.0180494524538517,
0.15495143830776215,
0.007115351501852274,
0.15378530323505402,
-0.204085111618042,
-0.00034925088402815163,
0.08883196860551834,
0.04376406967639923,
0.08927015215158463,
0.07939790189266205,
0.012761996127665043,
0.05882435664534569,
-0.09317784011363983,
0.10664862394332886,
0.009557745419442654,
-0.0999196469783783,
-0.1724216789007187,
-0.11524244397878647,
0.07383520156145096,
0.10788093507289886,
0.09093191474676132,
0.017834315076470375,
0.12591278553009033,
-0.0731879472732544,
0.07250966131687164,
0.23970091342926025,
-0.24972200393676758,
-0.08059234172105789,
0.05539728328585625,
0.05784483626484871,
0.06642423570156097,
-0.09562516957521439,
-0.035405952483415604,
0.05443381145596504,
0.013783429749310017,
0.0936053916811943,
0.005601452197879553,
-0.03998667746782303,
-0.017955558374524117,
-0.1310296356678009,
-0.024553537368774414,
0.12540316581726074,
0.023358670994639397,
-0.040644802153110504,
-0.07247254997491837,
-0.05703619867563248,
-0.12632450461387634,
-0.02288147434592247,
-0.056338854134082794,
0.035638321191072464,
-0.03901489078998566,
-0.022417081519961357,
-0.03111180104315281,
-0.0803355723619461,
-0.06834319978952408,
0.02120528742671013,
0.1295914202928543,
0.03999846428632736,
0.01467658206820488,
-0.0415501743555069,
0.11612788587808609,
-0.0033073194790631533,
-0.13959285616874695,
0.010971666313707829,
-0.001490244292654097,
-0.07541771233081818,
-0.07329666614532471,
-0.04083319380879402,
-0.009766430594027042,
-0.026045430451631546,
0.14904430508613586,
-0.08747795224189758,
0.06646917760372162,
-0.004996289033442736,
0.010024534538388252,
-0.05406099930405617,
0.1412295252084732,
-0.04739348217844963,
-0.023666763678193092,
-0.00833688024431467,
0.12552005052566528,
0.013656971044838428,
-0.004360760096460581,
-0.06771087646484375,
-0.02247733063995838,
0.0587642528116703,
0.04979006573557854,
-0.05194835737347603,
0.0032177744433283806,
-0.061642635613679886,
-0.03217540681362152,
0.04671431705355644,
-0.13789427280426025,
0.054746728390455246,
-0.004205751698464155,
-0.07637172192335129,
-0.0159396193921566,
0.014988873153924942,
0.03723686560988426,
0.00016409177624154836,
0.1304347962141037,
-0.08710986375808716,
0.020228197798132896,
-0.10333040356636047,
-0.06582368910312653,
0.007345589809119701,
-0.04357778653502464,
-0.01955353282392025,
-0.05931128188967705,
-0.19484862685203552,
-0.03902922570705414,
0.053593799471855164,
-0.07682500034570694,
-0.035811860114336014,
-0.007089339662343264,
-0.07007965445518494,
0.021103665232658386,
-0.00637491699308157,
0.12389054894447327,
-0.042073458433151245,
0.07960231602191925,
-0.004600113723427057,
0.03323550149798393,
0.028007568791508675,
0.032502539455890656,
-0.06734955310821533,
0.03328047692775726,
-0.17793723940849304,
0.04152389243245125,
-0.08337705582380295,
0.01535361260175705,
-0.10851497203111649,
-0.09643656015396118,
-0.009803005494177341,
-0.02356005273759365,
0.08125097304582596,
0.09486334025859833,
-0.1909017264842987,
-0.031921934336423874,
0.16363868117332458,
-0.1109413355588913,
-0.0675470158457756,
0.07766174525022507,
-0.05169576033949852,
0.04357943683862686,
0.043300285935401917,
0.1216098815202713,
0.1129806637763977,
-0.17109414935112,
0.00279886182397604,
-0.02265637181699276,
0.09217318147420883,
0.03925952687859535,
0.05851375311613083,
-0.01247226633131504,
0.04818414896726608,
-0.001661177957430482,
-0.06753361970186234,
-0.004047540482133627,
-0.07721717655658722,
-0.0841878354549408,
-0.04504431039094925,
-0.07347438484430313,
0.05355454608798027,
0.038639411330223083,
0.03383605182170868,
-0.056010790169239044,
-0.10442696511745453,
0.10976095497608185,
0.13373352587223053,
-0.0444929301738739,
0.017385339364409447,
-0.06048094853758812,
0.04725645110011101,
-0.006691309157758951,
-0.053922198712825775,
-0.2073984593153,
-0.10460442304611206,
0.03591163456439972,
-0.06191711127758026,
-0.008783877827227116,
0.02695087343454361,
0.06282687187194824,
0.08187932521104813,
-0.059400659054517746,
-0.016150103881955147,
-0.11553353071212769,
-0.003974083345383406,
-0.09626013040542603,
-0.20935118198394775,
-0.052597776055336,
-0.029803864657878876,
0.18041761219501495,
-0.21814021468162537,
0.014255179092288017,
-0.025095731019973755,
0.17179790139198303,
0.038533374667167664,
-0.06124338135123253,
-0.041300319135189056,
0.06079854071140289,
0.00017696755821816623,
-0.08512065559625626,
0.030781446024775505,
-0.005731642711907625,
-0.035812657326459885,
-0.05789542943239212,
-0.14014530181884766,
0.08333031833171844,
0.08069340139627457,
0.057129934430122375,
-0.10544278472661972,
-0.03130199760198593,
-0.059832628816366196,
-0.040295056998729706,
-0.09211081266403198,
-0.014597858302295208,
0.15948787331581116,
0.014981932006776333,
0.13152369856834412,
-0.08011670410633087,
-0.07278203964233398,
-0.0016584628028795123,
-0.007352450862526894,
0.0006569346878677607,
0.05814383924007416,
0.0801776796579361,
-0.09429322928190231,
0.09811292588710785,
0.11636441200971603,
-0.05960654094815254,
0.12772439420223236,
-0.05680954456329346,
-0.09146623313426971,
-0.03328730911016464,
0.010839411057531834,
-0.0031315030064433813,
0.1279059499502182,
-0.03606172651052475,
0.025440724566578865,
0.018891263753175735,
0.015648892149329185,
0.02566670998930931,
-0.19841791689395905,
-0.019317051395773888,
0.032826412469148636,
-0.03630628064274788,
-0.00533332908526063,
-0.036387041211128235,
0.0376688577234745,
0.09872499108314514,
0.00685000279918313,
-0.04925175756216049,
-0.0032871845178306103,
-0.015827836468815804,
-0.08724216371774673,
0.17201973497867584,
-0.11327534168958664,
-0.12244896590709686,
-0.12524788081645966,
0.02977718412876129,
-0.008020611479878426,
-0.03200695663690567,
-0.0014572180807590485,
-0.07740426808595657,
-0.05594780296087265,
-0.08512171357870102,
-0.03614300116896629,
-0.013180630281567574,
-0.019361549988389015,
0.1014341488480568,
0.016601452603936195,
0.08441627770662308,
-0.12999866902828217,
0.013342278078198433,
-0.022497843950986862,
-0.055996134877204895,
-0.005506490357220173,
0.06425373256206512,
0.07872578501701355,
0.13944542407989502,
-0.015377144329249859,
0.012961667031049728,
-0.028991708531975746,
0.23684506118297577,
-0.09356744587421417,
-0.008299414999783039,
0.12884533405303955,
-0.023407861590385437,
0.06800347566604614,
0.14136531949043274,
0.05516236275434494,
-0.0879286378622055,
0.02532569319009781,
0.0598115473985672,
-0.01458875834941864,
-0.2437955141067505,
-0.03757747635245323,
-0.020286260172724724,
-0.09111244231462479,
0.08717634528875351,
0.02800346538424492,
0.02540329098701477,
0.032944805920124054,
-0.018336255103349686,
0.04111580550670624,
0.01412996742874384,
0.07794816792011261,
0.06196024641394615,
0.04063018411397934,
0.11233151704072952,
-0.018589740619063377,
-0.0330134816467762,
0.04726256802678108,
-0.004048147704452276,
0.26874661445617676,
-0.006502606440335512,
0.06428244709968567,
0.036763790994882584,
0.12722766399383545,
-0.013198771513998508,
0.049028750509023666,
0.009454621002078056,
-0.020050829276442528,
0.003369732992723584,
-0.07049154490232468,
-0.014448776841163635,
0.03397535905241966,
-0.04581405967473984,
0.05563216656446457,
-0.062346577644348145,
0.02808702364563942,
0.010700175538659096,
0.26505669951438904,
0.0347675122320652,
-0.28153377771377563,
-0.05305866152048111,
0.01162202563136816,
-0.027047179639339447,
-0.04043220728635788,
-0.008706997148692608,
0.12289641797542572,
-0.11660398542881012,
0.061707284301519394,
-0.07944034785032272,
0.07929528504610062,
-0.007929989136755466,
-0.0044966284185647964,
0.09899504482746124,
0.09941587597131729,
-0.0021567516960203648,
0.0490884929895401,
-0.18600642681121826,
0.24477581679821014,
0.01503038126975298,
0.10193435847759247,
-0.040687356144189835,
0.05120671167969704,
0.012049905955791473,
0.06129999831318855,
0.07102006673812866,
-0.011388540267944336,
-0.05366498604416847,
-0.21220746636390686,
-0.08158722519874573,
0.013366887345910072,
0.13212722539901733,
-0.04390464723110199,
0.09311124682426453,
-0.05142385885119438,
-0.003449223702773452,
0.05073382705450058,
-0.04542473331093788,
-0.16942830383777618,
-0.11798368394374847,
0.043114788830280304,
-0.015360211953520775,
-0.0424160398542881,
-0.09750635176897049,
-0.11012425273656845,
-0.051871951669454575,
0.13420216739177704,
-0.016716931015253067,
-0.040454547852277756,
-0.1398247927427292,
0.07145264744758606,
0.14283907413482666,
-0.05013569816946983,
0.03807777911424637,
0.04024602845311165,
0.12299493700265884,
0.01478759478777647,
-0.05431833490729332,
0.05141865834593773,
-0.07263343781232834,
-0.19266432523727417,
-0.07341230660676956,
0.13820363581180573,
0.059649791568517685,
0.05131211131811142,
0.0014221563469618559,
0.04200390726327896,
0.04119652882218361,
-0.08427900820970535,
0.007012075278908014,
0.08521109074354172,
0.0790710598230362,
0.03622395545244217,
-0.08129357546567917,
0.006916605401784182,
-0.04083826765418053,
-0.03260860592126846,
0.07963395863771439,
0.22234933078289032,
-0.0910959541797638,
0.0582430399954319,
0.03256155177950859,
-0.08999372273683548,
-0.14008881151676178,
0.05984736606478691,
0.14648757874965668,
0.014846443198621273,
0.0791969895362854,
-0.16396653652191162,
0.09642981737852097,
0.1388493776321411,
-0.0371873565018177,
0.050413940101861954,
-0.3398210406303406,
-0.13820622861385345,
0.03963906690478325,
0.11129472404718399,
-0.00979209691286087,
-0.13416604697704315,
-0.04205065593123436,
-0.017773427069187164,
-0.1537267118692398,
0.09002953767776489,
-0.0813404768705368,
0.08279230445623398,
0.0026855224277824163,
0.0533517561852932,
0.02508641965687275,
-0.04120977967977524,
0.1606193482875824,
0.014954675920307636,
0.09655697643756866,
-0.041445162147283554,
0.0070151761174201965,
0.07701174914836884,
-0.057789161801338196,
0.011734851635992527,
0.013274304568767548,
0.04713784158229828,
-0.11457788199186325,
-0.008703331463038921,
-0.07100232690572739,
0.028701024129986763,
-0.06863653659820557,
-0.047553807497024536,
-0.046729862689971924,
0.06754674017429352,
0.03864029422402382,
-0.04112789034843445,
0.06554210931062698,
0.03549595922231674,
0.15416091680526733,
0.11074360460042953,
0.0761416032910347,
0.019919199869036674,
-0.09711520373821259,
-0.0015027946792542934,
-0.015099531970918179,
0.05931427329778671,
-0.11425583064556122,
0.03515619412064552,
0.11929723620414734,
0.041824884712696075,
0.13734358549118042,
0.03591405227780342,
-0.06523769348859787,
-0.0002622238243930042,
0.05930878594517708,
-0.10068982094526291,
-0.10385674983263016,
0.0014129327610135078,
0.01465526595711708,
-0.12858924269676208,
-0.0040804301388561726,
0.12390440702438354,
-0.05340399593114853,
-0.01770748570561409,
-0.0053145443089306355,
0.01813342422246933,
-0.014061552472412586,
0.21008123457431793,
0.03149419650435448,
0.065910205245018,
-0.06786539405584335,
0.09074507653713226,
0.07613096386194229,
-0.04773615673184395,
0.019457492977380753,
0.05347376689314842,
-0.08296632021665573,
0.00022522335348185152,
0.049135129898786545,
0.13067726790905,
-0.0273919515311718,
-0.03263266384601593,
-0.10665366053581238,
-0.09583234041929245,
0.040174055844545364,
0.13634055852890015,
0.04615969583392143,
-0.014484350569546223,
-0.017385005950927734,
0.03215264156460762,
-0.12223056703805923,
0.10126448422670364,
0.05427616089582443,
0.08751081675291061,
-0.1502392590045929,
0.13025066256523132,
0.013307388871908188,
-0.007219149265438318,
-0.008970025926828384,
0.040134839713573456,
-0.09159837663173676,
-0.018324827775359154,
-0.1077609732747078,
-0.005851004272699356,
-0.020399823784828186,
-0.005130284931510687,
-0.016737081110477448,
-0.05783465504646301,
-0.03464166074991226,
0.038340695202350616,
-0.07544369995594025,
-0.047967493534088135,
0.000993550755083561,
0.03948046639561653,
-0.14454853534698486,
-0.0123441768810153,
0.04386261850595474,
-0.0946883112192154,
0.06728459894657135,
0.04484066739678383,
0.050931334495544434,
0.03485030680894852,
-0.11219064891338348,
-0.0035072071477770805,
0.03621092066168785,
0.023263374343514442,
0.0588911771774292,
-0.11800678819417953,
-0.008898959495127201,
-0.0422983281314373,
0.04618706926703453,
0.023575887084007263,
0.03302483260631561,
-0.1331530213356018,
-0.0022154010366648436,
-0.036755964159965515,
-0.07759794592857361,
-0.04404109716415405,
0.031253378838300705,
0.11829259991645813,
0.01565854623913765,
0.15513485670089722,
-0.07981887459754944,
0.06251031905412674,
-0.22101126611232758,
-0.03135722130537033,
0.00016060120833572,
-0.00896070059388876,
-0.07330122590065002,
-0.03812383860349655,
0.06478086113929749,
-0.06546586751937866,
0.09010046720504761,
0.005457092542201281,
0.10452164709568024,
0.04605022817850113,
-0.06542005389928818,
-0.014731541275978088,
0.02530233934521675,
0.14009326696395874,
0.07213353365659714,
-0.0053770579397678375,
0.08683137595653534,
-0.02253635972738266,
0.05150921642780304,
0.015800604596734047,
0.19396932423114777,
0.1676269918680191,
-0.02741841785609722,
0.04269121587276459,
0.06747902929782867,
-0.1276049166917801,
-0.14017948508262634,
0.08258289843797684,
-0.05607416108250618,
0.08400728553533554,
-0.04996129125356674,
0.15047535300254822,
0.1040610745549202,
-0.1974334716796875,
0.04167212173342705,
-0.056592442095279694,
-0.09817297011613846,
-0.12746258080005646,
-0.01200153212994337,
-0.07699718326330185,
-0.1218395009636879,
0.018163572996854782,
-0.12492561340332031,
0.0548434779047966,
0.11545804888010025,
0.002878704108297825,
0.02418457716703415,
0.13461780548095703,
-0.01113048568367958,
0.010660023428499699,
0.04539446905255318,
0.04550711438059807,
0.022456981241703033,
-0.08489416539669037,
-0.06314574927091599,
0.02818903513252735,
0.007418330293148756,
0.056098125874996185,
-0.05239308997988701,
-0.01682615652680397,
0.010481441393494606,
0.013691700994968414,
-0.06708406656980515,
0.040271371603012085,
0.00341481133364141,
0.03847089037299156,
0.01583458110690117,
0.03994518145918846,
0.038835447281599045,
-0.04841800406575203,
0.2776872515678406,
-0.07288072258234024,
-0.06880571693181992,
-0.14187096059322357,
0.19770169258117676,
0.028596725314855576,
0.005652984604239464,
0.07099730521440506,
-0.11182112991809845,
-0.01578626222908497,
0.13392162322998047,
0.12091466039419174,
-0.07454998046159744,
-0.012250351719558239,
0.0005379045614972711,
-0.01748272404074669,
-0.059986937791109085,
0.11626813560724258,
0.10894223302602768,
0.044438108801841736,
-0.052766792476177216,
-0.004584937822073698,
-0.019206339493393898,
-0.028145331889390945,
-0.062270719558000565,
0.05548746511340141,
0.008822161704301834,
0.010885782539844513,
-0.03419298306107521,
0.08013516664505005,
0.0491829439997673,
-0.1754041463136673,
0.06517565995454788,
-0.1910201609134674,
-0.1992443948984146,
0.002730388194322586,
0.09480132162570953,
-0.0296873077750206,
0.05702326074242592,
-0.01188772451132536,
-0.03645680099725723,
0.1455559879541397,
-0.014384924434125423,
-0.03232832998037338,
-0.13051359355449677,
0.08599301427602768,
-0.09508705884218216,
0.21791905164718628,
-0.010219988413155079,
0.0854673981666565,
0.08784922957420349,
0.02221294865012169,
-0.10771988332271576,
0.008283560164272785,
0.0843709260225296,
-0.0967072919011116,
-0.0063253589905798435,
0.154868021607399,
-0.04864281415939331,
0.10117177665233612,
0.05802848935127258,
-0.1490904539823532,
0.003261302364990115,
-0.016234269365668297,
-0.03286512568593025,
-0.07244256883859634,
0.000736905843950808,
-0.0664508119225502,
0.1518411487340927,
0.21950635313987732,
-0.03201168775558472,
0.024164125323295593,
-0.04416850209236145,
0.039054859429597855,
0.04889632388949394,
0.11461993306875229,
-0.03524595499038696,
-0.23990298807621002,
0.023765338584780693,
0.016538580879569054,
0.021042833104729652,
-0.19946609437465668,
-0.09676317125558853,
0.059379566460847855,
-0.06772096455097198,
-0.07468792051076889,
0.10201340168714523,
0.05071400851011276,
0.03986351937055588,
-0.023195266723632812,
-0.1465356945991516,
-0.04917529970407486,
0.1524914801120758,
-0.15282373130321503,
-0.03435434028506279
] |
null | null | peft |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
### Framework versions
- PEFT 0.8.2 | {"library_name": "peft", "base_model": "NousResearch/llama-2-7b-chat-hf"} | null | GSalimp/ChatUFOPTreinado | [
"peft",
"safetensors",
"arxiv:1910.09700",
"base_model:NousResearch/llama-2-7b-chat-hf",
"region:us"
] | 2024-02-15T06:37:28+00:00 | [
"1910.09700"
] | [] | TAGS
#peft #safetensors #arxiv-1910.09700 #base_model-NousResearch/llama-2-7b-chat-hf #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
### Framework versions
- PEFT 0.8.2 | [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact",
"### Framework versions\n\n- PEFT 0.8.2"
] | [
"TAGS\n#peft #safetensors #arxiv-1910.09700 #base_model-NousResearch/llama-2-7b-chat-hf #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact",
"### Framework versions\n\n- PEFT 0.8.2"
] | [
43,
6,
3,
54,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4,
11
] | [
"passage: TAGS\n#peft #safetensors #arxiv-1910.09700 #base_model-NousResearch/llama-2-7b-chat-hf #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact### Framework versions\n\n- PEFT 0.8.2"
] | [
-0.1182713732123375,
0.20605283975601196,
-0.002885909052565694,
0.025528445839881897,
0.07780413329601288,
0.01534308586269617,
0.05651530623435974,
0.13322071731090546,
0.033434174954891205,
0.11681489646434784,
0.06960669159889221,
0.11800691485404968,
0.1150616705417633,
0.21967867016792297,
0.003384095150977373,
-0.16528919339179993,
0.01977493241429329,
-0.07272271811962128,
0.01721583679318428,
0.11825031787157059,
0.14115279912948608,
-0.09906499832868576,
0.07677090167999268,
-0.020376496016979218,
0.002085930434986949,
-0.027533182874321938,
-0.06786970794200897,
-0.010711569339036942,
0.05443205311894417,
0.03205583617091179,
0.056387003511190414,
-0.01105571910738945,
0.0850101038813591,
-0.2699846923351288,
0.01892087794840336,
0.04313221201300621,
-0.0003035950940102339,
0.08347462117671967,
0.09693489968776703,
-0.04515960440039635,
0.12342850118875504,
-0.022153450176119804,
0.1334221065044403,
0.09028904885053635,
-0.0952930599451065,
-0.2340102195739746,
-0.06347289681434631,
0.07943075895309448,
0.18779218196868896,
0.08598335832357407,
-0.043026991188526154,
0.12365785986185074,
-0.06345905363559723,
0.02279195375740528,
0.0676071047782898,
-0.10427715629339218,
-0.06358727067708969,
0.0635727196931839,
0.13042697310447693,
0.07722453773021698,
-0.12583956122398376,
-0.036777839064598083,
0.035800136625766754,
0.04602229595184326,
0.0581551194190979,
0.006705643609166145,
0.1484973281621933,
0.028741590678691864,
-0.1451803594827652,
-0.04968540370464325,
0.13717946410179138,
0.010393635369837284,
-0.03750089928507805,
-0.21629458665847778,
-0.004606952425092459,
-0.09520990401506424,
-0.0382964201271534,
-0.04776408150792122,
0.036868900060653687,
0.010364466346800327,
0.13369497656822205,
-0.05042941868305206,
-0.09208183735609055,
-0.015096059069037437,
0.10996752232313156,
0.06212526559829712,
0.02062259614467621,
-0.0195420254021883,
0.008153927512466908,
0.12230360507965088,
0.0676325112581253,
-0.13369221985340118,
-0.06309580057859421,
-0.06740275770425797,
-0.03399163857102394,
-0.0246161837130785,
0.03987646475434303,
0.017038490623235703,
0.06289330124855042,
0.2714812755584717,
-0.04003416374325752,
0.06364351511001587,
0.04111989960074425,
0.02255065366625786,
0.030048247426748276,
0.10513897240161896,
-0.03285462409257889,
-0.16402380168437958,
-0.00665299640968442,
0.10125549137592316,
0.0030582118779420853,
-0.03408040851354599,
-0.05679953843355179,
0.03299674019217491,
0.03614059463143349,
0.11764265596866608,
0.10936488211154938,
-0.027982190251350403,
-0.07459105551242828,
-0.05576953664422035,
0.19054386019706726,
-0.15673096477985382,
0.043454647064208984,
0.030578123405575752,
0.001023058663122356,
-0.061113812029361725,
0.008617413230240345,
0.01789192110300064,
-0.03379528224468231,
0.07333704829216003,
-0.06758376210927963,
-0.04032308980822563,
-0.12070638686418533,
-0.03044939413666725,
0.0362834669649601,
0.009123535826802254,
-0.044990796595811844,
-0.042633622884750366,
-0.07179179042577744,
-0.11010327190160751,
0.10839002579450607,
-0.05423647165298462,
-0.05902193859219551,
-0.02857949398458004,
-0.08325549215078354,
0.018796207383275032,
0.034835055470466614,
0.06896884739398956,
-0.026200484484434128,
0.04591298848390579,
-0.010548440739512444,
0.06778675317764282,
0.06936480849981308,
0.030754953622817993,
-0.08208310604095459,
0.0653846338391304,
-0.1963820457458496,
0.07294302433729172,
-0.07990950345993042,
0.04360390082001686,
-0.15957826375961304,
-0.0038924869149923325,
-0.0019611686002463102,
0.029599850997328758,
0.04238700494170189,
0.1614408642053604,
-0.21272744238376617,
-0.030876295641064644,
0.1687087118625641,
-0.10763195902109146,
-0.13279667496681213,
0.0406498908996582,
-0.036999259144067764,
0.18255393207073212,
0.028227349743247032,
0.029523150995373726,
0.08913213759660721,
-0.16037945449352264,
-0.022294489666819572,
-0.01812099665403366,
0.010333359241485596,
0.06734508275985718,
0.08116967976093292,
-0.09610126912593842,
-0.0006990813417360187,
0.010612556710839272,
-0.0612914003431797,
-0.01802164874970913,
-0.0408414751291275,
-0.10424500703811646,
0.0045899092219769955,
-0.08709751814603806,
0.010447245091199875,
0.005453047808259726,
-0.0942273736000061,
-0.007612002547830343,
-0.15214625000953674,
-0.057793427258729935,
0.08467891812324524,
-0.00002668775232450571,
-0.013796081766486168,
-0.09464187920093536,
0.0636085718870163,
-0.035775523632764816,
-0.0206647589802742,
-0.1442236751317978,
-0.016089124605059624,
0.01771959848701954,
-0.1381399780511856,
0.0014452963368967175,
-0.12018341571092606,
0.06737711280584335,
0.005342571996152401,
-0.050444044172763824,
-0.04374247416853905,
-0.002577848033979535,
-0.0044412435963749886,
-0.061488714069128036,
-0.23698198795318604,
-0.025010906159877777,
-0.052404481917619705,
0.17132504284381866,
-0.2317884862422943,
0.04182033985853195,
0.0035999033134430647,
0.11974582821130753,
0.004399395547807217,
-0.059057049453258514,
0.022792967036366463,
-0.06243915855884552,
-0.024900730699300766,
-0.06855564564466476,
-0.0043016234412789345,
0.0034342100843787193,
-0.02846984751522541,
0.014674222096800804,
-0.12139137834310532,
-0.06430231034755707,
0.09529095888137817,
0.06014416739344597,
-0.14503102004528046,
0.008223640732467175,
-0.039966583251953125,
-0.05673050135374069,
-0.06781600415706635,
-0.0718679428100586,
0.084385447204113,
0.053049758076667786,
0.047856543213129044,
-0.08300822973251343,
-0.06746778637170792,
0.0034881478641182184,
-0.024375656619668007,
-0.013754853047430515,
0.1263294816017151,
0.0910896584391594,
-0.09857907146215439,
0.09208352118730545,
0.0702979788184166,
0.02078418992459774,
0.08543384820222855,
-0.023306185379624367,
-0.10623372346162796,
-0.026145050302147865,
0.056796252727508545,
0.010644526220858097,
0.17062047123908997,
-0.07161949574947357,
0.055644404143095016,
0.04700854420661926,
-0.057312507182359695,
0.04834717512130737,
-0.09276522696018219,
0.00648490572348237,
-0.00304055237211287,
-0.016390539705753326,
0.03746310994029045,
-0.016710294410586357,
0.004765782970935106,
0.09012790769338608,
0.06235261261463165,
0.021259654313325882,
0.012547078542411327,
-0.03656672313809395,
-0.14217142760753632,
0.17982566356658936,
-0.09221989661455154,
-0.238255575299263,
-0.15024535357952118,
0.055758535861968994,
0.057310279458761215,
-0.014234625734388828,
0.030911585316061974,
-0.053577207028865814,
-0.09477101266384125,
-0.08852027356624603,
0.0038970138411968946,
0.033775899559259415,
-0.06099589914083481,
-0.06302347034215927,
0.03564249351620674,
0.03882383555173874,
-0.12014775723218918,
0.02394936978816986,
0.056450679898262024,
-0.0020400190260261297,
-0.004255637060850859,
0.045932650566101074,
0.09282688796520233,
0.2042648047208786,
-0.0029902453534305096,
0.005519507918506861,
0.05900596082210541,
0.2758841812610626,
-0.15884141623973846,
0.11355440318584442,
0.13835501670837402,
-0.06560976803302765,
0.07683132588863373,
0.19025172293186188,
0.030262501910328865,
-0.0939931571483612,
0.018768295645713806,
0.03134610503911972,
-0.02410808950662613,
-0.27108055353164673,
-0.05080442875623703,
-0.023897584527730942,
-0.07520309835672379,
0.08098097890615463,
0.08847048878669739,
0.0916871652007103,
0.028044668957591057,
-0.06392549723386765,
-0.0988224670290947,
0.026554882526397705,
0.11190618574619293,
-0.017336886376142502,
0.0024288217537105083,
0.07985373586416245,
-0.04878299683332443,
0.005801938474178314,
0.08532202243804932,
-0.020526492968201637,
0.12679286301136017,
0.05521775782108307,
0.10699477046728134,
0.08260072022676468,
0.08279095590114594,
-0.009515267796814442,
0.0307913888245821,
0.0026503463741391897,
0.02060583420097828,
0.020737893879413605,
-0.09097623825073242,
0.017064353451132774,
0.11505641788244247,
0.014642301015555859,
0.02032553404569626,
0.01459258608520031,
-0.058910876512527466,
0.03713325411081314,
0.1933179348707199,
0.03199465200304985,
-0.2061154991388321,
-0.08038612455129623,
0.0542338602244854,
-0.07409846037626266,
-0.15521550178527832,
-0.007620801217854023,
0.015986477956175804,
-0.15715132653713226,
0.018397698178887367,
-0.039695415645837784,
0.1073032021522522,
-0.06608713418245316,
-0.03798544034361839,
0.10192058235406876,
0.048482101410627365,
-0.028460949659347534,
0.04989929124712944,
-0.19209927320480347,
0.10784655809402466,
0.028568807989358902,
0.06779257208108902,
-0.08892970532178879,
0.08724246174097061,
-0.0015932342503219843,
-0.011674707755446434,
0.1649181991815567,
-0.0020832798909395933,
-0.06093902885913849,
-0.07643745839595795,
-0.07959781587123871,
-0.00531748915091157,
0.07980035990476608,
-0.1370832473039627,
0.0751100406050682,
-0.033452875912189484,
-0.0313371904194355,
-0.007147870492190123,
-0.0859275683760643,
-0.1180790364742279,
-0.16262733936309814,
0.061198484152555466,
-0.08442038297653198,
0.02453313022851944,
-0.0804985836148262,
-0.052395667880773544,
0.03343388810753822,
0.17713890969753265,
-0.20273298025131226,
-0.10893429815769196,
-0.14334061741828918,
-0.10114128142595291,
0.15259984135627747,
-0.04731695353984833,
0.08691044896841049,
-0.007201332598924637,
0.16218975186347961,
0.00008136751421261579,
-0.01844305917620659,
0.0842253640294075,
-0.09492965042591095,
-0.18502108752727509,
-0.04672754928469658,
0.18379199504852295,
0.1312231868505478,
0.028197286650538445,
-0.011228502728044987,
0.02626769058406353,
-0.06610091030597687,
-0.10937976092100143,
0.030172422528266907,
0.14868010580539703,
0.06737270951271057,
-0.019497528672218323,
-0.04252130538225174,
-0.09584315866231918,
-0.064703069627285,
-0.043597862124443054,
-0.003050177590921521,
0.2050287425518036,
-0.070322684943676,
0.1546923965215683,
0.11148670315742493,
-0.060102127492427826,
-0.21141009032726288,
0.03290581330657005,
0.0397295206785202,
0.017268575727939606,
0.0323413647711277,
-0.191900834441185,
0.08756529539823532,
-0.025466935709118843,
-0.08177987486124039,
0.17957407236099243,
-0.19307510554790497,
-0.1297256052494049,
0.10760422796010971,
0.02142084389925003,
-0.20138975977897644,
-0.1500365287065506,
-0.10363522171974182,
-0.01805701106786728,
-0.11922493577003479,
0.047810714691877365,
0.008540397509932518,
0.011494984850287437,
0.011662069708108902,
0.020514773204922676,
0.04130404442548752,
-0.04830506443977356,
0.2029387205839157,
-0.0446561835706234,
-0.005516673903912306,
-0.05273920297622681,
-0.07709458470344543,
0.013759861700236797,
-0.05520083010196686,
0.12393932789564133,
-0.016330868005752563,
0.034237537533044815,
-0.16240613162517548,
-0.04284835234284401,
-0.0630665048956871,
0.035096801817417145,
-0.09589596092700958,
-0.07887813448905945,
-0.04441095516085625,
0.08295630663633347,
0.09100662171840668,
-0.012428413145244122,
0.01169382594525814,
-0.09673134982585907,
0.09629544615745544,
0.2009706348180771,
0.19358405470848083,
0.0633295327425003,
-0.05314387008547783,
0.03046184778213501,
-0.038307055830955505,
0.04414588585495949,
-0.2188640832901001,
0.04247678443789482,
0.06522481888532639,
0.02688462845981121,
0.06974633783102036,
-0.005771987605839968,
-0.16294898092746735,
-0.09104243665933609,
0.08826496452093124,
-0.0636831521987915,
-0.17394186556339264,
-0.03367873653769493,
0.04159823805093765,
-0.20945408940315247,
-0.04618501663208008,
0.03989119082689285,
-0.018465913832187653,
-0.041446298360824585,
0.026032190769910812,
0.08097519725561142,
-0.021863466128706932,
0.08518407493829727,
0.0950760617852211,
0.08999811112880707,
-0.09490492194890976,
0.052841171622276306,
0.07905349135398865,
-0.01940363086760044,
0.030652064830064774,
0.13818641006946564,
-0.03646538779139519,
-0.04622619226574898,
0.07921917736530304,
0.12132767587900162,
-0.0031819341238588095,
-0.05521034821867943,
0.004267476033419371,
-0.049098532646894455,
0.06138637661933899,
0.12190491706132889,
0.021225154399871826,
-0.011393753811717033,
0.07855856418609619,
0.025349637493491173,
-0.09237981587648392,
0.12371013313531876,
0.04185954108834267,
0.02055821754038334,
-0.03515569865703583,
-0.02861037105321884,
-0.014265110716223717,
-0.0017343778163194656,
-0.014768585562705994,
0.00027858768589794636,
-0.09069497883319855,
0.0015787361189723015,
-0.11658487468957901,
0.017843464389443398,
-0.06725388765335083,
-0.0005016532959416509,
0.028911152854561806,
-0.04888756573200226,
-0.0036345093976706266,
-0.005113163031637669,
-0.07815469056367874,
-0.0527898333966732,
-0.022131646052002907,
0.07822831720113754,
-0.14035795629024506,
0.03465598076581955,
0.07448223978281021,
-0.10333461314439774,
0.06892556697130203,
-0.008120646700263023,
0.013014335185289383,
0.007627950515598059,
-0.14365744590759277,
0.05623140186071396,
-0.02945813164114952,
-0.00653649540618062,
0.0014239016454666853,
-0.1796097308397293,
-0.011555179953575134,
-0.042533792555332184,
-0.07062070071697235,
0.013413636945188046,
-0.012913190759718418,
-0.12236585468053818,
0.110450379550457,
0.007939588278532028,
-0.06537073850631714,
-0.015546583570539951,
0.045166563242673874,
0.0718785896897316,
-0.012280561961233616,
0.1080503761768341,
-0.027087215334177017,
0.08267063647508621,
-0.18073134124279022,
-0.005913458298891783,
-0.016532042995095253,
0.05379952862858772,
-0.018159737810492516,
-0.045532725751399994,
0.05673592537641525,
-0.02058895118534565,
0.164139062166214,
-0.0012772013433277607,
0.07334963232278824,
0.051869072020053864,
0.009864007122814655,
0.04401415213942528,
0.07277357578277588,
0.06407126784324646,
-0.01681201159954071,
-0.004601679742336273,
0.03282833471894264,
-0.0018730671145021915,
-0.045296087861061096,
-0.14011424779891968,
0.07312480360269547,
0.17647749185562134,
0.07014699280261993,
0.022771956399083138,
0.008567322045564651,
-0.13379374146461487,
-0.07430650293827057,
0.10477078706026077,
-0.01792232133448124,
-0.03040623478591442,
-0.06671395152807236,
0.22744539380073547,
0.14973148703575134,
-0.19028127193450928,
0.07572297751903534,
-0.054439276456832886,
-0.038440149277448654,
-0.1435355246067047,
-0.1686539649963379,
-0.05750570073723793,
-0.048870500177145004,
-0.03220955654978752,
-0.05980883911252022,
0.051531530916690826,
0.039265114814043045,
-0.004663804545998573,
-0.02181880548596382,
0.10881195217370987,
0.03113744780421257,
-0.04014651104807854,
0.0462324395775795,
0.06068835407495499,
0.042648766189813614,
-0.09921442717313766,
0.011160290800035,
0.002389137865975499,
0.008375519886612892,
0.06241960823535919,
0.02345135807991028,
-0.0694429948925972,
0.029791612178087234,
-0.017441386356949806,
-0.1206028163433075,
0.04920424520969391,
-0.007382280193269253,
-0.02161260135471821,
0.15011633932590485,
0.03557932749390602,
0.007705779280513525,
-0.0102414945140481,
0.2399473935365677,
-0.07250183820724487,
-0.08194702863693237,
-0.1303110122680664,
0.08548229187726974,
-0.06418672949075699,
0.02347709611058235,
0.015268092043697834,
-0.12437192350625992,
0.012704011984169483,
0.1793370544910431,
0.11639893800020218,
-0.020440157502889633,
0.013169199228286743,
0.0458420030772686,
0.00944592896848917,
-0.035121262073516846,
0.012001585215330124,
0.056230898946523666,
0.20680110156536102,
-0.07782764732837677,
0.061167020350694656,
-0.017614347860217094,
-0.06891374289989471,
-0.03134419769048691,
0.10733165591955185,
-0.011959775350987911,
-0.010875735431909561,
-0.05935762822628021,
0.14111125469207764,
-0.07606784254312515,
-0.21537406742572784,
0.05105280876159668,
-0.0821949914097786,
-0.13868452608585358,
-0.04909532889723778,
0.026993973180651665,
-0.026116052642464638,
0.006130906753242016,
0.06018061563372612,
-0.05275861173868179,
0.18007898330688477,
0.028724515810608864,
-0.04316242039203644,
-0.0947510153055191,
0.05672045797109604,
-0.1623714119195938,
0.28204894065856934,
0.02151842787861824,
0.048691339790821075,
0.10992647707462311,
-0.02214217185974121,
-0.13270075619220734,
0.015201215632259846,
0.11275465786457062,
-0.06338974088430405,
0.06375198066234589,
0.1610059142112732,
0.0029797141905874014,
0.12227534502744675,
0.06605304777622223,
-0.05792946740984917,
0.03564044460654259,
-0.06739834696054459,
-0.05393943563103676,
-0.11609920114278793,
0.07782628387212753,
-0.09682848304510117,
0.15295644104480743,
0.12124107778072357,
-0.07323046773672104,
-0.0026875033508986235,
-0.020534852519631386,
0.08149248361587524,
0.018438078463077545,
0.10931225121021271,
0.008929711766541004,
-0.186166912317276,
0.046575866639614105,
0.008079797960817814,
0.09851927310228348,
-0.2115471214056015,
-0.04853019490838051,
0.04166003316640854,
-0.017471889033913612,
-0.08520112186670303,
0.11405480653047562,
0.03882638365030289,
0.01746544800698757,
-0.034990351647138596,
-0.04833139479160309,
0.01700066402554512,
0.15264013409614563,
-0.10562065243721008,
-0.014745889231562614
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-300m-england-0215-parallel-dim32-avatar
This model is a fine-tuned version of [vitouphy/wav2vec2-xls-r-300m-english](https://huggingface.co/vitouphy/wav2vec2-xls-r-300m-english) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2242
- Wer: 0.1663
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1227
- num_epochs: 15
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:-----:|:---------------:|:------:|
| 1.578 | 1.0 | 1227 | 0.2271 | 0.2289 |
| 0.2219 | 2.0 | 2454 | 0.1971 | 0.1976 |
| 0.1844 | 3.0 | 3681 | 0.1874 | 0.1813 |
| 0.1582 | 4.0 | 4908 | 0.1819 | 0.1776 |
| 0.1375 | 5.0 | 6135 | 0.1778 | 0.1698 |
| 0.1216 | 6.0 | 7362 | 0.1780 | 0.1686 |
| 0.1083 | 7.0 | 8589 | 0.1801 | 0.1679 |
| 0.0971 | 8.0 | 9816 | 0.1859 | 0.1681 |
| 0.0871 | 9.0 | 11043 | 0.1854 | 0.1672 |
| 0.0784 | 10.0 | 12270 | 0.1940 | 0.1657 |
| 0.07 | 11.0 | 13497 | 0.1990 | 0.1656 |
| 0.0633 | 12.0 | 14724 | 0.2066 | 0.1650 |
| 0.0577 | 13.0 | 15951 | 0.2117 | 0.1645 |
| 0.0523 | 14.0 | 17178 | 0.2186 | 0.1655 |
| 0.0486 | 15.0 | 18405 | 0.2242 | 0.1663 |
### Framework versions
- Transformers 4.36.0.dev0
- Pytorch 1.12.1+cu113
- Datasets 2.14.7
- Tokenizers 0.15.0
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "metrics": ["wer"], "base_model": "vitouphy/wav2vec2-xls-r-300m-english", "model-index": [{"name": "wav2vec2-300m-england-0215-parallel-dim32-avatar", "results": []}]} | automatic-speech-recognition | Lin25/wav2vec2-300m-england-0215-parallel-dim32-avatar | [
"transformers",
"tensorboard",
"safetensors",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"base_model:vitouphy/wav2vec2-xls-r-300m-english",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2024-02-15T06:39:23+00:00 | [] | [] | TAGS
#transformers #tensorboard #safetensors #wav2vec2 #automatic-speech-recognition #generated_from_trainer #base_model-vitouphy/wav2vec2-xls-r-300m-english #license-apache-2.0 #endpoints_compatible #region-us
| wav2vec2-300m-england-0215-parallel-dim32-avatar
================================================
This model is a fine-tuned version of vitouphy/wav2vec2-xls-r-300m-english on the None dataset.
It achieves the following results on the evaluation set:
* Loss: 0.2242
* Wer: 0.1663
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 0.001
* train\_batch\_size: 16
* eval\_batch\_size: 8
* seed: 42
* gradient\_accumulation\_steps: 2
* total\_train\_batch\_size: 32
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 1227
* num\_epochs: 15
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.36.0.dev0
* Pytorch 1.12.1+cu113
* Datasets 2.14.7
* Tokenizers 0.15.0
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.001\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1227\n* num\\_epochs: 15\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.36.0.dev0\n* Pytorch 1.12.1+cu113\n* Datasets 2.14.7\n* Tokenizers 0.15.0"
] | [
"TAGS\n#transformers #tensorboard #safetensors #wav2vec2 #automatic-speech-recognition #generated_from_trainer #base_model-vitouphy/wav2vec2-xls-r-300m-english #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.001\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1227\n* num\\_epochs: 15\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.36.0.dev0\n* Pytorch 1.12.1+cu113\n* Datasets 2.14.7\n* Tokenizers 0.15.0"
] | [
80,
159,
4,
40
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #wav2vec2 #automatic-speech-recognition #generated_from_trainer #base_model-vitouphy/wav2vec2-xls-r-300m-english #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.001\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1227\n* num\\_epochs: 15\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.36.0.dev0\n* Pytorch 1.12.1+cu113\n* Datasets 2.14.7\n* Tokenizers 0.15.0"
] | [
-0.12465691566467285,
0.13422846794128418,
-0.0033921669237315655,
0.04942885786294937,
0.08694503456354141,
0.02121187187731266,
0.10567963868379593,
0.14321152865886688,
-0.05826539918780327,
0.12673917412757874,
0.11136090010404587,
0.0845835879445076,
0.07596516609191895,
0.1459297239780426,
-0.02841995656490326,
-0.29495757818222046,
0.0224810428917408,
-0.016161799430847168,
-0.1142239198088646,
0.10135062783956528,
0.08888418972492218,
-0.10815936326980591,
0.030373359099030495,
0.006174994166940451,
-0.08791209757328033,
-0.013285565190017223,
-0.03192336484789848,
-0.0618252195417881,
0.10925420373678207,
0.05721037834882736,
0.0789627730846405,
0.038640573620796204,
0.0832996517419815,
-0.27146294713020325,
0.013236995786428452,
0.05100950598716736,
0.02037181332707405,
0.07485561817884445,
0.09717301279306412,
-0.01830236427485943,
0.10736017674207687,
-0.10179122537374496,
0.07814016193151474,
0.03811271861195564,
-0.09145806729793549,
-0.3031558394432068,
-0.07879561185836792,
0.05247556045651436,
0.14443330466747284,
0.08126156777143478,
-0.03472461551427841,
0.07135944068431854,
-0.056889958679676056,
0.07816186547279358,
0.22543039917945862,
-0.2625831067562103,
-0.06478251516819,
-0.01270261686295271,
0.046449512243270874,
0.05287967249751091,
-0.1144741028547287,
-0.019110586494207382,
0.020056825131177902,
0.01791469193994999,
0.08630358427762985,
0.01640322245657444,
0.05626271665096283,
0.019265267997980118,
-0.1485597789287567,
-0.031407251954078674,
0.11930002272129059,
0.09399271011352539,
-0.01598619483411312,
-0.1202399805188179,
-0.033291932195425034,
-0.1576727032661438,
-0.05974581465125084,
-0.011626476421952248,
0.0199188981205225,
-0.035975679755210876,
-0.08254094421863556,
0.020756477490067482,
-0.06593198329210281,
-0.07009761035442352,
0.013556314632296562,
0.13418807089328766,
0.04965931922197342,
-0.03940175846219063,
0.029303492978215218,
0.08000955730676651,
0.039811696857213974,
-0.1509731411933899,
0.0010899495100602508,
0.030081072822213173,
-0.10689733177423477,
-0.012487096711993217,
-0.01752214878797531,
-0.0037971828132867813,
0.031802933663129807,
0.14865058660507202,
-0.02718919701874256,
0.09470890462398529,
0.02501765266060829,
0.010753295384347439,
-0.08100786060094833,
0.14188045263290405,
-0.06458915770053864,
-0.08041055500507355,
-0.04937044158577919,
0.11280500143766403,
0.023620815947651863,
-0.016562215983867645,
-0.07768469303846359,
0.02609618753194809,
0.0910460501909256,
0.04789042845368385,
-0.002183925360441208,
0.00941375084221363,
-0.07486971467733383,
-0.023043794557452202,
0.04457709193229675,
-0.10666938871145248,
0.05648881942033768,
0.040808551013469696,
-0.0424368754029274,
-0.005682796239852905,
-0.0050771646201610565,
0.03140757232904434,
-0.005957894492894411,
0.11618456989526749,
-0.06692945957183838,
-0.01908540353178978,
-0.052657317370176315,
-0.09988176077604294,
0.03331562876701355,
-0.03511057794094086,
-0.0008417097269557416,
-0.07353773713111877,
-0.08586865663528442,
-0.05412377789616585,
0.05573324114084244,
-0.057308558374643326,
-0.062025632709264755,
-0.07954680919647217,
-0.05664779990911484,
0.0692707896232605,
-0.009371276944875717,
0.1232767179608345,
-0.0534224733710289,
0.09213440120220184,
0.004930954892188311,
0.0687665119767189,
0.05363429710268974,
0.05418974533677101,
-0.03217229247093201,
0.04693179205060005,
-0.1655811071395874,
0.07958008348941803,
-0.10107725858688354,
0.0462600402534008,
-0.16477049887180328,
-0.08749037981033325,
-0.010742590762674809,
0.0037504418287426233,
0.0900646448135376,
0.11584417521953583,
-0.1839005947113037,
-0.09751975536346436,
0.1800609976053238,
-0.08436178416013718,
-0.10292261838912964,
0.14855927228927612,
-0.018097100779414177,
-0.04493969306349754,
0.03095185197889805,
0.18467818200588226,
0.09480796754360199,
-0.1019318625330925,
-0.014242682605981827,
-0.048202622681856155,
0.125900000333786,
0.030600544065237045,
0.11503001302480698,
-0.055704984813928604,
0.015985598787665367,
-0.006210555788129568,
-0.022121546790003777,
0.05842384323477745,
-0.07523756474256516,
-0.08422631770372391,
-0.013014234602451324,
-0.07502517849206924,
0.026611171662807465,
0.05130164325237274,
0.025490496307611465,
-0.08779723197221756,
-0.13890734314918518,
0.009012717753648758,
0.112187460064888,
-0.09905129671096802,
0.02573724091053009,
-0.07166428864002228,
0.06583460420370102,
-0.02471533976495266,
-0.005061803851276636,
-0.13695982098579407,
-0.011253075674176216,
0.02891036868095398,
-0.04860220104455948,
0.006382744759321213,
-0.023464569821953773,
0.07495082914829254,
0.05582648143172264,
-0.0624234601855278,
-0.06751437485218048,
-0.03387540578842163,
0.010957157239317894,
-0.07078094780445099,
-0.2536008656024933,
-0.04739753529429436,
-0.041560348123311996,
0.17449812591075897,
-0.23287859559059143,
0.007978971116244793,
0.009731430560350418,
0.14297343790531158,
0.0404483899474144,
-0.049616739153862,
-0.004274751991033554,
0.05847722664475441,
-0.029938064515590668,
-0.06424052268266678,
0.03185473382472992,
-0.012096774764358997,
-0.13092069327831268,
0.010333947837352753,
-0.1443648487329483,
0.09557273238897324,
0.10555122047662735,
0.043106622993946075,
-0.08215141296386719,
-0.08893183618783951,
-0.056019674986600876,
-0.04625513032078743,
-0.032676562666893005,
-0.005051587242633104,
0.1368291676044464,
0.022296762093901634,
0.09638182073831558,
-0.07204465568065643,
-0.03881654888391495,
0.03599295765161514,
0.014294483698904514,
-0.04407728835940361,
0.16101762652397156,
0.07079316675662994,
-0.07004653662443161,
0.10077449679374695,
0.13098689913749695,
-0.04669530689716339,
0.12386652827262878,
-0.06117767095565796,
-0.09604673832654953,
-0.03909028694033623,
0.028104135766625404,
0.038013212382793427,
0.10401234775781631,
-0.12499450147151947,
0.00011558888218132779,
0.020688150078058243,
0.02525905705988407,
0.0072943586856126785,
-0.17663924396038055,
-0.01109160203486681,
0.051894038915634155,
-0.05910249054431915,
-0.0072919102385640144,
-0.014158312231302261,
-0.018293175846338272,
0.08396372199058533,
0.013992696069180965,
-0.06050700694322586,
-0.02008850686252117,
-0.015141600742936134,
-0.10052376240491867,
0.18735739588737488,
-0.12100609391927719,
-0.13682174682617188,
-0.1107015609741211,
-0.024012308567762375,
-0.004433472640812397,
-0.013666593469679356,
0.054017987102270126,
-0.11229386925697327,
-0.042320843786001205,
-0.08469092100858688,
0.02971469797194004,
-0.059263020753860474,
0.0501706637442112,
0.024538688361644745,
0.006641092710196972,
0.04327184706926346,
-0.0882573127746582,
0.021516606211662292,
-0.019353307783603668,
0.006218044552952051,
0.01273646391928196,
0.013273934833705425,
0.09901361912488937,
0.16751913726329803,
0.051268115639686584,
0.025012869387865067,
-0.047743625938892365,
0.17528291046619415,
-0.1031772792339325,
0.005883621983230114,
0.09682352095842361,
0.0012472504749894142,
0.04982517287135124,
0.16714616119861603,
0.04836713522672653,
-0.08160148561000824,
0.02094469591975212,
0.02905653603374958,
-0.010264560580253601,
-0.23609279096126556,
-0.04486323148012161,
-0.05981893837451935,
-0.008341739885509014,
0.11863920092582703,
0.040315914899110794,
-0.02237948402762413,
0.03303240239620209,
-0.014322753064334393,
-0.004443172365427017,
0.014414799399673939,
0.06731575727462769,
0.08740384131669998,
0.04183557257056236,
0.12013236433267593,
-0.025197764858603477,
-0.028503015637397766,
0.039633914828300476,
-0.006057452410459518,
0.22490598261356354,
0.013164778240025043,
0.15907412767410278,
0.03730512037873268,
0.14815069735050201,
0.013581855222582817,
0.04572770744562149,
0.013271297328174114,
-0.025757092982530594,
0.0042893411591649055,
-0.06349009275436401,
-0.0153342979028821,
0.06815525889396667,
0.10480351746082306,
0.015667922794818878,
-0.11365535855293274,
0.016864264383912086,
0.028722455725073814,
0.2802526652812958,
0.10110834985971451,
-0.2881394326686859,
-0.08494052290916443,
0.024226831272244453,
-0.06429651379585266,
-0.023689718917012215,
0.030396588146686554,
0.10517676174640656,
-0.055776551365852356,
0.08272742480039597,
-0.05832483991980553,
0.07868875563144684,
-0.05875645577907562,
-0.0075154732912778854,
0.040583688765764236,
0.0838610976934433,
-0.011864845640957355,
0.05436096340417862,
-0.23350310325622559,
0.300857812166214,
0.002144297119230032,
0.06185218319296837,
-0.041023120284080505,
0.02903771586716175,
0.023582953959703445,
-0.02367059886455536,
0.09743359684944153,
-0.012309964746236801,
-0.14949147403240204,
-0.1586543619632721,
-0.10755860805511475,
0.023196902126073837,
0.11868114769458771,
-0.06869807839393616,
0.10237953066825867,
-0.022582538425922394,
-0.035772960633039474,
0.06107112765312195,
-0.0437554307281971,
-0.11314672976732254,
-0.13791383802890778,
0.01837385818362236,
0.02390027418732643,
0.04385851323604584,
-0.08861307799816132,
-0.11458326876163483,
-0.09115555882453918,
0.15207993984222412,
-0.09642759710550308,
-0.008278883993625641,
-0.13828150928020477,
0.07758791744709015,
0.1609092801809311,
-0.08590144664049149,
0.04970823600888252,
0.006224216427654028,
0.12112695723772049,
-0.004389591049402952,
-0.021128958091139793,
0.12441623210906982,
-0.08886036276817322,
-0.19982311129570007,
-0.07579217851161957,
0.16541942954063416,
0.039482783526182175,
0.06833011656999588,
-0.020997148007154465,
0.041578106582164764,
-0.009769851341843605,
-0.0781874731183052,
0.08743233233690262,
0.05540665239095688,
0.022205648943781853,
0.038115572184324265,
-0.023264657706022263,
-0.03284603729844093,
-0.06224752217531204,
-0.07518796622753143,
0.13796380162239075,
0.3099845349788666,
-0.09964250773191452,
0.054175637662410736,
0.07267068326473236,
-0.04165518283843994,
-0.14749953150749207,
-0.011213596910238266,
0.11096030473709106,
0.032545384019613266,
0.019626058638095856,
-0.1909829080104828,
0.04654333367943764,
0.08012045174837112,
-0.022109389305114746,
0.054848238825798035,
-0.299514502286911,
-0.13938665390014648,
0.11130546778440475,
0.0951491966843605,
-0.023050449788570404,
-0.1629284918308258,
-0.07291495054960251,
-0.01841890625655651,
-0.08837102353572845,
0.058659877628088,
-0.021678363904356956,
0.10548903048038483,
0.0014504729770123959,
0.0050860196352005005,
0.014934753999114037,
-0.056419748812913895,
0.15836037695407867,
-0.0077098277397453785,
0.03126494586467743,
-0.0105955321341753,
0.022249100729823112,
-0.03787294030189514,
-0.0649576187133789,
0.0032912935130298138,
-0.08965753763914108,
0.03260745853185654,
-0.11746523529291153,
-0.03461860492825508,
-0.06254439055919647,
0.013990161940455437,
-0.04498837888240814,
-0.03865162655711174,
-0.041557587683200836,
0.049785126000642776,
0.07602188736200333,
-0.00866411067545414,
0.13438765704631805,
-0.03367864340543747,
0.1528163105249405,
0.09713498502969742,
0.08908714354038239,
0.0037102289497852325,
-0.06963146477937698,
-0.010377682745456696,
-0.034309402108192444,
0.03859543427824974,
-0.1360805630683899,
0.026729537174105644,
0.14457173645496368,
0.035194989293813705,
0.1566532403230667,
0.04956107586622238,
-0.08772237598896027,
0.010831729508936405,
0.07128259539604187,
-0.08107803761959076,
-0.17113368213176727,
-0.016871606931090355,
0.044225748628377914,
-0.14615696668624878,
0.002111061243340373,
0.10808148235082626,
-0.034818731248378754,
-0.008472893387079239,
0.009534978307783604,
0.041008636355400085,
-0.017468124628067017,
0.21549323201179504,
0.035228949040174484,
0.07741193473339081,
-0.08696434646844864,
0.0661788135766983,
0.0627426728606224,
-0.18075448274612427,
0.048869747668504715,
0.08888214826583862,
-0.05956956371665001,
-0.0223891269415617,
0.034298304468393326,
0.08823622018098831,
0.013687703758478165,
-0.05001280456781387,
-0.10624095052480698,
-0.1416071206331253,
0.09546911716461182,
0.08980558812618256,
0.0286586731672287,
0.010445079766213894,
-0.018578147515654564,
0.028939250856637955,
-0.08818807452917099,
0.11803793907165527,
0.08050849288702011,
0.06904196739196777,
-0.13148103654384613,
0.0962539091706276,
0.005609280429780483,
-0.014090328477323055,
0.0027510446961969137,
0.016220945864915848,
-0.12626852095127106,
0.00307014980353415,
-0.10474570840597153,
-0.010948458686470985,
-0.08163022249937057,
-0.004538469947874546,
0.007960007525980473,
-0.06569186598062515,
-0.044137779623270035,
0.0034379728604108095,
-0.10066719353199005,
-0.04576309770345688,
-0.022153012454509735,
0.06914004683494568,
-0.11724007874727249,
-0.020933514460921288,
0.033753279596567154,
-0.1112338975071907,
0.09764771163463593,
0.03073752298951149,
0.036367110908031464,
0.019938191398978233,
-0.09965749830007553,
0.021258065477013588,
0.032896630465984344,
-0.006201489828526974,
0.02797832153737545,
-0.18904319405555725,
-0.016797177493572235,
-0.026024438440799713,
0.01231157872825861,
0.0006191044813022017,
0.042001478374004364,
-0.1148499995470047,
-0.0026931690517812967,
-0.06373682618141174,
-0.07059185951948166,
-0.054164156317710876,
0.05040789768099785,
0.0708576962351799,
0.01201667357236147,
0.14907217025756836,
-0.0870274007320404,
0.053721170872449875,
-0.2193002998828888,
0.0027807094156742096,
-0.031426187604665756,
-0.05132703110575676,
-0.05167054384946823,
-0.019764194265007973,
0.08048106729984283,
-0.054605633020401,
0.0773584246635437,
-0.06151328608393669,
0.0437510684132576,
0.04007377475500107,
-0.10574456304311752,
0.025782490149140358,
0.04471040144562721,
0.19997578859329224,
0.05083022639155388,
-0.02550613135099411,
0.04279240965843201,
0.0024539080914109945,
0.07178132981061935,
0.137655571103096,
0.13864757120609283,
0.16311699151992798,
0.04619503393769264,
0.08988433331251144,
0.056616492569446564,
-0.12446460127830505,
-0.14952975511550903,
0.13274778425693512,
-0.0505509190261364,
0.12263701856136322,
-0.0036637966986745596,
0.19783629477024078,
0.1195773184299469,
-0.19866126775741577,
0.0333433635532856,
-0.03165009990334511,
-0.0888478234410286,
-0.11334225535392761,
-0.0694054514169693,
-0.09658876061439514,
-0.18413053452968597,
0.0030663402285426855,
-0.10130501538515091,
0.043609704822301865,
0.02714058943092823,
0.04914320260286331,
0.05326608940958977,
0.09732173383235931,
0.05968717113137245,
0.016358498483896255,
0.09090957045555115,
0.02789892442524433,
-0.020201245322823524,
-0.025575658306479454,
-0.08776678144931793,
0.03609349951148033,
-0.04193320870399475,
0.04367658123373985,
-0.04059930145740509,
-0.0959504172205925,
0.07702730596065521,
0.020310360938310623,
-0.10311459749937057,
0.01788952387869358,
-0.006651570554822683,
0.05427092686295509,
0.10480048507452011,
0.040353260934352875,
-0.016790350899100304,
-0.015592037700116634,
0.21644027531147003,
-0.09369787573814392,
-0.04794204607605934,
-0.13067875802516937,
0.21886341273784637,
-0.002035768935456872,
0.00789148174226284,
0.018390290439128876,
-0.08484547585248947,
-0.001614079112187028,
0.1462174952030182,
0.14836829900741577,
-0.008537434972822666,
-0.0128974923864007,
0.03556783124804497,
-0.00969971064478159,
-0.03504331409931183,
0.060224808752536774,
0.12247539311647415,
0.08845093101263046,
-0.05734921619296074,
-0.047053441405296326,
-0.045989684760570526,
-0.054885007441043854,
-0.028954679146409035,
0.07112771272659302,
0.02638692781329155,
-0.013109331950545311,
-0.007614858448505402,
0.11813981086015701,
-0.036612775176763535,
-0.13547247648239136,
0.03129667788743973,
-0.19458259642124176,
-0.18193116784095764,
-0.030999675393104553,
0.0831923708319664,
0.03019033372402191,
0.04178052768111229,
0.004377693869173527,
-0.032944366335868835,
0.1087295338511467,
0.005747510585933924,
-0.061776306480169296,
-0.09095799177885056,
0.07698226720094681,
-0.06195938214659691,
0.17110060155391693,
-0.03269008919596672,
0.02339637652039528,
0.13258329033851624,
0.07403609901666641,
-0.08537306636571884,
0.04070720076560974,
0.0881977528333664,
-0.09510976076126099,
0.06531482934951782,
0.16427946090698242,
-0.043467238545417786,
0.1507003903388977,
0.06624232977628708,
-0.10473897308111191,
0.03149082884192467,
-0.09452635049819946,
-0.06903994083404541,
-0.05181585252285004,
0.028023090213537216,
-0.04808000475168228,
0.15068504214286804,
0.18012842535972595,
-0.06588926166296005,
-0.02063954994082451,
-0.028162136673927307,
0.01489912997931242,
0.0314309261739254,
0.14450062811374664,
-0.02213136851787567,
-0.2586686909198761,
0.026346303522586823,
0.005942780990153551,
0.03500347584486008,
-0.24705246090888977,
-0.09849528968334198,
0.018885361030697823,
-0.04813682660460472,
-0.07339941710233688,
0.11952648311853409,
0.08413115888834,
0.0388621985912323,
-0.06799168139696121,
-0.1152370497584343,
-0.0226230900734663,
0.16855163872241974,
-0.16250088810920715,
-0.05446529760956764
] |
null | null | null |
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://github.com/janhq/jan/assets/89722390/35daac7d-b895-487c-a6ac-6663daaad78e" alt="Jan banner" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<p align="center">
<a href="https://jan.ai/">Jan</a>
- <a href="https://discord.gg/AsJ8krTT3N">Discord</a>
</p>
<!-- header end -->
# Model Description
This is a GGUF version of [jan-hq/stealth-finance-v1-e1](https://huggingface.co/jan-hq/stealth-finance-v1-e1)
- Model creator: [jan-hq](https://huggingface.co/jan-hq)
- Original model: [stealth-finance-v1-e1](https://huggingface.co/jan-hq/stealth-finance-v1-e1)
- Model description: [Readme](https://huggingface.co/jan-hq/stealth-finance-v1-e1/blob/main/README.md)
# About Jan
Jan believes in the need for an open-source AI ecosystem and is building the infra and tooling to allow open-source AIs to compete on a level playing field with proprietary ones.
Jan's long-term vision is to build a cognitive framework for future robots, who are practical, useful assistants for humans and businesses in everyday life.
# Jan Model Converter
This is a repository for the [open-source converter](https://github.com/janhq/model-converter. We would be grateful if the community could contribute and strengthen this repository. We are aiming to expand the repo that can convert into various types of format
| {"language": ["en"], "license": "apache-2.0", "model_name": "stealth-finance-v1-e1", "base_model": "jan-hq/stealth-finance-v1-e1", "model_creator": "jan-hq", "quantized_by": "JanHQ"} | null | janhq/stealth-finance-v1-e1-GGUF | [
"gguf",
"en",
"base_model:jan-hq/stealth-finance-v1-e1",
"license:apache-2.0",
"region:us"
] | 2024-02-15T06:40:13+00:00 | [] | [
"en"
] | TAGS
#gguf #en #base_model-jan-hq/stealth-finance-v1-e1 #license-apache-2.0 #region-us
|
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="URL alt="Jan banner" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<p align="center">
<a href="URL
- <a href="URL
</p>
# Model Description
This is a GGUF version of jan-hq/stealth-finance-v1-e1
- Model creator: jan-hq
- Original model: stealth-finance-v1-e1
- Model description: Readme
# About Jan
Jan believes in the need for an open-source AI ecosystem and is building the infra and tooling to allow open-source AIs to compete on a level playing field with proprietary ones.
Jan's long-term vision is to build a cognitive framework for future robots, who are practical, useful assistants for humans and businesses in everyday life.
# Jan Model Converter
This is a repository for the [open-source converter](URL We would be grateful if the community could contribute and strengthen this repository. We are aiming to expand the repo that can convert into various types of format
| [
"# Model Description\nThis is a GGUF version of jan-hq/stealth-finance-v1-e1\n- Model creator: jan-hq\n- Original model: stealth-finance-v1-e1\n- Model description: Readme",
"# About Jan\nJan believes in the need for an open-source AI ecosystem and is building the infra and tooling to allow open-source AIs to compete on a level playing field with proprietary ones.\n\nJan's long-term vision is to build a cognitive framework for future robots, who are practical, useful assistants for humans and businesses in everyday life.",
"# Jan Model Converter\nThis is a repository for the [open-source converter](URL We would be grateful if the community could contribute and strengthen this repository. We are aiming to expand the repo that can convert into various types of format"
] | [
"TAGS\n#gguf #en #base_model-jan-hq/stealth-finance-v1-e1 #license-apache-2.0 #region-us \n",
"# Model Description\nThis is a GGUF version of jan-hq/stealth-finance-v1-e1\n- Model creator: jan-hq\n- Original model: stealth-finance-v1-e1\n- Model description: Readme",
"# About Jan\nJan believes in the need for an open-source AI ecosystem and is building the infra and tooling to allow open-source AIs to compete on a level playing field with proprietary ones.\n\nJan's long-term vision is to build a cognitive framework for future robots, who are practical, useful assistants for humans and businesses in everyday life.",
"# Jan Model Converter\nThis is a repository for the [open-source converter](URL We would be grateful if the community could contribute and strengthen this repository. We are aiming to expand the repo that can convert into various types of format"
] | [
40,
56,
77,
53
] | [
"passage: TAGS\n#gguf #en #base_model-jan-hq/stealth-finance-v1-e1 #license-apache-2.0 #region-us \n# Model Description\nThis is a GGUF version of jan-hq/stealth-finance-v1-e1\n- Model creator: jan-hq\n- Original model: stealth-finance-v1-e1\n- Model description: Readme# About Jan\nJan believes in the need for an open-source AI ecosystem and is building the infra and tooling to allow open-source AIs to compete on a level playing field with proprietary ones.\n\nJan's long-term vision is to build a cognitive framework for future robots, who are practical, useful assistants for humans and businesses in everyday life.# Jan Model Converter\nThis is a repository for the [open-source converter](URL We would be grateful if the community could contribute and strengthen this repository. We are aiming to expand the repo that can convert into various types of format"
] | [
-0.013293768279254436,
0.06761199235916138,
-0.002095417818054557,
-0.005686779506504536,
0.06566976010799408,
0.014988493174314499,
0.1401589959859848,
0.0987105742096901,
0.1247408390045166,
-0.01052565686404705,
0.02840522862970829,
-0.1367182433605194,
0.07093659043312073,
0.273813396692276,
0.0901867002248764,
-0.2918749153614044,
0.11652017384767532,
0.0028144337702542543,
-0.11453072726726532,
-0.036398328840732574,
0.0529448576271534,
-0.04606809467077255,
0.08041787147521973,
0.03906800225377083,
-0.02790014259517193,
-0.07658369839191437,
-0.13521821796894073,
0.03982313349843025,
0.03499319404363632,
0.10301485657691956,
-0.041702475398778915,
0.0334644578397274,
0.013426681980490685,
0.04121209308505058,
0.032137803733348846,
-0.06706918776035309,
-0.07779554277658463,
0.013976527377963066,
-0.14213144779205322,
-0.002286811824887991,
0.11389954388141632,
-0.047284796833992004,
-0.0816538855433464,
0.03881862387061119,
-0.025619903579354286,
-0.2608914375305176,
-0.01944742165505886,
-0.06318899989128113,
-0.002290603704750538,
0.06158140301704407,
-0.026912018656730652,
-0.064966581761837,
-0.07753816992044449,
-0.04214958846569061,
0.04511778801679611,
-0.2347109019756317,
0.0035982986446470022,
0.1186463013291359,
-0.01569228433072567,
0.055373892188072205,
0.019609179347753525,
0.06749334931373596,
0.05411471053957939,
0.026979900896549225,
0.0393746942281723,
-0.03965096175670624,
0.05130968242883682,
0.00010987831046804786,
-0.1171807050704956,
-0.0617896169424057,
0.32942986488342285,
0.01552900206297636,
-0.09571841359138489,
0.06309257447719574,
0.043960750102996826,
0.14125004410743713,
-0.07847555726766586,
0.040367040783166885,
0.06167171895503998,
0.0001358329609502107,
-0.005090603604912758,
-0.12130310386419296,
-0.06889539211988449,
-0.027038052678108215,
-0.11082962155342102,
0.2678784132003784,
0.03130664676427841,
0.08882869780063629,
-0.05571158602833748,
0.009347444400191307,
-0.25521788001060486,
-0.05669362470507622,
-0.13759060204029083,
-0.03301137685775757,
0.02235383354127407,
0.0490199476480484,
-0.06514827907085419,
-0.04972328990697861,
0.16906456649303436,
0.09144098311662674,
-0.11724898219108582,
-0.04551133140921593,
-0.006185796577483416,
0.06621112674474716,
0.07433804869651794,
0.015585810877382755,
-0.05119825899600983,
0.027009278535842896,
0.10088951140642166,
-0.07469828426837921,
0.05315196514129639,
-0.003997918218374252,
-0.17710468173027039,
0.02334344945847988,
-0.06748487055301666,
-0.025929484516382217,
0.004847269505262375,
0.09209803491830826,
0.04593139886856079,
-0.027438005432486534,
0.03759240359067917,
0.005236433818936348,
-0.03251676261425018,
-0.04639282822608948,
-0.058424100279808044,
-0.07607223093509674,
-0.07318145036697388,
0.02576998434960842,
-0.0165743138641119,
0.012889200821518898,
-0.04493001103401184,
-0.07047746330499649,
-0.08387434482574463,
-0.052506450563669205,
0.026717277243733406,
0.020545126870274544,
0.09972287714481354,
-0.13017170131206512,
-0.17165207862854004,
0.03826207295060158,
0.020594198256731033,
-0.030527744442224503,
-0.0991351306438446,
0.0071624466218054295,
-0.038185250014066696,
-0.08177516609430313,
-0.049011122435331345,
0.101833775639534,
-0.056030336767435074,
-0.026296747848391533,
-0.0011227118084207177,
0.09461231529712677,
-0.21043603122234344,
0.021192586049437523,
-0.0658332034945488,
-0.04070567712187767,
-0.20803476870059967,
0.010793729685246944,
-0.16646555066108704,
0.07308870553970337,
0.001964885275810957,
0.04873806610703468,
-0.07310236990451813,
0.06560588628053665,
-0.047097403556108475,
0.11889173835515976,
-0.046996332705020905,
-0.028624137863516808,
0.178716242313385,
-0.11782707273960114,
-0.08583144098520279,
0.13637734949588776,
0.02215525135397911,
0.02748546190559864,
0.061268843710422516,
0.17570476233959198,
0.05219583213329315,
-0.02484416589140892,
-0.04832726716995239,
0.04457872733473778,
-0.0646819919347763,
-0.10510311275720596,
0.09878621995449066,
0.06208633631467819,
-0.10360416769981384,
0.030380556359887123,
-0.18953101336956024,
0.07921940833330154,
-0.039472684264183044,
0.026679357513785362,
0.012281223200261593,
-0.07988372445106506,
0.03847108036279678,
-0.04418036341667175,
0.1432551145553589,
0.016575763002038002,
-0.05472150817513466,
-0.12020961195230484,
0.06281585991382599,
-0.003319247392937541,
-0.010145124979317188,
-0.1873767375946045,
0.15375228226184845,
0.02699047327041626,
0.1322496235370636,
-0.016151469200849533,
-0.09547583013772964,
0.055873893201351166,
-0.12860463559627533,
0.10481858253479004,
0.19766093790531158,
0.028841383755207062,
-0.024879485368728638,
0.020600609481334686,
0.02106029912829399,
-0.012298453599214554,
-0.030211107805371284,
0.04386018589138985,
-0.12221229076385498,
0.0022950677666813135,
-0.04040862247347832,
0.14135169982910156,
0.07968857884407043,
0.04044784978032112,
0.12105581164360046,
0.03705340251326561,
-0.01230160053819418,
0.07745236903429031,
0.06222853437066078,
0.013109816238284111,
0.04592076689004898,
0.005300230346620083,
-0.009627483785152435,
0.02440505474805832,
-0.1579090654850006,
0.18607570230960846,
0.06675838679075241,
0.11704684793949127,
0.11123979836702347,
0.07015357911586761,
0.08522401750087738,
0.03149561211466789,
-0.0375538133084774,
0.013087181374430656,
-0.012413425371050835,
-0.08605547994375229,
0.2569352388381958,
-0.054911088198423386,
0.019744964316487312,
-0.07104912400245667,
0.023349737748503685,
0.014144918881356716,
-0.08722137659788132,
0.07604849338531494,
-0.020950529724359512,
0.03793482854962349,
-0.05719362571835518,
0.07483398914337158,
-0.0834939032793045,
-0.021912243217229843,
0.19588987529277802,
-0.0036927880719304085,
-0.021802851930260658,
0.0022266407031565905,
0.04272852838039398,
-0.025494879111647606,
0.22323407232761383,
-0.22176747024059296,
-0.0857519879937172,
0.029816308990120888,
0.02016429230570793,
0.07774268090724945,
-0.07372689992189407,
-0.017217401415109634,
-0.0518203005194664,
-0.04922076687216759,
0.06261064112186432,
0.036181703209877014,
-0.06073768064379692,
0.048965856432914734,
0.10149520635604858,
0.005894693545997143,
0.04631020501255989,
0.013358702883124352,
-0.05062912404537201,
0.11052300035953522,
-0.005536952055990696,
-0.2475007027387619,
-0.04593684896826744,
-0.037550244480371475,
-0.01419161632657051,
0.023076951503753662,
0.11906463652849197,
-0.10228206217288971,
-0.06193782389163971,
-0.03892873227596283,
0.13679563999176025,
-0.030157441273331642,
-0.011143292300403118,
0.018673481419682503,
-0.04197424277663231,
-0.048927273601293564,
-0.03629171475768089,
-0.058403752744197845,
-0.03699497878551483,
-0.08717382699251175,
0.11824673414230347,
-0.009742033667862415,
0.1114601269364357,
-0.037441205233335495,
-0.0024001796264201403,
0.03675736114382744,
-0.04722319915890694,
0.17582277953624725,
-0.09894665330648422,
-0.007321320939809084,
0.11062829196453094,
0.07336593419313431,
-0.0024488032795488834,
0.1059621050953865,
0.03827039152383804,
-0.07072197645902634,
0.010074804536998272,
-0.005727873649448156,
-0.082729771733284,
-0.04865169897675514,
-0.10119209438562393,
-0.09080664813518524,
0.09019173681735992,
-0.05406428501009941,
0.06863203644752502,
0.07073269784450531,
0.11288132518529892,
-0.03555001690983772,
0.16654889285564423,
-0.05276767536997795,
0.06009211391210556,
0.07629707455635071,
-0.03202005475759506,
0.02136276662349701,
0.0015934868715703487,
-0.042424410581588745,
0.12310763448476791,
0.037617169320583344,
0.31781768798828125,
-0.003560068318620324,
0.047736164182424545,
0.12479836493730545,
0.03692273423075676,
0.10526388883590698,
-0.023026473820209503,
-0.04514814913272858,
-0.04413692653179169,
-0.05996262654662132,
-0.022158058360219002,
-0.059432655572891235,
0.10087790340185165,
-0.07161204516887665,
-0.096913643181324,
-0.011685402132570744,
0.020451879128813744,
-0.023305952548980713,
0.11552903056144714,
-0.04718527942895889,
-0.05406726896762848,
-0.139154851436615,
0.08843968063592911,
0.020639045163989067,
0.012306788004934788,
0.017962491139769554,
0.14124983549118042,
-0.09574029594659805,
-0.05428476631641388,
0.015022504143416882,
0.053474873304367065,
-0.04057877138257027,
0.07282944023609161,
-0.09619899094104767,
0.10059807449579239,
0.017908312380313873,
0.09770413488149643,
-0.1860850751399994,
0.16102911531925201,
0.025518516078591347,
0.0779561921954155,
-0.11928315460681915,
-0.003601183881983161,
0.054257407784461975,
0.08477096259593964,
0.07548443228006363,
0.04647434130311012,
-0.03816494718194008,
0.02549329772591591,
-0.08481692522764206,
0.16672930121421814,
-0.017198219895362854,
-0.08295688033103943,
0.07803425937891006,
0.05773711949586868,
0.044632550328969955,
-0.08129654824733734,
0.09886595606803894,
-0.026243848726153374,
-0.08084625750780106,
0.0589904747903347,
0.004658612888306379,
0.16906005144119263,
-0.0836150199174881,
0.008052350021898746,
0.04473089054226875,
0.06493448466062546,
0.10623365640640259,
-0.1426069289445877,
-0.06602920591831207,
0.13637959957122803,
-0.017409035935997963,
-0.07559755444526672,
-0.011171039193868637,
0.026725132018327713,
-0.0065934802405536175,
0.0288208220154047,
-0.11920702457427979,
-0.014203148894011974,
-0.06181609258055687,
-0.03583464026451111,
-0.020746316760778427,
-0.07257714122533798,
0.07615597546100616,
0.08411616086959839,
-0.007522230967879295,
-0.07869915664196014,
-0.00007042806828394532,
-0.11432590335607529,
0.002914957469329238,
0.023168403655290604,
-0.018790315836668015,
0.007558120880275965,
-0.09186574071645737,
0.03259967640042305,
0.005593676585704088,
-0.0563824437558651,
0.04476447030901909,
0.08818721026182175,
-0.03283468633890152,
0.045746099203825,
0.21561865508556366,
0.042872123420238495,
-0.24795377254486084,
-0.12090732157230377,
-0.009739411063492298,
0.0813676193356514,
-0.1339653879404068,
-0.30112606287002563,
0.09157954156398773,
0.07204435020685196,
-0.07311044633388519,
0.08718287944793701,
-0.0847342237830162,
0.0093696853145957,
0.05893325060606003,
0.06156857684254646,
0.34321898221969604,
-0.1295059621334076,
-0.048090483993291855,
0.07912203669548035,
-0.1363668292760849,
0.029519710689783096,
-0.11878649890422821,
0.09875117242336273,
-0.03302913159132004,
0.20095184445381165,
-0.00474819028750062,
0.011930068023502827,
0.08548334240913391,
-0.054044514894485474,
-0.022228743880987167,
-0.08200570940971375,
0.11572415381669998,
0.08117294311523438,
-0.027084240689873695,
0.14698705077171326,
-0.10350851714611053,
-0.03127452731132507,
-0.11797573417425156,
-0.009745707735419273,
-0.040095847100019455,
0.043039992451667786,
0.0381951704621315,
-0.11981920152902603,
-0.07229848951101303,
0.12484870851039886,
-0.01932363770902157,
0.08181963860988617,
0.0757751613855362,
-0.03294416517019272,
-0.025848934426903725,
-0.10270664095878601,
0.13599255681037903,
-0.2533024847507477,
0.11710000038146973,
0.031202778220176697,
-0.027951957657933235,
0.08918174356222153,
-0.21380972862243652,
-0.03513993322849274,
0.056583303958177567,
-0.02326151542365551,
0.07208693027496338,
0.04254816845059395,
-0.13658414781093597,
0.09366801381111145,
0.09264355897903442,
-0.0642595887184143,
-0.3164677619934082,
-0.02244316041469574,
0.11579065769910812,
0.028170624747872353,
0.152296781539917,
0.15028555691242218,
-0.06746364384889603,
-0.04603768512606621,
-0.04245926812291145,
0.0332215279340744,
-0.02497178316116333,
-0.03539382666349411,
-0.04778876528143883,
0.032645437866449356,
-0.10098568350076675,
-0.06870099902153015,
0.028125394135713577,
0.04037691280245781,
0.05839068442583084,
0.026497110724449158,
-0.10223021358251572,
-0.1204005628824234,
-0.18800349533557892,
0.10030689090490341,
-0.03320039436221123,
-0.05669128894805908,
-0.03018094226717949,
-0.148671954870224,
-0.00012104423512937501,
0.025804346427321434,
0.017256449908018112,
-0.03757881373167038,
0.07952097803354263,
0.06409814953804016,
-0.012871421873569489,
-0.04225761815905571,
-0.09177982807159424,
-0.0038176386151462793,
0.02940215729176998,
-0.11305619776248932,
0.009617678821086884,
0.064491868019104,
-0.08314720541238785,
-0.026059970259666443,
-0.17857252061367035,
-0.05695786327123642,
-0.17547591030597687,
-0.005804256070405245,
-0.04149070382118225,
-0.04810144752264023,
0.007810624781996012,
-0.05711359903216362,
-0.065904900431633,
0.11392028629779816,
-0.07872162014245987,
0.04967318847775459,
-0.010756062343716621,
0.05389595404267311,
0.047977931797504425,
0.050575122237205505,
-0.01988138072192669,
0.015186283737421036,
0.10659704357385635,
0.07094268500804901,
0.017088646069169044,
0.02129204198718071,
-0.05668329447507858,
0.06375612318515778,
0.031131714582443237,
0.013456776738166809,
0.09753582626581192,
0.05809720233082771,
-0.02112480252981186,
0.01274340134114027,
-0.0057194470427930355,
-0.011151705868542194,
0.07077876478433609,
-0.02794664353132248,
0.07739636301994324,
-0.0038719980511814356,
0.08117832243442535,
-0.0756700336933136,
0.03735324740409851,
0.06312277168035507,
0.07877290993928909,
0.02110351249575615,
-0.019933776929974556,
-0.018432848155498505,
-0.03187565132975578,
0.008609768003225327,
0.03094262070953846,
0.020975684747099876,
-0.050057925283908844,
-0.08761833608150482,
-0.005616684444248676,
0.029291318729519844,
0.2021261751651764,
0.019458463415503502,
-0.02048678882420063,
-0.023686520755290985,
0.0012383501743897796,
0.13393692672252655,
-0.0666336864233017,
-0.024269435554742813,
0.02750757895410061,
0.027072357013821602,
0.005525483284145594,
0.0650729238986969,
-0.010827001184225082,
-0.18621191382408142,
0.04977663606405258,
-0.010140985250473022,
0.12460636347532272,
0.07421725988388062,
0.05853135883808136,
-0.005662460811436176,
-0.09744127839803696,
0.10996958613395691,
-0.012126481160521507,
0.0744779035449028,
-0.03162704408168793,
0.17010493576526642,
0.1387641876935959,
-0.0037720906548202038,
0.1022925078868866,
0.061128292232751846,
0.0013920636847615242,
0.0035130728501826525,
-0.23435595631599426,
-0.04000551253557205,
-0.2169409990310669,
0.02085905335843563,
-0.16550488770008087,
-0.09500300139188766,
0.0418504923582077,
0.016246341168880463,
-0.028863955289125443,
0.015468671917915344,
0.02574288286268711,
-0.0444883294403553,
-0.004916682373732328,
-0.06561379879713058,
0.00036667429958470166,
0.022723916918039322,
-0.03709876537322998,
0.0327557697892189,
0.11125614494085312,
0.02192801423370838,
0.029582340270280838,
0.06153366342186928,
0.001108340104110539,
-0.007697936147451401,
0.00989332515746355,
-0.04099280387163162,
-0.02727154828608036,
-0.004547422751784325,
0.11057568341493607,
0.02452559769153595,
-0.055734410881996155,
0.05454351752996445,
0.16702872514724731,
-0.00017852627206593752,
-0.07281236350536346,
-0.11437957733869553,
0.14220841228961945,
-0.059384092688560486,
0.002505417913198471,
-0.035477958619594574,
-0.06954663246870041,
-0.061577681452035904,
0.21910198032855988,
0.11100691556930542,
-0.05724416300654411,
-0.012183699756860733,
-0.030175229534506798,
-0.015981487929821014,
-0.002450232859700918,
0.09045170247554779,
0.07672806084156036,
0.2771245837211609,
0.0017230578232556581,
-0.016311118379235268,
0.010793009772896767,
0.04107307270169258,
-0.145423024892807,
-0.02756388857960701,
-0.013556959107518196,
-0.013447818346321583,
-0.04475273936986923,
-0.00408343318849802,
-0.1269880086183548,
-0.125852569937706,
-0.026683513075113297,
-0.05622881278395653,
-0.040014490485191345,
0.009302924387156963,
0.02284666709601879,
0.04294392094016075,
0.12423593550920486,
-0.050706591457128525,
0.010738628916442394,
0.03551359102129936,
0.0027749116998165846,
-0.11870484054088593,
0.03232322633266449,
0.08541542291641235,
-0.11560800671577454,
0.1835288405418396,
-0.033044736832380295,
-0.05043217912316322,
0.06828350573778152,
-0.0437251552939415,
-0.1448289304971695,
0.055154141038656235,
-0.05887339636683464,
-0.17752426862716675,
-0.08317315578460693,
0.09889009594917297,
-0.026550481095910072,
0.026532242074608803,
0.06833535432815552,
-0.08156660199165344,
-0.03540872409939766,
0.1282879114151001,
0.04325302317738533,
0.03339233249425888,
-0.009548421949148178,
-0.17215250432491302,
0.06637604534626007,
0.0819329172372818,
0.006900264881551266,
-0.09091248363256454,
-0.053628209978342056,
0.08036526292562485,
0.031064821407198906,
-0.00132721324916929,
-0.10284982621669769,
-0.04846099764108658,
-0.044653113931417465,
0.06158443167805672,
-0.06574934720993042,
-0.16732469201087952,
0.03867776691913605,
-0.07843876630067825,
0.11770155280828476,
0.062133900821208954,
0.006895084865391254,
0.18618662655353546,
0.006084510590881109,
-0.0596868135035038,
-0.21875563263893127,
0.021455176174640656,
0.03356144204735756,
-0.06766248494386673,
-0.09751731902360916
] |
null | null | transformers |
# Uploaded model
- **Developed by:** ribhu
- **License:** apache-2.0
- **Finetuned from model :** unsloth/mistral-7b-bnb-4bit
This mistral model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
| {"language": ["en"], "license": "apache-2.0", "tags": ["text-generation-inference", "transformers", "unsloth", "mistral", "trl"], "base_model": "unsloth/mistral-7b-bnb-4bit"} | text-generation | ribhu/mistral-7b-test-finetune | [
"transformers",
"pytorch",
"mistral",
"text-generation",
"text-generation-inference",
"unsloth",
"trl",
"en",
"base_model:unsloth/mistral-7b-bnb-4bit",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-15T06:47:44+00:00 | [] | [
"en"
] | TAGS
#transformers #pytorch #mistral #text-generation #text-generation-inference #unsloth #trl #en #base_model-unsloth/mistral-7b-bnb-4bit #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# Uploaded model
- Developed by: ribhu
- License: apache-2.0
- Finetuned from model : unsloth/mistral-7b-bnb-4bit
This mistral model was trained 2x faster with Unsloth and Huggingface's TRL library.
<img src="URL width="200"/>
| [
"# Uploaded model\n\n- Developed by: ribhu\n- License: apache-2.0\n- Finetuned from model : unsloth/mistral-7b-bnb-4bit\n\nThis mistral model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>"
] | [
"TAGS\n#transformers #pytorch #mistral #text-generation #text-generation-inference #unsloth #trl #en #base_model-unsloth/mistral-7b-bnb-4bit #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# Uploaded model\n\n- Developed by: ribhu\n- License: apache-2.0\n- Finetuned from model : unsloth/mistral-7b-bnb-4bit\n\nThis mistral model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>"
] | [
81,
78
] | [
"passage: TAGS\n#transformers #pytorch #mistral #text-generation #text-generation-inference #unsloth #trl #en #base_model-unsloth/mistral-7b-bnb-4bit #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n# Uploaded model\n\n- Developed by: ribhu\n- License: apache-2.0\n- Finetuned from model : unsloth/mistral-7b-bnb-4bit\n\nThis mistral model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>"
] | [
-0.0495183989405632,
0.04368286579847336,
-0.002992406487464905,
0.13346515595912933,
0.11351648718118668,
0.06626076251268387,
0.10544061660766602,
0.10577688366174698,
0.010736566036939621,
-0.0558086633682251,
0.10909246653318405,
0.11103750765323639,
0.0014858508948236704,
0.03262502700090408,
-0.02457704208791256,
-0.20480374991893768,
0.06511876732110977,
-0.013528784736990929,
-0.0036736996844410896,
0.08080178499221802,
0.0677395612001419,
0.012280264869332314,
0.1027241200208664,
-0.07577858120203018,
-0.10246411710977554,
0.0066049974411726,
-0.0089558782055974,
0.0062087532132864,
0.04272992163896561,
0.07189507782459259,
0.006995778065174818,
0.03900608420372009,
0.047948312014341354,
-0.07299814373254776,
0.04658680781722069,
0.052619993686676025,
-0.033340856432914734,
0.09344173222780228,
0.011795605532824993,
0.015090573579072952,
0.1329524666070938,
0.02474215067923069,
-0.05596039071679115,
0.06147957965731621,
-0.04589083418250084,
-0.11101603507995605,
-0.08788459002971649,
0.11062874644994736,
0.01795848086476326,
0.04597708210349083,
0.04157653823494911,
0.10635437071323395,
-0.06484037637710571,
0.09897524863481522,
0.188904270529747,
-0.22745074331760406,
-0.05911550298333168,
0.1359235793352127,
0.03197788447141647,
0.05115843191742897,
-0.04306866228580475,
-0.014751282520592213,
0.033198270946741104,
0.03848174214363098,
0.04019731655716896,
-0.07484409213066101,
-0.1397465020418167,
0.01026991382241249,
-0.08637579530477524,
0.004254141356796026,
0.17698076367378235,
0.026945097371935844,
-0.04000401496887207,
0.03528854623436928,
-0.1419590264558792,
0.037001833319664,
-0.026124421507120132,
0.05548712983727455,
0.016118012368679047,
0.07990560680627823,
-0.020913507789373398,
-0.10724877566099167,
-0.036123570054769516,
-0.05491292476654053,
-0.09783658385276794,
0.03447764366865158,
0.039182115346193314,
0.10057377815246582,
-0.04402004927396774,
0.09434565156698227,
-0.008692406117916107,
-0.11385542154312134,
-0.021732257679104805,
-0.08725862205028534,
0.06907729804515839,
0.06325478851795197,
-0.03900527209043503,
0.06260349601507187,
0.10429250448942184,
0.21947720646858215,
0.03568646311759949,
0.051512472331523895,
0.0020613069646060467,
0.05904506519436836,
-0.07478976249694824,
0.045692041516304016,
-0.11382339149713516,
-0.07832324504852295,
0.1226644366979599,
0.019689910113811493,
0.07297158986330032,
-0.016505157575011253,
-0.10331885516643524,
-0.040457092225551605,
0.014854575507342815,
0.007215193472802639,
0.051595430821180344,
0.10830079019069672,
-0.01481243409216404,
-0.05260036885738373,
0.16889122128486633,
-0.05700615048408508,
-0.0074091944843530655,
0.025170443579554558,
-0.06282734125852585,
0.11165116727352142,
0.1308441311120987,
-0.03008747659623623,
-0.05748008191585541,
-0.043429452925920486,
-0.07426872849464417,
0.020695094019174576,
-0.029461339116096497,
-0.09610763192176819,
0.04687391594052315,
-0.09199371188879013,
-0.009221872314810753,
-0.15914687514305115,
-0.2732723355293274,
0.028880806639790535,
0.1099705696105957,
-0.04775717109441757,
-0.010897220112383366,
-0.04060354456305504,
-0.04495692625641823,
0.03690146282315254,
-0.02800651825964451,
-0.02945534884929657,
-0.06664586812257767,
0.03813721239566803,
-0.0907387062907219,
0.08022989332675934,
-0.20019564032554626,
0.0424712598323822,
-0.12293987721204758,
0.015613640658557415,
-0.05597015097737312,
0.08863970637321472,
-0.057913295924663544,
0.11447566002607346,
-0.09864538162946701,
-0.03307122737169266,
-0.040447935461997986,
-0.0030373139306902885,
0.05192897841334343,
0.15694484114646912,
-0.15412242710590363,
-0.01294700801372528,
0.12541262805461884,
-0.06435218453407288,
-0.09728321433067322,
0.1601819545030594,
0.0019289206247776747,
0.05206380784511566,
0.05504344776272774,
0.10163421928882599,
0.16950231790542603,
-0.035592738538980484,
0.061408013105392456,
0.12900902330875397,
-0.05018016695976257,
-0.07108292728662491,
0.04712047055363655,
0.0394124798476696,
-0.18031586706638336,
0.08370959758758545,
-0.02986804023385048,
0.11085688322782516,
-0.011691057123243809,
-0.0369698740541935,
-0.10300877690315247,
-0.05318748578429222,
0.07697949558496475,
0.0010163604747503996,
0.017092108726501465,
-0.013637850061058998,
-0.0496663935482502,
0.06684815883636475,
0.14306817948818207,
-0.06737542897462845,
0.049495868384838104,
-0.007475874852389097,
0.07643717527389526,
-0.07508084177970886,
0.06670340150594711,
-0.12248501181602478,
-0.016350334510207176,
-0.010592362843453884,
-0.004430505912750959,
0.06739301979541779,
0.09013595432043076,
0.07601623237133026,
0.020741518586874008,
-0.036937981843948364,
-0.015639444813132286,
0.10015074908733368,
-0.022676141932606697,
-0.055502552539110184,
-0.10683953762054443,
-0.0010471605928614736,
-0.0276025403290987,
0.11101257055997849,
-0.08772662281990051,
0.04701822251081467,
-0.10501879453659058,
0.061816535890102386,
-0.019547779113054276,
0.04523617774248123,
0.02663433365523815,
-0.038142528384923935,
-0.047313764691352844,
-0.06941383332014084,
0.11269930005073547,
0.04164068400859833,
-0.05904494225978851,
0.10490021109580994,
-0.08102574944496155,
0.08248374611139297,
0.15060099959373474,
0.014480621553957462,
0.03477301076054573,
-0.000052691328164655715,
-0.032630518078804016,
-0.03730584308505058,
0.06925950944423676,
-0.002884392160922289,
0.03032616712152958,
-0.004318554420024157,
0.1589556336402893,
-0.08080916851758957,
-0.008343937806785107,
-0.0015402535209432244,
-0.07647676765918732,
0.01297608483582735,
0.09547118097543716,
0.007903673686087132,
-0.13347004354000092,
0.06859608739614487,
0.2060413658618927,
-0.12557105720043182,
0.13257360458374023,
-0.0059647150337696075,
-0.028590895235538483,
0.029978377744555473,
0.001171910553239286,
-0.009658229537308216,
-0.023470239713788033,
-0.04713864251971245,
0.03651341050863266,
0.07195074111223221,
-0.0050133634358644485,
0.030961379408836365,
-0.10233411192893982,
0.015190436504781246,
-0.040788229554891586,
-0.05692525953054428,
-0.024616142734885216,
0.08164443075656891,
-0.031443774700164795,
0.07603359967470169,
-0.04870373383164406,
-0.10409664362668991,
0.04273726046085358,
0.0308853592723608,
-0.05295952409505844,
0.1394040584564209,
-0.13552071154117584,
-0.1730375438928604,
-0.19969718158245087,
-0.10296663641929626,
-0.1587868630886078,
-0.014830291271209717,
0.0962480902671814,
-0.038572221994400024,
-0.0398561917245388,
-0.10920628905296326,
0.0015316724311560392,
0.06258753687143326,
-0.011307423934340477,
0.015923848375678062,
0.0013200398534536362,
0.07072433084249496,
-0.14707441627979279,
-0.0009354740614071488,
0.004073118790984154,
-0.09581039100885391,
0.09169133007526398,
-0.09118290990591049,
0.06234707310795784,
0.11177986115217209,
-0.00041849276749417186,
-0.009531351737678051,
0.03584941849112511,
0.15823307633399963,
0.058938346803188324,
0.09011008590459824,
0.21294525265693665,
0.008193344809114933,
0.11160565912723541,
0.10918707400560379,
0.004944514483213425,
-0.026640363037586212,
0.02277223952114582,
-0.03122386522591114,
-0.06805054843425751,
-0.18284687399864197,
-0.0575316920876503,
-0.09082615375518799,
0.08409155160188675,
0.0821969136595726,
0.07946371287107468,
0.03675828501582146,
0.13528521358966827,
-0.0627216324210167,
0.10193455964326859,
0.05977938324213028,
0.09440526366233826,
0.07642816007137299,
0.009878689423203468,
0.08842780441045761,
-0.1102299839258194,
0.03515481948852539,
0.15520712733268738,
0.029007168486714363,
0.14306630194187164,
-0.06026710197329521,
0.04313187673687935,
0.0324908085167408,
0.16594798862934113,
0.013801871798932552,
0.13331857323646545,
-0.053957611322402954,
0.03110504522919655,
-0.07056329399347305,
-0.0538073405623436,
-0.06122136861085892,
0.04356390982866287,
-0.09866581112146378,
0.01916525885462761,
0.03045773319900036,
0.0593806654214859,
0.07474974542856216,
0.19659870862960815,
0.08148432523012161,
-0.28519487380981445,
-0.07899752259254456,
0.054739903658628464,
0.044732097536325455,
-0.037702687084674835,
0.03170064464211464,
0.020152030512690544,
-0.026693303138017654,
0.051495518535375595,
-0.04386981576681137,
0.14554819464683533,
0.04679398611187935,
0.035234373062849045,
0.04171369597315788,
0.1676771640777588,
0.03846704959869385,
0.08423542976379395,
-0.20304834842681885,
0.04529606178402901,
-0.00439444649964571,
-0.015220495872199535,
-0.04323315620422363,
-0.011265229433774948,
0.1261180192232132,
0.1394902616739273,
0.03245087340474129,
0.03498829901218414,
0.08115977048873901,
-0.021573996171355247,
-0.1363663673400879,
0.041701506823301315,
0.0008114039083011448,
0.00559498043730855,
0.02521539479494095,
-0.09840565174818039,
-0.038026113063097,
0.008912217803299427,
0.08137678354978561,
-0.06595814228057861,
-0.08056410402059555,
-0.005048065446317196,
0.07640618830919266,
-0.05107653886079788,
-0.01549489051103592,
-0.007060895673930645,
-0.03503837063908577,
0.14813293516635895,
0.028230953961610794,
-0.08168717473745346,
-0.09587656706571579,
-0.07299599051475525,
0.1316959708929062,
-0.09625513106584549,
0.008538281545042992,
-0.08503716439008713,
-0.02159002050757408,
0.015740355476737022,
-0.2537616491317749,
0.06667070090770721,
-0.10442113876342773,
-0.05958615615963936,
-0.00110432633664459,
0.04557592421770096,
-0.08588705211877823,
0.01747012510895729,
-0.0017733823042362928,
-0.030997246503829956,
-0.1110881119966507,
-0.11377303302288055,
-0.10028457641601562,
0.15092593431472778,
-0.005811788607388735,
0.06464394927024841,
-0.10434228926897049,
-0.05090804025530815,
0.006358814425766468,
0.04276435822248459,
0.06955129653215408,
0.16275866329669952,
-0.06082326918840408,
0.12147948145866394,
0.23986698687076569,
-0.04875539243221283,
-0.310397207736969,
-0.12102954089641571,
-0.04917197301983833,
-0.05386042967438698,
-0.038540951907634735,
-0.07296334952116013,
0.11027413606643677,
0.06478892266750336,
-0.019500456750392914,
0.10755444318056107,
-0.24270063638687134,
-0.09559372812509537,
0.12614065408706665,
0.040366653352975845,
0.3099881410598755,
-0.12715090811252594,
-0.046082571148872375,
-0.13589505851268768,
-0.23815593123435974,
0.06549779325723648,
-0.2168055772781372,
0.10653624683618546,
-0.07331762462854385,
0.030833762139081955,
-0.017922328785061836,
-0.02642514556646347,
0.0981639176607132,
0.0005304670776240528,
0.07840649783611298,
-0.1235123947262764,
0.08106820285320282,
0.10436086356639862,
-0.085138700902462,
0.19875457882881165,
-0.15944819152355194,
0.09563346952199936,
-0.09221349656581879,
0.013323906809091568,
-0.04998236894607544,
0.004274250939488411,
0.018060319125652313,
-0.03460989519953728,
-0.04270178824663162,
-0.014099779538810253,
0.047072287648916245,
-0.007725915871560574,
0.08212380111217499,
0.04765874892473221,
0.018281592056155205,
0.21039186418056488,
0.010677648708224297,
-0.11393151432275772,
-0.007252685725688934,
-0.05413800850510597,
-0.05050542205572128,
0.07226235419511795,
-0.19432176649570465,
0.041654348373413086,
0.06616544723510742,
-0.038613393902778625,
0.07948454469442368,
0.03995194286108017,
0.04242757335305214,
0.01919802650809288,
0.05244576185941696,
-0.13725917041301727,
-0.05817713588476181,
-0.03554706275463104,
-0.03170750290155411,
-0.0681091696023941,
0.08132953196763992,
0.18216344714164734,
-0.07273183017969131,
0.0007259888807311654,
0.015126064419746399,
0.03579018637537956,
-0.061049818992614746,
0.07170125842094421,
0.05016390606760979,
-0.015354677103459835,
-0.14260387420654297,
0.1534527987241745,
-0.010639455169439316,
0.036594923585653305,
-0.022917969152331352,
0.13433308899402618,
-0.1598886251449585,
-0.11554402112960815,
-0.00648321770131588,
0.06624826788902283,
-0.12867063283920288,
-0.009133485145866871,
-0.04542002081871033,
-0.042221639305353165,
0.044472940266132355,
0.016664691269397736,
0.07436350733041763,
0.03735337778925896,
-0.05509765446186066,
-0.04509327560663223,
-0.036987729370594025,
0.0224673580378294,
0.08375038206577301,
0.06513456255197525,
-0.14335960149765015,
-0.00917748548090458,
-0.017673993483185768,
0.0861695408821106,
-0.060745928436517715,
-0.0005155164399184287,
-0.11957097053527832,
-0.026633353903889656,
-0.30260297656059265,
0.05156268924474716,
-0.06161533668637276,
0.04077434912323952,
-0.021954499185085297,
-0.04767646640539169,
-0.040830958634614944,
0.055511508136987686,
-0.07080866396427155,
-0.04961847513914108,
-0.014789511449635029,
0.03215930238366127,
-0.08844980597496033,
-0.048427876085042953,
0.02294343151152134,
-0.0383138433098793,
0.08181773126125336,
0.09930417686700821,
-0.12207183241844177,
0.043666478246450424,
-0.16976632177829742,
-0.09328657388687134,
0.0511731281876564,
0.03182261437177658,
0.02053319662809372,
0.05511874333024025,
-0.000017222086171386763,
0.022297142073512077,
0.03421321138739586,
-0.023972254246473312,
0.09758651256561279,
-0.06141592562198639,
-0.008575343526899815,
-0.14047621190547943,
-0.0287553109228611,
-0.06888572126626968,
-0.025888903066515923,
0.10848496109247208,
0.133092001080513,
0.17548176646232605,
-0.04390913248062134,
0.0008702576160430908,
-0.16601674258708954,
-0.0306985042989254,
0.03453681990504265,
-0.14918455481529236,
-0.1323358565568924,
-0.09609081596136093,
0.015670424327254295,
0.0026414855383336544,
0.06946573406457901,
0.008354519493877888,
-0.06954529136419296,
-0.04136425256729126,
0.0976894423365593,
-0.030233154073357582,
-0.05423775315284729,
0.16999571025371552,
0.022013919427990913,
0.02056312933564186,
-0.10283123701810837,
0.023951411247253418,
0.07926694303750992,
0.017128270119428635,
0.009847942739725113,
0.08868252485990524,
0.008299187757074833,
0.12816600501537323,
0.010672035627067089,
0.052668776363134384,
-0.0071729738265275955,
0.0351073332130909,
-0.009520665742456913,
0.11187288910150528,
-0.05441790819168091,
0.1519031673669815,
0.1493932157754898,
-0.0562930703163147,
0.011563695967197418,
-0.009990989230573177,
-0.04175518453121185,
-0.12498693913221359,
-0.16910429298877716,
-0.10806955397129059,
-0.1772766262292862,
-0.015907583758234978,
-0.06271762400865555,
0.014806704595685005,
0.10830185562372208,
0.01583118736743927,
0.024116728454828262,
0.015014135278761387,
-0.035180360078811646,
-0.08263327926397324,
0.04254758358001709,
-0.03124827705323696,
-0.0704975500702858,
0.10386229306459427,
-0.0394752137362957,
0.07519165426492691,
-0.017615221440792084,
0.022367779165506363,
0.05239016190171242,
0.09647808223962784,
0.06462987512350082,
-0.0879109725356102,
-0.09454665333032608,
-0.04008123651146889,
0.08742018043994904,
-0.020372161641716957,
0.12186599522829056,
0.06533398479223251,
-0.021871894598007202,
0.03011886030435562,
0.18122035264968872,
-0.08509692549705505,
-0.11814209818840027,
-0.10977815091609955,
0.1484653651714325,
-0.04297897219657898,
0.019593950361013412,
-0.0012548234080895782,
-0.028775369748473167,
-0.013380155898630619,
0.222654327750206,
0.24959421157836914,
-0.1120804026722908,
-0.011801895685493946,
0.004182577133178711,
0.008453677408397198,
-0.009975990280508995,
0.180144265294075,
0.13939931988716125,
0.05098247528076172,
-0.04401523619890213,
-0.009133129380643368,
-0.018557237461209297,
-0.022529128938913345,
-0.11329807341098785,
-0.007670168299227953,
-0.08868810534477234,
-0.06370635330677032,
-0.02564048394560814,
0.0030861925333738327,
-0.11694598197937012,
-0.029135361313819885,
-0.03893841803073883,
-0.01809944398701191,
-0.0411134772002697,
-0.08472193777561188,
0.057926248759031296,
0.06790735572576523,
0.028039906173944473,
-0.08002106845378876,
0.06410335004329681,
0.1592608243227005,
-0.08483294397592545,
-0.1750742793083191,
-0.04869939759373665,
0.042061638087034225,
-0.012890013866126537,
0.07329064607620239,
0.018502915278077126,
0.020099684596061707,
0.05850132554769516,
-0.007464307360351086,
-0.14193134009838104,
0.06924732029438019,
-0.04141705483198166,
-0.051124174147844315,
0.018918469548225403,
0.00215438986197114,
-0.058000657707452774,
-0.051715392619371414,
0.03336942940950394,
0.0010904730297625065,
-0.0309377983212471,
0.10268773883581161,
-0.03029606305062771,
-0.04499005526304245,
0.007277834694832563,
-0.06417161226272583,
0.11105291545391083,
0.11726102977991104,
-0.032464850693941116,
-0.019011152908205986,
-0.09580501168966293,
0.009066950529813766,
0.004691624082624912,
-0.11326763778924942,
0.005050357896834612,
-0.025916555896401405,
-0.04358738660812378,
-0.023736752569675446,
0.044587016105651855,
-0.19988895952701569,
-0.0425775870680809,
-0.12238695472478867,
-0.023607458919286728,
-0.0945751965045929,
0.11643242090940475,
0.050872836261987686,
0.03663148730993271,
0.004423813428729773,
-0.11458392441272736,
-0.007241557817906141,
0.05575757101178169,
-0.06544741988182068,
-0.10101112723350525
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-imdb
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.4253
- Perplexity: 11.20
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 2.6962 | 1.0 | 157 | 2.5423 |
| 2.5701 | 2.0 | 314 | 2.4638 |
| 2.5417 | 3.0 | 471 | 2.4253 |
### Framework versions
- Transformers 4.38.0.dev0
- Pytorch 2.1.2+cu121
- Datasets 2.16.1
- Tokenizers 0.15.0 | {"language": ["en"], "license": "apache-2.0", "library_name": "transformers", "tags": ["generated_from_trainer", "huggingface_course", "movies"], "datasets": ["imdb"], "metrics": ["perplexity"], "base_model": "distilbert-base-uncased", "pipeline_tag": "fill-mask", "model-index": [{"name": "distilbert-base-uncased-finetuned-imdb", "results": []}]} | fill-mask | Skier8402/distilbert-base-uncased-finetuned-imdb | [
"transformers",
"safetensors",
"distilbert",
"fill-mask",
"generated_from_trainer",
"huggingface_course",
"movies",
"en",
"dataset:imdb",
"base_model:distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-15T06:48:14+00:00 | [] | [
"en"
] | TAGS
#transformers #safetensors #distilbert #fill-mask #generated_from_trainer #huggingface_course #movies #en #dataset-imdb #base_model-distilbert-base-uncased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
| distilbert-base-uncased-finetuned-imdb
======================================
This model is a fine-tuned version of distilbert-base-uncased on an unknown dataset.
It achieves the following results on the evaluation set:
* Loss: 2.4253
* Perplexity: 11.20
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 64
* eval\_batch\_size: 64
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 3.0
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.38.0.dev0
* Pytorch 2.1.2+cu121
* Datasets 2.16.1
* Tokenizers 0.15.0
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.38.0.dev0\n* Pytorch 2.1.2+cu121\n* Datasets 2.16.1\n* Tokenizers 0.15.0"
] | [
"TAGS\n#transformers #safetensors #distilbert #fill-mask #generated_from_trainer #huggingface_course #movies #en #dataset-imdb #base_model-distilbert-base-uncased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.38.0.dev0\n* Pytorch 2.1.2+cu121\n* Datasets 2.16.1\n* Tokenizers 0.15.0"
] | [
86,
113,
4,
38
] | [
"passage: TAGS\n#transformers #safetensors #distilbert #fill-mask #generated_from_trainer #huggingface_course #movies #en #dataset-imdb #base_model-distilbert-base-uncased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.38.0.dev0\n* Pytorch 2.1.2+cu121\n* Datasets 2.16.1\n* Tokenizers 0.15.0"
] | [
-0.1603843867778778,
0.16201196610927582,
-0.0023044166155159473,
0.11215280741453171,
0.13463111221790314,
0.030226008966565132,
0.1523425430059433,
0.12377489358186722,
-0.04458368197083473,
0.0713699534535408,
0.11442620307207108,
0.07968707382678986,
0.055474426597356796,
0.2275700718164444,
-0.04896022379398346,
-0.22332213819026947,
0.028375860303640366,
0.011951609514653683,
-0.05516831576824188,
0.12866544723510742,
0.09776819497346878,
-0.11141278594732285,
0.05842164531350136,
-0.012427086941897869,
-0.16018913686275482,
-0.05206437036395073,
0.01969587430357933,
-0.05821991339325905,
0.13271291553974152,
0.01576032117009163,
0.12492167949676514,
0.04546339064836502,
0.11444756388664246,
-0.15149013698101044,
0.02051919884979725,
0.04063589870929718,
0.002939111553132534,
0.06913237273693085,
0.0661390945315361,
0.020148705691099167,
0.03689049929380417,
-0.07893593609333038,
0.0729316994547844,
0.001828827429562807,
-0.1151018813252449,
-0.22970247268676758,
-0.10711231827735901,
0.046720240265131,
0.059461500495672226,
0.07764609158039093,
-0.007368337828665972,
0.09260575473308563,
-0.062375571578741074,
0.08001107722520828,
0.20382265746593475,
-0.25673797726631165,
-0.0633033737540245,
-0.010284284129738808,
0.053999535739421844,
0.04486588016152382,
-0.10496874153614044,
-0.052595075219869614,
0.027599766850471497,
0.027700914070010185,
0.14503812789916992,
0.0045616808347404,
-0.02430904097855091,
-0.06825348734855652,
-0.1348697990179062,
-0.07819601148366928,
0.13344016671180725,
0.0656389370560646,
-0.06669739633798599,
-0.06822311878204346,
-0.06201056391000748,
-0.1490950733423233,
-0.053426243364810944,
0.04097972437739372,
0.022641360759735107,
-0.03738419711589813,
-0.06671934574842453,
0.01888246461749077,
-0.08617798984050751,
-0.06472515314817429,
0.011740567162632942,
0.13854360580444336,
0.06456215679645538,
0.021845553070306778,
-0.009340723045170307,
0.11275358498096466,
0.01550313364714384,
-0.16885919868946075,
0.005707830190658569,
0.0017968114698305726,
0.015009121038019657,
-0.004198002628982067,
-0.031082578003406525,
0.004559285007417202,
-0.000977327348664403,
0.1832905113697052,
-0.08693789690732956,
0.02586684376001358,
0.011368386447429657,
0.05592409148812294,
-0.10385508835315704,
0.15618683397769928,
-0.06913300603628159,
-0.030524011701345444,
0.06689346581697464,
0.15953534841537476,
0.07475382089614868,
-0.011420268565416336,
-0.09283147752285004,
-0.003033090615645051,
0.11615585535764694,
0.03170343115925789,
-0.006997119635343552,
0.042368173599243164,
-0.10153767466545105,
-0.014832763932645321,
0.11577627062797546,
-0.09868360310792923,
0.009580451063811779,
0.02013222873210907,
-0.05456685647368431,
-0.0386618971824646,
0.06491877883672714,
0.0071760546416044235,
0.005619588773697615,
0.08710914105176926,
-0.07338432967662811,
-0.003444669069722295,
-0.07718474417924881,
-0.10987348854541779,
0.03915376588702202,
-0.06673933565616608,
0.004987855441868305,
-0.10618666559457779,
-0.15150758624076843,
-0.01492281537503004,
0.03628920391201973,
-0.015837684273719788,
-0.07601945102214813,
-0.041387706995010376,
-0.06978794932365417,
0.03664902225136757,
-0.008116759359836578,
0.0645172968506813,
-0.062404509633779526,
0.1245671957731247,
0.03354310244321823,
0.07706131041049957,
-0.031405359506607056,
0.05114978179335594,
-0.07107287645339966,
0.06029628589749336,
-0.17637212574481964,
0.04156915098428726,
-0.06406335532665253,
0.016582168638706207,
-0.06236768886446953,
-0.11359934508800507,
0.02664165198802948,
-0.03560846671462059,
0.08477816730737686,
0.13525451719760895,
-0.162744402885437,
-0.05503195524215698,
0.21701429784297943,
-0.11308111250400543,
-0.1312689483165741,
0.13519088923931122,
-0.02677462249994278,
-0.03760246932506561,
0.005862070247530937,
0.12146685272455215,
0.07869292795658112,
-0.11374428123235703,
-0.014198754914104939,
-0.00608406588435173,
0.08483005315065384,
-0.02920055203139782,
0.13893362879753113,
0.022225264459848404,
-0.0012673009186983109,
0.007607220206409693,
-0.10555887222290039,
0.07991593331098557,
-0.09829367697238922,
-0.08851533383131027,
-0.0235246904194355,
-0.09279129654169083,
0.08505208045244217,
0.04554237797856331,
0.03302667662501335,
-0.08046124130487442,
-0.10817262530326843,
-0.029729437083005905,
0.1232747957110405,
-0.07623877376317978,
0.028503820300102234,
-0.07138029485940933,
0.12713201344013214,
-0.06427008658647537,
-0.03370804339647293,
-0.15869103372097015,
-0.08184093236923218,
0.03203253448009491,
-0.024936256930232048,
-0.03643135353922844,
-0.06152677908539772,
0.0461193323135376,
0.13855133950710297,
-0.060627374798059464,
-0.07751131802797318,
-0.09493325650691986,
0.000773096748162061,
-0.05411341041326523,
-0.19242429733276367,
-0.07428300380706787,
-0.04694168269634247,
0.21991083025932312,
-0.16970746219158173,
0.019948681816458702,
-0.04784967750310898,
0.11347125470638275,
0.02721107192337513,
-0.0514950230717659,
-0.0005032300250604749,
0.0657559260725975,
-0.010729040019214153,
-0.0934913158416748,
0.06435389071702957,
0.040546856820583344,
-0.07080960273742676,
0.011693891137838364,
-0.12541793286800385,
0.14422661066055298,
0.11486146599054337,
0.008183571510016918,
-0.07829535752534866,
0.01790812239050865,
-0.07296524941921234,
-0.030837800353765488,
-0.034237608313560486,
0.021016962826251984,
0.07624682784080505,
0.024111570790410042,
0.14445729553699493,
-0.08070418983697891,
-0.002486050594598055,
0.03700847923755646,
-0.03040706180036068,
-0.014420692808926105,
0.08383872359991074,
0.08772391080856323,
-0.05745340511202812,
0.13356196880340576,
0.13562355935573578,
-0.06982050836086273,
0.11831943690776825,
-0.05648200586438179,
-0.0820106565952301,
-0.025258878245949745,
0.022537553682923317,
0.041967008262872696,
0.13322089612483978,
-0.06102054566144943,
0.006527912802994251,
0.04203315079212189,
-0.025436298921704292,
-0.011480678804218769,
-0.22405698895454407,
-0.05223216861486435,
0.03162933513522148,
-0.049624014645814896,
-0.04284634068608284,
0.01370404101908207,
-0.0013818338047713041,
0.10485754162073135,
0.010842268355190754,
-0.07969257235527039,
0.03535139560699463,
-0.010925588198006153,
-0.08265092968940735,
0.18514610826969147,
-0.09226792305707932,
-0.19135473668575287,
-0.12979952991008759,
-0.024461863562464714,
-0.013099768199026585,
0.008939503692090511,
0.04995507001876831,
-0.031544964760541916,
-0.03891638666391373,
-0.07443653792142868,
-0.043105076998472214,
0.026249006390571594,
0.01888122409582138,
0.029086988419294357,
-0.013414857909083366,
0.09156198799610138,
-0.08242758363485336,
-0.012325874529778957,
0.005376671440899372,
-0.04770093038678169,
0.0854916051030159,
0.0254944059997797,
0.11349865794181824,
0.10890145599842072,
-0.018942059949040413,
-0.004428636282682419,
-0.02455238066613674,
0.22892698645591736,
-0.06777440756559372,
-0.02014387771487236,
0.16801708936691284,
-0.02278531715273857,
0.0838002860546112,
0.15032796561717987,
0.03974737599492073,
-0.07212385535240173,
0.010513746179640293,
-0.02341715805232525,
-0.03229370340704918,
-0.18623919785022736,
-0.0377461239695549,
-0.06042424216866493,
-0.02197570726275444,
0.1105666533112526,
0.005729996599256992,
-0.020452424883842468,
0.06259654462337494,
-0.03482259064912796,
0.048676688224077225,
0.004651328083127737,
0.0767085924744606,
0.06515022367238998,
0.05544476583600044,
0.1185816302895546,
-0.013610804453492165,
-0.019135607406497,
0.03228679671883583,
-0.012505496852099895,
0.21482565999031067,
-0.05183447524905205,
0.12561078369617462,
0.051543984562158585,
0.217286616563797,
0.02654515951871872,
0.06690356135368347,
0.020609015598893166,
0.00682805385440588,
-0.009413812309503555,
-0.0447968952357769,
-0.07200049608945847,
0.010428554378449917,
-0.028910258784890175,
0.04126375913619995,
-0.12891815602779388,
0.06251443922519684,
0.0012617576867341995,
0.270332008600235,
0.06637324392795563,
-0.3612574636936188,
-0.09517649561166763,
-0.004597539082169533,
0.020601749420166016,
-0.0413605161011219,
-0.011284442618489265,
0.12293803691864014,
-0.10913171619176865,
0.05837070941925049,
-0.07755058258771896,
0.060295041650533676,
-0.03164555877447128,
-0.0035052031744271517,
0.056759241968393326,
0.07361051440238953,
0.022256910800933838,
0.05064048618078232,
-0.23464807868003845,
0.24542826414108276,
-0.03130678832530975,
0.06602489948272705,
-0.036632731556892395,
0.009540996514260769,
0.058211181312799454,
0.041658058762550354,
0.12277694791555405,
0.005539776291698217,
-0.005029602441936731,
-0.18964368104934692,
-0.12377721071243286,
0.019861755892634392,
0.0644727349281311,
-0.04234364628791809,
0.10969333350658417,
-0.034905336797237396,
-0.03046562522649765,
0.044873133301734924,
0.01391509361565113,
-0.06833651661872864,
-0.08772625029087067,
0.02952723018825054,
0.003328539663925767,
0.01629137247800827,
-0.11928775906562805,
-0.1356414556503296,
-0.06834253668785095,
0.12927694618701935,
-0.05118435248732567,
-0.061277907341718674,
-0.11509872227907181,
0.0734914168715477,
0.1052950993180275,
-0.10100112110376358,
0.08880594372749329,
-0.019288869574666023,
0.1549491435289383,
-0.023935772478580475,
-0.05520028620958328,
0.07861020416021347,
-0.0862557515501976,
-0.2110317349433899,
-0.06772197037935257,
0.12955515086650848,
-0.013603655621409416,
0.06528104096651077,
-0.02413276769220829,
0.03184053674340248,
-0.005996904335916042,
-0.06269511580467224,
0.0049984208308160305,
0.03840453177690506,
0.08711913228034973,
0.024771859869360924,
-0.05043414607644081,
-0.03420741483569145,
-0.046582434326410294,
-0.026079781353473663,
0.15371859073638916,
0.3180357813835144,
-0.10344523936510086,
0.007999214343726635,
0.025982758030295372,
-0.02059345878660679,
-0.21094313263893127,
-0.03711703047156334,
0.11076948046684265,
0.022171862423419952,
0.022565020248293877,
-0.14370432496070862,
0.030957747250795364,
0.06884194165468216,
-0.0419645756483078,
0.09357339888811111,
-0.27803513407707214,
-0.12907513976097107,
0.09354992210865021,
0.1647690236568451,
0.11456667631864548,
-0.13400444388389587,
-0.02814481221139431,
0.00399739621207118,
-0.1390462964773178,
0.094261035323143,
-0.02352248504757881,
0.10861840844154358,
-0.036314744502305984,
0.02927400916814804,
0.001390226068906486,
-0.07039526849985123,
0.13896305859088898,
-0.04329322651028633,
0.057614121586084366,
-0.041418299078941345,
-0.020466282963752747,
0.05851605907082558,
-0.07364115864038467,
0.03802207484841347,
-0.05786195397377014,
0.0578625462949276,
-0.065342478454113,
-0.008905025199055672,
-0.09374070912599564,
0.017780127003788948,
-0.036575011909008026,
-0.029695497825741768,
-0.012221241369843483,
0.06032775714993477,
0.05658816173672676,
-0.0011097700335085392,
0.08654265850782394,
0.041444431990385056,
0.11957049369812012,
0.09261944144964218,
0.021617310121655464,
0.008505752310156822,
-0.06338317692279816,
-0.0470302477478981,
-0.03946338966488838,
0.05466621741652489,
-0.08188948035240173,
0.0189049169421196,
0.11781397461891174,
0.03061797097325325,
0.15242986381053925,
0.05770557373762131,
-0.04342171177268028,
0.023219740018248558,
0.08803536742925644,
-0.1402284950017929,
-0.09806685149669647,
-0.0126345194876194,
0.031010176986455917,
-0.14320829510688782,
-0.0371788889169693,
0.07913557440042496,
-0.08874589204788208,
-0.019094429910182953,
-0.030491678044199944,
0.03357599303126335,
-0.016645684838294983,
0.15615186095237732,
0.0718989297747612,
0.0638425275683403,
-0.10995190590620041,
0.09538538753986359,
0.05153024196624756,
-0.09028510749340057,
-0.0015107168583199382,
0.0665830746293068,
-0.10869614779949188,
-0.023542365059256554,
0.04817486181855202,
0.1068175733089447,
-0.017031138762831688,
-0.05318544805049896,
-0.12409607321023941,
-0.10796232521533966,
0.062035027891397476,
0.08789733052253723,
0.06488366425037384,
0.04549585282802582,
-0.006151583511382341,
-0.00346204312518239,
-0.10782834142446518,
0.11767727881669998,
0.08679592609405518,
0.07869107276201248,
-0.13591979444026947,
0.11473510414361954,
0.006994942203164101,
0.05155613645911217,
-0.013153976760804653,
0.041567977517843246,
-0.05766836181282997,
-0.018911724910140038,
-0.10951179265975952,
0.04578975960612297,
-0.04856236279010773,
0.005212461110204458,
-0.03475818783044815,
-0.05550992488861084,
-0.031775057315826416,
0.038162339478731155,
-0.08548793941736221,
-0.06718640774488449,
-0.020544901490211487,
0.02573721669614315,
-0.12309055030345917,
-0.05404657870531082,
0.046550970524549484,
-0.11472935974597931,
0.08149676024913788,
0.05896097794175148,
0.033062901347875595,
0.015831349417567253,
-0.08865087479352951,
-0.05145982652902603,
0.047371696680784225,
0.012490404769778252,
0.028537441045045853,
-0.1206250935792923,
-0.004802065435796976,
-0.012793779373168945,
-0.019216394051909447,
-0.010548483580350876,
0.09999196231365204,
-0.13421012461185455,
-0.009611163288354874,
-0.003857796546071768,
-0.021685924381017685,
-0.05902586504817009,
0.04185590520501137,
0.07984847575426102,
0.028166456148028374,
0.16957701742649078,
-0.10175996273756027,
0.04616260901093483,
-0.22232027351856232,
-0.015759769827127457,
-0.03118540346622467,
-0.08011119067668915,
-0.11812156438827515,
-0.01653124950826168,
0.07212429493665695,
-0.02709522470831871,
0.06836628913879395,
-0.045130133628845215,
0.04884927719831467,
0.011613664217293262,
-0.015227660536766052,
0.04054068773984909,
0.025970347225666046,
0.169732004404068,
0.022884858772158623,
-0.04346132650971413,
0.036814603954553604,
-0.0013906107051298022,
0.08784176409244537,
0.06201624497771263,
0.1467529833316803,
0.161802276968956,
0.041475165635347366,
0.05902661010622978,
0.03866077959537506,
-0.05154416337609291,
-0.15613137185573578,
0.034257061779499054,
-0.042761411517858505,
0.08787543326616287,
0.00866139866411686,
0.19303034245967865,
0.08677796274423599,
-0.17687025666236877,
0.036076661199331284,
-0.02581297792494297,
-0.07324985414743423,
-0.0789623036980629,
-0.05486857146024704,
-0.08725494146347046,
-0.11506496369838715,
0.019038286060094833,
-0.1105385422706604,
0.017980538308620453,
0.09848573803901672,
0.013383007608354092,
-0.020882008597254753,
0.16326142847537994,
0.04372510313987732,
-0.005025055725127459,
0.07027707993984222,
0.032454557716846466,
-0.005465412046760321,
-0.008382846601307392,
-0.10578028857707977,
0.04635012894868851,
-0.0019147746497765183,
0.07037761807441711,
-0.03758705034852028,
-0.04612881317734718,
0.07888912409543991,
0.03082428313791752,
-0.12446413934230804,
0.024834072217345238,
0.008640000596642494,
0.06042332202196121,
0.09170752018690109,
0.012487963773310184,
0.05756169557571411,
0.005065640434622765,
0.17652806639671326,
-0.053638000041246414,
-0.07413718849420547,
-0.11732063442468643,
0.1767425239086151,
-0.005067566875368357,
-0.04034247249364853,
0.04707298055291176,
-0.08221466094255447,
0.003502647392451763,
0.1374179869890213,
0.15539686381816864,
-0.08141044527292252,
-0.00850384309887886,
0.009941079653799534,
-0.016630664467811584,
-0.05296828970313072,
0.10236663371324539,
0.1241292878985405,
0.042797815054655075,
-0.09288882464170456,
-0.03900492936372757,
-0.06563174724578857,
-0.0036152061074972153,
-0.025773510336875916,
0.01475775707513094,
-0.018984777852892876,
0.003756775753572583,
-0.0763695240020752,
0.03808962553739548,
-0.013109232299029827,
-0.110169917345047,
0.06757993996143341,
-0.19911473989486694,
-0.1771886795759201,
-0.02687692455947399,
0.05273039638996124,
0.026829203590750694,
0.04841219261288643,
-0.019812216982245445,
0.019503669813275337,
0.10617301613092422,
-0.017702585086226463,
-0.03624752536416054,
-0.06009548157453537,
0.06627614051103592,
-0.03451571986079216,
0.22746145725250244,
-0.024854103103280067,
0.07832620292901993,
0.09027652442455292,
0.06845004111528397,
-0.11476223915815353,
0.061941687017679214,
0.0614938884973526,
-0.03618260845541954,
0.02174249477684498,
0.1544831395149231,
-0.05762167274951935,
0.05660233646631241,
0.03726207837462425,
-0.10012587904930115,
0.0015151113038882613,
-0.04169270023703575,
-0.07131699472665787,
-0.04497702792286873,
-0.02173076942563057,
-0.05023378133773804,
0.14102761447429657,
0.17953072488307953,
-0.058397240936756134,
-0.011138088069856167,
-0.0473807156085968,
0.01240726187825203,
0.09868785738945007,
0.07608263194561005,
-0.015079534612596035,
-0.21494676172733307,
0.022988054901361465,
-0.011373545043170452,
0.0300854854285717,
-0.24227239191532135,
-0.08792358636856079,
-0.042860809713602066,
-0.06466882675886154,
-0.09815928339958191,
0.09365256875753403,
0.024271905422210693,
0.04902874305844307,
-0.04704718291759491,
-0.028158364817500114,
-0.07103556394577026,
0.14952802658081055,
-0.14921334385871887,
-0.08241154253482819
] |
null | null | null | Transform INX-TEXT/Bailong-instruct-7B to GGUF format
original model link [https://huggingface.co/INX-TEXT/Bailong-instruct-7B] | {"language": ["en", "zh"], "license": "llama2"} | null | NapYang/Bailong-instruct-7B.GGUF | [
"gguf",
"en",
"zh",
"license:llama2",
"region:us"
] | 2024-02-15T06:50:15+00:00 | [] | [
"en",
"zh"
] | TAGS
#gguf #en #zh #license-llama2 #region-us
| Transform INX-TEXT/Bailong-instruct-7B to GGUF format
original model link [URL | [] | [
"TAGS\n#gguf #en #zh #license-llama2 #region-us \n"
] | [
20
] | [
"passage: TAGS\n#gguf #en #zh #license-llama2 #region-us \n"
] | [
-0.009824411943554878,
-0.003945640753954649,
-0.0078685712069273,
-0.015466614626348019,
0.008399118669331074,
0.04414432868361473,
0.12322155386209488,
0.05117114260792732,
0.23337407410144806,
-0.005853279028087854,
0.13256247341632843,
0.04791627451777458,
0.03526690602302551,
-0.06025400757789612,
0.018587013706564903,
-0.16490840911865234,
0.04358631372451782,
-0.027416743338108063,
-0.023277482017874718,
0.01887809857726097,
0.02433103881776333,
-0.035020459443330765,
0.033749356865882874,
-0.001967530930414796,
-0.06253597885370255,
0.030600527301430702,
0.002240664092823863,
-0.017818724736571312,
0.0940474271774292,
0.09770631790161133,
0.012173793278634548,
0.06327855587005615,
-0.016609622165560722,
-0.21817302703857422,
0.0263129323720932,
-0.1084115207195282,
-0.14453904330730438,
0.009712958708405495,
0.024161646142601967,
-0.04746881127357483,
0.13495992124080658,
0.08661964535713196,
-0.1326105296611786,
0.06048642471432686,
-0.18630553781986237,
-0.20867450535297394,
-0.08582276850938797,
0.0854739174246788,
-0.003444055328145623,
0.003449485870078206,
0.05288614705204964,
0.06984985619783401,
-0.18367083370685577,
-0.03193887695670128,
0.1667037159204483,
-0.35295239090919495,
0.05306600406765938,
0.17355291545391083,
0.05386762693524361,
0.005758903454989195,
-0.07706953585147858,
0.15959787368774414,
0.033598512411117554,
-0.08293289691209793,
-0.15382631123065948,
-0.056638266891241074,
0.14847326278686523,
0.14926546812057495,
-0.04986316338181496,
-0.055840641260147095,
0.1931774616241455,
0.05502718687057495,
-0.026729097589850426,
0.0914350226521492,
-0.0011577106779441237,
0.01601373963057995,
0.025360049679875374,
0.06395884603261948,
-0.000987494713626802,
0.15404857695102692,
0.1383153647184372,
-0.06689544767141342,
-0.12598061561584473,
-0.03705112636089325,
-0.21830272674560547,
0.22036468982696533,
0.026345042511820793,
0.11151444166898727,
-0.14570041000843048,
0.015979737043380737,
-0.15308915078639984,
-0.0361475832760334,
-0.08974514156579971,
-0.017612231895327568,
0.05547727271914482,
0.04139891639351845,
0.017581358551979065,
0.12454866617918015,
0.18901704251766205,
0.16288389265537262,
-0.012827154248952866,
0.031132971867918968,
0.024361999705433846,
0.17559827864170074,
0.0006115740980021656,
0.03244452178478241,
0.07362973690032959,
0.07690656930208206,
0.022687090560793877,
-0.17829781770706177,
0.024330630898475647,
-0.037597741931676865,
-0.152344211935997,
-0.04296750947833061,
-0.12675999104976654,
0.13607974350452423,
-0.0651857927441597,
-0.054315727204084396,
-0.019838569685816765,
0.051526185125112534,
0.03471589833498001,
0.011298087425529957,
-0.02659623883664608,
0.011806335300207138,
-0.0123324329033494,
-0.0670536607503891,
-0.09766217321157455,
0.054615557193756104,
0.12131133675575256,
0.08228165656328201,
-0.13195611536502838,
0.005455428268760443,
0.04068860039114952,
0.09867044538259506,
0.0812879428267479,
-0.05082537606358528,
0.08419972658157349,
-0.12354723364114761,
-0.12839312851428986,
0.035194359719753265,
-0.0023739999160170555,
-0.007684150245040655,
0.08385470509529114,
0.07968101650476456,
0.03557159751653671,
-0.02606605738401413,
-0.07628437876701355,
-0.018249522894620895,
-0.08313148468732834,
0.12407451122999191,
0.012920665554702282,
-0.026379426941275597,
-0.2235078066587448,
-0.019218919798731804,
-0.03678274527192116,
0.09339378029108047,
0.13353313505649567,
-0.06614167243242264,
-0.07952797412872314,
0.10794750601053238,
-0.007329953834414482,
0.046264782547950745,
-0.09659222513437271,
0.01804034411907196,
-0.05270533636212349,
0.09124398231506348,
-0.075161412358284,
-0.07987228780984879,
0.16562621295452118,
-0.09620610624551773,
-0.09814208000898361,
0.024318600073456764,
0.021156011149287224,
0.004861159715801477,
0.052908480167388916,
0.38126158714294434,
-0.04458966851234436,
-0.1868540197610855,
0.015380318276584148,
0.130705788731575,
-0.13249818980693817,
-0.171650230884552,
0.15481670200824738,
-0.1040218248963356,
-0.09290290623903275,
0.0250651016831398,
-0.026861533522605896,
0.13294120132923126,
-0.04367716610431671,
-0.07425553351640701,
0.03461790084838867,
-0.04388988018035889,
0.011689059436321259,
0.018109163269400597,
0.09608680009841919,
-0.058564912527799606,
0.037030819803476334,
-0.04961373284459114,
0.06832769513130188,
0.1308353692293167,
-0.016301168128848076,
-0.10959672927856445,
0.10213276743888855,
0.05238005891442299,
0.008731814101338387,
0.03429010882973671,
-0.0736655667424202,
-0.029779663309454918,
-0.03705664351582527,
0.09719318896532059,
0.09121045470237732,
0.043518438935279846,
0.0032677610870450735,
0.0015195347368717194,
0.028760559856891632,
0.03143295645713806,
0.03277643024921417,
0.02171635627746582,
-0.0634312555193901,
0.06263827532529831,
0.013235819526016712,
0.026150189340114594,
-0.10913699865341187,
-0.02639167010784149,
0.2131742388010025,
-0.022507183253765106,
-0.044580716639757156,
0.025358324870467186,
-0.011606454849243164,
-0.04652892425656319,
0.06500796228647232,
0.0242320504039526,
0.11252826452255249,
0.006766984239220619,
-0.11352594941854477,
0.14133109152317047,
0.015932423993945122,
0.2652145326137543,
0.12034004926681519,
-0.0005950813065283,
0.050731539726257324,
-0.11355199664831161,
-0.030814774334430695,
0.012150666676461697,
0.05990182235836983,
-0.0027346916031092405,
0.013122611679136753,
-0.07336244732141495,
0.022271180525422096,
-0.03424288332462311,
-0.0126214399933815,
-0.0195477157831192,
-0.07538595050573349,
-0.05706958845257759,
0.09021657705307007,
0.17676371335983276,
-0.2363923341035843,
0.14063145220279694,
0.2911156117916107,
0.15102903544902802,
0.24962551891803741,
-0.10980674624443054,
0.02897345833480358,
-0.08183363080024719,
0.05104343220591545,
-0.027486547827720642,
0.21348154544830322,
-0.10274621099233627,
-0.013188793323934078,
0.015126257203519344,
0.0031287625897675753,
0.07641244679689407,
-0.17135518789291382,
-0.1566690057516098,
-0.04738110676407814,
-0.060923848301172256,
-0.07248007506132126,
0.13791392743587494,
-0.14143826067447662,
0.029081085696816444,
0.02536596544086933,
-0.06957816332578659,
0.11090997606515884,
0.0014410187723115087,
-0.060880083590745926,
0.06289318948984146,
-0.0923200249671936,
-0.09086209535598755,
-0.057446498423814774,
-0.09227338433265686,
-0.042717840522527695,
0.0038079426158219576,
0.0443844348192215,
-0.07034485787153244,
-0.007785299327224493,
0.09906338900327682,
0.01210757065564394,
-0.14754880964756012,
0.016047460958361626,
0.01478691678494215,
0.03315401077270508,
-0.11569035053253174,
-0.07409185916185379,
-0.09111728519201279,
-0.11604569107294083,
-0.07842191308736801,
0.069817453622818,
-0.06453526020050049,
0.0822635293006897,
0.09477507323026657,
0.08033626526594162,
0.11505132913589478,
-0.0259584691375494,
0.14619122445583344,
-0.08231785893440247,
-0.06399601697921753,
0.1271817684173584,
0.031709156930446625,
0.050808150321245193,
0.1209394633769989,
0.09672423452138901,
-0.11305869370698929,
-0.05437573790550232,
-0.015643948689103127,
-0.14864800870418549,
-0.11977055668830872,
-0.06704166531562805,
-0.08643976598978043,
0.06633440405130386,
-0.062071267515420914,
0.10874953120946884,
0.1883133202791214,
0.02873455174267292,
0.06400325149297714,
-0.09805038571357727,
-0.03739834204316139,
-0.009432070888578892,
0.1336861401796341,
-0.03198372945189476,
-0.033462703227996826,
-0.1162111759185791,
-0.020249051973223686,
0.13471072912216187,
0.10405757278203964,
0.0784861296415329,
0.23131687939167023,
0.046751558780670166,
0.15537001192569733,
0.14641036093235016,
0.13782334327697754,
-0.0432082898914814,
-0.00421597296372056,
-0.0629204586148262,
-0.027756134048104286,
-0.03616230562329292,
0.03036409616470337,
0.03842965140938759,
0.07069987803697586,
-0.1881759911775589,
0.05480676516890526,
-0.2661299705505371,
0.0583207905292511,
-0.149963840842247,
0.03928570821881294,
0.11865367740392685,
0.06797108054161072,
0.07987934350967407,
0.025570819154381752,
0.014500786550343037,
0.07846200466156006,
-0.02460709773004055,
-0.1305973082780838,
0.03722148761153221,
0.06496939808130264,
0.03447948768734932,
0.056477148085832596,
0.059935566037893295,
-0.11631181091070175,
-0.11304086446762085,
0.030530452728271484,
0.12126971036195755,
-0.2366439253091812,
0.228202685713768,
0.04308090731501579,
-0.0464308001101017,
-0.061209190636873245,
-0.08320439606904984,
0.044071901589632034,
0.0881897583603859,
0.14929941296577454,
0.08488848805427551,
-0.06818404048681259,
-0.07746680825948715,
0.017784269526600838,
0.027765484526753426,
0.09885603189468384,
-0.06356111913919449,
-0.14333736896514893,
-0.027254370972514153,
0.04326664283871651,
-0.01994325965642929,
0.16100484132766724,
-0.07529640197753906,
-0.05074185132980347,
0.04974616691470146,
0.03081476129591465,
0.0006961822509765625,
-0.06740758568048477,
0.03839633986353874,
-0.08795785158872604,
0.00828325655311346,
-0.14379702508449554,
-0.014323382638394833,
-0.09669371694326401,
-0.10329917818307877,
-0.0029686156194657087,
-0.08701679110527039,
-0.009537040255963802,
-0.036187414079904556,
-0.15964561700820923,
-0.12706750631332397,
-0.17049188911914825,
0.08219817280769348,
-0.011600587517023087,
0.0034669663291424513,
-0.026498781517148018,
0.19666416943073273,
-0.011218319647014141,
0.03737363964319229,
-0.002645297208800912,
0.001985497772693634,
-0.0024966297205537558,
-0.17824731767177582,
0.11378725618124008,
-0.12781879305839539,
-0.05566940829157829,
-0.022527748718857765,
0.01976734586060047,
0.06598174571990967,
0.04159415513277054,
-0.1515653431415558,
0.18404507637023926,
0.3551383316516876,
-0.01214658934623003,
0.20880423486232758,
0.2151344269514084,
-0.05570060387253761,
-0.2154179811477661,
-0.19381409883499146,
-0.22409391403198242,
-0.04180561378598213,
0.012213033623993397,
-0.1605069488286972,
0.04316071793437004,
0.16251955926418304,
-0.11129610985517502,
0.35324522852897644,
-0.32162919640541077,
-0.05641666054725647,
0.11923899501562119,
-0.030001496896147728,
0.5806500315666199,
-0.1501682549715042,
-0.1356973648071289,
0.014682854525744915,
-0.14879143238067627,
0.14619198441505432,
-0.018786387518048286,
0.10308142751455307,
-0.03166358545422554,
-0.031454846262931824,
-0.004289558157324791,
-0.0026242516469210386,
0.23330970108509064,
0.03842297941446304,
0.049848634749650955,
-0.08999516814947128,
-0.18637143075466156,
0.14981091022491455,
0.028400814160704613,
-0.043854475021362305,
-0.11121351271867752,
-0.046959251165390015,
-0.1238064393401146,
-0.0010760497534647584,
-0.059029266238212585,
0.11650941520929337,
0.03781053051352501,
-0.09684021025896072,
-0.13212627172470093,
0.04547962546348572,
-0.10500536113977432,
-0.013954009860754013,
0.21773965656757355,
-0.05676752328872681,
0.011663571000099182,
-0.01406281441450119,
-0.10104648023843765,
-0.1591278314590454,
-0.025662163272500038,
-0.12865695357322693,
-0.03175433725118637,
0.07938968390226364,
-0.1420956254005432,
-0.019574126228690147,
0.10559727996587753,
0.05529526248574257,
0.05692818760871887,
0.05066855251789093,
-0.12837882339954376,
0.06770697981119156,
0.15235541760921478,
-0.12961120903491974,
-0.14805366098880768,
-0.033639539033174515,
-0.006859317887574434,
0.2609214782714844,
0.035688016563653946,
0.03378615155816078,
0.08080867677927017,
0.019164137542247772,
0.00813024491071701,
-0.015284329652786255,
-0.14647436141967773,
-0.01792021654546261,
0.08765745162963867,
-0.009386316873133183,
-0.10817527770996094,
0.07840969413518906,
0.02195042558014393,
0.18429099023342133,
-0.04465166851878166,
0.14792044460773468,
-0.04430569335818291,
-0.062120139598846436,
-0.25004681944847107,
0.09809675067663193,
-0.2365015745162964,
-0.08704765886068344,
0.040919963270425797,
-0.07855500280857086,
-0.042154598981142044,
0.22408811748027802,
0.03096037171781063,
0.11315391212701797,
0.03701438382267952,
-0.018167339265346527,
0.12240862101316452,
-0.0883733257651329,
-0.14185959100723267,
-0.009663568809628487,
-0.0896570086479187,
-0.09762056916952133,
0.017130520194768906,
0.15602755546569824,
-0.07727334648370743,
-0.09453737735748291,
-0.2140122652053833,
0.06595128774642944,
-0.10961085557937622,
-0.048520833253860474,
-0.08089014887809753,
-0.032251521944999695,
0.030923103913664818,
-0.09299018979072571,
-0.025695739313960075,
0.007792817894369364,
-0.12535738945007324,
0.04058736935257912,
0.04974360391497612,
0.0777415856719017,
-0.05693075433373451,
-0.03741157427430153,
0.11351218074560165,
0.04654785990715027,
0.12167199701070786,
0.13132530450820923,
0.09573698788881302,
0.1981518417596817,
-0.19790905714035034,
-0.031153762713074684,
0.07844505459070206,
-0.027947688475251198,
-0.00012634694576263428,
0.09220898896455765,
-0.019862806424498558,
-0.0042146253399550915,
-0.03112185001373291,
0.06449270993471146,
-0.09137333184480667,
-0.10628050565719604,
-0.08596360683441162,
-0.05269080400466919,
-0.10507967323064804,
0.02421337366104126,
-0.10464508086442947,
0.1057443842291832,
0.05230971798300743,
0.021208247169852257,
0.06538191437721252,
-0.002103552222251892,
0.023900682106614113,
0.014181514270603657,
-0.0031478211749345064,
-0.13682617247104645,
-0.012651604600250721,
-0.07420070469379425,
-0.07981903851032257,
-0.015727750957012177,
0.29801297187805176,
-0.004545154515653849,
-0.2048368901014328,
-0.003371825208887458,
0.19199351966381073,
0.07667072117328644,
-0.03639345243573189,
0.2925551235675812,
0.07094889134168625,
0.020302528515458107,
-0.13628268241882324,
0.07098894566297531,
-0.0764019638299942,
-0.2026975005865097,
0.06751640886068344,
0.014312607236206532,
-0.011863132007420063,
-0.01025864202529192,
0.09153559803962708,
-0.09709761291742325,
0.022422486916184425,
-0.05431061610579491,
0.028497347608208656,
-0.11080465465784073,
-0.020955143496394157,
0.048365164548158646,
0.24715983867645264,
-0.05767197534441948,
0.012504075653851032,
0.026808427646756172,
0.011930793523788452,
-0.1119709387421608,
-0.17836838960647583,
0.028647467494010925,
-0.029909364879131317,
0.08993059396743774,
0.03719465807080269,
0.02770397998392582,
0.17824864387512207,
0.025810973718762398,
-0.018175087869167328,
0.0005494542419910431,
-0.06769184023141861,
-0.0930037796497345,
-0.026154732331633568,
-0.046702295541763306,
-0.011420355178415775,
-0.11582835763692856,
-0.038495615124702454,
-0.04978272691369057,
-0.1505853682756424,
-0.051087092608213425,
0.014383208006620407,
-0.00011038469529012218,
-0.055072057992219925,
-0.12998276948928833,
-0.00975238811224699,
-0.06163066625595093,
0.07991599291563034,
0.021694695577025414,
0.11871574074029922,
-0.002118335571140051,
0.0015323861734941602,
0.022293195128440857,
0.06676722317934036,
0.0712004229426384,
-0.06233184412121773,
0.08563024550676346,
0.022308729588985443,
-0.011532734148204327,
0.09271056205034256,
-0.05572833493351936,
0.026787834241986275,
0.06287813931703568,
0.20335370302200317,
0.2023536115884781,
-0.09179983288049698,
0.054151829332113266,
-0.006736986339092255,
0.030071282759308815,
0.10492202639579773,
0.07697220146656036,
-0.025787504389882088,
0.304071843624115,
-0.09623304009437561,
0.0005504737491719425,
-0.02031027339398861,
0.017393693327903748,
-0.09328529238700867,
0.10325229167938232,
0.05830041691660881,
-0.052817996591329575,
-0.08237718045711517,
0.08507166057825089,
-0.18703828752040863,
0.10084541887044907,
0.12120369076728821,
-0.11454244703054428,
0.020474044606089592,
-0.006232337560504675,
0.0056864358484745026,
0.04536847397685051,
0.06591186672449112,
-0.11471787840127945,
-0.08234730362892151,
-0.2119903564453125,
0.09141099452972412,
-0.36765968799591064,
-0.12015455961227417,
0.09575545787811279,
0.1799173355102539,
0.15326307713985443,
-0.06301472336053848,
0.05408412218093872,
0.03366498276591301,
0.037078388035297394,
-0.021960215643048286,
0.14645962417125702,
0.032352980226278305,
-0.06155009567737579,
-0.17992037534713745,
-0.1542726457118988,
0.061743345111608505,
-0.1001390889286995,
0.03688664361834526,
0.06615302711725235,
0.015584360808134079,
0.15560227632522583,
-0.04999890923500061,
0.006632986944168806,
0.07043560594320297,
-0.13639360666275024,
0.07937571406364441,
-0.0845707431435585,
0.016432838514447212,
-0.06537585705518723,
-0.06808412820100784,
0.018119486048817635,
0.11659006029367447,
-0.1722428947687149,
-0.07225567102432251,
0.14314769208431244,
0.009940756484866142,
0.09597757458686829,
-0.053871046751737595,
-0.01983094960451126,
-0.027655841782689095,
-0.11434134095907211,
0.11440703272819519,
-0.06698772311210632,
0.03885084018111229,
0.19177860021591187,
-0.0014642983442172408,
0.03931370750069618,
-0.34381750226020813,
0.03722058981657028,
-0.06547262519598007,
-0.041507985442876816,
-0.05741531774401665
] |
null | null | transformers |
# Antler 7B
<img src="OIG3.UAjshTXCEJU.jpg" alt="drawing" style="width:512px;"/>
## Model Description
This is a 7B-parameter decoder-only Japanese language model fine-tuned on novel datasets, built on top of the base model Japanese Stable LM Base Gamma 7B. [Japanese Stable LM Instruct Gamma 7B](https://huggingface.co/stabilityai/japanese-stablelm-instruct-gamma-7b)
## Usage
Ensure you are using Transformers 4.34.0 or newer.
```python
import torch
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("Elizezen/Antler-7B")
model = AutoModelForCausalLM.from_pretrained(
"Elizezen/Antler-7B",
torch_dtype="auto",
)
model.eval()
if torch.cuda.is_available():
model = model.to("cuda")
input_ids = tokenizer.encode(
"吾輩は猫である。名前はまだない",,
add_special_tokens=True,
return_tensors="pt"
)
tokens = model.generate(
input_ids.to(device=model.device),
max_new_tokens=512,
temperature=1,
top_p=0.95,
do_sample=True,
)
out = tokenizer.decode(tokens[0][input_ids.shape[1]:], skip_special_tokens=True).strip()
print(out)
"""
output example:
とある大きなお屋敷に潜入している。
「……」
吾輩は夜の帳が下りた頃に、こっそりと昼間に見たお屋敷の裏手に回っている。目的はもちろん、昼間に見たお屋敷の主の正体を突き止めるためだ。
吾輩はくんかくんかと鼻を鳴らすと、あたりをうろつき始めた。
「……」
すると、それほど時間を要せずに吾輩の鼻腔をくすぐる甘い香りを発見した。これだ!
吾輩はその香りの元を辿っていくと、とある部屋にたどり着いた。どうやらここが今回のターゲットらしい。
「……」
吾輩は部屋の窓から中を覗いてみる。すると、そこには一人の少女がいた。
(黒髪に紫眼……噂に聞く妖怪か)
吾輩はこの世界の妖怪をほとんど見たことがないが、噂に聞くその特徴はこの少女に当てはまるようだった。
(それにしても……)
吾輩は改めて少女を観察する。
(か、可愛い……!)
この世界の人間と比べると少し幼く見えるが、それでも十分に整った顔立ちをしている。それに加えて、あの豊満なボディもたまらない。これはもう、
"""
```
### Datasets
- less than 1GB of web novels(non-PG)
- 70GB of web novels(PG)
### Intended Use
The model is mainly intended to be used for generating novels. It may not be so capable with instruction-based responses. | {"language": ["ja"], "tags": ["causal-lm", "not-for-all-audiences", "nsfw"], "pipeline_tag": "text-generation"} | text-generation | Elizezen/Antler-7B | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"causal-lm",
"not-for-all-audiences",
"nsfw",
"ja",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-15T06:50:25+00:00 | [] | [
"ja"
] | TAGS
#transformers #safetensors #mistral #text-generation #causal-lm #not-for-all-audiences #nsfw #ja #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Antler 7B
<img src="URL" alt="drawing" style="width:512px;"/>
## Model Description
This is a 7B-parameter decoder-only Japanese language model fine-tuned on novel datasets, built on top of the base model Japanese Stable LM Base Gamma 7B. Japanese Stable LM Instruct Gamma 7B
## Usage
Ensure you are using Transformers 4.34.0 or newer.
### Datasets
- less than 1GB of web novels(non-PG)
- 70GB of web novels(PG)
### Intended Use
The model is mainly intended to be used for generating novels. It may not be so capable with instruction-based responses. | [
"# Antler 7B\n\n<img src=\"URL\" alt=\"drawing\" style=\"width:512px;\"/>",
"## Model Description\n\nThis is a 7B-parameter decoder-only Japanese language model fine-tuned on novel datasets, built on top of the base model Japanese Stable LM Base Gamma 7B. Japanese Stable LM Instruct Gamma 7B",
"## Usage\n\nEnsure you are using Transformers 4.34.0 or newer.",
"### Datasets\n\n- less than 1GB of web novels(non-PG)\n- 70GB of web novels(PG)",
"### Intended Use\n\nThe model is mainly intended to be used for generating novels. It may not be so capable with instruction-based responses."
] | [
"TAGS\n#transformers #safetensors #mistral #text-generation #causal-lm #not-for-all-audiences #nsfw #ja #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Antler 7B\n\n<img src=\"URL\" alt=\"drawing\" style=\"width:512px;\"/>",
"## Model Description\n\nThis is a 7B-parameter decoder-only Japanese language model fine-tuned on novel datasets, built on top of the base model Japanese Stable LM Base Gamma 7B. Japanese Stable LM Instruct Gamma 7B",
"## Usage\n\nEnsure you are using Transformers 4.34.0 or newer.",
"### Datasets\n\n- less than 1GB of web novels(non-PG)\n- 70GB of web novels(PG)",
"### Intended Use\n\nThe model is mainly intended to be used for generating novels. It may not be so capable with instruction-based responses."
] | [
68,
28,
59,
17,
29,
35
] | [
"passage: TAGS\n#transformers #safetensors #mistral #text-generation #causal-lm #not-for-all-audiences #nsfw #ja #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Antler 7B\n\n<img src=\"URL\" alt=\"drawing\" style=\"width:512px;\"/>## Model Description\n\nThis is a 7B-parameter decoder-only Japanese language model fine-tuned on novel datasets, built on top of the base model Japanese Stable LM Base Gamma 7B. Japanese Stable LM Instruct Gamma 7B## Usage\n\nEnsure you are using Transformers 4.34.0 or newer.### Datasets\n\n- less than 1GB of web novels(non-PG)\n- 70GB of web novels(PG)### Intended Use\n\nThe model is mainly intended to be used for generating novels. It may not be so capable with instruction-based responses."
] | [
-0.0033749218564480543,
0.025676431134343147,
-0.002048217225819826,
0.08499124646186829,
0.07228711992502213,
0.07131855189800262,
0.19857245683670044,
0.07573533803224564,
0.09605707973241806,
0.029106270521879196,
0.05027816817164421,
-0.03670063987374306,
0.007886464707553387,
0.12422453612089157,
-0.03257273882627487,
-0.18170295655727386,
0.06077846512198448,
0.032304562628269196,
0.1583610326051712,
0.0767415314912796,
0.0633346438407898,
-0.017769480124115944,
0.06238042563199997,
-0.012229339219629765,
-0.08154705911874771,
-0.014627651311457157,
0.11632154881954193,
-0.06381623446941376,
0.03989830240607262,
0.07299354672431946,
0.0769021064043045,
0.004122243728488684,
0.05617040768265724,
-0.10215313732624054,
0.01815543696284294,
0.055367182940244675,
-0.06082966551184654,
0.01864730566740036,
0.04639892280101776,
0.008598579093813896,
0.18138043582439423,
-0.0590163916349411,
-0.054856471717357635,
0.08317196369171143,
-0.08971838653087616,
-0.1881488710641861,
-0.014174963347613811,
0.022668231278657913,
0.03720651939511299,
0.13245339691638947,
-0.06144354119896889,
0.013611529022455215,
0.03832203522324562,
0.09570230543613434,
0.07344911247491837,
-0.22361800074577332,
-0.05136501044034958,
0.12124501913785934,
0.013887190259993076,
0.17404943704605103,
-0.04884999245405197,
0.06752786040306091,
0.08592857420444489,
0.006495393346995115,
0.047270942479372025,
-0.03394104540348053,
0.13417498767375946,
-0.01199956052005291,
-0.10539824515581131,
-0.0063137779943645,
0.26494941115379333,
-0.025102678686380386,
-0.0719309151172638,
-0.10280372947454453,
-0.11947961896657944,
0.07495138049125671,
-0.03348518908023834,
0.040422581136226654,
-0.05344454571604729,
0.04149097204208374,
0.05915170535445213,
-0.07703033089637756,
-0.10702092200517654,
-0.14316509664058685,
-0.06818056106567383,
0.1704173982143402,
0.017939457669854164,
0.06020152568817139,
-0.03514918312430382,
0.05350293219089508,
-0.21634618937969208,
-0.09205222874879837,
-0.04356415197253227,
-0.12908464670181274,
0.01869758404791355,
0.0732756182551384,
-0.06679274886846542,
-0.13923947513103485,
0.059373125433921814,
0.02530173398554325,
-0.03571265563368797,
0.05164146423339844,
0.009610798209905624,
0.04622608423233032,
0.019877543672919273,
-0.00013364330516196787,
-0.002488106256350875,
0.012618591077625751,
0.08123093098402023,
0.045830052345991135,
0.12359783053398132,
-0.06026000529527664,
-0.06270835548639297,
-0.08799365162849426,
-0.003608293831348419,
-0.009258885867893696,
-0.040680527687072754,
0.11375684291124344,
-0.00460759736597538,
-0.005172805394977331,
0.14150111377239227,
-0.092300184071064,
-0.06060261279344559,
0.0226312056183815,
0.01598271168768406,
0.020793059840798378,
0.06003037840127945,
-0.037762030959129333,
0.030694415792822838,
-0.033393342047929764,
-0.06301277875900269,
-0.058275170624256134,
-0.067076675593853,
-0.06956879049539566,
0.0013743378221988678,
0.04313920438289642,
-0.004687279928475618,
-0.13841019570827484,
-0.21860720217227936,
0.016214272007346153,
0.01741744950413704,
-0.02808852307498455,
0.08376578986644745,
0.029301071539521217,
-0.0014893603511154652,
0.04806346818804741,
0.022327017039060593,
-0.05750909820199013,
-0.06153798848390579,
0.09445659816265106,
0.12217091768980026,
0.11046379059553146,
-0.1026216372847557,
-0.004726882092654705,
-0.07515434175729752,
0.007684772368520498,
-0.18864312767982483,
0.04931214824318886,
-0.07443709671497345,
-0.04589206725358963,
-0.041890013962984085,
-0.03993780538439751,
0.030124451965093613,
0.058129068464040756,
-0.021460164338350296,
0.1765017956495285,
-0.07426734268665314,
-0.010513202287256718,
0.15116800367832184,
-0.18139433860778809,
-0.03576786443591118,
0.1860477477312088,
-0.005334638990461826,
0.042504847049713135,
0.10149534791707993,
0.20868442952632904,
0.04402780160307884,
-0.00855704303830862,
-0.009702973999083042,
0.014703701250255108,
-0.0030310212168842554,
0.06454949080944061,
0.07929348945617676,
0.08349685370922089,
-0.21275164186954498,
0.08308189362287521,
-0.06804390996694565,
-0.04443542659282684,
0.012050285004079342,
-0.07228794693946838,
-0.07532503455877304,
-0.02259666845202446,
0.09709309786558151,
0.02752121165394783,
0.017606768757104874,
-0.03735441341996193,
-0.10182567685842514,
-0.09946901351213455,
0.09090936183929443,
-0.08853241801261902,
0.00784142967313528,
-0.05465679243206978,
0.03260065242648125,
-0.07761518657207489,
0.0745103508234024,
-0.06810858100652695,
-0.06774895638227463,
0.028175437822937965,
0.010004017502069473,
0.04524023085832596,
0.03382979705929756,
0.026753554120659828,
0.05656121298670769,
-0.12561838328838348,
-0.009610270150005817,
0.0136046651750803,
-0.035134848207235336,
-0.018755750730633736,
-0.1318545639514923,
0.030323656275868416,
-0.05630357190966606,
0.07904749363660812,
-0.24108251929283142,
0.06133868917822838,
-0.028550753369927406,
-0.023869983851909637,
-0.04346609115600586,
-0.006482486147433519,
0.060893382877111435,
-0.05383291468024254,
-0.06406181305646896,
-0.009249218739569187,
0.052438873797655106,
-0.011975646018981934,
-0.16299745440483093,
0.049127910286188126,
-0.21935060620307922,
0.04612868279218674,
0.11176514625549316,
0.051092274487018585,
-0.04837639629840851,
-0.005639018956571817,
0.029031483456492424,
0.0014491422334685922,
-0.031276099383831024,
-0.05788586288690567,
0.040985170751810074,
-0.0494440421462059,
0.08867392688989639,
-0.12380664795637131,
0.03482014313340187,
0.03496420010924339,
-0.1041714996099472,
-0.022615943104028702,
0.08246646076440811,
-0.016270983964204788,
-0.19313834607601166,
0.07551092654466629,
0.06589455902576447,
-0.12130636721849442,
0.2219117432832718,
0.048223864287137985,
-0.04159320890903473,
0.0047500417567789555,
0.01800333894789219,
0.05369149148464203,
0.07614593207836151,
-0.09595943987369537,
0.025631189346313477,
0.05117417871952057,
-0.014886435121297836,
0.046049878001213074,
-0.10636301338672638,
-0.07449635118246078,
-0.010886323638260365,
-0.09737887233495712,
-0.10136771947145462,
0.06553196161985397,
-0.05891861021518707,
0.07046444714069366,
-0.060601312667131424,
0.12479531764984131,
0.055871907621622086,
-0.050920601934194565,
-0.11949209123849869,
0.17257247865200043,
-0.08312486857175827,
-0.18127687275409698,
-0.20530179142951965,
0.008617177605628967,
-0.1512271761894226,
0.06555777043104172,
0.05707155540585518,
-0.16754357516765594,
-0.10110212862491608,
-0.17158371210098267,
0.0230106208473444,
-0.0589088574051857,
-0.009257161058485508,
-0.08174547553062439,
0.021405695006251335,
0.01528123114258051,
-0.13648635149002075,
0.012347900308668613,
0.005270725581794977,
-0.11930125206708908,
0.10086600482463837,
-0.08669448643922806,
0.11082277446985245,
0.12420555204153061,
0.041958436369895935,
-0.03803149610757828,
0.020759498700499535,
0.16285593807697296,
-0.08089491724967957,
0.048749085515737534,
0.2102024108171463,
0.026078050956130028,
0.04966516047716141,
0.14024409651756287,
0.03631284460425377,
-0.0833856388926506,
0.06984532624483109,
-0.023712068796157837,
-0.10386460274457932,
-0.24485009908676147,
-0.021541418507695198,
-0.09311017394065857,
0.029552290216088295,
-0.00522663863375783,
0.052592795342206955,
0.12661391496658325,
0.12369131296873093,
-0.08235877007246017,
0.057119764387607574,
0.08415577560663223,
0.10760181397199631,
0.20060858130455017,
0.03192634508013725,
0.07330919802188873,
-0.08241835236549377,
0.03039105050265789,
0.0945976972579956,
0.011257604695856571,
0.18890751898288727,
-0.06909496337175369,
0.140081986784935,
0.065494105219841,
0.06215304881334305,
0.10371048003435135,
0.048815030604600906,
-0.028417114168405533,
0.009689601138234138,
-0.07649541646242142,
-0.09472794085741043,
-0.0552409365773201,
0.11418400704860687,
-0.16450850665569305,
-0.03608328476548195,
-0.03610716387629509,
0.001900471979752183,
0.023363104090094566,
-0.03828873857855797,
0.017207683995366096,
-0.19940564036369324,
-0.06973589211702347,
0.1118832379579544,
0.03082413412630558,
-0.05858656018972397,
0.021172722801566124,
0.1545972228050232,
-0.06090037152171135,
0.09110713005065918,
-0.012032894417643547,
0.06830096989870071,
-0.01524299569427967,
0.0004941930528730154,
-0.03576319292187691,
0.11524646729230881,
-0.019123096019029617,
0.11260277777910233,
-0.14930734038352966,
0.15972910821437836,
0.052602704614400864,
0.08989466726779938,
-0.04507433623075485,
0.008875269442796707,
0.04009753465652466,
0.1476253718137741,
0.1280900090932846,
-0.000024545999622205272,
0.0037917166482657194,
-0.07878248393535614,
-0.008860582485795021,
0.07073622196912766,
0.0722857266664505,
0.025500604882836342,
0.054018232971429825,
-0.012475515715777874,
-0.005680093076080084,
0.012546544894576073,
0.02604955993592739,
-0.19069534540176392,
-0.18153224885463715,
0.047999367117881775,
0.12593218684196472,
-0.020257366821169853,
-0.03019142523407936,
0.014674675650894642,
0.009199082851409912,
0.1582840085029602,
-0.039742160588502884,
-0.12299419194459915,
-0.1058771088719368,
-0.07840008288621902,
0.14324109256267548,
-0.047948334366083145,
0.029045335948467255,
-0.06381881982088089,
0.1181580200791359,
-0.028254354372620583,
-0.09349285066127777,
0.019134216010570526,
-0.06692501902580261,
-0.155455082654953,
-0.024306995794177055,
0.06772104650735855,
0.02136884815990925,
0.027684517204761505,
0.01183861494064331,
0.0020377442706376314,
-0.05035398155450821,
-0.1529654860496521,
0.01118519064038992,
0.12767471373081207,
0.05578863248229027,
0.0004998050862923265,
-0.0785493552684784,
-0.09657033532857895,
-0.05596727877855301,
-0.06815048307180405,
-0.01776900701224804,
0.2144755870103836,
-0.031115323305130005,
0.06698838621377945,
0.17433834075927734,
-0.07672984898090363,
-0.14320580661296844,
-0.05202900990843773,
-0.052249666303396225,
-0.004563288763165474,
-0.07119811326265335,
-0.12004127353429794,
0.08382339030504227,
0.06195152923464775,
-0.0008748676627874374,
0.08624552190303802,
-0.1024891585111618,
-0.118656225502491,
-0.027540333569049835,
0.10974928736686707,
0.11334265768527985,
-0.17200510203838348,
-0.06032855808734894,
-0.07070785015821457,
-0.0820663645863533,
0.03755226358771324,
-0.1866762489080429,
0.15222236514091492,
-0.012142462655901909,
0.08349597454071045,
-0.0009108137455768883,
-0.055826108902692795,
0.17048850655555725,
-0.03226344287395477,
0.07480362057685852,
-0.07170738279819489,
0.021189002320170403,
0.12768664956092834,
-0.03266023099422455,
0.1496659219264984,
-0.10150017589330673,
0.03820334002375603,
-0.018719477578997612,
-0.10366702824831009,
-0.06199478730559349,
0.024609239771962166,
-0.017561322078108788,
-0.0950596034526825,
-0.08235716074705124,
0.04128308594226837,
-0.011215800419449806,
-0.05062089115381241,
0.09130633622407913,
-0.09314439445734024,
0.09490049630403519,
0.114105224609375,
0.16185608506202698,
0.0073624406941235065,
0.0439235158264637,
-0.012594488449394703,
-0.048371657729148865,
0.10895240306854248,
-0.24971342086791992,
0.026552068069577217,
0.013478463515639305,
0.004987351130694151,
0.09396553039550781,
0.05245538055896759,
-0.03992919251322746,
0.06189326196908951,
0.10336977988481522,
-0.15993423759937286,
-0.13188320398330688,
-0.01635095849633217,
0.010636231862008572,
-0.05186526104807854,
0.06098441034555435,
0.11316434293985367,
-0.08918071538209915,
0.022188318893313408,
-0.014079118147492409,
0.030133338645100594,
0.01781160570681095,
0.06192588061094284,
0.022420797497034073,
0.024621378630399704,
-0.07675369083881378,
0.04913676530122757,
0.04102989658713341,
-0.048375897109508514,
0.028266601264476776,
0.07151255756616592,
-0.11360441148281097,
-0.07211585342884064,
-0.05240948870778084,
0.027242181822657585,
-0.03134728595614433,
-0.046683743596076965,
-0.07470760494470596,
-0.07497523725032806,
-0.038978975266218185,
0.09624188393354416,
0.046839844435453415,
-0.015520916320383549,
-0.038764048367738724,
0.041499678045511246,
-0.08671850711107254,
0.044141676276922226,
0.04843664914369583,
0.033789634704589844,
-0.13369770348072052,
0.04707534238696098,
0.03341002017259598,
0.0016302207950502634,
-0.08536043763160706,
0.019868357107043266,
-0.10370568931102753,
-0.003570778761059046,
-0.12212641537189484,
0.06758448481559753,
-0.08711788803339005,
-0.014036241918802261,
-0.08132912218570709,
-0.03146235644817352,
-0.08086373656988144,
0.014498818665742874,
-0.021033350378274918,
0.005639270879328251,
-0.0015509375371038914,
0.007551225367933512,
-0.08361873030662537,
-0.014644195325672626,
0.031816355884075165,
-0.06277908384799957,
0.027137914672493935,
0.06756376475095749,
-0.07580874860286713,
0.04168567806482315,
-0.18033762276172638,
0.09212801605463028,
0.02884768135845661,
0.033822350203990936,
-0.0644318088889122,
-0.0002673665585462004,
-0.010745598003268242,
0.014415799640119076,
0.02539975382387638,
0.029702162370085716,
0.06369516998529434,
-0.09127525240182877,
0.06611253321170807,
-0.037933528423309326,
-0.01925087906420231,
-0.08581573516130447,
0.022141695022583008,
0.11503292620182037,
0.029095573350787163,
0.10927188396453857,
-0.0989416167140007,
0.02371537685394287,
-0.1303146779537201,
0.013982555828988552,
0.023097272962331772,
-0.10094437748193741,
-0.0625113844871521,
-0.032528139650821686,
0.032865747809410095,
-0.0024733315221965313,
0.242319718003273,
-0.035084985196590424,
-0.04474910348653793,
-0.004566645715385675,
-0.003417924977838993,
0.1481602042913437,
-0.04038684442639351,
0.2671704590320587,
0.042011965066194534,
-0.001319303410127759,
-0.0034126825630664825,
0.03474332019686699,
0.06488437205553055,
0.043823376297950745,
0.09543993324041367,
0.07386921346187592,
0.0024714507162570953,
0.08471031486988068,
-0.056708551943302155,
0.04101167991757393,
-0.015273358672857285,
0.02178681083023548,
-0.03882277384400368,
0.016124214977025986,
-0.008103135041892529,
0.024211358278989792,
0.19372576475143433,
-0.03997502848505974,
0.045736052095890045,
-0.024452360346913338,
-0.09957230091094971,
-0.12302812933921814,
-0.14396491646766663,
-0.10174711793661118,
-0.14044618606567383,
0.019654659554362297,
-0.12230142951011658,
0.061655305325984955,
-0.02922847308218479,
0.018519381061196327,
-0.053390663117170334,
0.18058651685714722,
0.06775226444005966,
-0.025841480121016502,
0.06664945185184479,
-0.007427126634865999,
-0.047427136451005936,
0.12879113852977753,
-0.018284020945429802,
0.030799441039562225,
-0.06543254107236862,
0.0176218431442976,
0.10215841978788376,
0.028200563043355942,
0.040376223623752594,
-0.040471795946359634,
-0.07572197169065475,
-0.08282386511564255,
0.08820219337940216,
0.06333287060260773,
0.11243092268705368,
0.0045945048332214355,
-0.10720161348581314,
0.008533928543329239,
0.17167522013187408,
-0.04488299787044525,
-0.03804542124271393,
-0.0750882551074028,
0.19323471188545227,
-0.11297190934419632,
-0.011993484571576118,
-0.03376792371273041,
-0.11994726955890656,
0.05168357118964195,
0.2271765172481537,
0.12710249423980713,
-0.045892976224422455,
-0.008399730548262596,
-0.05031951144337654,
-0.0076606180518865585,
0.009809022769331932,
0.1442500799894333,
-0.0016677059466019273,
0.2762211263179779,
-0.03180611506104469,
0.03222249448299408,
-0.0401807501912117,
0.026504624634981155,
-0.06399086862802505,
0.12113039940595627,
0.019900288432836533,
-0.04064261540770531,
-0.08995512127876282,
0.06577116996049881,
-0.13961820304393768,
0.02333013340830803,
-0.059899501502513885,
0.028971681371331215,
-0.011593950912356377,
-0.008816963993012905,
0.04000789672136307,
-0.05980486422777176,
0.03128262236714363,
-0.04518415778875351,
0.08581524342298508,
-0.005136599764227867,
-0.005254419986158609,
-0.12704727053642273,
0.020617403090000153,
0.059092435985803604,
0.09152837842702866,
0.20025046169757843,
0.023169929161667824,
0.05943934991955757,
0.052684031426906586,
-0.0577128641307354,
-0.08800110965967178,
0.15283694863319397,
0.0005481779808178544,
-0.12406060099601746,
0.031286872923374176,
0.147233247756958,
-0.03202127292752266,
0.016385365277528763,
0.07439231127500534,
-0.02762867696583271,
0.04149385914206505,
0.11628173291683197,
-0.042014457285404205,
-0.12059565633535385,
0.032068535685539246,
-0.162638321518898,
0.11171548813581467,
0.13131219148635864,
-0.02944924682378769,
-0.06980881094932556,
-0.05067765712738037,
0.10404358059167862,
0.027519533410668373,
0.08393692970275879,
0.04890777915716171,
-0.17374831438064575,
-0.04140574485063553,
0.16305746138095856,
0.03845067322254181,
-0.3537651002407074,
-0.034606922417879105,
-0.07538527995347977,
0.015236852690577507,
-0.07795225083827972,
0.0029223181772977114,
0.12403597682714462,
-0.019635673612356186,
-0.02080700360238552,
-0.20843608677387238,
0.02122403122484684,
0.09321405738592148,
-0.10704496502876282,
-0.11870130151510239
] |
null | null | transformers | # Kyllene 34B v1.1

## Model Details
- A result of new merge method provided by [MergeMonster](https://github.com/Gryphe/MergeMonster/) tool with extended RPG preset.
- models used for merge:
[jondurbin/bagel-dpo-34b-v0.2](https://huggingface.co/jondurbin/bagel-dpo-34b-v0.2)
[NousResearch/Nous-Capybara-34B](https://huggingface.co/NousResearch/Nous-Capybara-34B)
[NousResearch_Nous-Hermes-2-Yi-34B](https://huggingface.co/NousResearch/Nous-Hermes-2-Yi-34B)
[SUSTech/SUS-Chat-34B](https://huggingface.co/SUSTech/SUS-Chat-34B)
- Method is aimed to maximize probability of certain phrases and minimize probablility of other phrases.
- RPG preset was extened with examples of typical, nonsensical output of most models like 'unbreakable bond', 'send shivers down her spine' etc.
- The resulting model has approximately 34 billion parameters.
- See [mergekit-config.yml](https://huggingface.co/TeeZee/Kyllene-34B-v1.1/resolve/main/merge-config.yml) for details on the merge method used and RPG presets.
**Warning: This model can produce NSFW content!**
## Results
- produces SFW nad NSFW content without issues, switches context seamlessly.
- 200K context length
- good at following instructions
- different than [TeeZee/Kyllene-57B-v1.0](https://huggingface.co/TeeZee/Kyllene-57B-v1.0), but also surprisingly entertaining (but more tests are needed)
## Side notes
- [MergeMonster](https://github.com/Gryphe/MergeMonster/) method works, however project would benefit greatly from some more love from developers.
- In its current state MergeMonster consumes insane amounts of RAM (256GB+) or VRAM and takes a really long time to process model data, this merge took 24H on 1xADA6000
- MergeMonster is not a golden bullet, other experiments has shown that it can also produce incredibly stupid models.
All comments are greatly appreciated, download, test and if you appreciate my work, consider buying me my fuel:
<a href="https://www.buymeacoffee.com/TeeZee" target="_blank"><img src="https://cdn.buymeacoffee.com/buttons/v2/default-yellow.png" alt="Buy Me A Coffee" style="height: 60px !important;width: 217px !important;" ></a> | {"license": "other", "tags": ["merge"], "license_name": "yi-license", "license_link": "https://huggingface.co/01-ai/Yi-34B-200K/blob/main/LICENSE"} | text-generation | LoneStriker/Kyllene-34B-v1.1-2.7bpw-h6-exl2 | [
"transformers",
"safetensors",
"llama",
"text-generation",
"merge",
"conversational",
"license:other",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-15T06:50:41+00:00 | [] | [] | TAGS
#transformers #safetensors #llama #text-generation #merge #conversational #license-other #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
| # Kyllene 34B v1.1
!image/png
## Model Details
- A result of new merge method provided by MergeMonster tool with extended RPG preset.
- models used for merge:
jondurbin/bagel-dpo-34b-v0.2
NousResearch/Nous-Capybara-34B
NousResearch_Nous-Hermes-2-Yi-34B
SUSTech/SUS-Chat-34B
- Method is aimed to maximize probability of certain phrases and minimize probablility of other phrases.
- RPG preset was extened with examples of typical, nonsensical output of most models like 'unbreakable bond', 'send shivers down her spine' etc.
- The resulting model has approximately 34 billion parameters.
- See URL for details on the merge method used and RPG presets.
Warning: This model can produce NSFW content!
## Results
- produces SFW nad NSFW content without issues, switches context seamlessly.
- 200K context length
- good at following instructions
- different than TeeZee/Kyllene-57B-v1.0, but also surprisingly entertaining (but more tests are needed)
## Side notes
- MergeMonster method works, however project would benefit greatly from some more love from developers.
- In its current state MergeMonster consumes insane amounts of RAM (256GB+) or VRAM and takes a really long time to process model data, this merge took 24H on 1xADA6000
- MergeMonster is not a golden bullet, other experiments has shown that it can also produce incredibly stupid models.
All comments are greatly appreciated, download, test and if you appreciate my work, consider buying me my fuel:
<a href="URL target="_blank"><img src="URL alt="Buy Me A Coffee" style="height: 60px !important;width: 217px !important;" ></a> | [
"# Kyllene 34B v1.1\n\n!image/png",
"## Model Details\n\n- A result of new merge method provided by MergeMonster tool with extended RPG preset.\n- models used for merge:\n jondurbin/bagel-dpo-34b-v0.2\n NousResearch/Nous-Capybara-34B\n NousResearch_Nous-Hermes-2-Yi-34B\n SUSTech/SUS-Chat-34B\n- Method is aimed to maximize probability of certain phrases and minimize probablility of other phrases.\n- RPG preset was extened with examples of typical, nonsensical output of most models like 'unbreakable bond', 'send shivers down her spine' etc.\n- The resulting model has approximately 34 billion parameters.\n- See URL for details on the merge method used and RPG presets.\n\nWarning: This model can produce NSFW content!",
"## Results\n\n- produces SFW nad NSFW content without issues, switches context seamlessly.\n- 200K context length\n- good at following instructions\n- different than TeeZee/Kyllene-57B-v1.0, but also surprisingly entertaining (but more tests are needed)",
"## Side notes\n\n - MergeMonster method works, however project would benefit greatly from some more love from developers.\n - In its current state MergeMonster consumes insane amounts of RAM (256GB+) or VRAM and takes a really long time to process model data, this merge took 24H on 1xADA6000\n - MergeMonster is not a golden bullet, other experiments has shown that it can also produce incredibly stupid models.\n\nAll comments are greatly appreciated, download, test and if you appreciate my work, consider buying me my fuel:\n<a href=\"URL target=\"_blank\"><img src=\"URL alt=\"Buy Me A Coffee\" style=\"height: 60px !important;width: 217px !important;\" ></a>"
] | [
"TAGS\n#transformers #safetensors #llama #text-generation #merge #conversational #license-other #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Kyllene 34B v1.1\n\n!image/png",
"## Model Details\n\n- A result of new merge method provided by MergeMonster tool with extended RPG preset.\n- models used for merge:\n jondurbin/bagel-dpo-34b-v0.2\n NousResearch/Nous-Capybara-34B\n NousResearch_Nous-Hermes-2-Yi-34B\n SUSTech/SUS-Chat-34B\n- Method is aimed to maximize probability of certain phrases and minimize probablility of other phrases.\n- RPG preset was extened with examples of typical, nonsensical output of most models like 'unbreakable bond', 'send shivers down her spine' etc.\n- The resulting model has approximately 34 billion parameters.\n- See URL for details on the merge method used and RPG presets.\n\nWarning: This model can produce NSFW content!",
"## Results\n\n- produces SFW nad NSFW content without issues, switches context seamlessly.\n- 200K context length\n- good at following instructions\n- different than TeeZee/Kyllene-57B-v1.0, but also surprisingly entertaining (but more tests are needed)",
"## Side notes\n\n - MergeMonster method works, however project would benefit greatly from some more love from developers.\n - In its current state MergeMonster consumes insane amounts of RAM (256GB+) or VRAM and takes a really long time to process model data, this merge took 24H on 1xADA6000\n - MergeMonster is not a golden bullet, other experiments has shown that it can also produce incredibly stupid models.\n\nAll comments are greatly appreciated, download, test and if you appreciate my work, consider buying me my fuel:\n<a href=\"URL target=\"_blank\"><img src=\"URL alt=\"Buy Me A Coffee\" style=\"height: 60px !important;width: 217px !important;\" ></a>"
] | [
59,
11,
183,
62,
170
] | [
"passage: TAGS\n#transformers #safetensors #llama #text-generation #merge #conversational #license-other #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Kyllene 34B v1.1\n\n!image/png## Model Details\n\n- A result of new merge method provided by MergeMonster tool with extended RPG preset.\n- models used for merge:\n jondurbin/bagel-dpo-34b-v0.2\n NousResearch/Nous-Capybara-34B\n NousResearch_Nous-Hermes-2-Yi-34B\n SUSTech/SUS-Chat-34B\n- Method is aimed to maximize probability of certain phrases and minimize probablility of other phrases.\n- RPG preset was extened with examples of typical, nonsensical output of most models like 'unbreakable bond', 'send shivers down her spine' etc.\n- The resulting model has approximately 34 billion parameters.\n- See URL for details on the merge method used and RPG presets.\n\nWarning: This model can produce NSFW content!## Results\n\n- produces SFW nad NSFW content without issues, switches context seamlessly.\n- 200K context length\n- good at following instructions\n- different than TeeZee/Kyllene-57B-v1.0, but also surprisingly entertaining (but more tests are needed)## Side notes\n\n - MergeMonster method works, however project would benefit greatly from some more love from developers.\n - In its current state MergeMonster consumes insane amounts of RAM (256GB+) or VRAM and takes a really long time to process model data, this merge took 24H on 1xADA6000\n - MergeMonster is not a golden bullet, other experiments has shown that it can also produce incredibly stupid models.\n\nAll comments are greatly appreciated, download, test and if you appreciate my work, consider buying me my fuel:\n<a href=\"URL target=\"_blank\"><img src=\"URL alt=\"Buy Me A Coffee\" style=\"height: 60px !important;width: 217px !important;\" ></a>"
] | [
-0.04345127195119858,
0.03651237487792969,
-0.003862067824229598,
0.041928429156541824,
0.015448260121047497,
0.0037857056595385075,
0.09202281385660172,
0.04567190632224083,
-0.009149615652859211,
0.11935081332921982,
-0.017030417919158936,
-0.005398779641836882,
0.08406251668930054,
0.12793268263339996,
0.0382562093436718,
-0.240144744515419,
0.10789000988006592,
-0.06110549718141556,
0.09957525879144669,
0.07266922295093536,
0.0975169762969017,
-0.07726587355136871,
0.0502295047044754,
0.00914758536964655,
-0.02211177535355091,
0.036503247916698456,
-0.05387347936630249,
-0.019353536888957024,
0.09972843527793884,
0.01848491281270981,
0.05054822191596031,
0.02131974883377552,
-0.007063377182930708,
-0.17458200454711914,
0.03261023387312889,
0.039738159626722336,
0.001401805318892002,
0.0486963614821434,
0.08223322033882141,
0.06755927950143814,
0.1592220515012741,
-0.06243234500288963,
0.038955457508563995,
0.06050859019160271,
-0.054638221859931946,
-0.12557943165302277,
-0.1447276622056961,
0.13996092975139618,
0.031052520498633385,
0.04085245355963707,
-0.04809140786528587,
0.1327163130044937,
-0.0312538743019104,
-0.01546384021639824,
0.2582966387271881,
-0.22224529087543488,
-0.033203884959220886,
0.04440365359187126,
0.016129225492477417,
0.06401611864566803,
-0.06816613674163818,
-0.005725965369492769,
0.05114321410655975,
0.03645199537277222,
0.006382267456501722,
-0.0023141340352594852,
0.17665225267410278,
-0.01965753734111786,
-0.07931197434663773,
-0.001569767715409398,
0.13052697479724884,
0.08430684357881546,
-0.049035362899303436,
-0.1773148477077484,
-0.05769268423318863,
-0.007750179618597031,
-0.06882686167955399,
-0.07830701768398285,
0.01980471983551979,
-0.003911132924258709,
0.07953860610723495,
-0.0256295558065176,
-0.0979941263794899,
0.017894288524985313,
-0.08809833973646164,
0.04499076306819916,
0.021265828981995583,
0.004763930570334196,
0.02865978516638279,
0.05645444616675377,
-0.16319093108177185,
-0.08897867798805237,
-0.10327187180519104,
-0.0748019814491272,
-0.10540719330310822,
-0.015396497212350368,
-0.010522022843360901,
-0.0371452271938324,
0.038853004574775696,
0.1324481964111328,
-0.04722423478960991,
0.06446927785873413,
-0.0025353815872222185,
-0.0178549624979496,
0.03315283730626106,
0.11384216696023941,
-0.04676610231399536,
-0.152350515127182,
0.0031057323794811964,
0.081587053835392,
0.028377411887049675,
-0.0001736826088745147,
-0.02451314777135849,
-0.05491621792316437,
0.006056175567209721,
0.0018416533712297678,
0.029302410781383514,
0.06730110943317413,
-0.05444372445344925,
-0.02958974428474903,
0.16298997402191162,
-0.12955524027347565,
0.00909517053514719,
0.010086972266435623,
-0.013320320285856724,
0.08355691283941269,
-0.005397580098360777,
-0.007860153913497925,
-0.04762543365359306,
0.06260531395673752,
-0.09547106176614761,
-0.053877051919698715,
-0.07339057326316833,
-0.025481170043349266,
0.03897523880004883,
0.06735294312238693,
-0.11879962682723999,
-0.07065773755311966,
-0.14080055058002472,
-0.04894954338669777,
-0.009219749830663204,
-0.026558643206954002,
-0.01202448084950447,
-0.018199626356363297,
-0.049377571791410446,
0.02047847956418991,
0.006300315726548433,
0.018897779285907745,
-0.01449591014534235,
-0.010602660477161407,
-0.030191443860530853,
0.03182055801153183,
0.01671591028571129,
-0.0036084253806620836,
-0.0777730867266655,
0.040853243321180344,
-0.22050125896930695,
0.10627900063991547,
-0.10967228561639786,
0.007243528962135315,
-0.09522172808647156,
-0.02733975648880005,
-0.14395786821842194,
-0.004914381541311741,
-0.00963778980076313,
0.1182248517870903,
-0.19118495285511017,
-0.058165594935417175,
0.1480737030506134,
-0.1472090482711792,
-0.06639117747545242,
0.10792971402406693,
-0.022694680839776993,
0.029947886243462563,
0.1467716246843338,
0.0808487981557846,
0.08813869208097458,
-0.11216176301240921,
-0.07752753794193268,
-0.018899420276284218,
0.010816971771419048,
0.1335265040397644,
0.059575095772743225,
-0.08677078038454056,
0.014582989737391472,
0.05892859399318695,
-0.0027734683826565742,
-0.093770332634449,
0.011277955956757069,
-0.018133779987692833,
-0.046794343739748,
0.0057582552544772625,
-0.01809176616370678,
-0.04463734105229378,
-0.06227308139204979,
-0.06134289130568504,
-0.1374817192554474,
0.0009585548541508615,
0.1315302699804306,
-0.01993883028626442,
0.0036763420794159174,
-0.07850147783756256,
0.041412603110075,
0.0624772347509861,
0.0059486557729542255,
-0.13430045545101166,
-0.0678471028804779,
0.04832307994365692,
-0.11793665587902069,
0.09734335541725159,
0.07504449039697647,
0.05388205870985985,
0.05558663606643677,
-0.01915167272090912,
-0.0046800789423286915,
-0.07160396873950958,
0.010655781254172325,
-0.05872945114970207,
-0.10686565935611725,
-0.020047446712851524,
-0.021561915054917336,
0.09949613362550735,
-0.018222425132989883,
0.004873295780271292,
0.06880971044301987,
0.13036826252937317,
0.00772569514811039,
-0.0553915835916996,
-0.020349755883216858,
-0.04636618122458458,
-0.017237666994333267,
-0.03594047203660011,
-0.026090841740369797,
-0.012068009003996849,
-0.03589288145303726,
0.1010037213563919,
-0.17510657012462616,
-0.0676933005452156,
0.09381168335676193,
0.11528937518596649,
-0.056598540395498276,
-0.03562028333544731,
-0.039553578943014145,
-0.02758328802883625,
-0.039449721574783325,
-0.14706899225711823,
0.1244426816701889,
0.06555075198411942,
0.08027449250221252,
-0.10667359828948975,
-0.041482243686914444,
-0.02510618232190609,
0.03324102982878685,
-0.06366975605487823,
0.06031794100999832,
0.018417930230498314,
-0.1433476358652115,
0.04980771988630295,
0.06081005185842514,
0.0928170457482338,
0.10271813720464706,
-0.0040372381918132305,
-0.07076475024223328,
-0.08220415562391281,
0.0009337732917629182,
-0.019009478390216827,
0.033904049545526505,
0.012398527935147285,
0.07031942903995514,
0.061352893710136414,
0.021312419325113297,
0.07391855865716934,
-0.06774498522281647,
0.02871939353644848,
0.03212212771177292,
-0.03038955293595791,
0.06784679740667343,
0.034931592643260956,
0.04345805570483208,
0.10872609913349152,
0.02136741764843464,
0.038410428911447525,
-0.05432744324207306,
-0.02850126288831234,
-0.07067202776670456,
0.13384060561656952,
-0.08139624446630478,
-0.13047946989536285,
-0.14204667508602142,
0.10756713896989822,
-0.028598656877875328,
-0.0005281806224957108,
0.014363763853907585,
-0.06411627680063248,
-0.09993957728147507,
-0.14393176138401031,
0.11225306242704391,
0.01468387246131897,
-0.030171135440468788,
0.022850463166832924,
0.0032147574238479137,
-0.013773949816823006,
-0.12731926143169403,
-0.03713212534785271,
0.006395877804607153,
-0.04003915190696716,
-0.049717579036951065,
-0.047552552074193954,
0.07687046378850937,
0.10139591991901398,
0.0022416028659790754,
-0.02662438340485096,
0.013300033286213875,
0.17761026322841644,
-0.03282253071665764,
0.12749959528446198,
0.1428588479757309,
0.08632377535104752,
0.1221705973148346,
0.12725874781608582,
0.02366155944764614,
-0.01668146625161171,
0.013131336309015751,
0.06780599057674408,
0.011985723860561848,
-0.22534197568893433,
-0.026614097878336906,
-0.06394663453102112,
-0.03453509137034416,
0.05214903876185417,
0.0420004278421402,
0.011871171183884144,
0.08499763906002045,
-0.14024581015110016,
0.042438894510269165,
0.05183309316635132,
0.054173219949007034,
0.15540006756782532,
0.009630924090743065,
0.04583122208714485,
-0.09527836740016937,
-0.04915674030780792,
0.12586799263954163,
-0.05203702673316002,
0.18336157500743866,
0.003179412567988038,
0.24961769580841064,
0.07055184990167618,
0.003906418103724718,
0.008685029111802578,
0.03847181424498558,
0.02855118364095688,
0.015519085340201855,
-0.04884004965424538,
-0.11050340533256531,
-0.0074131847359240055,
0.08619391918182373,
0.008781198412179947,
0.04052681475877762,
-0.0007318013231270015,
0.01763536036014557,
0.0715695321559906,
0.1366233378648758,
-0.018845394253730774,
-0.14876382052898407,
-0.060621898621320724,
0.053185347467660904,
-0.054241422563791275,
-0.02545415237545967,
-0.00946839526295662,
0.09300988167524338,
-0.08129237592220306,
0.10248425602912903,
0.0052355024963617325,
0.08714063465595245,
-0.034916941076517105,
0.023066574707627296,
-0.03764955699443817,
0.1240444928407669,
-0.021770015358924866,
0.09607935696840286,
-0.055144406855106354,
0.07862567156553268,
0.003856042632833123,
0.0525839664041996,
-0.06987592577934265,
0.04954277724027634,
0.0438990481197834,
0.023653386160731316,
0.03901105374097824,
-0.002667538123205304,
-0.21012026071548462,
-0.033305663615465164,
-0.08141867816448212,
0.003274330636486411,
0.08580424636602402,
-0.07935799658298492,
0.11301976442337036,
-0.0529075488448143,
-0.0025854127015918493,
-0.03103148564696312,
0.05132822319865227,
-0.20009925961494446,
-0.2049650400876999,
0.07772954553365707,
-0.03773541748523712,
-0.04541757330298424,
-0.09764672815799713,
-0.04308658093214035,
-0.051910750567913055,
0.1755332052707672,
-0.1093507781624794,
-0.09618279337882996,
-0.11370371282100677,
-0.018718594685196877,
0.19517992436885834,
-0.12024443596601486,
-0.017311908304691315,
-0.04948600381612778,
0.19866444170475006,
-0.09421038627624512,
-0.11779788136482239,
-0.037245165556669235,
-0.03393998369574547,
-0.16044186055660248,
-0.029796935617923737,
0.1491098552942276,
0.04248088225722313,
0.043735578656196594,
0.031545620411634445,
0.028088204562664032,
-0.026950882747769356,
-0.1482645720243454,
-0.03193289414048195,
0.10221022367477417,
0.04577018693089485,
0.034667618572711945,
-0.020419396460056305,
-0.12971070408821106,
-0.06747829914093018,
-0.0012237216578796506,
0.00968259945511818,
0.2138911634683609,
-0.055051546543836594,
0.053989991545677185,
0.10403314232826233,
-0.0479595884680748,
-0.15396642684936523,
-0.03511998429894447,
0.06938976049423218,
-0.014502928592264652,
0.035594191402196884,
-0.1044517308473587,
0.0820029005408287,
0.09821950644254684,
-0.023121513426303864,
0.022742968052625656,
-0.28792718052864075,
-0.13412277400493622,
0.011007142253220081,
-0.00015720337978564203,
-0.035658907145261765,
-0.0874997228384018,
-0.09302183985710144,
-0.06187746673822403,
-0.10384655743837357,
0.09529221802949905,
-0.02576160989701748,
0.07464233785867691,
0.0036873312201350927,
0.09974254667758942,
0.04795850068330765,
-0.03727428987622261,
0.18802711367607117,
0.020427444949746132,
0.09029104560613632,
-0.08747140318155289,
-0.013072267174720764,
0.06849689036607742,
-0.07861638814210892,
0.13289663195610046,
-0.03793126344680786,
0.04992414265871048,
-0.05613073334097862,
-0.04183117300271988,
-0.07354990392923355,
0.059427231550216675,
-0.05890713259577751,
-0.004515106324106455,
-0.14029814302921295,
0.06761188060045242,
0.04167171195149422,
-0.03165668994188309,
0.02849840559065342,
-0.034355729818344116,
0.05176416411995888,
0.1199563518166542,
0.09262209385633469,
0.013300031423568726,
-0.11562784761190414,
-0.033548399806022644,
-0.0102989561855793,
0.039268966764211655,
-0.12138454616069794,
0.03317466750741005,
0.08949647098779678,
-0.01987709291279316,
0.11547081172466278,
-0.01537422463297844,
-0.19736254215240479,
-0.0394352562725544,
0.09942980855703354,
-0.10395067185163498,
-0.22039325535297394,
-0.02037382312119007,
0.09989901632070541,
-0.09661175310611725,
-0.04927469417452812,
0.11046510189771652,
-0.03653186559677124,
-0.017941676080226898,
0.004266080912202597,
0.03240282088518143,
0.01065752562135458,
0.14083929359912872,
0.009299065917730331,
0.04566601663827896,
-0.06563592702150345,
0.04425700008869171,
0.048708219081163406,
-0.0802423283457756,
0.00987790897488594,
0.11946358531713486,
-0.06828400492668152,
-0.06111987307667732,
0.005701475776731968,
0.08151727169752121,
-0.0026069884188473225,
-0.017034074291586876,
-0.006741741672158241,
-0.10326109826564789,
0.0520470067858696,
0.05859893560409546,
0.014570043422281742,
0.08690785616636276,
0.03650110960006714,
0.015298780053853989,
0.014679293148219585,
0.08630861341953278,
0.021626057103276253,
0.06867781281471252,
-0.06560094654560089,
0.08085887879133224,
-0.0020943861454725266,
-0.031149206683039665,
-0.0010657341917976737,
0.013428870588541031,
-0.01833404414355755,
-0.07611935585737228,
-0.1297290176153183,
0.041058704257011414,
-0.0723038762807846,
-0.0220517385751009,
0.003897695103660226,
0.009587040171027184,
-0.027568025514483452,
0.025211870670318604,
-0.06901150196790695,
-0.06526483595371246,
-0.05224450305104256,
0.09456419199705124,
-0.13883376121520996,
0.06709672510623932,
0.11683856695890427,
-0.08030647784471512,
0.061019349843263626,
0.000998714822344482,
-0.022257475182414055,
0.02015177719295025,
-0.0681958720088005,
0.019686583429574966,
-0.027116229757666588,
0.080900639295578,
0.0066026668064296246,
-0.03954731673002243,
0.016529524698853493,
-0.014797979965806007,
0.018970197066664696,
-0.031205546110868454,
0.026631459593772888,
-0.13939544558525085,
-0.035278916358947754,
0.018181541934609413,
-0.09413054585456848,
-0.05173448473215103,
0.00730216596275568,
0.09758125990629196,
-0.04813696816563606,
0.14865361154079437,
-0.029905587434768677,
-0.016092881560325623,
-0.2187018096446991,
-0.020901599898934364,
0.035805873572826385,
-0.012033600360155106,
0.04552285373210907,
-0.09295075386762619,
0.046670060604810715,
0.0053010317496955395,
0.10879877954721451,
-0.03924126178026199,
-0.043702635914087296,
0.04113684967160225,
-0.018373090773820877,
0.0016335437539964914,
0.027638226747512817,
0.10450215637683868,
-0.014224641025066376,
-0.014130675233900547,
-0.0007916061440482736,
-0.0601373054087162,
-0.0214301235973835,
-0.05658821016550064,
0.14737041294574738,
0.1374145746231079,
0.08178631961345673,
0.015141515992581844,
0.11030157655477524,
0.03225225210189819,
-0.032743461430072784,
0.0768183246254921,
-0.03703939914703369,
0.006793771870434284,
-0.051713164895772934,
0.01903151348233223,
0.15865564346313477,
-0.15259936451911926,
0.11266814917325974,
-0.023993441835045815,
-0.006125836633145809,
-0.05646755173802376,
-0.1304776966571808,
-0.0941862016916275,
-0.03904660418629646,
0.00837575364857912,
-0.11934089660644531,
0.014912359416484833,
0.10041689872741699,
0.019672637805342674,
-0.03837433457374573,
0.11920024454593658,
-0.0963212326169014,
-0.12008866667747498,
0.06504262983798981,
0.024215376004576683,
-0.04918421059846878,
0.08712517470121384,
-0.03048710525035858,
0.08720246702432632,
0.04825969785451889,
0.047520264983177185,
0.06965652108192444,
0.0188317708671093,
-0.020498942583799362,
-0.051366232335567474,
-0.06134401634335518,
0.010035373270511627,
-0.02895992621779442,
-0.005011155270040035,
0.05428459867835045,
0.0343172661960125,
-0.04742937162518501,
-0.03513394668698311,
0.09468422830104828,
-0.01708536222577095,
-0.125680074095726,
-0.11758251488208771,
0.11762434989213943,
-0.032524578273296356,
0.055230528116226196,
0.025060905143618584,
-0.10507243871688843,
-0.002374213421717286,
0.11549371480941772,
0.07675378769636154,
-0.028877971693873405,
-0.00860647950321436,
-0.002306196838617325,
0.004940152168273926,
-0.0178971104323864,
0.1027505025267601,
-0.001910414663143456,
0.18171709775924683,
-0.0013602232793346047,
0.15914389491081238,
-0.04211113229393959,
-0.013382740318775177,
-0.07712815701961517,
0.15702185034751892,
-0.08647900074720383,
0.04394875839352608,
-0.06650317460298538,
0.08449048548936844,
0.03763711452484131,
-0.26447954773902893,
0.021044079214334488,
-0.05411621183156967,
-0.09733304381370544,
0.0384591743350029,
0.08534058928489685,
0.006386253051459789,
0.051252320408821106,
0.0038562528789043427,
0.006252779625356197,
0.1140139177441597,
-0.029571877792477608,
-0.11995801329612732,
-0.047561053186655045,
0.060426339507102966,
-0.08136644959449768,
0.17915980517864227,
0.04270083084702492,
0.06391369551420212,
0.11477864533662796,
-0.051538411527872086,
-0.10833416134119034,
0.08781320601701736,
0.050124190747737885,
-0.016571326181292534,
0.03336106985807419,
0.10162296146154404,
0.011001347564160824,
0.03595568239688873,
0.09857571870088577,
-0.04580715671181679,
0.035868242383003235,
0.04260101914405823,
-0.03104018047451973,
-0.10717781633138657,
0.13772112131118774,
-0.10268033295869827,
0.12690961360931396,
0.1476212590932846,
-0.02651520073413849,
0.002863786183297634,
-0.034923166036605835,
0.030452948063611984,
-0.022531477734446526,
0.08748497813940048,
-0.021911554038524628,
-0.14653554558753967,
0.002417441224679351,
0.07692887634038925,
0.10165394842624664,
-0.20422738790512085,
-0.061265978962183,
-0.00971593800932169,
0.020103400573134422,
0.017271244898438454,
0.14268134534358978,
0.03654786944389343,
-0.00039310389547608793,
-0.007941193878650665,
-0.19411654770374298,
-0.019837960600852966,
0.10025530308485031,
-0.056536342948675156,
-0.00570423249155283
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | object-detection | oraul/table_transformer_TSR_v1 | [
"transformers",
"safetensors",
"table-transformer",
"object-detection",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-15T06:58:31+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #table-transformer #object-detection #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #table-transformer #object-detection #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
41,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #table-transformer #object-detection #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.053896769881248474,
0.18985764682292938,
-0.004845323972404003,
0.021511230617761612,
0.09961056709289551,
0.00250054313801229,
0.06178539618849754,
0.11068109422922134,
-0.008710930123925209,
0.12965236604213715,
0.02196524851024151,
0.09819786250591278,
0.11521566659212112,
0.16163109242916107,
-0.006316293030977249,
-0.2332613319158554,
0.06448189169168472,
-0.10584572702646255,
0.005951083265244961,
0.11411421000957489,
0.13461008667945862,
-0.11268643289804459,
0.08168956637382507,
-0.005840268451720476,
-0.0019032636191695929,
-0.023005664348602295,
-0.05934227630496025,
-0.06709644198417664,
0.055881161242723465,
0.07462269067764282,
0.06538130342960358,
0.016434337943792343,
0.07650427520275116,
-0.2880500555038452,
0.014189272187650204,
0.08077903091907501,
0.010454624891281128,
0.06849627196788788,
0.08778498321771622,
-0.07122921943664551,
0.12222716212272644,
-0.06050044670701027,
0.14018215239048004,
0.07400192320346832,
-0.0901309996843338,
-0.19486217200756073,
-0.06881159543991089,
0.05961989611387253,
0.1303035020828247,
0.05849425494670868,
-0.033369891345500946,
0.15627358853816986,
-0.11015134304761887,
0.005073630716651678,
0.11629632860422134,
-0.08019883930683136,
-0.05913447588682175,
0.025984153151512146,
0.103952556848526,
0.08505066484212875,
-0.12755541503429413,
-0.01517707109451294,
0.024290138855576515,
0.018858786672353745,
0.09130894392728806,
0.016291048377752304,
0.14698940515518188,
0.03207651898264885,
-0.14204683899879456,
-0.050434812903404236,
0.08758741617202759,
0.032813090831041336,
-0.047627560794353485,
-0.2331787496805191,
-0.025487830862402916,
0.00007420060137519613,
-0.038170237094163895,
-0.03985035419464111,
0.04962104186415672,
-0.03546081855893135,
0.07195594161748886,
0.010509306564927101,
-0.07574505358934402,
-0.034930795431137085,
0.059204764664173126,
0.08394253253936768,
0.026304660364985466,
-0.011434021405875683,
0.024071570485830307,
0.1154872253537178,
0.09382809698581696,
-0.12664204835891724,
-0.06311384588479996,
-0.07818222045898438,
-0.09379224479198456,
-0.044364962726831436,
0.038751211017370224,
0.06396675854921341,
0.06151125207543373,
0.20379723608493805,
-0.005686802789568901,
0.05062498524785042,
0.03708554431796074,
0.0038161547854542732,
0.06926463544368744,
0.09810096770524979,
-0.06372009962797165,
-0.137046679854393,
-0.05052396282553673,
0.10365185886621475,
0.0020032688044011593,
-0.02784918062388897,
-0.026833128184080124,
0.04488638415932655,
0.04450235515832901,
0.11589072644710541,
0.0918738842010498,
0.0054393610917031765,
-0.07741430401802063,
-0.04211326688528061,
0.21741873025894165,
-0.1487889289855957,
0.033147625625133514,
0.014047853648662567,
-0.046055398881435394,
-0.012484665028750896,
0.000793092476669699,
0.020197641104459763,
-0.03717091679573059,
0.10450247675180435,
-0.07543838769197464,
-0.036034807562828064,
-0.1111527606844902,
-0.060001831501722336,
0.02364167384803295,
0.025387465953826904,
-0.025458790361881256,
-0.038539428263902664,
-0.09763406962156296,
-0.06037333607673645,
0.07439575344324112,
-0.07656611502170563,
-0.060659259557724,
0.005402243696153164,
-0.04793621599674225,
0.009693598374724388,
0.00225981161929667,
0.10939191281795502,
-0.03692612424492836,
0.031855519860982895,
-0.05163314566016197,
0.06722023338079453,
0.1360289603471756,
0.0320834182202816,
-0.09313187748193741,
0.07229655981063843,
-0.23608939349651337,
0.11010845750570297,
-0.10086704790592194,
0.02263675257563591,
-0.14402420818805695,
-0.03296544775366783,
0.019474664703011513,
0.02503880113363266,
-0.012435270473361015,
0.1324254423379898,
-0.19238010048866272,
-0.028728676959872246,
0.1660696119070053,
-0.1241338774561882,
-0.09097551554441452,
0.06425117701292038,
-0.05913051590323448,
0.10340268164873123,
0.047215670347213745,
-0.02083822712302208,
0.051907576620578766,
-0.1267014741897583,
-0.042514290660619736,
-0.02989351749420166,
-0.015349352732300758,
0.1581287682056427,
0.06398312747478485,
-0.06374841928482056,
0.059225648641586304,
0.019244423136115074,
-0.029702600091695786,
-0.03355392813682556,
-0.0360160768032074,
-0.08947702497243881,
0.0087362639605999,
-0.06819707900285721,
0.03526466712355614,
-0.017074614763259888,
-0.08760028332471848,
-0.03860391303896904,
-0.17240197956562042,
0.007247101049870253,
0.08174939453601837,
0.008531020022928715,
-0.024532418698072433,
-0.08698130398988724,
0.008766962215304375,
0.000002328865548406611,
-0.02249380573630333,
-0.16371896862983704,
-0.06364418566226959,
0.04461904615163803,
-0.20705866813659668,
0.014290501363575459,
-0.04589878395199776,
0.045884937047958374,
0.030599605292081833,
-0.03977600112557411,
-0.008089592680335045,
0.015229171141982079,
0.02262193150818348,
-0.02453632466495037,
-0.21511344611644745,
-0.023388078436255455,
-0.02649925835430622,
0.1471385657787323,
-0.23034916818141937,
0.02397211454808712,
0.078488789498806,
0.14707455039024353,
0.009920164942741394,
-0.04560774937272072,
0.02437875047326088,
-0.05930383875966072,
-0.049630582332611084,
-0.06486861407756805,
-0.009278226643800735,
-0.030267292633652687,
-0.04073875769972801,
0.07060664147138596,
-0.18862277269363403,
-0.038378141820430756,
0.1139380931854248,
0.08075502514839172,
-0.14830443263053894,
-0.03811204433441162,
-0.041455354541540146,
-0.06684859842061996,
-0.09182824939489365,
-0.05680965632200241,
0.10552947223186493,
0.05406331270933151,
0.04692131653428078,
-0.08191195130348206,
-0.05267970263957977,
0.0057449606247246265,
-0.012869192287325859,
-0.030552193522453308,
0.0892425924539566,
0.0852329283952713,
-0.11755501478910446,
0.08797714859247208,
0.08087699115276337,
0.06464790552854538,
0.08450039476156235,
-0.004672153852880001,
-0.10432245582342148,
-0.020206112414598465,
0.01546332985162735,
0.02244335040450096,
0.12910160422325134,
-0.03977353498339653,
0.05886797979474068,
0.05667756870388985,
-0.02756022848188877,
0.017511876299977303,
-0.10928512364625931,
0.032340917736291885,
0.035339485853910446,
-0.005716424435377121,
0.0328737273812294,
-0.033340081572532654,
0.03340640664100647,
0.09013866633176804,
0.033997513353824615,
0.026867734268307686,
0.011716018430888653,
-0.037244003266096115,
-0.10510455816984177,
0.171413853764534,
-0.09629801660776138,
-0.27357324957847595,
-0.12262832373380661,
-0.009736437350511551,
0.04878897964954376,
-0.020917948335409164,
0.01011411938816309,
-0.04047318547964096,
-0.119237519800663,
-0.10229230672121048,
0.007487248629331589,
0.0560445562005043,
-0.0845072865486145,
-0.07479758560657501,
0.05580433830618858,
0.03758729249238968,
-0.13407142460346222,
0.019758667796850204,
0.04359801486134529,
-0.04316617175936699,
-0.0032637405674904585,
0.07013295590877533,
0.10042545944452286,
0.1720646172761917,
0.007179040461778641,
-0.02813372015953064,
0.02597983554005623,
0.24086466431617737,
-0.14426027238368988,
0.10666214674711227,
0.14719100296497345,
-0.06297770142555237,
0.1034480556845665,
0.2012205570936203,
0.03007160872220993,
-0.07733233273029327,
0.04285002499818802,
0.04189614579081535,
-0.04116341471672058,
-0.23780110478401184,
-0.06555622816085815,
-0.0014978675171732903,
-0.07844412326812744,
0.08754359930753708,
0.09024514257907867,
0.10796608775854111,
0.051689524203538895,
-0.0901881530880928,
-0.07019747793674469,
0.03573935478925705,
0.110515296459198,
-0.01013439241796732,
0.014352538622915745,
0.0936531126499176,
-0.03637070581316948,
-0.005942917428910732,
0.10272877663373947,
0.0005009321612305939,
0.19775447249412537,
0.027380118146538734,
0.15599669516086578,
0.0780603364109993,
0.04872450232505798,
0.025878099724650383,
0.017263295128941536,
0.033391501754522324,
0.01223544031381607,
-0.013041124679148197,
-0.09584459662437439,
0.012771572917699814,
0.1326114386320114,
0.06314895302057266,
0.03481246903538704,
0.03105885162949562,
-0.03291463851928711,
0.07247482985258102,
0.16872058808803558,
0.007816868834197521,
-0.20723967254161835,
-0.035055194050073624,
0.08608231693506241,
-0.08529336750507355,
-0.11878781765699387,
-0.009928903542459011,
0.02254212461411953,
-0.18599684536457062,
0.046389978379011154,
-0.025730641558766365,
0.11018697172403336,
-0.11890815198421478,
-0.027398502454161644,
0.0428323820233345,
0.0899224653840065,
-0.032575443387031555,
0.07917756587266922,
-0.17726963758468628,
0.12186236679553986,
0.010240131057798862,
0.059918489307165146,
-0.11130647361278534,
0.09836231917142868,
0.02054828219115734,
-0.01032648142427206,
0.16435828804969788,
-0.0038455668836832047,
-0.08222708851099014,
-0.05304618179798126,
-0.07943323999643326,
-0.018566587939858437,
0.09453383088111877,
-0.10170958936214447,
0.08677323162555695,
-0.01890096813440323,
-0.04254455119371414,
0.00010234722140012309,
-0.11912921071052551,
-0.13951049745082855,
-0.1903744637966156,
0.06202134117484093,
-0.11293336004018784,
0.0060536968521773815,
-0.11114640533924103,
-0.05319005623459816,
-0.03297508507966995,
0.20137345790863037,
-0.1402655839920044,
-0.0955023393034935,
-0.14809376001358032,
-0.09922617673873901,
0.16853204369544983,
-0.046823617070913315,
0.08776923269033432,
-0.004836251959204674,
0.21514403820037842,
0.015055883675813675,
-0.007189025636762381,
0.08686351776123047,
-0.09207946062088013,
-0.1970943659543991,
-0.0812477096915245,
0.12810958921909332,
0.12377389520406723,
0.04462341591715813,
-0.013672680594027042,
0.03230961412191391,
-0.03223620727658272,
-0.11545951664447784,
0.0057435124181210995,
0.12144786864519119,
0.06534624099731445,
0.044576212763786316,
0.001676240935921669,
-0.1321355551481247,
-0.08290743082761765,
-0.04295523464679718,
0.015137597918510437,
0.1821272075176239,
-0.0770491510629654,
0.1485561579465866,
0.1435382068157196,
-0.04652554169297218,
-0.21024169027805328,
0.029603442177176476,
0.040501900017261505,
-0.00850088894367218,
0.058609090745449066,
-0.17139142751693726,
0.0883507952094078,
0.029937930405139923,
-0.05878322571516037,
0.1538875550031662,
-0.1748322993516922,
-0.15564566850662231,
0.08036808669567108,
0.061596255749464035,
-0.22746199369430542,
-0.12722599506378174,
-0.09651684015989304,
-0.06621198356151581,
-0.14994527399539948,
0.07509175688028336,
0.01192533876746893,
-0.003097836161032319,
0.04728495329618454,
0.010389857925474644,
0.021158823743462563,
-0.05525732412934303,
0.21573437750339508,
-0.006654130294919014,
0.03454458341002464,
-0.07771413028240204,
-0.09978178888559341,
0.05522996932268143,
-0.04747195541858673,
0.07393359392881393,
-0.022641196846961975,
0.0064969612285494804,
-0.0741257295012474,
-0.05938330665230751,
-0.059075821191072464,
0.027261465787887573,
-0.08088944107294083,
-0.10125406831502914,
-0.07247894257307053,
0.10214890539646149,
0.0972798764705658,
-0.031054385006427765,
-0.04064827039837837,
-0.08935251832008362,
0.03712157905101776,
0.21843789517879486,
0.17296740412712097,
0.06064308062195778,
-0.0957874059677124,
0.0022928572725504637,
-0.016179464757442474,
0.04036194086074829,
-0.18458811938762665,
0.05467572063207626,
0.04525376856327057,
0.02278442680835724,
0.1181579977273941,
-0.021386127918958664,
-0.16324907541275024,
-0.04676032438874245,
0.05559495463967323,
-0.04484201967716217,
-0.20253068208694458,
-0.00796900037676096,
0.06279757618904114,
-0.17616285383701324,
-0.06738551706075668,
0.01685149595141411,
-0.010227978229522705,
-0.028138024732470512,
0.007530041504651308,
0.07183258980512619,
0.031745463609695435,
0.1034717708826065,
0.06313232332468033,
0.09989684820175171,
-0.11343235522508621,
0.088556207716465,
0.09649699926376343,
-0.08222965151071548,
0.0031950101256370544,
0.0820591077208519,
-0.057729385793209076,
-0.024447636678814888,
0.027812425047159195,
0.06069275364279747,
0.01827177405357361,
-0.058563482016325,
-0.023210931569337845,
-0.1131163239479065,
0.058853648602962494,
0.12622076272964478,
0.03794603422284126,
-0.0012824046425521374,
0.05226685851812363,
0.019789865240454674,
-0.08146467804908752,
0.11700969934463501,
0.04511731117963791,
0.04166313633322716,
-0.05943675711750984,
-0.024652279913425446,
0.04533801227807999,
-0.013803993351757526,
-0.017854582518339157,
-0.026689855381846428,
-0.05790664628148079,
-0.012994321063160896,
-0.19042237102985382,
0.02657165378332138,
-0.08754143118858337,
0.0031881569884717464,
0.016301354393363,
-0.03669215366244316,
-0.014885609969496727,
0.012825794517993927,
-0.08131971955299377,
-0.051158349961042404,
-0.00438035000115633,
0.09929584711790085,
-0.15266433358192444,
0.0038159538526088,
0.09505754709243774,
-0.12431690096855164,
0.06802330911159515,
-0.0062525831162929535,
-0.012738212011754513,
0.0059532527811825275,
-0.13297978043556213,
0.03856383264064789,
-0.0034136006142944098,
0.015030334703624249,
0.0396854430437088,
-0.17830196022987366,
0.0012825396843254566,
-0.040559325367212296,
-0.047739364206790924,
-0.01593058928847313,
-0.08126797527074814,
-0.12212362140417099,
0.1137058213353157,
0.004203297663480043,
-0.09291407465934753,
-0.013971133157610893,
0.045322317630052567,
0.11068380624055862,
-0.04977858066558838,
0.13614477217197418,
-0.006055842153728008,
0.06359837204217911,
-0.17514245212078094,
-0.02376708574593067,
-0.01469358243048191,
0.008100277744233608,
0.008844252675771713,
-0.0023512770421802998,
0.04648870974779129,
-0.01235913299024105,
0.25048092007637024,
-0.02393345534801483,
0.05937084183096886,
0.06372595578432083,
0.016160152852535248,
0.01804032176733017,
0.0935516506433487,
0.05263444408774376,
0.01912546716630459,
0.0035660085268318653,
0.023014549165964127,
-0.04327422380447388,
-0.02317485585808754,
-0.17132113873958588,
0.0829366147518158,
0.16010233759880066,
0.08645261824131012,
0.0022717430256307125,
0.06431861221790314,
-0.10154232382774353,
-0.10923823714256287,
0.087091363966465,
-0.043625328689813614,
-0.005169228184968233,
-0.05517043545842171,
0.1570768654346466,
0.1489749699831009,
-0.17115329205989838,
0.072262704372406,
-0.05029889568686485,
-0.0501953586935997,
-0.11837446689605713,
-0.1600906401872635,
-0.07064758986234665,
-0.03152783587574959,
-0.005098439287394285,
-0.05311264842748642,
0.07201356440782547,
0.11473295837640762,
-0.0015408979961648583,
-0.0002822616370394826,
0.09849988669157028,
-0.022463176399469376,
-0.01625908352434635,
0.032777950167655945,
0.05281274393200874,
0.02829718217253685,
-0.04442012310028076,
0.01609094813466072,
0.00021448447660077363,
0.03456650301814079,
0.05111145228147507,
0.02765604853630066,
-0.0415780134499073,
0.01634812355041504,
-0.009177451953291893,
-0.10677926242351532,
0.03157821297645569,
-0.02928728610277176,
-0.06586195528507233,
0.13235220313072205,
0.03461612015962601,
0.018147198483347893,
-0.032193783670663834,
0.2159709632396698,
-0.07285205274820328,
-0.0768788531422615,
-0.1385231465101242,
0.12383091449737549,
-0.0412251316010952,
0.06036591902375221,
0.061002880334854126,
-0.11438033729791641,
0.007967420853674412,
0.1287284791469574,
0.12771983444690704,
-0.03913014009594917,
0.009629350155591965,
0.027373021468520164,
0.004018411505967379,
-0.04699529707431793,
0.04949912428855896,
0.034944165498018265,
0.15095548331737518,
-0.06894240528345108,
0.0731968954205513,
-0.002508282894268632,
-0.08797654509544373,
-0.04326790198683739,
0.13094164431095123,
0.007371340412646532,
0.032398294657468796,
-0.07835353165864944,
0.10842498391866684,
-0.06125984340906143,
-0.23533765971660614,
0.042783670127391815,
-0.06500391662120819,
-0.1572353094816208,
-0.01726554147899151,
0.01528225652873516,
-0.006051359698176384,
0.026124773547053337,
0.06613563746213913,
-0.06688439100980759,
0.15352368354797363,
0.04562787711620331,
-0.07863350957632065,
-0.053857866674661636,
0.08235954493284225,
-0.08295497298240662,
0.3005877435207367,
0.013585677370429039,
0.046472277492284775,
0.10270290821790695,
-0.021888306364417076,
-0.1376582235097885,
0.039291247725486755,
0.10312914103269577,
-0.09848074615001678,
0.06898271292448044,
0.17673836648464203,
-0.0009152397979050875,
0.10669633001089096,
0.0855925977230072,
-0.06351079046726227,
0.06396494060754776,
-0.07857616990804672,
-0.074799545109272,
-0.09962234646081924,
0.06167346239089966,
-0.06651102006435394,
0.13983160257339478,
0.13268128037452698,
-0.04157791659235954,
0.00369839183986187,
-0.023645659908652306,
0.04781581833958626,
0.0055615538731217384,
0.12286534905433655,
0.00800529308617115,
-0.1681613326072693,
0.034481167793273926,
0.009411446750164032,
0.10510578006505966,
-0.20890279114246368,
-0.08676018565893173,
0.040113672614097595,
-0.02732236683368683,
-0.05539212375879288,
0.10613667219877243,
0.046571582555770874,
0.04761878401041031,
-0.04517625644803047,
-0.050160687416791916,
-0.007909558713436127,
0.15566523373126984,
-0.1152821034193039,
-0.005057927221059799
] |
null | null | peft |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# model_IMDB_bert_base_peft
This model is a fine-tuned version of [nlptown/bert-base-multilingual-uncased-sentiment](https://huggingface.co/nlptown/bert-base-multilingual-uncased-sentiment) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2116
- Accuracy: 0.9224
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|
| 0.2559 | 1.0 | 1563 | 0.2355 | 0.9088 |
| 0.2406 | 2.0 | 3126 | 0.2285 | 0.9134 |
| 0.2322 | 3.0 | 4689 | 0.2185 | 0.9173 |
| 0.2291 | 4.0 | 6252 | 0.2174 | 0.9193 |
| 0.2177 | 5.0 | 7815 | 0.2171 | 0.9186 |
| 0.2218 | 6.0 | 9378 | 0.2154 | 0.9202 |
| 0.2116 | 7.0 | 10941 | 0.2127 | 0.9221 |
| 0.2133 | 8.0 | 12504 | 0.2101 | 0.9225 |
| 0.2076 | 9.0 | 14067 | 0.2125 | 0.9221 |
| 0.2029 | 10.0 | 15630 | 0.2116 | 0.9224 |
### Framework versions
- PEFT 0.8.2
- Transformers 4.37.2
- Pytorch 2.0.1+cu117
- Datasets 2.17.0
- Tokenizers 0.15.2 | {"license": "mit", "library_name": "peft", "tags": ["generated_from_trainer"], "metrics": ["accuracy"], "base_model": "nlptown/bert-base-multilingual-uncased-sentiment", "model-index": [{"name": "model_IMDB_bert_base_peft", "results": []}]} | null | Kudod/model_IMDB_bert_base_peft | [
"peft",
"tensorboard",
"safetensors",
"generated_from_trainer",
"base_model:nlptown/bert-base-multilingual-uncased-sentiment",
"license:mit",
"region:us"
] | 2024-02-15T06:58:33+00:00 | [] | [] | TAGS
#peft #tensorboard #safetensors #generated_from_trainer #base_model-nlptown/bert-base-multilingual-uncased-sentiment #license-mit #region-us
| model\_IMDB\_bert\_base\_peft
=============================
This model is a fine-tuned version of nlptown/bert-base-multilingual-uncased-sentiment on an unknown dataset.
It achieves the following results on the evaluation set:
* Loss: 0.2116
* Accuracy: 0.9224
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 10
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* PEFT 0.8.2
* Transformers 4.37.2
* Pytorch 2.0.1+cu117
* Datasets 2.17.0
* Tokenizers 0.15.2
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 10\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* PEFT 0.8.2\n* Transformers 4.37.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
"TAGS\n#peft #tensorboard #safetensors #generated_from_trainer #base_model-nlptown/bert-base-multilingual-uncased-sentiment #license-mit #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 10\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* PEFT 0.8.2\n* Transformers 4.37.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
53,
113,
4,
39
] | [
"passage: TAGS\n#peft #tensorboard #safetensors #generated_from_trainer #base_model-nlptown/bert-base-multilingual-uncased-sentiment #license-mit #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 10\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* PEFT 0.8.2\n* Transformers 4.37.2\n* Pytorch 2.0.1+cu117\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
-0.12518620491027832,
0.045327700674533844,
-0.0014217204879969358,
0.10750575363636017,
0.13674011826515198,
0.01751725748181343,
0.11976133286952972,
0.10521597415208817,
-0.05259047821164131,
0.06436603516340256,
0.09166216850280762,
0.09926864504814148,
0.02357524447143078,
0.12531420588493347,
-0.04826624318957329,
-0.2305188775062561,
-0.003054286353290081,
0.0075844950042665005,
-0.012102539651095867,
0.11169492453336716,
0.06665754318237305,
-0.13065019249916077,
0.058139681816101074,
-0.015201550908386707,
-0.16443154215812683,
0.021053209900856018,
0.01209214422851801,
-0.003172504948452115,
0.12097536772489548,
0.017683207988739014,
0.14306630194187164,
0.014104068279266357,
0.09369280934333801,
-0.235812708735466,
0.02210165746510029,
0.06868337839841843,
0.005479949992150068,
0.05971531569957733,
0.08239848166704178,
-0.010826760903000832,
0.12489449977874756,
-0.092460036277771,
0.06046735867857933,
0.011974864639341831,
-0.13650676608085632,
-0.23501081764698029,
-0.08432910591363907,
0.024632206186652184,
0.07159867137670517,
0.0724250078201294,
-0.028910553082823753,
0.15272940695285797,
-0.08902629464864731,
0.08554632216691971,
0.208609938621521,
-0.2817172110080719,
-0.08296260237693787,
0.01993468776345253,
0.02508079633116722,
0.11127308011054993,
-0.12706854939460754,
-0.044477254152297974,
0.060034822672605515,
0.04839091747999191,
0.08174026757478714,
-0.027071552351117134,
-0.07366222888231277,
0.020507100969552994,
-0.1484593003988266,
0.016763607040047646,
0.08917617052793503,
0.06377196311950684,
-0.041611190885305405,
-0.021049803122878075,
-0.053502023220062256,
-0.14461183547973633,
-0.06459333002567291,
-0.009975171647965908,
0.04232947155833244,
-0.02579227276146412,
-0.07949993014335632,
-0.03152301907539368,
-0.09194064140319824,
-0.06865736842155457,
-0.04330804944038391,
0.18377244472503662,
0.0445048026740551,
0.003146421629935503,
-0.005866925232112408,
0.10325460880994797,
-0.050793591886758804,
-0.11982611566781998,
0.02562204748392105,
0.016731247305870056,
-0.026317404583096504,
-0.061485376209020615,
-0.04318588599562645,
-0.04496253654360771,
0.015251997858285904,
0.09307732433080673,
-0.09950988739728928,
0.07535111159086227,
-0.010462028905749321,
0.043978411704301834,
-0.09989093989133835,
0.10977867990732193,
-0.06237487122416496,
0.012623915448784828,
0.007515230681747198,
0.09464992582798004,
0.03519615903496742,
0.0020609113853424788,
-0.09213215112686157,
0.05111599341034889,
0.07958930730819702,
0.013347309082746506,
-0.08441682159900665,
0.035348135977983475,
-0.08235394209623337,
-0.0050996774807572365,
-0.007904824800789356,
-0.09166181832551956,
0.05908823385834694,
0.01969432644546032,
-0.05396556109189987,
-0.01681520789861679,
-0.0003473830292932689,
0.03157088905572891,
0.006226921454071999,
0.1065351665019989,
-0.08481136709451675,
0.05834582448005676,
-0.10416539758443832,
-0.1247870922088623,
0.023356111720204353,
-0.026583068072795868,
0.010808483697474003,
-0.08522607386112213,
-0.15931767225265503,
-0.035905588418245316,
0.051463764160871506,
-0.047245901077985764,
-0.003345085773617029,
-0.07771796733140945,
-0.06574846804141998,
0.002947864355519414,
-0.012244193814694881,
0.10133564472198486,
-0.06917861849069595,
0.11107833683490753,
0.018444428220391273,
0.054594989866018295,
-0.06534816324710846,
0.03117983415722847,
-0.09166911989450455,
0.02530575916171074,
-0.18706004321575165,
0.028540654107928276,
-0.08109638094902039,
0.06197764351963997,
-0.07539092749357224,
-0.08451972156763077,
-0.04885196313261986,
0.006398268975317478,
0.11017851531505585,
0.10748948901891708,
-0.20424382388591766,
-0.05972997471690178,
0.18416884541511536,
-0.08442976325750351,
-0.09560026973485947,
0.1454329788684845,
-0.05986291915178299,
0.0547359436750412,
0.07490149140357971,
0.22490762174129486,
0.04972943291068077,
-0.14239759743213654,
0.03316478058695793,
-0.01744813285768032,
0.08179023116827011,
-0.01445156242698431,
0.03966416046023369,
-0.016373617574572563,
-0.027045900002121925,
0.015062288381159306,
-0.018581096082925797,
0.07367682456970215,
-0.09934844076633453,
-0.07538976520299911,
-0.03782160207629204,
-0.10548693686723709,
0.05547502264380455,
0.07392735034227371,
0.04707656800746918,
-0.13164301216602325,
-0.07944652438163757,
0.08435762673616409,
0.0882273018360138,
-0.06720820814371109,
0.03355447202920914,
-0.0523340217769146,
0.08289223164319992,
-0.04387902095913887,
-0.046495381742715836,
-0.15213632583618164,
-0.04915527254343033,
0.012309624813497066,
0.012724907137453556,
0.020712943747639656,
0.0030774828046560287,
0.08229106664657593,
0.0819331631064415,
-0.07681470364332199,
-0.024483932182192802,
-0.029597852379083633,
0.0037127751857042313,
-0.1219797283411026,
-0.22692717611789703,
-0.008673369884490967,
-0.03692825883626938,
0.12067563086748123,
-0.23347844183444977,
0.029403081163764,
-0.00959231611341238,
0.08789831399917603,
0.0375075563788414,
-0.04408944398164749,
-0.025909235700964928,
0.10027571022510529,
-0.00849491823464632,
-0.06069445610046387,
0.07703667879104614,
0.0074846697971224785,
-0.07460583746433258,
-0.040305089205503464,
-0.1323348581790924,
0.16172660887241364,
0.12603206932544708,
-0.05189801752567291,
-0.09028053283691406,
-0.04659639298915863,
-0.04046818986535072,
-0.023553188890218735,
-0.06555580347776413,
0.04171689972281456,
0.1668361872434616,
0.007377423346042633,
0.12033719569444656,
-0.09140738099813461,
-0.03058113344013691,
0.011778128333389759,
-0.035966869443655014,
0.0269396360963583,
0.13220366835594177,
0.08733242005109787,
-0.07757347077131271,
0.1187545657157898,
0.13906657695770264,
-0.09413955360651016,
0.14623494446277618,
-0.048125971108675,
-0.06575250625610352,
-0.03147882595658302,
0.052435774356126785,
-0.0031551707070320845,
0.15457706153392792,
-0.10571068525314331,
0.014459699392318726,
-0.022108495235443115,
0.021454131230711937,
0.01966584473848343,
-0.24269992113113403,
-0.06124469265341759,
0.011754407547414303,
-0.054422371089458466,
-0.03351817652583122,
-0.03079012967646122,
0.007331226021051407,
0.10728106647729874,
-0.02609848417341709,
-0.05707823485136032,
0.014626065269112587,
0.0020311372354626656,
-0.08195630460977554,
0.19561699032783508,
-0.09891762584447861,
-0.07778892666101456,
-0.07803292572498322,
-0.03732696920633316,
-0.03999671712517738,
-0.009015757590532303,
0.04774290323257446,
-0.09622500836849213,
-0.01250546146184206,
-0.1051710769534111,
0.0028902445919811726,
0.005679840222001076,
-0.0037097476888448,
-0.0378563329577446,
-0.011057768017053604,
0.08112692087888718,
-0.0998271107673645,
0.006776182912290096,
-0.05389592424035072,
-0.048096198588609695,
0.03411070257425308,
0.03446333855390549,
0.11759787797927856,
0.1605335772037506,
-0.003307438688352704,
-0.004320652224123478,
-0.03492875397205353,
0.22028614580631256,
-0.0622166246175766,
-0.022473981603980064,
0.07474393397569656,
-0.004445204511284828,
0.06778889894485474,
0.1383197009563446,
0.08381829410791397,
-0.11271354556083679,
0.024370059370994568,
0.05861157551407814,
-0.02206818200647831,
-0.19866670668125153,
-0.03526908904314041,
-0.05114132910966873,
-0.048827290534973145,
0.06744284927845001,
0.033610325306653976,
-0.042208265513181686,
0.044536370784044266,
0.02153036743402481,
0.01902483031153679,
-0.02863660454750061,
0.055666930973529816,
0.07117344439029694,
0.026054272428154945,
0.10352510213851929,
-0.035739291459321976,
-0.05959051474928856,
0.04760769009590149,
0.00526039581745863,
0.21450506150722504,
0.0016895068110898137,
0.08675266057252884,
0.06583663821220398,
0.21300876140594482,
-0.009122277610003948,
0.051641467958688736,
-0.005226423032581806,
-0.06313278526067734,
-0.004064170178025961,
-0.046860381960868835,
-0.006522894371300936,
0.02361520379781723,
-0.04130695387721062,
0.06206924840807915,
-0.10610389709472656,
-0.055599186569452286,
0.06168787181377411,
0.21593880653381348,
0.06822462379932404,
-0.27577149868011475,
-0.05101972445845604,
0.024272339418530464,
-0.013082309626042843,
-0.03363993391394615,
0.02670014277100563,
0.16373233497142792,
-0.0425821915268898,
0.041096437722444534,
-0.07087145000696182,
0.07888027280569077,
-0.0294613316655159,
0.028051888570189476,
0.03567532077431679,
0.12099301069974899,
-0.035228949040174484,
0.05005516856908798,
-0.26583126187324524,
0.28041499853134155,
0.026195935904979706,
0.10103558003902435,
-0.028094947338104248,
-0.040199581533670425,
0.044448431581258774,
0.053528524935245514,
0.05832485109567642,
-0.0044018300250172615,
-0.05163026601076126,
-0.23241057991981506,
-0.04562859609723091,
0.05216490477323532,
0.11934627592563629,
0.015191547572612762,
0.09282267838716507,
0.0016507460968568921,
0.02089928649365902,
0.07416286319494247,
-0.031060826033353806,
-0.14055532217025757,
-0.0783085823059082,
-0.04660959541797638,
0.042563315480947495,
-0.0589272640645504,
-0.07314973324537277,
-0.11047404259443283,
-0.12347433716058731,
0.10504336655139923,
0.009416298940777779,
-0.015064483508467674,
-0.11486560851335526,
0.0657103955745697,
0.07079420238733292,
-0.058716483414173126,
0.008080965839326382,
0.025960447266697884,
0.05146035924553871,
0.007722529117017984,
-0.03017384000122547,
0.1226397156715393,
-0.05026622116565704,
-0.13955052196979523,
-0.07518722862005234,
0.1151559129357338,
0.07458065450191498,
0.05172095447778702,
-0.021259792149066925,
0.0012348305899649858,
0.0012759915553033352,
-0.10115697234869003,
0.02887999266386032,
-0.002397754229605198,
0.033927999436855316,
0.03434501588344574,
-0.07385741919279099,
0.0475921593606472,
-0.06087103858590126,
-0.0285771694034338,
0.1289164423942566,
0.3093409836292267,
-0.07053011655807495,
0.00613013980910182,
0.032464999705553055,
-0.07111643999814987,
-0.19029134511947632,
0.0731751024723053,
0.05095846205949783,
0.02038521319627762,
0.08413732796907425,
-0.14733117818832397,
0.09133312851190567,
0.11634865403175354,
-0.013663227669894695,
0.13661789894104004,
-0.30024075508117676,
-0.12967947125434875,
0.10219520330429077,
0.17899249494075775,
0.10737523436546326,
-0.1651516705751419,
-0.027783503755927086,
-0.001936382264830172,
-0.07415860891342163,
0.07644393295049667,
-0.17684721946716309,
0.10295338183641434,
0.0058808885514736176,
0.09159877896308899,
0.013971865177154541,
-0.04364918917417526,
0.1479191780090332,
-0.0031093936413526535,
0.1324928253889084,
-0.04534079134464264,
0.004676740150898695,
0.009522187523543835,
-0.038844168186187744,
0.00799324456602335,
-0.07374904304742813,
0.019876185804605484,
-0.06336604803800583,
-0.02187815122306347,
-0.08461779356002808,
0.03272110968828201,
-0.0394623838365078,
-0.046329565346241,
-0.042484961450099945,
0.05041498318314552,
0.026449233293533325,
-0.007053714711219072,
0.10347963124513626,
-0.024588288739323616,
0.17956550419330597,
0.10194031894207001,
0.0786551684141159,
-0.060288019478321075,
-0.019678829237818718,
0.014395237900316715,
-0.0362628772854805,
0.03828859701752663,
-0.1451362520456314,
0.01704416051506996,
0.12440845370292664,
0.02689717710018158,
0.1387477070093155,
0.06730744242668152,
-0.0706285759806633,
0.0321463942527771,
0.06851735711097717,
-0.1209932491183281,
-0.12581251561641693,
-0.0020984902512282133,
-0.016087284311652184,
-0.11478804796934128,
0.02947981469333172,
0.08643316477537155,
-0.06442805379629135,
-0.0014912590850144625,
-0.013378343544900417,
0.004047836642712355,
-0.060351643711328506,
0.18236108124256134,
0.08101718872785568,
0.023419253528118134,
-0.08850531280040741,
0.0686069056391716,
0.03129340335726738,
-0.08835972100496292,
0.005405739415436983,
0.05976596102118492,
-0.06234748288989067,
-0.037515196949243546,
0.10643023252487183,
0.22575342655181885,
0.0026177039835602045,
-0.04975180700421333,
-0.11744232475757599,
-0.11389441043138504,
0.05290607362985611,
0.16669540107250214,
0.07874228805303574,
-0.007658400107175112,
-0.02710040844976902,
0.01801295019686222,
-0.10582786798477173,
0.08906210213899612,
0.05692174285650253,
0.06112675368785858,
-0.12372798472642899,
0.15308693051338196,
0.006092132534831762,
-0.0011806075926870108,
-0.018902724608778954,
0.03727628290653229,
-0.13379840552806854,
0.013174798339605331,
-0.1521628499031067,
-0.02143864333629608,
-0.03663302958011627,
0.018739936873316765,
0.006283349823206663,
-0.065764881670475,
-0.06396298855543137,
0.004654163960367441,
-0.11995808035135269,
-0.015308267436921597,
0.040676459670066833,
0.03697235882282257,
-0.11819471418857574,
-0.039077818393707275,
0.023923685774207115,
-0.05842122063040733,
0.04170285537838936,
0.020033854991197586,
0.022301802411675453,
0.07387049496173859,
-0.22993840277194977,
0.03734644502401352,
0.04624750837683678,
-0.015247435308992863,
0.04098597913980484,
-0.07016398757696152,
-0.025579217821359634,
-0.0087916050106287,
0.09010666608810425,
0.023126758635044098,
0.07763025909662247,
-0.11051125824451447,
-0.006486293859779835,
-0.028577812016010284,
-0.08867339044809341,
-0.04318387433886528,
0.0318220816552639,
0.06111147254705429,
0.022659774869680405,
0.16117233037948608,
-0.1094108447432518,
0.020843982696533203,
-0.21862538158893585,
-0.009453714825212955,
-0.0025638050865381956,
-0.07560411840677261,
-0.08091426640748978,
-0.05051550641655922,
0.07197856903076172,
-0.03413749858736992,
0.11938240379095078,
0.010659079067409039,
0.04462562873959541,
0.018185801804065704,
-0.0640144869685173,
0.011605416424572468,
0.02348555251955986,
0.21574708819389343,
0.008245817385613918,
-0.036112718284130096,
0.025446247309446335,
0.045667074620723724,
0.08606486022472382,
0.13716943562030792,
0.19075001776218414,
0.1666029840707779,
-0.012118486687541008,
0.091514952480793,
0.033371880650520325,
-0.06290069967508316,
-0.06739019602537155,
0.08872896432876587,
-0.028248002752661705,
0.06378953158855438,
-0.042755790054798126,
0.23691841959953308,
0.0790594220161438,
-0.17935356497764587,
0.03975050896406174,
-0.04256387799978256,
-0.10417026281356812,
-0.11563267558813095,
-0.06140581890940666,
-0.07892853021621704,
-0.18500731885433197,
0.00498255854472518,
-0.11536519229412079,
0.02770383656024933,
0.0970509722828865,
0.019368503242731094,
0.0023155619855970144,
0.1718807816505432,
0.03663000836968422,
0.026545248925685883,
0.0767136737704277,
0.002784373937174678,
-0.02127990685403347,
-0.08424541354179382,
-0.09484188258647919,
0.021856777369976044,
-0.0678722932934761,
0.03168077766895294,
-0.050375182181596756,
-0.058215558528900146,
0.04237544536590576,
-0.018919724971055984,
-0.102309450507164,
0.013697318732738495,
0.027758896350860596,
0.07597602158784866,
0.08302252739667892,
0.0380115807056427,
-0.003056412097066641,
-0.020777255296707153,
0.222819522023201,
-0.06383133679628372,
-0.04510669410228729,
-0.09711090475320816,
0.24665866792201996,
0.014696604572236538,
0.011036102660000324,
0.004822360817342997,
-0.0998920276761055,
0.057993125170469284,
0.19495145976543427,
0.1505342572927475,
-0.12363342195749283,
0.0029481835663318634,
-0.02406689152121544,
-0.010866360738873482,
-0.05688510090112686,
0.10290157049894333,
0.10612968355417252,
-0.005643353797495365,
-0.10324325412511826,
-0.04770297557115555,
-0.046986933797597885,
-0.0074813030660152435,
-0.04887907952070236,
0.03522861748933792,
0.050046224147081375,
0.008362522348761559,
-0.06725092232227325,
0.09796269237995148,
-0.053826164454221725,
-0.07906518876552582,
0.07898762822151184,
-0.17083488404750824,
-0.15104630589485168,
-0.026011204347014427,
0.0885014757514,
0.003842563135549426,
0.05146954208612442,
-0.05917515978217125,
0.012108643539249897,
0.07643647491931915,
-0.04372018948197365,
-0.06631258875131607,
-0.1222451850771904,
0.0655534639954567,
-0.09681244939565659,
0.18630219995975494,
-0.02658008225262165,
0.04915887489914894,
0.12331817299127579,
0.04419625550508499,
-0.047867435961961746,
0.10569219291210175,
0.04297434538602829,
-0.09058956056833267,
0.008238565176725388,
0.10392860323190689,
-0.046633969992399216,
0.09759284555912018,
0.05540664121508598,
-0.10924242436885834,
0.04326613247394562,
-0.06892146915197372,
-0.07772291451692581,
-0.06630777567625046,
-0.018340621143579483,
-0.07746104151010513,
0.13214486837387085,
0.18343131244182587,
-0.018974289298057556,
0.025247836485505104,
-0.04565145820379257,
0.026019904762506485,
0.04398452490568161,
0.09679954499006271,
-0.029777342453598976,
-0.23264305293560028,
0.026063933968544006,
0.08311080187559128,
-0.02024906314909458,
-0.2986432611942291,
-0.07960019260644913,
0.011424430646002293,
-0.04915672168135643,
-0.08657913655042648,
0.10197938978672028,
0.08688362687826157,
0.05684289336204529,
-0.05350521206855774,
-0.1727522760629654,
-0.04896949604153633,
0.1685127168893814,
-0.09066347777843475,
-0.06756370514631271
] |
null | null | null |
# Lora of matsukaze/松風/松风 (Azur Lane)
## What Is This?
This is the LoRA model of waifu matsukaze/松風/松风 (Azur Lane).
## How Is It Trained?
* This model is trained with [HCP-Diffusion](https://github.com/7eu7d7/HCP-Diffusion).
* The [auto-training framework](https://github.com/deepghs/cyberharem) is maintained by [DeepGHS Team](https://huggingface.co/deepghs).
* The base model used for training is [deepghs/animefull-latest](https://huggingface.co/deepghs/animefull-latest).
* Dataset used for training is the `stage3-p480-800` in [CyberHarem/matsukaze_azurlane](https://huggingface.co/datasets/CyberHarem/matsukaze_azurlane), which contains 47 images.
* Batch size is 4, resolution is 720x720, clustering into 5 buckets.
* Batch size for regularization dataset is 16, resolution is 720x720, clustering into 20 buckets.
* Trained for 800 steps, 40 checkpoints were saved and evaluated.
* **Trigger word is `matsukaze_azurlane`.**
* Pruned core tags for this waifu are `animal_ears, yellow_eyes, black_hair, fox_ears, long_hair, brown_hair, ponytail, tail, multicolored_hair, hair_between_eyes, hair_ornament, bangs, brown_eyes`. You can add them to the prompt when some features of waifu (e.g. hair color) are not stable.
## How to Use It?
### If You Are Using A1111 WebUI v1.7+
**Just use it like the classic LoRA**. The LoRA we provided are bundled with the embedding file.
### If You Are Using A1111 WebUI v1.6 or Lower
After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.
For example, if you want to use the model from step 140, you need to download [`140/matsukaze_azurlane.pt`](https://huggingface.co/CyberHarem/matsukaze_azurlane/resolve/main/140/matsukaze_azurlane.pt) as the embedding and [`140/matsukaze_azurlane.safetensors`](https://huggingface.co/CyberHarem/matsukaze_azurlane/resolve/main/140/matsukaze_azurlane.safetensors) for loading Lora. By using both files together, you can generate images for the desired characters.
## Which Step Should I Use?
We selected 5 good steps for you to choose. The best one is step 140.
1520 images (1.62 GiB) were generated for auto-testing.

The base model used for generating preview images is [Meina/MeinaMix_V11](https://huggingface.co/Meina/MeinaMix_V11).
Here are the preview of the recommended steps:
| Step | Epoch | CCIP | AI Corrupt | Bikini Plus | Score | Download | pattern_0_0 | pattern_0_1 | pattern_0_2 | portrait_0 | portrait_1 | portrait_2 | full_body_0 | full_body_1 | profile_0 | profile_1 | free_0 | free_1 | shorts | maid_0 | maid_1 | miko | yukata | suit | china | bikini_0 | bikini_1 | bikini_2 | sit | squat | kneel | jump | crossed_arms | angry | smile | cry | grin | n_lie_0 | n_lie_1 | n_stand_0 | n_stand_1 | n_stand_2 | n_sex_0 | n_sex_1 |
|-------:|--------:|:----------|:-------------|:--------------|:----------|:---------------------------------------------------------------------------------------------------------|:---------------------------------------------|:---------------------------------------------|:---------------------------------------------|:-------------------------------------------|:-------------------------------------------|:-------------------------------------------|:---------------------------------------------|:---------------------------------------------|:-----------------------------------------|:-----------------------------------------|:-----------------------------------|:-----------------------------------|:-----------------------------------|:-----------------------------------|:-----------------------------------|:-------------------------------|:-----------------------------------|:-------------------------------|:---------------------------------|:---------------------------------------|:---------------------------------------|:---------------------------------------|:-----------------------------|:---------------------------------|:---------------------------------|:-------------------------------|:-----------------------------------------------|:---------------------------------|:---------------------------------|:-----------------------------|:-------------------------------|:-------------------------------------|:-------------------------------------|:-----------------------------------------|:-----------------------------------------|:-----------------------------------------|:-------------------------------------|:-------------------------------------|
| 140 | 12 | **0.960** | **0.893** | **0.854** | **0.735** | [Download](https://huggingface.co/CyberHarem/matsukaze_azurlane/resolve/main/140/matsukaze_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
| 520 | 45 | 0.946 | 0.842 | 0.846 | 0.713 | [Download](https://huggingface.co/CyberHarem/matsukaze_azurlane/resolve/main/520/matsukaze_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
| 480 | 41 | 0.943 | 0.842 | 0.844 | 0.709 | [Download](https://huggingface.co/CyberHarem/matsukaze_azurlane/resolve/main/480/matsukaze_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
| 320 | 28 | 0.941 | 0.869 | 0.843 | 0.706 | [Download](https://huggingface.co/CyberHarem/matsukaze_azurlane/resolve/main/320/matsukaze_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
| 260 | 23 | 0.938 | 0.835 | 0.839 | 0.697 | [Download](https://huggingface.co/CyberHarem/matsukaze_azurlane/resolve/main/260/matsukaze_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
## Anything Else?
Because the automation of LoRA training always annoys some people. So for the following groups, it is not recommended to use this model and we express regret:
1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.
2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.
3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.
4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.
5. Individuals who finds the generated image content offensive to their values.
## All Steps
We uploaded the files in all steps. you can check the images, metrics and download them in the following links:
* [Steps From 620 to 800](all/0.md)
* [Steps From 420 to 600](all/1.md)
* [Steps From 220 to 400](all/2.md)
* [Steps From 20 to 200](all/3.md)
| {"license": "mit", "tags": ["art", "not-for-all-audiences"], "datasets": ["CyberHarem/matsukaze_azurlane"], "pipeline_tag": "text-to-image"} | text-to-image | CyberHarem/matsukaze_azurlane | [
"art",
"not-for-all-audiences",
"text-to-image",
"dataset:CyberHarem/matsukaze_azurlane",
"license:mit",
"region:us"
] | 2024-02-15T07:00:20+00:00 | [] | [] | TAGS
#art #not-for-all-audiences #text-to-image #dataset-CyberHarem/matsukaze_azurlane #license-mit #region-us
| Lora of matsukaze/松風/松风 (Azur Lane)
===================================
What Is This?
-------------
This is the LoRA model of waifu matsukaze/松風/松风 (Azur Lane).
How Is It Trained?
------------------
* This model is trained with HCP-Diffusion.
* The auto-training framework is maintained by DeepGHS Team.
* The base model used for training is deepghs/animefull-latest.
* Dataset used for training is the 'stage3-p480-800' in CyberHarem/matsukaze\_azurlane, which contains 47 images.
* Batch size is 4, resolution is 720x720, clustering into 5 buckets.
* Batch size for regularization dataset is 16, resolution is 720x720, clustering into 20 buckets.
* Trained for 800 steps, 40 checkpoints were saved and evaluated.
* Trigger word is 'matsukaze\_azurlane'.
* Pruned core tags for this waifu are 'animal\_ears, yellow\_eyes, black\_hair, fox\_ears, long\_hair, brown\_hair, ponytail, tail, multicolored\_hair, hair\_between\_eyes, hair\_ornament, bangs, brown\_eyes'. You can add them to the prompt when some features of waifu (e.g. hair color) are not stable.
How to Use It?
--------------
### If You Are Using A1111 WebUI v1.7+
Just use it like the classic LoRA. The LoRA we provided are bundled with the embedding file.
### If You Are Using A1111 WebUI v1.6 or Lower
After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.
For example, if you want to use the model from step 140, you need to download '140/matsukaze\_azurlane.pt' as the embedding and '140/matsukaze\_azurlane.safetensors' for loading Lora. By using both files together, you can generate images for the desired characters.
Which Step Should I Use?
------------------------
We selected 5 good steps for you to choose. The best one is step 140.
1520 images (1.62 GiB) were generated for auto-testing.
!Metrics Plot
The base model used for generating preview images is Meina/MeinaMix\_V11.
Here are the preview of the recommended steps:
Anything Else?
--------------
Because the automation of LoRA training always annoys some people. So for the following groups, it is not recommended to use this model and we express regret:
1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.
2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.
3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.
4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.
5. Individuals who finds the generated image content offensive to their values.
All Steps
---------
We uploaded the files in all steps. you can check the images, metrics and download them in the following links:
* Steps From 620 to 800
* Steps From 420 to 600
* Steps From 220 to 400
* Steps From 20 to 200
| [
"### If You Are Using A1111 WebUI v1.7+\n\n\nJust use it like the classic LoRA. The LoRA we provided are bundled with the embedding file.",
"### If You Are Using A1111 WebUI v1.6 or Lower\n\n\nAfter downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.\n\n\nFor example, if you want to use the model from step 140, you need to download '140/matsukaze\\_azurlane.pt' as the embedding and '140/matsukaze\\_azurlane.safetensors' for loading Lora. By using both files together, you can generate images for the desired characters.\n\n\nWhich Step Should I Use?\n------------------------\n\n\nWe selected 5 good steps for you to choose. The best one is step 140.\n\n\n1520 images (1.62 GiB) were generated for auto-testing.\n\n\n!Metrics Plot\n\n\nThe base model used for generating preview images is Meina/MeinaMix\\_V11.\n\n\nHere are the preview of the recommended steps:\n\n\n\nAnything Else?\n--------------\n\n\nBecause the automation of LoRA training always annoys some people. So for the following groups, it is not recommended to use this model and we express regret:\n\n\n1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.\n2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.\n3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.\n4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.\n5. Individuals who finds the generated image content offensive to their values.\n\n\nAll Steps\n---------\n\n\nWe uploaded the files in all steps. you can check the images, metrics and download them in the following links:\n\n\n* Steps From 620 to 800\n* Steps From 420 to 600\n* Steps From 220 to 400\n* Steps From 20 to 200"
] | [
"TAGS\n#art #not-for-all-audiences #text-to-image #dataset-CyberHarem/matsukaze_azurlane #license-mit #region-us \n",
"### If You Are Using A1111 WebUI v1.7+\n\n\nJust use it like the classic LoRA. The LoRA we provided are bundled with the embedding file.",
"### If You Are Using A1111 WebUI v1.6 or Lower\n\n\nAfter downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.\n\n\nFor example, if you want to use the model from step 140, you need to download '140/matsukaze\\_azurlane.pt' as the embedding and '140/matsukaze\\_azurlane.safetensors' for loading Lora. By using both files together, you can generate images for the desired characters.\n\n\nWhich Step Should I Use?\n------------------------\n\n\nWe selected 5 good steps for you to choose. The best one is step 140.\n\n\n1520 images (1.62 GiB) were generated for auto-testing.\n\n\n!Metrics Plot\n\n\nThe base model used for generating preview images is Meina/MeinaMix\\_V11.\n\n\nHere are the preview of the recommended steps:\n\n\n\nAnything Else?\n--------------\n\n\nBecause the automation of LoRA training always annoys some people. So for the following groups, it is not recommended to use this model and we express regret:\n\n\n1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.\n2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.\n3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.\n4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.\n5. Individuals who finds the generated image content offensive to their values.\n\n\nAll Steps\n---------\n\n\nWe uploaded the files in all steps. you can check the images, metrics and download them in the following links:\n\n\n* Steps From 620 to 800\n* Steps From 420 to 600\n* Steps From 220 to 400\n* Steps From 20 to 200"
] | [
45,
38,
467
] | [
"passage: TAGS\n#art #not-for-all-audiences #text-to-image #dataset-CyberHarem/matsukaze_azurlane #license-mit #region-us \n### If You Are Using A1111 WebUI v1.7+\n\n\nJust use it like the classic LoRA. The LoRA we provided are bundled with the embedding file."
] | [
0.013386927545070648,
-0.02720283344388008,
-0.0038376478478312492,
0.06964344531297684,
0.08400978147983551,
0.07949457317590714,
0.23539738357067108,
0.07692112028598785,
0.12471944093704224,
-0.06795642524957657,
0.088332399725914,
0.05282527580857277,
-0.005793762393295765,
0.0465380996465683,
-0.02394755370914936,
-0.15512384474277496,
-0.05928372964262962,
-0.030830208212137222,
0.018486402928829193,
0.01707657054066658,
0.08575329929590225,
0.0034101714845746756,
0.1025686264038086,
-0.04543961584568024,
-0.05141787230968475,
0.0516214519739151,
-0.03792653977870941,
-0.0464775487780571,
0.025490256026387215,
0.07458741217851639,
0.1178346648812294,
0.005383300129324198,
0.0664607360959053,
-0.15375393629074097,
0.06344512104988098,
-0.012808538973331451,
-0.10903459042310715,
-0.012638685293495655,
0.028825782239437103,
-0.042033977806568146,
0.14678560197353363,
0.010342658497393131,
-0.11017881333827972,
0.04166826605796814,
-0.12491339445114136,
-0.02538471110165119,
-0.04757336527109146,
0.04488665610551834,
0.14810767769813538,
0.06494028121232986,
0.014946041628718376,
0.06203891336917877,
-0.05132472887635231,
0.08306791633367538,
0.11967010796070099,
-0.14849470555782318,
-0.0694136917591095,
0.11650285124778748,
0.018424242734909058,
0.11887438595294952,
-0.09726410359144211,
0.09569112211465836,
0.07100578397512436,
-0.05158769711852074,
-0.14184817671775818,
-0.09419409185647964,
-0.18121500313282013,
-0.015656689181923866,
0.006493149325251579,
0.024571426212787628,
0.4025522768497467,
0.06204288825392723,
0.03928755968809128,
0.04664769768714905,
-0.058813225477933884,
0.03879431635141373,
-0.1041751503944397,
0.1453630030155182,
0.03847503289580345,
0.09101776778697968,
-0.0461779460310936,
-0.11173851042985916,
-0.12120212614536285,
-0.06807387620210648,
-0.07570728659629822,
-0.010375846177339554,
0.022437550127506256,
0.11143860965967178,
-0.1947125643491745,
0.0026804120279848576,
-0.050637952983379364,
-0.12959544360637665,
0.0251690112054348,
-0.10011515021324158,
0.160032257437706,
0.0685892328619957,
-0.019059909507632256,
-0.016245387494564056,
0.25110089778900146,
0.1229357123374939,
0.21165479719638824,
0.056298255920410156,
-0.09162325412034988,
0.12734147906303406,
0.04710140451788902,
-0.07255277782678604,
-0.007681744638830423,
-0.11105575412511826,
0.13497796654701233,
-0.05206603184342384,
0.1110764890909195,
-0.05968247354030609,
-0.11614637076854706,
0.017254937440156937,
-0.10330037772655487,
0.06646133214235306,
0.03625386580824852,
0.014471936039626598,
-0.04427212476730347,
0.04645224288105965,
0.03154730424284935,
-0.04298745468258858,
-0.01254555955529213,
-0.01156079862266779,
-0.054593782871961594,
0.05727725103497505,
0.09631379693746567,
0.034178152680397034,
0.06339256465435028,
0.005850860383361578,
-0.017805466428399086,
0.0032466098200529814,
-0.045006297528743744,
0.003912834450602531,
0.04516078531742096,
0.02102326974272728,
0.08850418776273727,
-0.159627765417099,
-0.0695493146777153,
-0.014197980053722858,
0.05665803700685501,
0.011632985435426235,
0.10165981948375702,
-0.013899014331400394,
0.050731342285871506,
0.009677186608314514,
-0.018372196704149246,
0.046574532985687256,
-0.099915511906147,
0.08187288790941238,
-0.01610354334115982,
0.09120575338602066,
-0.1873406618833542,
-0.003477870486676693,
-0.04721749946475029,
0.014886134304106236,
0.06649670004844666,
-0.01313917525112629,
-0.11277754604816437,
0.1282862275838852,
-0.01177047286182642,
0.07378992438316345,
-0.09342607110738754,
0.0435199998319149,
0.027466628700494766,
0.07878704369068146,
-0.09200495481491089,
0.0046496307477355,
0.10497444868087769,
-0.14236700534820557,
-0.1587103009223938,
0.08927176147699356,
-0.022220700979232788,
0.04153118282556534,
0.03810850903391838,
0.17214946448802948,
0.1566394716501236,
-0.19042108952999115,
-0.026124384254217148,
0.05791959911584854,
-0.02736932598054409,
-0.08300407230854034,
-0.00865284912288189,
0.10576433688402176,
0.0334591306746006,
0.04323297366499901,
-0.02676943503320217,
0.12329604476690292,
-0.02861846424639225,
-0.08546140044927597,
-0.033865734934806824,
-0.08096946030855179,
-0.05711492896080017,
0.05051473528146744,
-0.0025576024781912565,
-0.05527306720614433,
0.012855809181928635,
-0.13379719853401184,
0.16811469197273254,
0.01913384534418583,
0.01995936408638954,
-0.07760409265756607,
0.1189284399151802,
0.01182956900447607,
0.003930701874196529,
0.008040150627493858,
-0.05812642723321915,
-0.10917256772518158,
0.23743870854377747,
0.08730915188789368,
0.09086471050977707,
0.06212012469768524,
-0.05563340336084366,
-0.07029561698436737,
0.01987975835800171,
0.0036606767680495977,
-0.03378647193312645,
0.02413984015583992,
-0.10001927614212036,
0.052419066429138184,
-0.02002062276005745,
0.043959133327007294,
-0.007884892635047436,
-0.029176827520132065,
0.07354635745286942,
0.020618101581931114,
-0.019285351037979126,
0.08786146342754364,
0.061185531318187714,
-0.023637905716896057,
-0.06592078506946564,
0.00384042807854712,
0.07728572934865952,
-0.007006660103797913,
-0.08835818618535995,
0.030690571293234825,
-0.00618363730609417,
0.03359843045473099,
0.20587441325187683,
-0.2183924913406372,
0.036746054887771606,
0.01415029726922512,
0.04902450367808342,
0.03792310133576393,
0.007937269285321236,
-0.032182179391384125,
0.02262282371520996,
-0.019805772230029106,
0.06720566749572754,
-0.016126599162817,
0.06251978874206543,
-0.02513725683093071,
-0.14422455430030823,
-0.022324517369270325,
-0.023704934865236282,
0.16336525976657867,
-0.17795401811599731,
0.06540475785732269,
0.19900038838386536,
-0.11801163107156754,
0.1520964503288269,
-0.004630113020539284,
-0.011815124191343784,
0.007041825447231531,
0.033673934638500214,
-0.006439460441470146,
0.10616050660610199,
-0.08567771315574646,
-0.03291121870279312,
0.023196425288915634,
-0.08859389275312424,
0.03877562656998634,
-0.1196385845541954,
-0.11192859709262848,
-0.0760294571518898,
-0.03159228712320328,
-0.03157106786966324,
0.036988936364650726,
-0.05162898451089859,
0.07131370157003403,
-0.08929956704378128,
-0.07711431384086609,
-0.02371363528072834,
-0.0835786834359169,
0.02022334560751915,
0.005478669889271259,
-0.05851242318749428,
-0.1340646892786026,
-0.11626799404621124,
-0.08995307236909866,
-0.1543535739183426,
-0.009770741686224937,
0.06297071278095245,
-0.1127713993191719,
-0.04385025054216385,
0.009945465251803398,
-0.05451156944036484,
0.08413708209991455,
-0.07882863283157349,
0.02062569372355938,
0.06407653540372849,
-0.04707730561494827,
-0.16105175018310547,
-0.0020166707690805197,
-0.0716901570558548,
-0.06623093038797379,
0.1609640270471573,
-0.15448510646820068,
0.18125519156455994,
-0.028890807181596756,
0.057844653725624084,
0.06150202453136444,
0.02662183716893196,
0.12300872057676315,
-0.11898303031921387,
0.08098834753036499,
0.1842946857213974,
0.042319029569625854,
0.07973872870206833,
0.12185217440128326,
0.07769422233104706,
-0.10991024225950241,
0.0419759526848793,
0.07850415259599686,
-0.10211965441703796,
-0.08813539147377014,
-0.058705996721982956,
-0.11525082588195801,
-0.06247026473283768,
0.047538090497255325,
0.05814831331372261,
0.03829103335738182,
0.12460885941982269,
-0.048790741711854935,
-0.01755189150571823,
0.09595775604248047,
0.05099773034453392,
0.08477772027254105,
0.015303337946534157,
0.061565060168504715,
-0.15497015416622162,
-0.045241452753543854,
0.16058577597141266,
0.20702368021011353,
0.23743826150894165,
0.027179397642612457,
0.061677057296037674,
0.1250884234905243,
0.08149804919958115,
0.09457897394895554,
0.04567209631204605,
0.003084275871515274,
0.013270416297018528,
-0.07372960448265076,
-0.054063472896814346,
0.022994646802544594,
0.0119364894926548,
-0.03929523006081581,
-0.15597926080226898,
0.10046258568763733,
-0.006887626368552446,
0.08861448615789413,
0.14180472493171692,
0.036107633262872696,
-0.09095637500286102,
0.1554243266582489,
0.09601742029190063,
0.08819139748811722,
-0.05769069120287895,
0.12324810773134232,
0.05120712146162987,
-0.017038172110915184,
0.17070303857326508,
0.036225177347660065,
0.14637787640094757,
-0.01872497983276844,
-0.07683240622282028,
-0.08950141072273254,
-0.052883997559547424,
0.008714096620678902,
0.03669213131070137,
-0.22654728591442108,
0.11381591111421585,
0.05953211709856987,
0.01379980705678463,
-0.010448189452290535,
-0.057804785668849945,
0.18540838360786438,
0.14831236004829407,
0.07717745006084442,
0.027731752023100853,
-0.06720656156539917,
-0.02773880958557129,
-0.07484651356935501,
0.05190461128950119,
0.02714027836918831,
0.07588303834199905,
-0.03660536929965019,
-0.09992986172437668,
-0.02473355457186699,
-0.0047734747640788555,
0.030770419165492058,
-0.07854984700679779,
-0.1096002608537674,
-0.04553002491593361,
0.2500379681587219,
-0.08165101706981659,
0.051495350897312164,
0.05558675527572632,
0.01748465746641159,
-0.02268679440021515,
0.043596766889095306,
-0.031821589916944504,
-0.016907811164855957,
-0.059535808861255646,
0.008230732753872871,
0.002875515026971698,
-0.03857220336794853,
-0.05749375745654106,
-0.02407522313296795,
-0.10776346921920776,
-0.10122714936733246,
0.001511639915406704,
-0.04269832745194435,
0.008288269862532616,
-0.012239094823598862,
0.02111956477165222,
-0.09634412825107574,
-0.03018491342663765,
0.031148405745625496,
0.040868375450372696,
-0.081175796687603,
-0.13632474839687347,
-0.0026111521292477846,
0.004103110637515783,
-0.06726670265197754,
0.02479119412600994,
-0.11607223749160767,
-0.11920899152755737,
-0.057961124926805496,
-0.04306826740503311,
0.13258256018161774,
0.24692992866039276,
-0.025505736470222473,
0.002597060287371278,
0.14685384929180145,
-0.09320887178182602,
-0.3157961964607239,
-0.15630683302879333,
-0.16115961968898773,
-0.09727471321821213,
0.02999083139002323,
-0.07352576404809952,
0.02684212289750576,
0.07725102454423904,
-0.04131092131137848,
0.19538573920726776,
-0.19883427023887634,
-0.10544109344482422,
0.0832412987947464,
0.09044068306684494,
0.3070005774497986,
-0.24037887156009674,
0.016251705586910248,
-0.11907393485307693,
-0.028814060613512993,
0.010533010587096214,
-0.08284599334001541,
0.12300972640514374,
0.04181570187211037,
0.0786546915769577,
-0.0024808773305267096,
-0.009387879632413387,
0.1464996039867401,
-0.06466567516326904,
0.13071633875370026,
-0.12224774062633514,
-0.10052882879972458,
0.21332652866840363,
-0.03396547958254814,
0.014608791097998619,
-0.20874159038066864,
-0.0377560593187809,
-0.0423271618783474,
0.03182389959692955,
-0.006535637192428112,
0.06156760826706886,
-0.0034181519877165556,
-0.0198537427932024,
-0.13988658785820007,
-0.010340259410440922,
-0.03857458010315895,
0.05659046396613121,
0.21658223867416382,
-0.06700046360492706,
-0.0689702183008194,
0.036351460963487625,
-0.015264848247170448,
0.10408883541822433,
0.025704722851514816,
-0.05957026779651642,
-0.050361260771751404,
0.0909285843372345,
-0.2037736028432846,
0.04678884148597717,
0.00415491359308362,
-0.0027069360949099064,
0.011216835118830204,
0.0030813792254775763,
0.010011257603764534,
0.12357421219348907,
0.18114113807678223,
0.002268566284328699,
-0.03216997906565666,
-0.018479717895388603,
0.02204570546746254,
0.12841224670410156,
-0.02372051402926445,
0.10725574195384979,
0.023282120004296303,
0.03704488277435303,
0.008129332214593887,
0.04982764273881912,
-0.08056946098804474,
-0.09297828376293182,
0.09954220056533813,
-0.04787523299455643,
-0.07857108116149902,
0.08917948603630066,
0.050583139061927795,
0.08408571779727936,
0.009324015118181705,
0.044289905577898026,
0.01662207953631878,
-0.12347489595413208,
0.017275698482990265,
0.200795978307724,
-0.06516963243484497,
-0.062096308916807175,
-0.06502190977334976,
0.014853178523480892,
-0.12689630687236786,
0.08259688317775726,
0.04136370122432709,
-0.03392846882343292,
0.11937613040208817,
-0.04445454105734825,
-0.03055289387702942,
0.003575011156499386,
-0.04957782104611397,
0.029984071850776672,
-0.1421220749616623,
-0.18342657387256622,
0.0582328736782074,
-0.0035228233318775892,
-0.06783567368984222,
-0.09443618357181549,
-0.0868898555636406,
0.06757398694753647,
-0.14204402267932892,
0.14081743359565735,
-0.08450091630220413,
0.05501491576433182,
-0.041916374117136,
-0.050872549414634705,
-0.11216775327920914,
-0.020229987800121307,
-0.05354521796107292,
-0.023439651355147362,
0.056588687002658844,
0.018311018124222755,
-0.12875136733055115,
-0.11632183939218521,
0.0632157027721405,
-0.004664926324039698,
-0.005963998846709728,
0.021192288026213646,
-0.06274774670600891,
0.03306831791996956,
-0.236707404255867,
-0.06616731733083725,
0.08849964290857315,
0.03944731131196022,
-0.09428492188453674,
0.12736254930496216,
0.045461054891347885,
-0.022152457386255264,
0.048653244972229004,
0.011457940563559532,
0.175214946269989,
-0.08229222893714905,
0.023697633296251297,
-0.1265970915555954,
-0.1688593327999115,
-0.026612041518092155,
0.035830628126859665,
0.23257674276828766,
0.08692673593759537,
0.12449383735656738,
-0.05393344908952713,
0.025955531746149063,
-0.009429149329662323,
0.07225634902715683,
0.00804140791296959,
-0.11007130146026611,
-0.03953329846262932,
-0.17290274798870087,
-0.06946521252393723,
-0.06833280622959137,
0.16753217577934265,
0.03600728139281273,
-0.14763005077838898,
0.00039751792792230844,
0.12113570421934128,
-0.1716925948858261,
-0.010781441815197468,
0.17937706410884857,
-0.03966435045003891,
0.02904607728123665,
-0.16193968057632446,
0.025201940909028053,
0.07473938912153244,
-0.03076810948550701,
-0.0043221693485975266,
0.12848687171936035,
-0.005279185250401497,
0.006293507292866707,
0.04148079827427864,
-0.02998870238661766,
0.07994424551725388,
-0.06390182673931122,
0.048909660428762436,
0.0016942613292485476,
-0.05545550957322121,
-0.09093949943780899,
0.19789612293243408,
-0.01951746456325054,
0.009322135709226131,
-0.05042179673910141,
0.0031706485897302628,
-0.0960976630449295,
-0.09976016730070114,
-0.07476341724395752,
-0.11840127408504486,
0.0780918151140213,
-0.04988226294517517,
0.011829206719994545,
-0.0047614965587854385,
0.017118602991104126,
-0.07538773864507675,
0.020261716097593307,
-0.1850755214691162,
-0.04876171424984932,
0.021641122177243233,
-0.02020629309117794,
-0.03599979355931282,
-0.04780019447207451,
-0.04037712514400482,
0.01928722858428955,
-0.06062301993370056,
-0.06665197759866714,
0.05945594981312752,
0.07603567838668823,
0.05797874554991722,
-0.16559764742851257,
-0.10322433710098267,
-0.07255655527114868,
0.03991987556219101,
0.08355753123760223,
0.16550791263580322,
0.0399351492524147,
-0.010559742338955402,
0.03557205572724342,
0.13565278053283691,
0.021469440311193466,
-0.09042651951313019,
-0.0604572556912899,
-0.13546700775623322,
-0.13540761172771454,
-0.022480489686131477,
-0.06770550459623337,
-0.023917751386761665,
0.021680738776922226,
0.22884035110473633,
0.19976848363876343,
-0.14352329075336456,
0.04341214522719383,
-0.08195342868566513,
0.040582433342933655,
-0.03983806446194649,
0.16545899212360382,
0.03844171389937401,
0.1522519737482071,
-0.027516081929206848,
-0.031964171677827835,
-0.07229088991880417,
0.015621219761669636,
-0.1011577695608139,
0.03179416060447693,
-0.0018084070179611444,
-0.07561365514993668,
-0.060796286910772324,
0.10601655393838882,
-0.12173210829496384,
0.08201780915260315,
0.16295580565929413,
-0.14114871621131897,
-0.020558204501867294,
-0.03094344586133957,
0.03900658339262009,
0.11777114868164062,
0.021071448922157288,
-0.07656388729810715,
-0.012343994341790676,
-0.006231005769222975,
0.03211856633424759,
-0.17016643285751343,
-0.10955069214105606,
0.0018083843169733882,
-0.10942631214857101,
0.13840943574905396,
-0.006504702847450972,
0.0008887552539817989,
0.03421813249588013,
-0.06454434990882874,
-0.0024013356305658817,
0.18343867361545563,
0.019762296229600906,
-0.02514735795557499,
-0.018247291445732117,
-0.05962580814957619,
-0.10910868644714355,
0.07227542251348495,
0.09248591214418411,
0.06353702396154404,
0.007806904148310423,
0.14782090485095978,
-0.013516569510102272,
-0.040661778301000595,
0.14910058677196503,
-0.18057627975940704,
0.09309937804937363,
0.0013537958730012178,
-0.01914140023291111,
-0.07876081764698029,
-0.04003143683075905,
0.047944024205207825,
0.0762704536318779,
-0.17576003074645996,
-0.052091263234615326,
0.06068268418312073,
-0.0972239300608635,
0.06374560296535492,
0.03905697166919708,
-0.09571431577205658,
0.008132262155413628,
-0.12717846035957336,
0.002808287274092436,
-0.10214989632368088,
0.043498363345861435,
0.19370023906230927,
-0.035692520439624786,
0.013164643198251724,
-0.1485345959663391,
0.05479064956307411,
-0.035792917013168335,
-0.04393197223544121,
-0.07630790770053864
] |
null | null | diffusers | # FurSho
<Gallery />
## Download model
Weights for this model are available in Safetensors format.
[Download](/Akimitsujiro/FurSho/tree/main) them in the Files & versions tab.
| {"tags": ["text-to-image", "stable-diffusion", "lora", "diffusers", "template:sd-lora"], "widget": [{"text": "-", "output": {"url": "images/1000079717.webp"}}], "base_model": "cagliostrolab/animagine-xl-3.0"} | text-to-image | Akimitsujiro/FurSho | [
"diffusers",
"text-to-image",
"stable-diffusion",
"lora",
"template:sd-lora",
"base_model:cagliostrolab/animagine-xl-3.0",
"region:us"
] | 2024-02-15T07:00:54+00:00 | [] | [] | TAGS
#diffusers #text-to-image #stable-diffusion #lora #template-sd-lora #base_model-cagliostrolab/animagine-xl-3.0 #region-us
| # FurSho
<Gallery />
## Download model
Weights for this model are available in Safetensors format.
Download them in the Files & versions tab.
| [
"# FurSho\n\n<Gallery />",
"## Download model\n\nWeights for this model are available in Safetensors format.\n\nDownload them in the Files & versions tab."
] | [
"TAGS\n#diffusers #text-to-image #stable-diffusion #lora #template-sd-lora #base_model-cagliostrolab/animagine-xl-3.0 #region-us \n",
"# FurSho\n\n<Gallery />",
"## Download model\n\nWeights for this model are available in Safetensors format.\n\nDownload them in the Files & versions tab."
] | [
51,
8,
28
] | [
"passage: TAGS\n#diffusers #text-to-image #stable-diffusion #lora #template-sd-lora #base_model-cagliostrolab/animagine-xl-3.0 #region-us \n# FurSho\n\n<Gallery />## Download model\n\nWeights for this model are available in Safetensors format.\n\nDownload them in the Files & versions tab."
] | [
-0.12385847419500351,
0.0766042098402977,
0.0007902913494035602,
0.053417567163705826,
0.1449761837720871,
0.044889092445373535,
0.16689260303974152,
0.020726947113871574,
0.10821358859539032,
0.08039796352386475,
0.00904275942593813,
0.03703676164150238,
0.017804693430662155,
0.25021952390670776,
-0.07891611754894257,
-0.24089400470256805,
0.03091476485133171,
-0.01490290742367506,
-0.05638384446501732,
0.0014032755279913545,
0.028249341994524002,
-0.1053362488746643,
0.08260399103164673,
-0.09620841592550278,
0.0061814566142857075,
0.017061900347471237,
-0.004903910215944052,
-0.04304983466863632,
-0.013493455946445465,
0.06683721393346786,
0.06327115744352341,
0.10949931293725967,
0.11215636879205704,
-0.09529010206460953,
0.05798392370343208,
0.00010067301627714187,
-0.04779409244656563,
0.05732380971312523,
0.026814257726073265,
-0.10250597447156906,
0.10588487237691879,
-0.03378714993596077,
-0.02976396307349205,
-0.007833635434508324,
-0.017048850655555725,
-0.014748913235962391,
-0.0043561020866036415,
-0.05667081102728844,
0.063988097012043,
0.007316621020436287,
0.023156823590397835,
0.06447937339544296,
-0.0005441703251563013,
0.05428050458431244,
0.23286840319633484,
-0.1876089721918106,
-0.0744609385728836,
0.24106983840465546,
0.027528198435902596,
0.22239315509796143,
-0.06076684221625328,
0.09535352140665054,
0.12145212292671204,
-0.05418666452169418,
0.03279430791735649,
-0.04855719208717346,
0.06839198619127274,
-0.05347839742898941,
-0.07924404740333557,
0.0877339094877243,
0.2826252281665802,
0.05054778605699539,
-0.020387567579746246,
-0.08520792424678802,
-0.0907561182975769,
0.08386760205030441,
-0.12488418072462082,
0.06951898336410522,
0.029048416763544083,
-0.012962552718818188,
-0.014516949653625488,
-0.10220137983560562,
-0.0803152471780777,
-0.08028516918420792,
-0.08049744367599487,
0.2395610213279724,
-0.019072871655225754,
0.06665373593568802,
-0.013218024745583534,
0.08230630308389664,
-0.16949552297592163,
-0.13468505442142487,
0.07069902122020721,
-0.10961093008518219,
0.05663115158677101,
0.051733359694480896,
-0.021729005500674248,
-0.12453582882881165,
0.13495750725269318,
0.04417509213089943,
-0.001507418928667903,
-0.020467037335038185,
-0.04676163196563721,
0.09532081335783005,
0.038805022835731506,
-0.015442210249602795,
-0.041765861213207245,
-0.19104407727718353,
0.0771869346499443,
0.09869858622550964,
0.1214275136590004,
-0.034194815903902054,
-0.14523227512836456,
-0.00852937065064907,
-0.1481349915266037,
-0.003093795385211706,
0.009705913253128529,
-0.024014513939619064,
-0.07208984345197678,
-0.0645919069647789,
0.2740132212638855,
0.002029637573286891,
-0.05756231024861336,
-0.006709015928208828,
-0.0341225229203701,
0.1827452927827835,
0.09489724040031433,
0.028276216238737106,
0.13921256363391876,
0.039212990552186966,
-0.06694789975881577,
-0.04027640447020531,
-0.027545828372240067,
-0.09250691533088684,
-0.01964215189218521,
-0.08066229522228241,
0.042429301887750626,
-0.15038062632083893,
-0.21416307985782623,
0.015705902129411697,
0.005019601434469223,
-0.03341279923915863,
0.06364315003156662,
-0.03665376827120781,
0.010069741867482662,
-0.0027651709970086813,
0.015193435363471508,
-0.022397231310606003,
-0.08188354223966599,
0.05736705660820007,
0.0007529206923209131,
0.15032997727394104,
-0.13653288781642914,
-0.020340075716376305,
-0.07153362035751343,
0.021411804482340813,
-0.23019686341285706,
0.03066818043589592,
-0.10167670249938965,
0.08531329780817032,
-0.041477128863334656,
-0.06461361050605774,
-0.16174821555614471,
0.011789367534220219,
-0.0010419149184599519,
0.17709733545780182,
-0.1639239639043808,
-0.038088053464889526,
0.1076740175485611,
-0.1946256458759308,
-0.08527068793773651,
0.0517251119017601,
0.012768086977303028,
0.04597902297973633,
0.06397747248411179,
0.14934153854846954,
0.1031235083937645,
-0.22264638543128967,
0.010112094692885876,
0.07580834627151489,
-0.0015556864673271775,
-0.10042169690132141,
0.11068377643823624,
0.0655660629272461,
-0.052050553262233734,
0.04438154771924019,
-0.1410854011774063,
0.10487007349729538,
-0.06996073573827744,
-0.002387477084994316,
-0.01700461655855179,
-0.1834074705839157,
0.06312108784914017,
0.0507112517952919,
0.0519215352833271,
-0.022031385451555252,
0.037113092839717865,
0.0180477574467659,
0.11844412237405777,
-0.09451983124017715,
-0.04272174462676048,
0.035486821085214615,
0.18258629739284515,
-0.1423407644033432,
-0.02588939107954502,
-0.06888003647327423,
-0.13501127064228058,
0.0292925164103508,
0.18523772060871124,
-0.017517326399683952,
0.04301562160253525,
0.10259008407592773,
0.11285806447267532,
-0.09099765121936798,
-0.0008586198091506958,
0.10715614259243011,
-0.005667407996952534,
-0.022428225725889206,
-0.11455284059047699,
0.013006696477532387,
-0.09113716334104538,
0.12828147411346436,
-0.22924084961414337,
0.02061202935874462,
-0.023037003353238106,
0.07790303975343704,
0.08616134524345398,
0.01949063502252102,
0.06837259978055954,
-0.0606999434530735,
-0.0540747344493866,
-0.027625426650047302,
0.015884244814515114,
0.0029630414210259914,
-0.12328136712312698,
0.09011459350585938,
-0.11600992828607559,
0.19157540798187256,
0.16291439533233643,
0.059252265840768814,
0.006304046604782343,
-0.21972554922103882,
0.0016147163696587086,
0.01179532427340746,
-0.09552447497844696,
-0.04323258996009827,
-0.14126348495483398,
-0.024953214451670647,
0.08019280433654785,
-0.06398700177669525,
0.1271461546421051,
0.06556539237499237,
-0.031072264537215233,
-0.01693609170615673,
0.08130363374948502,
0.10270002484321594,
-0.035170868039131165,
0.026550769805908203,
0.1761571615934372,
-0.0684724822640419,
0.13643550872802734,
-0.004605020396411419,
-0.11878246068954468,
0.055771857500076294,
0.01306088175624609,
0.04454951733350754,
0.1507483571767807,
0.07225454598665237,
0.028921954333782196,
0.042520664632320404,
-0.05860130861401558,
0.018499428406357765,
-0.08097749948501587,
-0.06303426623344421,
0.010515090078115463,
-0.0467078797519207,
0.025755353271961212,
0.0604417584836483,
-0.07667626440525055,
0.0764099508523941,
-0.11620743572711945,
-0.048622701317071915,
-0.019846586510539055,
-0.03342261537909508,
-0.042083773761987686,
0.0822809487581253,
-0.03809882327914238,
-0.08716411143541336,
-0.09656832367181778,
0.05287130922079086,
-0.038483913987874985,
0.04647053778171539,
0.019382329657673836,
-0.02598957158625126,
-0.09376107901334763,
-0.10808565467596054,
0.0013802817557007074,
0.16605424880981445,
0.025277029722929,
-0.006247803568840027,
0.002413514768704772,
-0.05092194676399231,
-0.08196204900741577,
-0.004021977540105581,
-0.06850586086511612,
0.005813582334667444,
0.07008232921361923,
-0.1701791137456894,
0.15957581996917725,
0.09808243066072464,
0.024220770224928856,
0.018391726538538933,
0.00237291376106441,
0.11992782354354858,
-0.05174034461379051,
0.06916750222444534,
0.24537239968776703,
0.11445479840040207,
0.031151896342635155,
0.06885439157485962,
0.057912327349185944,
-0.04729263857007027,
0.08648862689733505,
-0.03248840942978859,
-0.1341746598482132,
-0.05114733800292015,
-0.1400497704744339,
-0.04862092807888985,
0.030446872115135193,
0.07413802295923233,
0.021681450307369232,
0.00004088068817509338,
0.17219765484333038,
-0.006475508213043213,
-0.07733548432588577,
0.07351363450288773,
0.02463144063949585,
-0.054342348128557205,
-0.0026193102821707726,
0.11599507182836533,
-0.08017763495445251,
-0.007950150407850742,
0.1894376575946808,
0.0037305213045328856,
0.18132562935352325,
-0.0679444819688797,
0.008443856611847878,
-0.026566121727228165,
0.07309127599000931,
0.12447656691074371,
0.14952726662158966,
0.012678056955337524,
-0.05092410370707512,
-0.04373838007450104,
-0.12003803253173828,
0.05333372950553894,
0.06554370373487473,
-0.02693207934498787,
-0.012766287662088871,
-0.023190299049019814,
0.04564887657761574,
0.028932100161910057,
-0.054804492741823196,
0.06891165673732758,
-0.30802667140960693,
0.02037811651825905,
0.07653577625751495,
0.14853891730308533,
-0.005812488030642271,
0.05488630384206772,
0.1840921938419342,
-0.014604296535253525,
0.06167251244187355,
-0.03986305743455887,
0.07797025889158249,
0.041462529450654984,
-0.057794079184532166,
-0.04354371130466461,
0.10970950871706009,
-0.025681104511022568,
-0.022810272872447968,
-0.01118429098278284,
0.09044866263866425,
0.008599520660936832,
0.004867938347160816,
-0.009756948798894882,
-0.0708315297961235,
0.12067177891731262,
0.15309713780879974,
0.11777456849813461,
-0.018204210326075554,
0.07978889346122742,
-0.07068324834108353,
-0.19476065039634705,
0.018408825621008873,
0.04584158957004547,
-0.07215742766857147,
0.02504173293709755,
0.019185364246368408,
-0.06130224093794823,
0.02741060219705105,
-0.05422499030828476,
-0.10289818048477173,
-0.10467942804098129,
-0.03089580126106739,
0.12520983815193176,
-0.03694211319088936,
-0.031353093683719635,
-0.1286928802728653,
-0.17306935787200928,
0.036131419241428375,
0.11450566351413727,
-0.09704724699258804,
-0.09217266738414764,
-0.021238824352622032,
0.14994333684444427,
-0.01628526858985424,
0.07215669751167297,
-0.02249901369214058,
0.05517984554171562,
-0.04460188001394272,
-0.10618459433317184,
0.038151081651449203,
-0.10752860456705093,
-0.08242212980985641,
-0.036882054060697556,
0.07392942160367966,
-0.028487171977758408,
0.012146761640906334,
0.00759132532402873,
0.009199202992022038,
0.04873329773545265,
-0.11725860089063644,
-0.060124464333057404,
0.1218327209353447,
0.058867331594228745,
0.1261177659034729,
-0.04205437749624252,
-0.10852167010307312,
0.005048064980655909,
0.02765742316842079,
0.003158445004373789,
0.22603988647460938,
-0.06799223273992538,
-0.018436582759022713,
0.0769161731004715,
0.017741749063134193,
-0.22521372139453888,
0.014925588853657246,
-0.023840975016355515,
-0.0046324776485562325,
0.1231422871351242,
-0.029304174706339836,
0.1911662518978119,
0.13119296729564667,
-0.04547940567135811,
0.2294992357492447,
-0.32102760672569275,
-0.09867431968450546,
0.012288262136280537,
0.17211492359638214,
0.25403231382369995,
-0.20959462225437164,
-0.028595061972737312,
-0.06507018208503723,
-0.17828315496444702,
0.10203138738870621,
-0.20177799463272095,
0.04750993102788925,
0.028184568509459496,
-0.06953472644090652,
0.020891882479190826,
-0.05161435529589653,
0.1640864908695221,
-0.0772521048784256,
0.03878159448504448,
-0.034049104899168015,
-0.0073228804394602776,
0.12449219077825546,
-0.02584660053253174,
0.06212107464671135,
-0.16395628452301025,
0.06468012183904648,
-0.004834728315472603,
-0.05465675890445709,
0.03741072490811348,
0.01116821076720953,
0.026515530422329903,
-0.0462135374546051,
-0.033512815833091736,
0.014167550019919872,
0.022044293582439423,
0.05397307872772217,
0.11124931275844574,
-0.036846522241830826,
-0.019802678376436234,
0.13970401883125305,
0.0018800460966303945,
0.046743959188461304,
-0.01651918515563011,
-0.09164818376302719,
-0.06664149463176727,
0.05695806443691254,
-0.1734836995601654,
-0.01101449504494667,
0.11035317927598953,
0.03161061182618141,
0.04142856225371361,
0.018896598368883133,
0.07481670379638672,
0.09150341153144836,
0.14253295958042145,
-0.10386431962251663,
-0.09803629666566849,
-0.055597275495529175,
-0.011147266253829002,
-0.0013773852260783315,
0.08119704574346542,
0.10637437552213669,
-0.036726467311382294,
0.04143238812685013,
-0.029974348843097687,
0.012267365120351315,
-0.009018384851515293,
0.12727810442447662,
0.10980536788702011,
-0.029577316716313362,
-0.0852440744638443,
0.08414740115404129,
-0.03013525903224945,
-0.005798907484859228,
-0.12251196801662445,
0.03741702809929848,
-0.10583996772766113,
-0.04551489278674126,
-0.025866979733109474,
0.0446053072810173,
-0.14734286069869995,
-0.021116765215992928,
-0.13491126894950867,
-0.023487890139222145,
-0.08959989994764328,
0.0828704684972763,
0.10460314899682999,
-0.05437232553958893,
0.009861462749540806,
-0.014587241224944592,
-0.02616409957408905,
0.05751270800828934,
0.1103048250079155,
0.09461558610200882,
-0.16475741565227509,
-0.10526435077190399,
-0.012992313131690025,
-0.031140195205807686,
-0.11507445573806763,
-0.03139198571443558,
-0.06570961326360703,
-0.017768338322639465,
-0.09659750759601593,
0.14672040939331055,
-0.1326460987329483,
-0.01582975685596466,
-0.07166177034378052,
-0.06002269685268402,
-0.02904115803539753,
-0.00601171562448144,
-0.032947856932878494,
0.040471822023391724,
0.01819947175681591,
0.025026574730873108,
-0.05799208581447601,
-0.06278344243764877,
0.0026317911688238382,
-0.04359940439462662,
0.022570015862584114,
0.0369204580783844,
-0.024976132437586784,
0.029800761491060257,
-0.191193625330925,
0.018716217949986458,
0.11751782149076462,
0.050184037536382675,
0.0010399293387308717,
0.13642282783985138,
0.022571556270122528,
0.031083257868885994,
0.01919802837073803,
-0.022518621757626534,
-0.000632906740065664,
-0.08095448464155197,
0.06028345227241516,
-0.11432866752147675,
-0.005023793317377567,
-0.02663532830774784,
-0.007576741743832827,
0.12644033133983612,
0.09869695454835892,
0.09740722924470901,
-0.08993195742368698,
0.004671395290642977,
-0.09613728523254395,
0.011071116663515568,
0.004930065479129553,
-0.12040355801582336,
-0.0905308946967125,
0.02387109212577343,
0.019366702064871788,
-0.009703620336949825,
0.18726730346679688,
-0.0036375720519572496,
0.015839366242289543,
-0.051641203463077545,
0.045355066657066345,
0.24500735104084015,
-0.008033642545342445,
0.3406907618045807,
0.10507046431303024,
0.06785251200199127,
-0.13232465088367462,
0.12261214107275009,
0.10864005982875824,
-0.03609597310423851,
0.011531337164342403,
0.13417218625545502,
-0.1153489276766777,
0.06889936327934265,
-0.004348194692283869,
0.03322022780776024,
0.01956965960562229,
0.031804706901311874,
-0.04862631857395172,
-0.008737516589462757,
-0.041251182556152344,
0.016114475205540657,
0.12533611059188843,
-0.07774174213409424,
-0.06820916384458542,
0.0876537635922432,
-0.020477378740906715,
-0.0911230519413948,
-0.18900081515312195,
-0.09961380809545517,
-0.21102777123451233,
0.007074144668877125,
-0.08818411082029343,
-0.027697360143065453,
0.07636518776416779,
0.014817360788583755,
0.021098699420690536,
0.113338902592659,
-0.09905879199504852,
-0.022364305332303047,
0.07151875644922256,
-0.012827781029045582,
-0.010109561495482922,
0.059735558927059174,
-0.10228128731250763,
0.0699242427945137,
-0.02440890111029148,
-0.03852156177163124,
0.0036953191738575697,
0.04752730950713158,
0.051404327154159546,
0.023360172286629677,
-0.09065048396587372,
-0.06906510144472122,
0.017697995528578758,
-0.03561810404062271,
0.16173072159290314,
0.074434295296669,
-0.001994459191337228,
-0.004579123575240374,
0.17769460380077362,
-0.041227374225854874,
-0.06053667515516281,
-0.11512739956378937,
0.015410522930324078,
-0.08074933290481567,
0.06473560631275177,
-0.0387294664978981,
-0.13352060317993164,
0.001198062440380454,
0.19629058241844177,
0.23710890114307404,
-0.1290702223777771,
0.03552442416548729,
-0.028398554772138596,
-0.008367951028048992,
0.003909665625542402,
0.03133254870772362,
0.0015993536217138171,
0.17345762252807617,
-0.02312777377665043,
-0.1023857519030571,
-0.07571328431367874,
-0.005070926155894995,
-0.07414115220308304,
-0.13435140252113342,
-0.02630753628909588,
-0.04231520742177963,
-0.08361818641424179,
0.1116269901394844,
-0.05748683586716652,
-0.008191178552806377,
0.03662210330367088,
-0.10944326221942902,
-0.03236200660467148,
-0.11568916589021683,
-0.013700653798878193,
0.10861273109912872,
-0.00883752852678299,
-0.1392277181148529,
-0.001419675420038402,
-0.051809363067150116,
-0.009280716069042683,
-0.09269163757562637,
-0.05593078210949898,
-0.06137734279036522,
-0.0626896321773529,
0.1242966577410698,
-0.03742920979857445,
0.009681379422545433,
-0.007837457582354546,
-0.010148810222744942,
-0.02794325351715088,
0.10269378870725632,
0.007467519957572222,
-0.12720000743865967,
-0.005830382462590933,
0.040693920105695724,
-0.10644414275884628,
0.10861935466527939,
0.010287569835782051,
-0.08743535727262497,
0.01001899503171444,
0.08253658562898636,
-0.09815046936273575,
-0.10221243649721146,
0.013562577776610851,
-0.1146145686507225,
0.11666300147771835,
0.0368388406932354,
-0.000935611198656261,
-0.034946948289871216,
0.01992642693221569,
0.08874279260635376,
0.0655096173286438,
-0.045711223036050797,
0.04478520527482033,
-0.09247695654630661,
-0.0923958420753479,
0.03352619707584381,
0.013743367046117783,
-0.18381185829639435,
0.007303543854504824,
-0.19836297631263733,
0.023526236414909363,
-0.005169142968952656,
0.017178773880004883,
0.2225179821252823,
-0.006149116437882185,
-0.011515671387314796,
-0.21864478290081024,
0.02423912100493908,
0.07946382462978363,
-0.11323942244052887,
-0.1039784699678421
] |
null | null | null | This model was converted to gguf format from daekeun-ml/phi-2-ko-v0.1. | {"license": "cc-by-sa-3.0"} | null | changok/phi-2-ko-v0.1-gguf | [
"gguf",
"license:cc-by-sa-3.0",
"region:us"
] | 2024-02-15T07:01:27+00:00 | [] | [] | TAGS
#gguf #license-cc-by-sa-3.0 #region-us
| This model was converted to gguf format from daekeun-ml/phi-2-ko-v0.1. | [] | [
"TAGS\n#gguf #license-cc-by-sa-3.0 #region-us \n"
] | [
20
] | [
"passage: TAGS\n#gguf #license-cc-by-sa-3.0 #region-us \n"
] | [
-0.0007700848509557545,
0.009697864763438702,
-0.006551571190357208,
-0.00969603005796671,
0.008047436363995075,
0.05594426020979881,
0.17804472148418427,
0.001738075166940689,
0.17765402793884277,
-0.04426407814025879,
0.15514712035655975,
0.06496353447437286,
0.034679289907217026,
0.0644635260105133,
0.00902540609240532,
-0.08746431022882462,
0.07436653226613998,
-0.03594089671969414,
-0.00972598697990179,
0.014168228022754192,
0.03821675479412079,
0.018273962661623955,
0.007366461679339409,
-0.03877338767051697,
-0.14467470347881317,
-0.02019619755446911,
0.08043427020311356,
-0.039114098995923996,
0.0648907944560051,
0.058926571160554886,
0.03911150246858597,
0.11396243423223495,
-0.036547038704156876,
-0.16285063326358795,
0.021297017112374306,
-0.07978416234254837,
-0.1660817414522171,
0.02813168801367283,
0.022754952311515808,
0.07552699744701385,
0.12169899791479111,
0.14369870722293854,
-0.11916446685791016,
0.06250054389238358,
-0.20387256145477295,
-0.13469800353050232,
-0.13527584075927734,
0.06146489083766937,
0.027860304340720177,
0.028957687318325043,
0.07177362591028214,
0.027930989861488342,
-0.19467228651046753,
-0.040617965161800385,
0.05369352176785469,
-0.3312735855579376,
0.03405322507023811,
0.3271135091781616,
-0.004000644665211439,
0.08075297623872757,
-0.044094499200582504,
0.14310841262340546,
0.04711473360657692,
-0.009125867858529091,
-0.07327696681022644,
-0.0801963284611702,
-0.015592626295983791,
0.14989756047725677,
-0.03017389215528965,
-0.0785786584019661,
0.28128501772880554,
0.026761123910546303,
-0.076514333486557,
0.07129929214715958,
0.02942510135471821,
0.016807328909635544,
-0.004387585446238518,
0.08407056331634521,
0.05395376309752464,
0.1737309843301773,
0.09531787037849426,
-0.06594083458185196,
-0.14439816772937775,
-0.1239498034119606,
-0.20662176609039307,
0.12085460871458054,
-0.04053746908903122,
0.12084117531776428,
-0.09954322129487991,
0.012183468788862228,
-0.2296762615442276,
0.0014001531526446342,
-0.11057707667350769,
-0.06054414436221123,
0.08910593390464783,
0.039299800992012024,
-0.06508926302194595,
0.157795712351799,
0.14740294218063354,
0.14023005962371826,
-0.12046608328819275,
-0.002976141171529889,
-0.057743534445762634,
0.16322533786296844,
-0.02391940914094448,
0.004660416394472122,
0.03069104254245758,
0.2358216792345047,
0.06988880783319473,
-0.14385643601417542,
0.02740306966006756,
-0.026975469663739204,
-0.1511678397655487,
-0.015162855386734009,
-0.22009825706481934,
0.12604455649852753,
-0.04659144580364227,
-0.0803997591137886,
-0.055732134729623795,
0.07110879570245743,
0.20757587254047394,
0.03155345842242241,
-0.026349885389208794,
0.021232381463050842,
0.04068875312805176,
-0.11317646503448486,
-0.037826307117938995,
0.04794050380587578,
0.12967337667942047,
0.09311303496360779,
-0.13652296364307404,
-0.009766235947608948,
0.04311386123299599,
0.08264521509408951,
0.11846792697906494,
-0.03890921175479889,
0.03226502612233162,
-0.1322207897901535,
-0.11758273839950562,
0.038610100746154785,
-0.006549044977873564,
-0.012556839734315872,
0.029290931299328804,
0.1395150125026703,
0.03499891236424446,
-0.020232273265719414,
-0.07518018782138824,
-0.07447938621044159,
-0.08798712491989136,
0.10140179842710495,
-0.04096008464694023,
-0.023667549714446068,
-0.26074859499931335,
-0.026448646560311317,
-0.06731270998716354,
0.042543042451143265,
-0.06163138151168823,
-0.044012825936079025,
-0.1524728387594223,
0.10746222734451294,
-0.02453800104558468,
0.007870165631175041,
-0.10993871092796326,
0.016788780689239502,
-0.06791788339614868,
0.11543654650449753,
-0.05778978392481804,
-0.051421597599983215,
0.1701514571905136,
-0.16070207953453064,
-0.10942874103784561,
0.03140641003847122,
0.061289072036743164,
-0.08855128288269043,
0.02432307042181492,
0.30958056449890137,
-0.0651014968752861,
-0.11309808492660522,
0.04071018099784851,
0.20226018130779266,
-0.060154568403959274,
-0.21066443622112274,
0.17667533457279205,
-0.1788433939218521,
-0.20695511996746063,
0.010602272115647793,
-0.18775177001953125,
0.13254587352275848,
-0.0053656031377613544,
-0.0635855495929718,
0.00780601566657424,
-0.033085454255342484,
-0.042736705392599106,
-0.02276868186891079,
0.07030799239873886,
-0.022940056398510933,
0.0658326968550682,
-0.1309145838022232,
0.019006630405783653,
0.0940321609377861,
0.02371963858604431,
-0.08074698597192764,
0.0983794704079628,
0.0018107456853613257,
0.017634011805057526,
0.009991690516471863,
-0.07416889816522598,
0.02949490211904049,
-0.009084563702344894,
0.10276395082473755,
0.06555159389972687,
0.030940553173422813,
-0.011457554996013641,
-0.002469760598614812,
0.06541533023118973,
-0.02050379477441311,
0.0036516760010272264,
0.06127466261386871,
-0.0682436004281044,
0.09272307902574539,
0.02487223595380783,
0.06432729959487915,
-0.040307044982910156,
-0.03106631338596344,
0.2223721146583557,
-0.107884980738163,
-0.06463851779699326,
-0.0063223461620509624,
0.02738475613296032,
-0.014220036566257477,
0.08433395624160767,
0.03015238791704178,
0.11553467065095901,
0.043618764728307724,
-0.13213957846164703,
0.25205618143081665,
0.009864564053714275,
0.21337920427322388,
0.15368497371673584,
-0.007184917107224464,
0.01897692121565342,
-0.10350217670202255,
0.013206916861236095,
0.023633873090147972,
0.04068606719374657,
0.0008379233186133206,
0.08650100231170654,
-0.05582459643483162,
0.015273402445018291,
-0.043145377188920975,
0.06298404932022095,
0.011564482003450394,
-0.057225193828344345,
-0.08236382156610489,
0.016007788479328156,
0.20618820190429688,
-0.124642014503479,
0.15163184702396393,
0.354807585477829,
0.09043475240468979,
0.11462227255105972,
-0.11130896955728531,
-0.006505927070975304,
-0.08181685209274292,
0.05717938020825386,
-0.002918352372944355,
0.17463678121566772,
-0.0423070527613163,
0.0007169600576162338,
0.04060779884457588,
0.033955659717321396,
0.07257070392370224,
-0.18196964263916016,
-0.16849903762340546,
-0.03431541845202446,
-0.09818164259195328,
-0.17749230563640594,
0.07523902505636215,
-0.1445445567369461,
0.02271212451159954,
0.03446560725569725,
-0.06557516008615494,
0.16731858253479004,
-0.013263781554996967,
-0.08947334438562393,
0.11690288782119751,
-0.16698028147220612,
-0.10397785902023315,
-0.1499418020248413,
-0.07240868359804153,
-0.05443732067942619,
0.05218072235584259,
0.046389054507017136,
-0.07586198300123215,
-0.059830501675605774,
0.04016020894050598,
-0.08184046298265457,
-0.10333117842674255,
-0.01816747523844242,
0.06100945547223091,
0.007320871576666832,
-0.02238325960934162,
-0.07625823467969894,
-0.06807080656290054,
-0.03912292420864105,
-0.09773695468902588,
0.08469358831644058,
-0.061450812965631485,
0.11581259965896606,
0.11330179125070572,
0.07762962579727173,
0.07214251160621643,
-0.04994143918156624,
0.1443222314119339,
-0.040524035692214966,
-0.15563668310642242,
0.1622788906097412,
0.033644016832113266,
0.007672982290387154,
0.09706190973520279,
0.11700452119112015,
-0.11058702319860458,
-0.05338157340884209,
-0.09873080998659134,
-0.1633751392364502,
-0.1728254109621048,
-0.05354972556233406,
-0.09898076206445694,
0.0992104783654213,
-0.02670331299304962,
0.1303616762161255,
0.09208273887634277,
0.06680154800415039,
0.061758607625961304,
-0.011830511502921581,
0.04204186424612999,
-0.013499190099537373,
0.13005562126636505,
-0.02857551909983158,
-0.03179716691374779,
-0.10550358146429062,
0.0971907377243042,
0.182448610663414,
0.13305984437465668,
0.09933293610811234,
0.2505817711353302,
0.10973197221755981,
0.16120411455631256,
0.08356232196092606,
0.14692775905132294,
-0.0050190649926662445,
0.013241936452686787,
-0.04941802844405174,
-0.022833168506622314,
-0.05834232643246651,
-0.0006725969724357128,
0.0312780886888504,
-0.008553426712751389,
-0.24348388612270355,
0.05886209011077881,
-0.2624271810054779,
0.05402198061347008,
-0.034524280577898026,
0.05313645303249359,
-0.01570812426507473,
0.07907221466302872,
0.06143688037991524,
0.13551387190818787,
0.020545413717627525,
0.11738487333059311,
-0.014385928399860859,
-0.059462565928697586,
0.044120509177446365,
0.030232960358262062,
0.02456749975681305,
0.03438256308436394,
0.03272028639912605,
-0.03609780594706535,
-0.12321091443300247,
0.03410102799534798,
0.08456593751907349,
-0.24216361343860626,
0.2175341248512268,
0.03493085503578186,
-0.0768032968044281,
-0.007594410330057144,
-0.021684370934963226,
0.0484347827732563,
0.18269002437591553,
0.16656892001628876,
0.0853193998336792,
-0.17771081626415253,
-0.10378468781709671,
-0.039515625685453415,
0.02059190161526203,
0.07280559092760086,
-0.07055274397134781,
-0.18663619458675385,
-0.022626591846346855,
0.07147631794214249,
0.001142982393503189,
0.1197531595826149,
-0.10412182658910751,
-0.07581270486116409,
0.06728065758943558,
0.11584460735321045,
0.061212360858917236,
-0.11085298657417297,
0.08460020273923874,
-0.10920879989862442,
0.11329670995473862,
-0.19340431690216064,
0.008228148333728313,
-0.07806357741355896,
-0.13270652294158936,
0.003405734896659851,
-0.030063649639487267,
0.014898230321705341,
-0.06549739092588425,
-0.1495208591222763,
-0.13848336040973663,
-0.1969091147184372,
0.11317717283964157,
-0.07059919089078903,
0.03008798509836197,
0.0029237319249659777,
0.10870935767889023,
-0.062230538576841354,
0.03933002054691315,
-0.021195033565163612,
0.028688795864582062,
0.029277736321091652,
-0.17220012843608856,
0.12437888234853745,
-0.07128254324197769,
0.030843930318951607,
0.07679547369480133,
0.01641440950334072,
0.04151221737265587,
0.07092767208814621,
-0.1098107174038887,
0.17116324603557587,
0.35193243622779846,
-0.06616746634244919,
0.20047925412654877,
0.3209896981716156,
-0.07634276151657104,
-0.2475428581237793,
-0.12587691843509674,
-0.2318907529115677,
-0.0858849510550499,
0.044459518045186996,
-0.21427680552005768,
-0.005194509401917458,
0.2419145256280899,
-0.1423467993736267,
0.24334537982940674,
-0.24242104589939117,
-0.029346147552132607,
0.08666101843118668,
-0.00368698313832283,
0.41644641757011414,
-0.18702517449855804,
-0.13182158768177032,
0.012846425175666809,
-0.1921416074037552,
0.10748744755983353,
-0.016739562153816223,
0.08599084615707397,
0.005839558783918619,
-0.08709678053855896,
-0.042738769203424454,
-0.040146201848983765,
0.23799987137317657,
-0.020324649289250374,
0.08385396003723145,
-0.06732401251792908,
-0.05756695568561554,
0.2067701816558838,
0.03359714522957802,
-0.028108293190598488,
-0.125055730342865,
-0.029776470735669136,
-0.009300182573497295,
0.015969350934028625,
-0.04314899072051048,
0.08205387741327286,
0.012571784667670727,
-0.08553598076105118,
-0.086797334253788,
0.017748210579156876,
-0.15269242227077484,
-0.02352292276918888,
0.19844745099544525,
-0.09201347827911377,
0.044395238161087036,
0.05011944845318794,
-0.027934163808822632,
-0.13461104035377502,
-0.06067204847931862,
-0.07887956500053406,
-0.0729023665189743,
0.06299986690282822,
-0.19787776470184326,
-0.022969970479607582,
0.048838060349226,
0.004055542405694723,
0.08299599587917328,
0.08991780132055283,
-0.04217529296875,
0.08539735525846481,
0.17997229099273682,
-0.14036326110363007,
-0.07812557369470596,
-0.00950013380497694,
-0.0010412639239802957,
0.16670788824558258,
0.03340047970414162,
0.04792344570159912,
0.03760717436671257,
0.03582676872611046,
0.011414540000259876,
0.038630809634923935,
-0.16419368982315063,
-0.03714478388428688,
0.019489634782075882,
-0.029915759339928627,
-0.1352764070034027,
0.14953060448169708,
0.035426992923021317,
0.04314475134015083,
-0.044767092913389206,
0.04736030474305153,
-0.07615260779857635,
-0.08669809252023697,
-0.26792868971824646,
-0.08619297295808792,
-0.20562483370304108,
-0.09435253590345383,
0.03214513137936592,
-0.1251870095729828,
-0.053912144154310226,
0.060784246772527695,
0.03576309606432915,
0.14303892850875854,
0.07294543832540512,
0.0006125193904154003,
0.10219082981348038,
-0.10748365521430969,
-0.26521947979927063,
0.017983941361308098,
-0.08307063579559326,
-0.12453439831733704,
0.015628404915332794,
0.06794790923595428,
-0.04671559855341911,
-0.027188381180167198,
-0.15367497503757477,
0.056045785546302795,
-0.045064907521009445,
0.007479534018784761,
-0.08656448870897293,
-0.00850484799593687,
0.04486239328980446,
-0.015059937722980976,
0.0014975132653489709,
0.03362729027867317,
-0.1215219721198082,
0.010658507235348225,
0.013608868233859539,
0.06219056248664856,
-0.04895275831222534,
-0.023384129628539085,
0.07383551448583603,
0.06877917796373367,
0.15129315853118896,
0.11845091730356216,
0.06299860030412674,
0.13879235088825226,
-0.24487273395061493,
-0.006956758443266153,
0.08701267093420029,
-0.047172609716653824,
-0.04662870243191719,
0.0364747978746891,
0.0055897715501487255,
0.0368533656001091,
-0.12210657447576523,
0.07791433483362198,
-0.03035132586956024,
-0.1340971291065216,
-0.1042039766907692,
-0.027991585433483124,
-0.07217179983854294,
-0.007200233172625303,
-0.11173557490110397,
0.22877641022205353,
0.09534735232591629,
0.033799055963754654,
0.02962321974337101,
-0.029027476906776428,
0.0256549920886755,
-0.004515738692134619,
-0.0256769061088562,
-0.10173682123422623,
-0.18991483747959137,
-0.04807035252451897,
-0.0711340680718422,
-0.004368167836219072,
0.35723617672920227,
-0.011841628700494766,
-0.19235195219516754,
0.043251145631074905,
0.12550412118434906,
0.18579161167144775,
0.001722323358990252,
0.2531207799911499,
0.0512743704020977,
0.003871186403557658,
-0.1091604232788086,
0.08924359828233719,
-0.05323107913136482,
-0.24009323120117188,
0.06749307364225388,
-0.00633244076743722,
-0.04936536028981209,
-0.008278653025627136,
0.11120717972517014,
-0.13088838756084442,
0.01913370005786419,
0.0038714956026524305,
0.00045064030564390123,
-0.013664660044014454,
-0.03736960142850876,
-0.025232620537281036,
0.19364921748638153,
-0.07055506855249405,
0.006163692567497492,
-0.00252225692383945,
-0.007799651473760605,
-0.1560903638601303,
-0.15517354011535645,
0.020449139177799225,
-0.1334521621465683,
0.1002301350235939,
-0.02106686495244503,
0.040035102516412735,
0.27627477049827576,
0.013146108947694302,
-0.026562223210930824,
-0.009272121824324131,
-0.11822500824928284,
-0.051092710345983505,
0.026129072532057762,
-0.0322076790034771,
-0.013824998401105404,
-0.10624122619628906,
-0.10432400554418564,
0.028536414727568626,
-0.17377535998821259,
0.008439146913588047,
0.015502101741731167,
0.06439683586359024,
-0.014524002559483051,
-0.10947515815496445,
-0.020589129999279976,
-0.08254910260438919,
0.09448450803756714,
-0.011553692631423473,
0.20497506856918335,
0.008876387029886246,
-0.02118419110774994,
0.07697881013154984,
0.030915990471839905,
-0.0015803264686837792,
-0.03218121454119682,
-0.02108156681060791,
0.12649615108966827,
-0.05505751073360443,
0.10001954436302185,
-0.0521758608520031,
-0.016780398786067963,
0.04199712350964546,
0.19703112542629242,
0.20398880541324615,
-0.11459182947874069,
0.02786925435066223,
-0.013508942909538746,
0.023438824340701103,
0.10643148422241211,
0.1913890242576599,
-0.00842409860342741,
0.26529476046562195,
-0.06582465022802353,
-0.03079272247850895,
-0.008903443813323975,
0.04580523446202278,
-0.06758272647857666,
0.04770633578300476,
0.01503016334027052,
-0.05511722341179848,
-0.07719441503286362,
0.11023501306772232,
-0.11305411905050278,
0.13944394886493683,
0.16671502590179443,
-0.04241015017032623,
0.08980467170476913,
-0.03056059218943119,
-0.018700215965509415,
-0.005378247704356909,
0.0661730095744133,
-0.13570278882980347,
-0.07651478797197342,
-0.1357082575559616,
0.03553564473986626,
-0.30721235275268555,
-0.0987430289387703,
0.06129671633243561,
0.1868092566728592,
0.18355417251586914,
-0.005010142922401428,
0.12599842250347137,
0.009567514061927795,
0.05428631976246834,
-0.09137740731239319,
0.17351768910884857,
-0.004629292991012335,
-0.10226863622665405,
-0.15565012395381927,
-0.16775424778461456,
-0.019524062052369118,
0.007426881697028875,
0.010032618418335915,
0.08338627219200134,
0.05983962118625641,
0.13706181943416595,
-0.06249342858791351,
-0.014441467821598053,
-0.03554921597242355,
-0.13935025036334991,
0.07396825402975082,
-0.05043525993824005,
0.010333872400224209,
-0.10591236501932144,
-0.06775392591953278,
0.005171519238501787,
0.08686131983995438,
-0.13459426164627075,
-0.03339244797825813,
0.12292363494634628,
0.028347894549369812,
0.20804612338542938,
-0.00279408716596663,
-0.03636467829346657,
0.014604859985411167,
-0.06324740499258041,
0.1406550407409668,
-0.07900936901569366,
0.04158153757452965,
0.16743142902851105,
-0.01431555300951004,
0.007826578803360462,
-0.18861456215381622,
0.0396672822535038,
-0.05219857767224312,
-0.017660850659012794,
-0.057784680277109146
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Whisper Small tr Beta - tgrhn
This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on the Common Voice 11.0 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2629
- Wer: 165.9327
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 64
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- training_steps: 1000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.0233 | 5.14 | 1000 | 0.2629 | 165.9327 |
### Framework versions
- Transformers 4.37.2
- Pytorch 2.2.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.2
| {"language": ["tr"], "license": "apache-2.0", "tags": ["whisper-event", "generated_from_trainer"], "datasets": ["mozilla-foundation/common_voice_11_0"], "metrics": ["wer"], "base_model": "openai/whisper-small", "model-index": [{"name": "Whisper Small tr Beta - tgrhn", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Common Voice 11.0", "type": "mozilla-foundation/common_voice_11_0", "config": "tr", "split": "test", "args": "tr"}, "metrics": [{"type": "wer", "value": 165.93265614565394, "name": "Wer"}]}]}]} | automatic-speech-recognition | tgrhn/whisper-small-turkish-cv16 | [
"transformers",
"pytorch",
"tensorboard",
"safetensors",
"whisper",
"automatic-speech-recognition",
"whisper-event",
"generated_from_trainer",
"tr",
"dataset:mozilla-foundation/common_voice_11_0",
"base_model:openai/whisper-small",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | 2024-02-15T07:08:35+00:00 | [] | [
"tr"
] | TAGS
#transformers #pytorch #tensorboard #safetensors #whisper #automatic-speech-recognition #whisper-event #generated_from_trainer #tr #dataset-mozilla-foundation/common_voice_11_0 #base_model-openai/whisper-small #license-apache-2.0 #model-index #endpoints_compatible #region-us
| Whisper Small tr Beta - tgrhn
=============================
This model is a fine-tuned version of openai/whisper-small on the Common Voice 11.0 dataset.
It achieves the following results on the evaluation set:
* Loss: 0.2629
* Wer: 165.9327
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 1e-05
* train\_batch\_size: 64
* eval\_batch\_size: 32
* seed: 42
* gradient\_accumulation\_steps: 2
* total\_train\_batch\_size: 128
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 500
* training\_steps: 1000
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.37.2
* Pytorch 2.2.0+cu121
* Datasets 2.17.0
* Tokenizers 0.15.2
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 32\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* training\\_steps: 1000\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.2.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
"TAGS\n#transformers #pytorch #tensorboard #safetensors #whisper #automatic-speech-recognition #whisper-event #generated_from_trainer #tr #dataset-mozilla-foundation/common_voice_11_0 #base_model-openai/whisper-small #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 32\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* training\\_steps: 1000\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.2.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
104,
158,
4,
33
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #safetensors #whisper #automatic-speech-recognition #whisper-event #generated_from_trainer #tr #dataset-mozilla-foundation/common_voice_11_0 #base_model-openai/whisper-small #license-apache-2.0 #model-index #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 32\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* training\\_steps: 1000\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.2.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.2"
] | [
-0.1257021278142929,
0.14095734059810638,
-0.004959451034665108,
0.07058548927307129,
0.09222663938999176,
0.020834149792790413,
0.12315312772989273,
0.1628658026456833,
-0.04900912195444107,
0.10520748794078827,
0.07880094647407532,
0.07118409126996994,
0.08277484774589539,
0.16314482688903809,
-0.016055813059210777,
-0.3173292279243469,
0.011403183452785015,
-0.03570970892906189,
-0.11313175410032272,
0.10763303935527802,
0.07779829949140549,
-0.09979165345430374,
0.040921032428741455,
0.002333519048988819,
-0.043072398751974106,
-0.019727714359760284,
-0.03631584346294403,
-0.0547366589307785,
0.09380167722702026,
0.020476507022976875,
0.041202664375305176,
0.02940499223768711,
0.10205477476119995,
-0.23679347336292267,
0.005932425148785114,
0.05979544296860695,
0.034459929913282394,
0.06511414051055908,
0.10754018276929855,
-0.019067035987973213,
0.07876434922218323,
-0.09383200854063034,
0.05682852119207382,
0.05171951279044151,
-0.09391506761312485,
-0.2810356616973877,
-0.06018809601664543,
0.05024564638733864,
0.13561037182807922,
0.061828967183828354,
-0.019324859604239464,
0.04994094744324684,
-0.03529111295938492,
0.09362209588289261,
0.21522577106952667,
-0.2527739405632019,
-0.08038699626922607,
-0.022279193624854088,
0.044358670711517334,
0.054475024342536926,
-0.10759205371141434,
-0.011708163656294346,
-0.002334154676645994,
0.017715882509946823,
0.10885826498270035,
-0.0053166295401751995,
0.02316933684051037,
-0.01101131271570921,
-0.13409626483917236,
-0.04529302194714546,
0.1294175237417221,
0.059423770755529404,
-0.023655833676457405,
-0.11969845741987228,
-0.050232645124197006,
-0.15621410310268402,
-0.05359766259789467,
0.018697913736104965,
0.02975119836628437,
-0.03149985149502754,
-0.06738660484552383,
-0.005964499432593584,
-0.04617496579885483,
-0.0874323695898056,
0.056850615888834,
0.11996385455131531,
0.03872992470860481,
-0.018229173496365547,
0.022279050201177597,
0.0957607626914978,
0.05449235439300537,
-0.17192953824996948,
-0.017316706478595734,
0.018615946173667908,
-0.0997706726193428,
-0.003228859743103385,
0.0008150681969709694,
0.029019368812441826,
0.04974576085805893,
0.148410826921463,
-0.01733362302184105,
0.0934358611702919,
0.03316296637058258,
0.009291814640164375,
-0.08848397433757782,
0.1612873077392578,
-0.06016843393445015,
-0.10776783525943756,
-0.03675061836838722,
0.13473328948020935,
0.01686873286962509,
-0.012948393821716309,
-0.05860775336623192,
0.025536181405186653,
0.08919576555490494,
0.06396714597940445,
0.0037924230564385653,
0.04028815031051636,
-0.05444841459393501,
-0.0333264134824276,
-0.0025919622275978327,
-0.12568126618862152,
0.041047800332307816,
0.04849182441830635,
-0.07747140526771545,
-0.03385414928197861,
-0.00860475655645132,
0.0016684348229318857,
-0.03255356103181839,
0.07741092145442963,
-0.05207675322890282,
-0.01869935542345047,
-0.07297022640705109,
-0.07975789159536362,
0.019712820649147034,
-0.05301116779446602,
-0.01267329603433609,
-0.044534988701343536,
-0.1467037796974182,
-0.05313549190759659,
0.0651731863617897,
-0.06757691502571106,
-0.06134145334362984,
-0.06565611809492111,
-0.09057583659887314,
0.04468507319688797,
-0.008841881528496742,
0.1239682286977768,
-0.0565769225358963,
0.08294747769832611,
0.008522250689566135,
0.05924233794212341,
0.10368012636899948,
0.05131954699754715,
-0.04538212716579437,
0.07166960835456848,
-0.15360620617866516,
0.09442711621522903,
-0.11059217900037766,
0.06668071448802948,
-0.14004400372505188,
-0.10397061705589294,
-0.002381742699071765,
-0.007243076339364052,
0.08745352923870087,
0.13499099016189575,
-0.18018803000450134,
-0.07189905643463135,
0.18395459651947021,
-0.08212964981794357,
-0.08699353784322739,
0.12168917059898376,
-0.004966855049133301,
-0.026855088770389557,
0.02339176833629608,
0.1866104155778885,
0.15232819318771362,
-0.07877466827630997,
0.014782099053263664,
-0.023697255179286003,
0.102857306599617,
0.042021941393613815,
0.09173624962568283,
-0.03555203974246979,
0.03527066484093666,
0.012025120668113232,
-0.0441022552549839,
0.02998959831893444,
-0.07545232772827148,
-0.08969931304454803,
-0.01393266674131155,
-0.07738541066646576,
0.031002862378954887,
0.05331725627183914,
0.01004979107528925,
-0.07971176505088806,
-0.12782786786556244,
-0.0029870790895074606,
0.1120138093829155,
-0.09695281088352203,
-0.0012137715239077806,
-0.08452790975570679,
0.05822737514972687,
0.016545671969652176,
-0.001995309954509139,
-0.13962888717651367,
-0.03119528852403164,
0.04624952748417854,
-0.07008164376020432,
0.009962603449821472,
-0.03081817924976349,
0.08974232524633408,
0.0647500678896904,
-0.04085762798786163,
-0.07970795035362244,
-0.016930563375353813,
-0.011175069957971573,
-0.056980133056640625,
-0.22681282460689545,
-0.08369658142328262,
-0.02543099783360958,
0.15420496463775635,
-0.19182781875133514,
0.02520322985947132,
0.030913203954696655,
0.12161321938037872,
0.03543161228299141,
-0.04327670857310295,
0.028443563729524612,
0.04157243296504021,
-0.007791435346007347,
-0.08909951895475388,
0.03549077734351158,
0.0009941302705556154,
-0.0992826595902443,
-0.0009842715226113796,
-0.1498987227678299,
0.09686709195375443,
0.06748313456773758,
0.040563955903053284,
-0.06610027700662613,
-0.047529738396406174,
-0.0598708875477314,
-0.05543601140379906,
-0.011897324584424496,
-0.008823180571198463,
0.16315093636512756,
0.014159703627228737,
0.09830448776483536,
-0.07861963659524918,
-0.05110495537519455,
0.024211527779698372,
-0.0009610956767573953,
-0.008131294511258602,
0.1381697952747345,
0.019141070544719696,
-0.0752754658460617,
0.09700106084346771,
0.05102882534265518,
-0.05502864345908165,
0.15463633835315704,
-0.0772782638669014,
-0.07402680814266205,
-0.04433272033929825,
0.04525494948029518,
0.037048667669296265,
0.11206428706645966,
-0.143828883767128,
-0.013052277266979218,
0.02458227425813675,
0.006598493549972773,
0.010649086907505989,
-0.16786593198776245,
0.001873439527116716,
0.029140358790755272,
-0.09166070818901062,
0.018286168575286865,
-0.009695719927549362,
-0.00016756162222009152,
0.0860067754983902,
0.0051276423037052155,
-0.06573031097650528,
-0.025882873684167862,
-0.04252360388636589,
-0.0761789008975029,
0.17674946784973145,
-0.1006430983543396,
-0.14930549263954163,
-0.12666933238506317,
0.007186199072748423,
-0.015723979100584984,
-0.007645728997886181,
0.0276455320417881,
-0.06537974625825882,
-0.03607027605175972,
-0.08637536317110062,
-0.0021164496429264545,
-0.007457710802555084,
0.02628682181239128,
0.024906618520617485,
0.016965119168162346,
0.08625317364931107,
-0.09407497197389603,
0.012157713994383812,
-0.005575753282755613,
-0.052979618310928345,
0.0016388853546231985,
0.026396872475743294,
0.07964000105857849,
0.14673210680484772,
0.04482179880142212,
0.03446614369750023,
-0.027486294507980347,
0.17604568600654602,
-0.10933312773704529,
0.029997002333402634,
0.12904506921768188,
-0.002671360271051526,
0.06727534532546997,
0.1705319583415985,
0.04077429696917534,
-0.07460134476423264,
0.002358728088438511,
0.029402367770671844,
-0.023756446316838264,
-0.2068311721086502,
-0.031010346487164497,
-0.05887942016124725,
0.03451118245720863,
0.10754270106554031,
0.04262910783290863,
0.00790821947157383,
0.0346490740776062,
-0.030418699607253075,
-0.0344061441719532,
0.053217966109514236,
0.06342196464538574,
0.04685475304722786,
0.03507788106799126,
0.11076779663562775,
-0.009981049224734306,
-0.03658745065331459,
0.03036007657647133,
-0.0032870753202587366,
0.2275257259607315,
-0.02592656761407852,
0.17763304710388184,
0.03175739943981171,
0.13775911927223206,
-0.0017843228997662663,
0.05605890229344368,
-0.00004090027141501196,
0.011428730562329292,
0.01635299250483513,
-0.0626174807548523,
-0.023076418787240982,
0.04117433726787567,
0.05934860557317734,
0.04490195959806442,
-0.08639772236347198,
0.032277680933475494,
0.04472627490758896,
0.34019750356674194,
0.07772593945264816,
-0.28555747866630554,
-0.08036031574010849,
0.027935294434428215,
-0.07039240002632141,
-0.04397929832339287,
0.02431788295507431,
0.14454540610313416,
-0.08831564337015152,
0.07783135026693344,
-0.07549355924129486,
0.07701494544744492,
-0.07269641011953354,
0.009604042395949364,
0.10462024807929993,
0.10613629966974258,
0.006048326846212149,
0.0551750473678112,
-0.19942732155323029,
0.2713257670402527,
-0.0032102942932397127,
0.0658864974975586,
-0.05632137507200241,
0.0515979640185833,
0.034930258989334106,
-0.021674301475286484,
0.1013181284070015,
-0.010896454565227032,
-0.11411318182945251,
-0.16310662031173706,
-0.11208252608776093,
0.0079620610922575,
0.13439619541168213,
-0.07630433887243271,
0.11160802096128464,
-0.04173198342323303,
-0.055187441408634186,
0.02297733537852764,
-0.09307799488306046,
-0.07728417962789536,
-0.08726070076227188,
0.0387197807431221,
0.0004318362334743142,
0.04500691220164299,
-0.09748121351003647,
-0.08389478921890259,
-0.032392989844083786,
0.1501060128211975,
-0.11299970000982285,
-0.04944736510515213,
-0.13848178088665009,
0.03954533860087395,
0.1772933453321457,
-0.072721928358078,
0.04723341390490532,
0.013967927545309067,
0.12281996011734009,
0.04160521924495697,
-0.034516189247369766,
0.09909708797931671,
-0.08183841407299042,
-0.22152195870876312,
-0.049603987485170364,
0.18309733271598816,
0.01671968773007393,
0.06142285466194153,
-0.02268124744296074,
0.026871835812926292,
0.0058228266425430775,
-0.07915234565734863,
0.05258503928780556,
0.03694240376353264,
-0.0016956705367192626,
0.040722623467445374,
-0.03239605203270912,
-0.012374170124530792,
-0.06575068831443787,
-0.01364095788449049,
0.09641719609498978,
0.23813585937023163,
-0.09184347093105316,
0.0559636615216732,
0.04851216450333595,
-0.05527583882212639,
-0.17314186692237854,
-0.022627638652920723,
0.11834337562322617,
0.030252281576395035,
0.0028745057061314583,
-0.19520987570285797,
0.02876725047826767,
0.072946697473526,
-0.0339568555355072,
0.08927428722381592,
-0.34751036763191223,
-0.12999869883060455,
0.0698304995894432,
0.07664826512336731,
-0.04780363664031029,
-0.17319701611995697,
-0.07666235417127609,
0.005138334818184376,
-0.021985745057463646,
0.023841962218284607,
-0.0328635647892952,
0.12076705694198608,
-0.002189685357734561,
0.005802413448691368,
0.026913747191429138,
-0.05047564208507538,
0.14199799299240112,
-0.011022805236279964,
0.05161333084106445,
-0.034295812249183655,
0.021580953150987625,
0.03174220770597458,
-0.07255373895168304,
0.019043779000639915,
-0.10134657472372055,
0.03827552869915962,
-0.12613612413406372,
-0.023592937737703323,
-0.0717955082654953,
0.021792778745293617,
-0.041354890912771225,
-0.03480608016252518,
-0.0003787409805227071,
0.0631621703505516,
0.08153803646564484,
0.010652263648808002,
0.08348093926906586,
-0.04296455532312393,
0.12612101435661316,
0.16153164207935333,
0.11548328399658203,
0.025278355926275253,
-0.09766118228435516,
-0.005865820683538914,
0.004909226670861244,
0.026702886447310448,
-0.10979261249303818,
0.04037532955408096,
0.13690266013145447,
0.04234657809138298,
0.12472167611122131,
0.04459058493375778,
-0.06391539424657822,
-0.0016350207151845098,
0.06524401903152466,
-0.09524831175804138,
-0.16707907617092133,
-0.02343258075416088,
0.011349942535161972,
-0.15294477343559265,
0.018335852771997452,
0.11358024179935455,
-0.030526788905262947,
0.001659401343204081,
0.013449431397020817,
0.05453939735889435,
-0.013663049787282944,
0.22708101570606232,
0.01632661558687687,
0.09452533721923828,
-0.09421216696500778,
0.07660996913909912,
0.03972862660884857,
-0.0746190994977951,
0.034841012209653854,
0.1320069581270218,
-0.06193147227168083,
-0.024051323533058167,
0.03392414748668671,
0.0654270127415657,
0.07087863981723785,
-0.03366671875119209,
-0.11137556284666061,
-0.138474240899086,
0.07233802229166031,
0.10633468627929688,
0.021164966747164726,
0.02079061232507229,
-0.01965218037366867,
0.03321921452879906,
-0.07292722910642624,
0.13460229337215424,
0.09755709022283554,
0.055046454071998596,
-0.1386113166809082,
0.12375731021165848,
0.0009211389115080237,
0.004748483654111624,
-0.008696159347891808,
0.00631742924451828,
-0.11460388451814651,
0.004474041052162647,
-0.1033908948302269,
0.007008197717368603,
-0.063454769551754,
-0.0029613475780934095,
0.002204359043389559,
-0.058903228491544724,
-0.05351712927222252,
0.018753454089164734,
-0.09355384856462479,
-0.05046924203634262,
-0.029796306043863297,
0.05883586406707764,
-0.09839590638875961,
-0.02975776232779026,
0.031042786315083504,
-0.1250118762254715,
0.09865231812000275,
0.040637675672769547,
0.012886243872344494,
0.005801037885248661,
-0.0666549950838089,
0.01356973685324192,
0.02542215771973133,
0.008881640620529652,
0.025142762809991837,
-0.14991535246372223,
-0.010644682683050632,
-0.04471217468380928,
-0.004001094494014978,
-0.014022331684827805,
0.028355034068226814,
-0.12095886468887329,
0.028878090903162956,
-0.031745795160532,
-0.04799250140786171,
-0.05014127492904663,
0.04565229266881943,
0.08334099501371384,
0.001015458139590919,
0.14436931908130646,
-0.07942714542150497,
0.07344765216112137,
-0.24047783017158508,
0.0023070608731359243,
-0.0006761919939890504,
-0.07536137849092484,
-0.06265680491924286,
-0.019568370655179024,
0.09935492277145386,
-0.06285790354013443,
0.0772947445511818,
-0.020598577335476875,
0.0530962198972702,
0.016932884231209755,
-0.08907624334096909,
0.05101209878921509,
0.06691790372133255,
0.1483464241027832,
0.03836183622479439,
-0.03234119713306427,
0.0812462642788887,
-0.017183218151330948,
0.0374927781522274,
0.07301139086484909,
0.14424936473369598,
0.13855615258216858,
0.060020096600055695,
0.06196164712309837,
0.09924305230379105,
-0.11583111435174942,
-0.16385355591773987,
0.10628028959035873,
-0.04753516986966133,
0.13241267204284668,
-0.040264520794153214,
0.18454140424728394,
0.10849609971046448,
-0.19670316576957703,
0.06461628526449203,
-0.024018896743655205,
-0.09171505272388458,
-0.11033271998167038,
-0.12390491366386414,
-0.08405152708292007,
-0.1461828052997589,
0.0012078065192326903,
-0.1073678582906723,
0.05012756958603859,
0.04523071274161339,
0.027763891965150833,
0.039491768926382065,
0.11626969277858734,
0.040792033076286316,
0.020894447341561317,
0.10231298208236694,
0.03169224038720131,
-0.006645877379924059,
-0.013598267920315266,
-0.10298582911491394,
0.024273524060845375,
-0.010949007235467434,
0.04625743627548218,
-0.04056846350431442,
-0.08165444433689117,
0.04817504063248634,
-0.0017287798691540956,
-0.09881846606731415,
0.02518613636493683,
-0.02126658521592617,
0.033540382981300354,
0.057019636034965515,
0.04056377336382866,
-0.030907893553376198,
-0.01794401742517948,
0.23465946316719055,
-0.11119730770587921,
-0.07097099721431732,
-0.12523745000362396,
0.21443408727645874,
-0.02412276528775692,
-0.010180958546698093,
0.03422127664089203,
-0.06236635893583298,
-0.00987631268799305,
0.16308188438415527,
0.18114274740219116,
-0.023419618606567383,
-0.008488030172884464,
0.009278430603444576,
-0.009003719314932823,
-0.028977137058973312,
0.07849756628274918,
0.10605386644601822,
0.07391797006130219,
-0.04757744446396828,
-0.01032557338476181,
-0.003468970535323024,
-0.05984041467308998,
-0.0681496411561966,
0.07837502658367157,
0.01732957363128662,
0.006296597886830568,
-0.019692230969667435,
0.10802976787090302,
-0.08404634892940521,
-0.1227639690041542,
0.0412445142865181,
-0.18580131232738495,
-0.18732285499572754,
-0.0377892442047596,
0.053324222564697266,
0.048923395574092865,
0.055440086871385574,
0.009902584366500378,
-0.029417268931865692,
0.08122070133686066,
0.0010001113405451179,
-0.045975759625434875,
-0.09246277064085007,
0.0655767023563385,
-0.14255444705486298,
0.2000429332256317,
-0.0415126197040081,
-0.0007429320830851793,
0.11951067298650742,
0.023171430453658104,
-0.10467690974473953,
0.02492896467447281,
0.09041567891836166,
-0.13591906428337097,
0.04042193293571472,
0.1842716783285141,
-0.04248055815696716,
0.13554100692272186,
0.04056796431541443,
-0.08312149345874786,
-0.005203412380069494,
-0.043750230222940445,
-0.03166516497731209,
-0.0576251856982708,
0.00031916049192659557,
-0.04230356588959694,
0.1463632732629776,
0.21364614367485046,
-0.07081857323646545,
-0.01386431884020567,
-0.037132881581783295,
0.006135535426437855,
0.032814864069223404,
0.10172593593597412,
-0.03343212977051735,
-0.2586893141269684,
0.0009027360938489437,
-0.013270863331854343,
0.03145352751016617,
-0.1878887563943863,
-0.09225532412528992,
0.019796976819634438,
-0.03880150243639946,
-0.0731988474726677,
0.1156904399394989,
0.07335425168275833,
0.03565674275159836,
-0.041026633232831955,
-0.0825980007648468,
-0.028295326977968216,
0.181443989276886,
-0.1871238499879837,
-0.04810649901628494
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | text-classification | MaggieZhang/lora_distilled_bert_classification | [
"transformers",
"safetensors",
"distilbert",
"text-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-15T07:14:16+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #distilbert #text-classification #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #distilbert #text-classification #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
48,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #distilbert #text-classification #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.07302619516849518,
0.15942901372909546,
-0.0037264563143253326,
0.025167323648929596,
0.1172078475356102,
0.008749904111027718,
0.07480382919311523,
0.10722988098859787,
-0.02045559324324131,
0.12543776631355286,
0.039410512894392014,
0.10394789278507233,
0.11057476699352264,
0.19190818071365356,
-0.002767085563391447,
-0.20705340802669525,
0.06408926099538803,
-0.1160135567188263,
0.01611061953008175,
0.12252140045166016,
0.14288343489170074,
-0.10968661308288574,
0.07077977806329727,
-0.03890860080718994,
-0.02716642990708351,
-0.03298870474100113,
-0.06216486915946007,
-0.05656079575419426,
0.06601378321647644,
0.0591726079583168,
0.0694754421710968,
0.024590101093053818,
0.08098499476909637,
-0.28943222761154175,
0.019404316321015358,
0.07737266272306442,
0.0035657489206641912,
0.06202864274382591,
0.07916183769702911,
-0.07883433997631073,
0.10480276495218277,
-0.05524575710296631,
0.15781880915164948,
0.07489447295665741,
-0.0982506051659584,
-0.18013636767864227,
-0.08504346013069153,
0.09836910665035248,
0.17099453508853912,
0.05479501187801361,
-0.03633468225598335,
0.14033685624599457,
-0.08061470836400986,
0.01687866821885109,
0.06772492825984955,
-0.0681740939617157,
-0.05254287272691727,
0.05607175827026367,
0.07204149663448334,
0.09469678997993469,
-0.13314193487167358,
-0.00799859780818224,
0.04471778869628906,
0.01626773551106453,
0.10993628203868866,
0.023736488074064255,
0.12237779796123505,
0.02965191937983036,
-0.14525189995765686,
-0.06473848968744278,
0.11550955474376678,
0.035839054733514786,
-0.060212019830942154,
-0.24547284841537476,
-0.0033295010216534138,
-0.034091219305992126,
-0.026645053178071976,
-0.04267369583249092,
0.0422113798558712,
-0.030090559273958206,
0.09037542343139648,
0.006197172217071056,
-0.06834427267313004,
-0.051925547420978546,
0.09237229824066162,
0.06009524688124657,
0.026748334988951683,
-0.027690796181559563,
0.02246314287185669,
0.12014351040124893,
0.1042904481291771,
-0.11278197169303894,
-0.06414325535297394,
-0.06594311445951462,
-0.08891893923282623,
-0.04987602308392525,
0.034507665783166885,
0.07233929634094238,
0.045586712658405304,
0.20240989327430725,
0.004374812822788954,
0.051542725414037704,
0.027702245861291885,
0.0163713525980711,
0.06719023734331131,
0.06825068593025208,
-0.05008118972182274,
-0.12756529450416565,
-0.03760921210050583,
0.11946272850036621,
0.0017542883288115263,
-0.03342404216527939,
-0.0370357446372509,
0.06106860190629959,
0.04892996326088905,
0.12214437127113342,
0.0646679550409317,
0.018874529749155045,
-0.07587674260139465,
-0.046467430889606476,
0.18032068014144897,
-0.15827614068984985,
0.02335406094789505,
0.01725550927221775,
-0.0531628392636776,
-0.033948007971048355,
0.018618647009134293,
0.009236144833266735,
-0.029387421905994415,
0.1005609855055809,
-0.06605342775583267,
-0.04097803682088852,
-0.10903192311525345,
-0.055316012352705,
0.03402786701917648,
-0.025065315887331963,
-0.02789347805082798,
-0.040971264243125916,
-0.1248450055718422,
-0.07442043721675873,
0.06903292238712311,
-0.06419821083545685,
-0.06718063354492188,
-0.040221910923719406,
-0.06102044880390167,
0.014937658794224262,
0.0008266915683634579,
0.1269739717245102,
-0.02968521974980831,
0.04848431050777435,
-0.0539008304476738,
0.06914420425891876,
0.13718481361865997,
0.03272920101881027,
-0.06809919327497482,
0.06580659747123718,
-0.21232743561267853,
0.10933514684438705,
-0.09396476298570633,
0.026792176067829132,
-0.16038449108600616,
-0.02306288480758667,
0.03069896623492241,
0.039205435663461685,
-0.01574552245438099,
0.1448679268360138,
-0.1747746467590332,
-0.03626937046647072,
0.18672682344913483,
-0.12991686165332794,
-0.09265255182981491,
0.06183605268597603,
-0.0648084431886673,
0.13347013294696808,
0.05529501289129257,
-0.01992315798997879,
0.05587787926197052,
-0.13651202619075775,
-0.023517979308962822,
-0.058770496398210526,
-0.011057188734412193,
0.15450166165828705,
0.06303975731134415,
-0.04996807500720024,
0.024645399302244186,
0.017310835421085358,
-0.024148117750883102,
-0.04886231571435928,
-0.03430904448032379,
-0.09810014069080353,
0.005970593076199293,
-0.07982048392295837,
0.025509681552648544,
-0.02279755286872387,
-0.08887400478124619,
-0.040562164038419724,
-0.15593992173671722,
0.009587006643414497,
0.0986250564455986,
0.0006499737501144409,
-0.029481856152415276,
-0.09914560616016388,
0.0014640848385170102,
0.016265012323856354,
-0.010709897615015507,
-0.1529860496520996,
-0.05147454887628555,
0.025713054463267326,
-0.16740785539150238,
0.02983911894261837,
-0.04416975751519203,
0.03472619876265526,
0.04469497501850128,
-0.047529187053442,
-0.02975785918533802,
0.015605244785547256,
0.02078833244740963,
-0.024411868304014206,
-0.25051596760749817,
-0.013653411529958248,
-0.051656268537044525,
0.17981497943401337,
-0.25592783093452454,
0.04935307428240776,
0.0690855160355568,
0.12038503587245941,
0.005616906564682722,
-0.04484110698103905,
0.038755834102630615,
-0.05312656611204147,
-0.04079194739460945,
-0.06756321340799332,
-0.004968787543475628,
-0.03330003470182419,
-0.04708937928080559,
0.040533605962991714,
-0.18370530009269714,
-0.026839453727006912,
0.11585007607936859,
0.06803574413061142,
-0.17149686813354492,
-0.07743752747774124,
-0.034665726125240326,
-0.05996506288647652,
-0.08542647957801819,
-0.056485775858163834,
0.09173574298620224,
0.04302561655640602,
0.055119626224040985,
-0.07221351563930511,
-0.0563325397670269,
0.015307560563087463,
-0.011831860989332199,
-0.032375045120716095,
0.08966241031885147,
0.07603370398283005,
-0.12257120013237,
0.10713227838277817,
0.06915293633937836,
0.06829847395420074,
0.10371299833059311,
0.006018918938934803,
-0.0951351672410965,
-0.012076831422746181,
0.028954172506928444,
0.013578351587057114,
0.14422492682933807,
-0.07140666991472244,
0.03330845758318901,
0.04359918460249901,
-0.027328653261065483,
0.009608421474695206,
-0.10246647149324417,
0.018117014318704605,
0.03343784064054489,
-0.008881162852048874,
0.017250988632440567,
-0.05481864511966705,
0.014968239702284336,
0.10633815079927444,
0.03211374580860138,
0.027500580996274948,
0.01981731504201889,
-0.040416620671749115,
-0.12751449644565582,
0.1772654801607132,
-0.09383377432823181,
-0.2552470862865448,
-0.13026653230190277,
-0.009479007683694363,
0.045126691460609436,
-0.010854403488337994,
0.019198866561055183,
-0.05917074531316757,
-0.1081017553806305,
-0.10490734130144119,
0.026286281645298004,
0.054074980318546295,
-0.08816048502922058,
-0.064018115401268,
0.05169869586825371,
0.0385097898542881,
-0.12403316795825958,
0.021811455488204956,
0.046125855296850204,
-0.07025353610515594,
0.00821257010102272,
0.052987806499004364,
0.08472178876399994,
0.1826072335243225,
0.007897963747382164,
-0.016298603266477585,
0.008750800043344498,
0.2144501805305481,
-0.1484457403421402,
0.092045359313488,
0.14109621942043304,
-0.06516804546117783,
0.08377774804830551,
0.20131921768188477,
0.030504774302244186,
-0.09844772517681122,
0.03905881568789482,
0.03513709455728531,
-0.0375148244202137,
-0.24395905435085297,
-0.0748228207230568,
0.0031239830423146486,
-0.06623414903879166,
0.10724245756864548,
0.08736731112003326,
0.1171678826212883,
0.05268942564725876,
-0.11185546219348907,
-0.06449731439352036,
0.05344700068235397,
0.12066427618265152,
-0.028124094009399414,
0.0008641352178528905,
0.09650425612926483,
-0.02977217361330986,
0.02383269928395748,
0.09186029434204102,
0.018334977328777313,
0.1854310929775238,
0.04487955570220947,
0.1315774768590927,
0.08984522521495819,
0.06165572628378868,
0.01767764426767826,
0.01994951255619526,
0.022676948457956314,
0.028990833088755608,
-0.022242991253733635,
-0.0817873626947403,
-0.00921230111271143,
0.14159180223941803,
0.026489878073334694,
0.03602421656250954,
0.001440341817215085,
-0.04777481406927109,
0.07105493545532227,
0.16661210358142853,
0.012482628226280212,
-0.22979335486888885,
-0.06520283222198486,
0.07564391940832138,
-0.07074891030788422,
-0.11627703160047531,
-0.013096708804368973,
0.024812309071421623,
-0.18332423269748688,
0.04349841922521591,
-0.024669349193572998,
0.1018587276339531,
-0.11199972778558731,
-0.02344847284257412,
0.035318560898303986,
0.06107853353023529,
-0.035138774663209915,
0.07848566025495529,
-0.20783106982707977,
0.1402515470981598,
0.007242240011692047,
0.06469187885522842,
-0.10684854537248611,
0.08134520798921585,
0.020340995863080025,
0.006346969865262508,
0.1665121465921402,
-0.005634299945086241,
-0.072713203728199,
-0.09345488250255585,
-0.07864519953727722,
-0.017188850790262222,
0.0979963019490242,
-0.11784757673740387,
0.09015297889709473,
-0.007544329855591059,
-0.03196582943201065,
-0.0007019630284048617,
-0.12950846552848816,
-0.13376227021217346,
-0.18478168547153473,
0.04834262654185295,
-0.12510578334331512,
0.041554566472768784,
-0.10858581960201263,
-0.060765668749809265,
-0.041379012167453766,
0.19413886964321136,
-0.20414148271083832,
-0.08119912445545197,
-0.14911502599716187,
-0.0672706589102745,
0.11254695802927017,
-0.03948867693543434,
0.08191721886396408,
0.008871423080563545,
0.2073923498392105,
-0.004810879472643137,
0.0006135239964351058,
0.09140623360872269,
-0.09588538110256195,
-0.2094263732433319,
-0.0959051325917244,
0.13635295629501343,
0.13115985691547394,
0.04470321163535118,
0.00023247375793289393,
0.02411508932709694,
-0.0018883526790887117,
-0.11162916570901871,
0.03426937386393547,
0.15202432870864868,
0.10249507427215576,
0.044034719467163086,
-0.0260105412453413,
-0.13932733237743378,
-0.1056612879037857,
-0.054744839668273926,
0.013206261210143566,
0.1903214454650879,
-0.0706305131316185,
0.1657869964838028,
0.1536196768283844,
-0.06531279534101486,
-0.21233291923999786,
0.03679078444838524,
0.030905993655323982,
-0.00751135777682066,
0.04347773641347885,
-0.2047269195318222,
0.07352772355079651,
0.01412410382181406,
-0.05716951563954353,
0.1305869072675705,
-0.17576472461223602,
-0.14771407842636108,
0.09065452963113785,
0.07857703417539597,
-0.2075619101524353,
-0.12917637825012207,
-0.0950717106461525,
-0.05231890827417374,
-0.10034287720918655,
0.09251669049263,
-0.0036216825246810913,
0.005252200644463301,
0.036232154816389084,
0.01758572831749916,
0.01728934422135353,
-0.05098523199558258,
0.19524237513542175,
-0.00017524124996270984,
0.05021730437874794,
-0.07728931307792664,
-0.07839185744524002,
0.03842216357588768,
-0.06752927601337433,
0.08417709171772003,
-0.02161126770079136,
0.0039355861954391,
-0.11725787818431854,
-0.06764968484640121,
-0.04570414870977402,
0.03315238282084465,
-0.08949651569128036,
-0.09646400064229965,
-0.0555412657558918,
0.10287721455097198,
0.09537502378225327,
-0.03549838066101074,
-0.06785823404788971,
-0.09521738439798355,
0.05743926018476486,
0.2211635708808899,
0.18752726912498474,
0.07758046686649323,
-0.07665256410837173,
-0.008446265943348408,
-0.02362825535237789,
0.05575858801603317,
-0.2147134691476822,
0.04626009985804558,
0.03838435187935829,
0.030744675546884537,
0.1351434588432312,
-0.022784622386097908,
-0.16072605550289154,
-0.04722895845770836,
0.05541609972715378,
-0.07028964161872864,
-0.15762348473072052,
0.003693870734423399,
0.08388359844684601,
-0.15567344427108765,
-0.05364304408431053,
0.030349692329764366,
-0.03299986198544502,
-0.02724997140467167,
0.002993965055793524,
0.08165504038333893,
0.02525121532380581,
0.10604418069124222,
0.06794179975986481,
0.11212385445833206,
-0.10361232608556747,
0.07820820808410645,
0.08721207082271576,
-0.11143109202384949,
0.03750693425536156,
0.059706296771764755,
-0.06430401653051376,
-0.03306615725159645,
0.028105957433581352,
0.08702781051397324,
0.02858729287981987,
-0.07410863786935806,
0.0023060773964971304,
-0.11285153776407242,
0.06773319095373154,
0.13773435354232788,
0.037572041153907776,
0.009064391255378723,
0.04253077879548073,
0.030666319653391838,
-0.1025259718298912,
0.11677869409322739,
0.04715273529291153,
0.03828616067767143,
-0.053534768521785736,
-0.002754961373284459,
0.04357896372675896,
-0.015574077144265175,
-0.017309002578258514,
-0.03927738964557648,
-0.06638500094413757,
-0.009345067664980888,
-0.16059128940105438,
0.027963994070887566,
-0.06438141316175461,
0.011313637718558311,
0.015024027787148952,
-0.02930280566215515,
0.006326301023364067,
0.010901868343353271,
-0.07644513994455338,
-0.04005778953433037,
-0.0025265931617468596,
0.11033432930707932,
-0.16255317628383636,
0.006753581576049328,
0.08725008368492126,
-0.12882095575332642,
0.07888396829366684,
-0.003228981513530016,
-0.008663777261972427,
0.019871357828378677,
-0.1389452964067459,
0.06426677107810974,
-0.007317067123949528,
0.006886337883770466,
0.024405626580119133,
-0.20780570805072784,
0.002691886154934764,
-0.049495045095682144,
-0.06124653294682503,
-0.003442719578742981,
-0.03931323438882828,
-0.11277955025434494,
0.10321920365095139,
0.017737101763486862,
-0.08050814270973206,
-0.018862100318074226,
0.05358913913369179,
0.11278057098388672,
-0.053978823125362396,
0.14271147549152374,
-0.018007846549153328,
0.05715036392211914,
-0.1816556304693222,
-0.017987793311476707,
-0.017368610948324203,
0.016075139865279198,
-0.03470727428793907,
-0.008873502723872662,
0.05237460881471634,
-0.01958826184272766,
0.22800102829933167,
-0.023029034957289696,
0.01981639862060547,
0.06532696634531021,
0.0016252564964815974,
-0.010984939523041248,
0.09684767574071884,
0.048498742282390594,
0.015143456868827343,
0.0203377865254879,
0.013252451084554195,
-0.04566340893507004,
-0.008616970852017403,
-0.12847718596458435,
0.08234056085348129,
0.1677752137184143,
0.08175479620695114,
-0.006052352488040924,
0.047567352652549744,
-0.11316590011119843,
-0.09173060953617096,
0.10132203251123428,
-0.03303218260407448,
-0.013127516023814678,
-0.05242474004626274,
0.1442553550004959,
0.15683847665786743,
-0.1846613585948944,
0.0673123374581337,
-0.06864999234676361,
-0.058019280433654785,
-0.10558338463306427,
-0.17708730697631836,
-0.0631738007068634,
-0.033932529389858246,
-0.009048123843967915,
-0.060769032686948776,
0.06745719909667969,
0.10813924670219421,
0.01437336578965187,
0.004817943554371595,
0.08580505102872849,
-0.03281113877892494,
0.006333827041089535,
0.04443316161632538,
0.052908364683389664,
0.015542974695563316,
-0.06320759654045105,
0.004275370854884386,
0.006610610987991095,
0.0376921184360981,
0.055147934705019,
0.030873596668243408,
-0.0092905443161726,
0.007207514252513647,
-0.020693093538284302,
-0.10057692229747772,
0.04111333191394806,
-0.025823315605521202,
-0.047910936176776886,
0.1509503871202469,
0.020467912778258324,
-0.003414076054468751,
-0.022258523851633072,
0.2298767864704132,
-0.06479788571596146,
-0.07484833151102066,
-0.13822507858276367,
0.14135941863059998,
-0.03916965425014496,
0.05368134006857872,
0.049936410039663315,
-0.10397564619779587,
0.03804606944322586,
0.14477981626987457,
0.14261196553707123,
-0.034453462809324265,
0.008940902538597584,
0.009526451118290424,
0.004399977158755064,
-0.02350606769323349,
0.05356355383992195,
0.04485337436199188,
0.11325705051422119,
-0.06528755277395248,
0.09648586809635162,
-0.005538135301321745,
-0.09084830433130264,
-0.019364742562174797,
0.1391776204109192,
0.002899263286963105,
0.024846963584423065,
-0.08323919028043747,
0.12169293314218521,
-0.06053123623132706,
-0.2529715597629547,
0.06497339904308319,
-0.06441039592027664,
-0.1503337174654007,
-0.019829563796520233,
0.015622834675014019,
-0.0025740223936736584,
0.022466620430350304,
0.06178610026836395,
-0.06470615416765213,
0.15161879360675812,
0.03660573810338974,
-0.07138057053089142,
-0.07539889216423035,
0.07816334068775177,
-0.08136013150215149,
0.30430659651756287,
0.007375960238277912,
0.05443240702152252,
0.09480572491884232,
-0.03690790757536888,
-0.13316340744495392,
0.0335354208946228,
0.09097745269536972,
-0.047231536358594894,
0.06487338244915009,
0.20800761878490448,
-0.011225960217416286,
0.11401397734880447,
0.07447969168424606,
-0.08660271763801575,
0.05096733942627907,
-0.0917983278632164,
-0.09906064718961716,
-0.0893944799900055,
0.0902828648686409,
-0.059031881392002106,
0.1506001204252243,
0.12994202971458435,
-0.04605574533343315,
0.005047217011451721,
-0.022221196442842484,
0.05354851856827736,
-0.0026379574555903673,
0.11034536361694336,
0.03008626215159893,
-0.19489215314388275,
0.03033076599240303,
-0.00037526662345044315,
0.10122878104448318,
-0.25035029649734497,
-0.08561131358146667,
0.03936697542667389,
-0.007475157734006643,
-0.057129982858896255,
0.12413015216588974,
0.054405856877565384,
0.047805771231651306,
-0.05493326112627983,
-0.05230220779776573,
-0.007250586990267038,
0.1655176728963852,
-0.10096944123506546,
-0.0014428504509851336
] |
null | null | diffusers |
# DreamBooth trained by AutoTrain
Text encoder was not trained.
| {"tags": ["text-to-image", "diffusers", "autotrain"], "base_model": "stabilityai/stable-diffusion-xl-base-1.0", "instance_prompt": "<deepika padukone>", "inference": true} | text-to-image | AmilaUvaz/autotrain-3dfyt-882ss | [
"diffusers",
"text-to-image",
"autotrain",
"base_model:stabilityai/stable-diffusion-xl-base-1.0",
"has_space",
"region:us"
] | 2024-02-15T07:16:25+00:00 | [] | [] | TAGS
#diffusers #text-to-image #autotrain #base_model-stabilityai/stable-diffusion-xl-base-1.0 #has_space #region-us
|
# DreamBooth trained by AutoTrain
Text encoder was not trained.
| [
"# DreamBooth trained by AutoTrain\n\nText encoder was not trained."
] | [
"TAGS\n#diffusers #text-to-image #autotrain #base_model-stabilityai/stable-diffusion-xl-base-1.0 #has_space #region-us \n",
"# DreamBooth trained by AutoTrain\n\nText encoder was not trained."
] | [
45,
19
] | [
"passage: TAGS\n#diffusers #text-to-image #autotrain #base_model-stabilityai/stable-diffusion-xl-base-1.0 #has_space #region-us \n# DreamBooth trained by AutoTrain\n\nText encoder was not trained."
] | [
-0.02063869684934616,
0.12998254597187042,
-0.00014558587281499058,
0.05282456427812576,
0.16523675620555878,
0.04722703993320465,
0.16625140607357025,
0.08092519640922546,
-0.021600954234600067,
0.06268861889839172,
0.19911405444145203,
-0.005327701102942228,
0.005592701490968466,
0.22998546063899994,
-0.094501793384552,
-0.15147385001182556,
0.05843960493803024,
-0.017813973128795624,
0.08953600376844406,
0.04556926712393761,
0.01589704304933548,
-0.08332102000713348,
0.06851272284984589,
-0.1127990260720253,
-0.21184474229812622,
0.06736689060926437,
0.028242893517017365,
-0.08190154284238815,
0.023159906268119812,
0.057201284915208817,
0.11752432584762573,
0.05736266449093819,
0.06915528327226639,
-0.09377864748239517,
0.030588991940021515,
0.09211067855358124,
-0.037628743797540665,
0.060378964990377426,
0.002463718643411994,
0.007739691063761711,
-0.03909904137253761,
0.01951049454510212,
0.05348891019821167,
0.033195290714502335,
-0.09112479537725449,
0.09422965347766876,
0.008997537195682526,
0.05966416001319885,
0.005606517195701599,
0.1256808042526245,
-0.02887202799320221,
0.0914452075958252,
0.0028242841362953186,
0.10286186635494232,
0.050214264541864395,
-0.15577325224876404,
-0.05811230465769768,
0.22586119174957275,
0.06323451548814774,
0.18434374034404755,
-0.1056840792298317,
0.08215278387069702,
0.1282002329826355,
0.0043175057508051395,
-0.024307064712047577,
-0.0056144483387470245,
-0.053464896976947784,
-0.0875391811132431,
-0.04101261869072914,
-0.04863812029361725,
0.19171690940856934,
0.013884141109883785,
-0.014532854780554771,
-0.08809809386730194,
-0.1092078685760498,
-0.03936294838786125,
0.015471521764993668,
0.009576751850545406,
-0.05643317475914955,
0.06334297358989716,
-0.04036302492022514,
-0.0881064385175705,
-0.048688579350709915,
-0.03869857266545296,
-0.07886603474617004,
0.09238439798355103,
-0.0456368625164032,
0.0745692178606987,
-0.0938243567943573,
0.13909384608268738,
-0.026598775759339333,
-0.12820684909820557,
0.06501864641904831,
-0.0971466526389122,
0.015486733056604862,
0.06505174934864044,
-0.019916843622922897,
-0.1562809944152832,
0.019901327788829803,
0.030637366697192192,
0.07526841759681702,
0.05189061909914017,
-0.08258821815252304,
0.09015702456235886,
0.007376048713922501,
0.09042561054229736,
-0.016077103093266487,
-0.024903813377022743,
0.06223255768418312,
0.080438993871212,
0.023856146261096,
-0.14336538314819336,
-0.16565988957881927,
0.06790684908628464,
-0.017159676179289818,
0.04283891245722771,
0.03642508387565613,
-0.010275715962052345,
-0.031149128451943398,
-0.004403593484312296,
0.047221966087818146,
-0.04838476702570915,
0.023466823622584343,
-0.07434477657079697,
-0.008917812258005142,
0.014335056766867638,
0.1431507170200348,
0.007567800115793943,
-0.006044706329703331,
-0.008012169972062111,
-0.10112743824720383,
-0.01249670796096325,
-0.06397054344415665,
-0.082596056163311,
-0.05697616934776306,
-0.11640746891498566,
0.03807840123772621,
-0.16242456436157227,
-0.1366284042596817,
-0.010717466473579407,
0.012121928855776787,
-0.08239061385393143,
-0.0024879504926502705,
-0.08431833982467651,
-0.12462550401687622,
0.1450532078742981,
-0.013907280750572681,
-0.03597475588321686,
0.0006233238964341581,
0.06648663431406021,
-0.010329908691346645,
0.10745283216238022,
-0.17473040521144867,
0.01794232614338398,
-0.07896706461906433,
-0.0015359485987573862,
-0.08321953564882278,
0.16549469530582428,
-0.03203589841723442,
0.033024370670318604,
-0.03292569890618324,
0.04207007214426994,
0.0021412093192338943,
0.008031118661165237,
0.05329214408993721,
0.15599198639392853,
-0.19367799162864685,
-0.04072578251361847,
0.0876203402876854,
-0.08026987314224243,
-0.011655561625957489,
0.041991058737039566,
-0.022804416716098785,
0.047191135585308075,
0.005142057780176401,
0.15102070569992065,
-0.07513030618429184,
-0.1523657888174057,
-0.00003674626350402832,
0.019653983414173126,
-0.03947019204497337,
0.06174682825803757,
-0.03899246081709862,
0.060578037053346634,
-0.07573825865983963,
0.03253980353474617,
-0.005597305484116077,
0.08249075710773468,
-0.06469673663377762,
-0.07055705785751343,
-0.06726926565170288,
-0.021799663081765175,
0.06577687710523605,
0.01678086258471012,
0.07544080168008804,
-0.030378416180610657,
-0.07784181833267212,
0.03869107738137245,
0.04462023451924324,
-0.009920100681483746,
-0.007784112356603146,
-0.013205957598984241,
-0.04446694254875183,
-0.12920789420604706,
0.003658822737634182,
-0.09591405093669891,
-0.0857297033071518,
0.00785818975418806,
0.23912277817726135,
0.09514347463846207,
0.14679308235645294,
0.059998251497745514,
0.04194987192749977,
-0.031193705275654793,
-0.12705348432064056,
-0.0008300838526338339,
0.029192514717578888,
-0.08331938832998276,
-0.09998124092817307,
0.0904180034995079,
-0.09146905690431595,
-0.004678551107645035,
-0.1545001119375229,
0.007734695915132761,
-0.07803455740213394,
0.15830396115779877,
0.028678199276328087,
-0.031181402504444122,
-0.03010755404829979,
0.0402386300265789,
-0.09691616147756577,
-0.1099129319190979,
-0.0022663131821900606,
0.0153842493891716,
-0.0945914015173912,
0.06970567256212234,
-0.2405780851840973,
0.0574164092540741,
0.14391222596168518,
-0.005025625228881836,
-0.07321476936340332,
0.11765623092651367,
0.0489165261387825,
-0.013706451281905174,
-0.023128986358642578,
-0.02168380096554756,
0.1244552806019783,
-0.07626726478338242,
0.19949495792388916,
-0.01798384077847004,
0.08187845349311829,
0.05062877759337425,
-0.06974431127309799,
-0.135806143283844,
-0.000004087520210305229,
-0.03837069496512413,
-0.0334748737514019,
0.11700894683599472,
0.09331324696540833,
-0.060808680951595306,
0.27977684140205383,
0.002255344530567527,
-0.0019275352824479342,
-0.03330899775028229,
-0.014577753841876984,
-0.0332055389881134,
0.12854062020778656,
-0.012121065519750118,
0.00992091279476881,
0.015768490731716156,
-0.014307437464594841,
0.01476898044347763,
-0.09258662909269333,
-0.015657516196370125,
-0.029646404087543488,
-0.0163404643535614,
0.1258670836687088,
0.016155531629920006,
-0.035148244351148605,
0.07309972494840622,
-0.04378744959831238,
-0.0816405862569809,
0.11111503094434738,
-0.022147411480545998,
-0.0004421356425154954,
0.05905456468462944,
-0.15857146680355072,
-0.2807832360267639,
-0.1459890753030777,
0.005951586179435253,
-0.11860986053943634,
0.04109755903482437,
0.052975885570049286,
-0.10799627006053925,
-0.07004248350858688,
-0.08202385157346725,
-0.08629177510738373,
-0.05557532608509064,
0.0011311533162370324,
0.11728531867265701,
-0.06409677118062973,
0.05387398600578308,
-0.06229059770703316,
-0.00887343194335699,
-0.013896237127482891,
0.0027349803131073713,
0.09634215384721756,
0.02155768871307373,
0.04409273341298103,
0.20931857824325562,
-0.01992671564221382,
0.03497228026390076,
-0.007471531629562378,
0.25480857491493225,
-0.07225025445222855,
0.051100753247737885,
0.11487668752670288,
0.031045233830809593,
0.052618835121393204,
0.1828797161579132,
-0.01034550741314888,
-0.0642908588051796,
0.06494352221488953,
-0.012484862469136715,
-0.10492375493049622,
-0.11105634272098541,
-0.0924028679728508,
-0.04872503876686096,
-0.06293869018554688,
0.029581304639577866,
0.06633029878139496,
0.18465307354927063,
0.03403869643807411,
-0.0085936663672328,
0.038062650710344315,
-0.038405340164899826,
0.05253121256828308,
0.05000557377934456,
-0.054350171238183975,
0.10506314784288406,
-0.05272989347577095,
-0.07878284156322479,
0.09704536944627762,
0.029444830492138863,
0.08175686746835709,
-0.005787411238998175,
-0.051862932741642,
-0.054340463131666183,
0.05357728153467178,
0.12942302227020264,
0.016036581248044968,
0.0732298195362091,
-0.037278078496456146,
-0.04033561050891876,
-0.043483830988407135,
-0.012224663980305195,
0.08897408843040466,
0.023024603724479675,
0.013343557715415955,
-0.06517297029495239,
0.09141328185796738,
-0.0036450172774493694,
0.03365681692957878,
0.10284296423196793,
-0.24468940496444702,
0.03720756992697716,
0.05340345576405525,
0.009430313482880592,
-0.15917426347732544,
-0.001802100450731814,
0.2596781551837921,
-0.0778416246175766,
-0.016604389995336533,
-0.005158600863069296,
0.07767105102539062,
0.07948087900876999,
-0.01405559852719307,
-0.12727415561676025,
0.08470404893159866,
-0.03762264549732208,
-0.009994231164455414,
-0.21587730944156647,
0.04233643785119057,
0.006741201039403677,
0.09690377861261368,
-0.02572929486632347,
0.016345487907528877,
0.0344662107527256,
0.14141175150871277,
0.0716816708445549,
0.00973005685955286,
-0.08598282933235168,
-0.14106571674346924,
-0.08402053266763687,
-0.05161529779434204,
0.10742203146219254,
0.09498894214630127,
-0.004010304808616638,
-0.011004406958818436,
0.029761290177702904,
0.04038768634200096,
-0.048020366579294205,
-0.20780979096889496,
-0.12313251197338104,
0.03342318534851074,
0.18468953669071198,
0.07250070571899414,
-0.042261723428964615,
-0.07773694396018982,
0.058913350105285645,
0.15853528678417206,
-0.06002082675695419,
-0.03646547347307205,
-0.12438587844371796,
-0.01314868126064539,
0.04682208597660065,
-0.004984802100807428,
0.07632478326559067,
-0.11283677071332932,
0.055372435599565506,
-0.05680480971932411,
-0.15995463728904724,
0.08369133621454239,
-0.09573204070329666,
-0.09156695753335953,
-0.09880076348781586,
-0.02600095607340336,
-0.07628563791513443,
-0.01809440366923809,
0.02631893940269947,
0.03644336014986038,
-0.09317634254693985,
-0.08042453974485397,
0.07387512177228928,
0.052659958600997925,
-0.0790650025010109,
0.11336636543273926,
0.039935242384672165,
-0.05932047963142395,
0.009086593985557556,
-0.020160207524895668,
0.16297784447669983,
0.2692966163158417,
-0.09637150168418884,
0.1332009732723236,
0.10272762179374695,
-0.07975436747074127,
-0.2972416281700134,
-0.06331747770309448,
-0.001001058961264789,
0.033033158630132675,
-0.037056490778923035,
-0.08421573042869568,
0.01754319854080677,
-0.037301890552043915,
-0.026686429977416992,
0.09380273520946503,
-0.25594666600227356,
-0.07236529886722565,
0.12090659141540527,
0.011188359931111336,
0.3046357333660126,
-0.12652114033699036,
-0.03758466988801956,
-0.07161959260702133,
0.030579380691051483,
0.09310808032751083,
0.05593981221318245,
0.1552010029554367,
-0.01064409501850605,
0.029015347361564636,
0.016381043940782547,
-0.03504854813218117,
0.15569667518138885,
-0.09976516664028168,
0.07290340214967728,
-0.09811180084943771,
0.02065517008304596,
0.1682867556810379,
-0.07824182510375977,
0.06025531142950058,
-0.08820004016160965,
0.08328087627887726,
-0.14803707599639893,
0.024164263159036636,
-0.030000343918800354,
0.019950132817029953,
0.023836227133870125,
-0.09545804560184479,
-0.05183679237961769,
-0.024305418133735657,
0.031683988869190216,
0.0011127261677756906,
0.008928169496357441,
-0.03344632312655449,
0.021105246618390083,
0.31053033471107483,
-0.045023828744888306,
-0.08844760805368423,
-0.032576143741607666,
0.0008607114432379603,
-0.07616515457630157,
0.15518175065517426,
-0.140009805560112,
0.016880689188838005,
0.08636961877346039,
-0.028658051043748856,
0.19429416954517365,
0.04890631139278412,
-0.034792251884937286,
0.06410761177539825,
0.08606549352407455,
-0.17321881651878357,
0.023975208401679993,
-0.08413522690534592,
0.03825248405337334,
0.07573363929986954,
-0.08445089310407639,
0.1707473248243332,
-0.07278440147638321,
0.0452447347342968,
-0.039885539561510086,
0.022516414523124695,
-0.02864324487745762,
0.07788124680519104,
0.05243882164359093,
0.03179828077554703,
-0.08249194175004959,
0.1251235008239746,
0.038169246166944504,
-0.00042698116158135235,
0.13369235396385193,
0.09562437236309052,
-0.02339347079396248,
-0.029987553134560585,
-0.006221109069883823,
0.24116981029510498,
-0.1580258458852768,
-0.008135645650327206,
-0.04209064692258835,
-0.0893833190202713,
-0.022283220663666725,
0.033660776913166046,
0.004361500032246113,
0.008071556687355042,
-0.06307882070541382,
-0.04562815651297569,
-0.10188619047403336,
0.03915635868906975,
0.04616845026612282,
0.06768101453781128,
-0.2191275805234909,
0.009082616306841373,
0.027556031942367554,
0.05952044948935509,
-0.13306017220020294,
-0.09101494401693344,
-0.15259279310703278,
0.00039742272929288447,
-0.13059686124324799,
0.06406794488430023,
0.061592768877744675,
-0.04854949936270714,
0.035067036747932434,
-0.043882932513952255,
0.0004143699479755014,
0.028861405327916145,
-0.04535970091819763,
-0.011117871850728989,
0.015505447052419186,
0.006177510134875774,
-0.030567757785320282,
-0.053487807512283325,
-0.043063435703516006,
-0.029680561274290085,
0.054787784814834595,
0.02104547619819641,
-0.0758507251739502,
-0.023473115637898445,
-0.18298010528087616,
-0.01812969706952572,
0.13245733082294464,
0.002356436103582382,
-0.008200457319617271,
0.14597384631633759,
-0.03255922719836235,
0.02350054867565632,
0.045941680669784546,
0.00834833923727274,
0.04824364185333252,
-0.10032500326633453,
-0.11206801235675812,
-0.07519271969795227,
-0.05291133001446724,
-0.07905688136816025,
0.08345643430948257,
0.10799416899681091,
0.07442577183246613,
0.11548545211553574,
-0.13865616917610168,
0.0669560506939888,
-0.07766007632017136,
-0.0069593568332493305,
-0.02534155361354351,
-0.07888507097959518,
0.010779611766338348,
-0.010162390768527985,
0.045325834304094315,
-0.0136204082518816,
0.14034360647201538,
0.09084946662187576,
-0.13309991359710693,
-0.0024138707667589188,
-0.00929619837552309,
-0.02631843276321888,
-0.016799401491880417,
0.2551889717578888,
0.10145963728427887,
-0.006956462282687426,
-0.08942729234695435,
0.021860230714082718,
0.13579104840755463,
0.12235307693481445,
0.0031318129040300846,
0.015644969418644905,
0.024258719757199287,
0.16481736302375793,
0.003965459298342466,
-0.016579868271946907,
-0.059937287122011185,
0.03426060453057289,
-0.10462383925914764,
0.12247732281684875,
-0.11873367428779602,
-0.14781181514263153,
0.10172251611948013,
-0.02050800621509552,
-0.03943636640906334,
0.0037147165276110172,
-0.0770074650645256,
-0.09716072678565979,
-0.027703063562512398,
-0.06800327450037003,
-0.17266669869422913,
0.027315571904182434,
-0.06229201331734657,
0.12446222454309464,
0.06566920876502991,
0.0076821851544082165,
-0.07402129471302032,
0.09486519545316696,
0.02963947132229805,
-0.0744946077466011,
0.11698131263256073,
0.006778121460229158,
-0.004627263639122248,
-0.0994558110833168,
-0.04522183537483215,
0.07144024223089218,
0.1097252368927002,
-0.0014914624625816941,
0.062032222747802734,
0.03687533363699913,
0.07175064831972122,
-0.021852314472198486,
-0.1358582228422165,
0.009430473670363426,
0.06657078862190247,
-0.014820176176726818,
0.17750272154808044,
0.0526888370513916,
0.01400719489902258,
-0.033916763961315155,
0.20015233755111694,
-0.1060686707496643,
-0.08181434869766235,
-0.08442502468824387,
0.1438642144203186,
-0.10308082401752472,
0.12064629793167114,
-0.08840858936309814,
-0.10166779905557632,
-0.1078825369477272,
0.13028131425380707,
0.14787504076957703,
-0.1705704927444458,
-0.00973743386566639,
-0.059736791998147964,
-0.007106183096766472,
-0.04775403439998627,
0.1790471076965332,
0.027036966755986214,
0.0724797174334526,
-0.06618554890155792,
0.02645842544734478,
-0.05293412134051323,
-0.10053800046443939,
-0.07210712134838104,
-0.07806676626205444,
0.0033898563124239445,
-0.046734727919101715,
-0.1227606013417244,
-0.054156865924596786,
-0.1298540085554123,
0.07478922605514526,
0.13676925003528595,
-0.09898433089256287,
-0.036553580313920975,
0.0014106096932664514,
0.16058985888957977,
-0.02301349863409996,
-0.022042766213417053,
-0.07060083746910095,
0.055989354848861694,
0.09736833721399307,
-0.06496482342481613,
-0.017015384510159492,
-0.018570678308606148,
-0.058492738753557205,
-0.2682178318500519,
0.16817118227481842,
-0.004298606421798468,
0.03949829190969467,
0.031070971861481667,
0.03424987196922302,
-0.05919113755226135,
0.13132870197296143,
-0.048278603702783585,
-0.026828566566109657,
-0.026347270235419273,
0.1921045184135437,
-0.024049637839198112,
0.05515880510210991,
0.037085678428411484,
-0.14625179767608643,
-0.030122820287942886,
0.011054210364818573,
-0.07361718267202377,
0.004916774109005928,
-0.043997615575790405,
-0.020118191838264465,
0.11343158781528473,
0.033629145473241806,
-0.016268683597445488,
0.012032032944262028,
-0.01385825127363205,
0.004258410073816776,
-0.01866539753973484,
-0.0066040013916790485,
0.027589673176407814,
-0.1268121898174286,
-0.026856685057282448,
0.0937948226928711,
0.038554634898900986,
-0.2399676889181137,
-0.056827764958143234,
-0.20476673543453217,
0.0481138676404953,
-0.07275717705488205,
0.13741889595985413,
0.15254093706607819,
-0.023496365174651146,
-0.007380692288279533,
-0.12191148102283478,
0.015328208915889263,
0.04335465282201767,
0.00498174037784338,
-0.033090222626924515
] |
null | null | null |
# Lora of jade/ヤーデ/亚德 (Azur Lane)
## What Is This?
This is the LoRA model of waifu jade/ヤーデ/亚德 (Azur Lane).
## How Is It Trained?
* This model is trained with [HCP-Diffusion](https://github.com/7eu7d7/HCP-Diffusion).
* The [auto-training framework](https://github.com/deepghs/cyberharem) is maintained by [DeepGHS Team](https://huggingface.co/deepghs).
* The base model used for training is [deepghs/animefull-latest](https://huggingface.co/deepghs/animefull-latest).
* Dataset used for training is the `stage3-p480-800` in [CyberHarem/jade_azurlane](https://huggingface.co/datasets/CyberHarem/jade_azurlane), which contains 117 images.
* Batch size is 4, resolution is 720x720, clustering into 5 buckets.
* Batch size for regularization dataset is 16, resolution is 720x720, clustering into 20 buckets.
* Trained for 1200 steps, 40 checkpoints were saved and evaluated.
* **Trigger word is `jade_azurlane`.**
* Pruned core tags for this waifu are `breasts, blue_eyes, bangs, grey_hair, hair_bun, hair_ornament, large_breasts, short_hair, hair_between_eyes, hairclip, hat, double_bun, mole`. You can add them to the prompt when some features of waifu (e.g. hair color) are not stable.
## How to Use It?
### If You Are Using A1111 WebUI v1.7+
**Just use it like the classic LoRA**. The LoRA we provided are bundled with the embedding file.
### If You Are Using A1111 WebUI v1.6 or Lower
After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.
For example, if you want to use the model from step 1020, you need to download [`1020/jade_azurlane.pt`](https://huggingface.co/CyberHarem/jade_azurlane/resolve/main/1020/jade_azurlane.pt) as the embedding and [`1020/jade_azurlane.safetensors`](https://huggingface.co/CyberHarem/jade_azurlane/resolve/main/1020/jade_azurlane.safetensors) for loading Lora. By using both files together, you can generate images for the desired characters.
## Which Step Should I Use?
We selected 5 good steps for you to choose. The best one is step 1020.
1600 images (1.66 GiB) were generated for auto-testing.

The base model used for generating preview images is [Meina/MeinaMix_V11](https://huggingface.co/Meina/MeinaMix_V11).
Here are the preview of the recommended steps:
| Step | Epoch | CCIP | AI Corrupt | Bikini Plus | Score | Download | pattern_0_0 | pattern_0_1 | pattern_1_0 | pattern_1_1 | pattern_1_2 | portrait_0 | portrait_1 | portrait_2 | full_body_0 | full_body_1 | profile_0 | profile_1 | free_0 | free_1 | shorts | maid_0 | maid_1 | miko | yukata | suit | china | bikini_0 | bikini_1 | bikini_2 | sit | squat | kneel | jump | crossed_arms | angry | smile | cry | grin | n_lie_0 | n_lie_1 | n_stand_0 | n_stand_1 | n_stand_2 | n_sex_0 | n_sex_1 |
|-------:|--------:|:----------|:-------------|:--------------|:----------|:------------------------------------------------------------------------------------------------|:----------------------------------------------|:----------------------------------------------|:----------------------------------------------|:----------------------------------------------|:----------------------------------------------|:--------------------------------------------|:--------------------------------------------|:--------------------------------------------|:----------------------------------------------|:----------------------------------------------|:------------------------------------------|:------------------------------------------|:------------------------------------|:------------------------------------|:------------------------------------|:------------------------------------|:------------------------------------|:--------------------------------|:------------------------------------|:--------------------------------|:----------------------------------|:----------------------------------------|:----------------------------------------|:----------------------------------------|:------------------------------|:----------------------------------|:----------------------------------|:--------------------------------|:------------------------------------------------|:----------------------------------|:----------------------------------|:------------------------------|:--------------------------------|:--------------------------------------|:--------------------------------------|:------------------------------------------|:------------------------------------------|:------------------------------------------|:--------------------------------------|:--------------------------------------|
| 1020 | 35 | **0.941** | 0.955 | 0.839 | **0.761** | [Download](https://huggingface.co/CyberHarem/jade_azurlane/resolve/main/1020/jade_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
| 870 | 30 | 0.919 | 0.960 | 0.840 | 0.742 | [Download](https://huggingface.co/CyberHarem/jade_azurlane/resolve/main/870/jade_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
| 330 | 12 | 0.914 | 0.961 | 0.841 | 0.738 | [Download](https://huggingface.co/CyberHarem/jade_azurlane/resolve/main/330/jade_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
| 840 | 29 | 0.929 | 0.954 | 0.831 | 0.737 | [Download](https://huggingface.co/CyberHarem/jade_azurlane/resolve/main/840/jade_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
| 630 | 22 | 0.895 | **0.982** | **0.844** | 0.723 | [Download](https://huggingface.co/CyberHarem/jade_azurlane/resolve/main/630/jade_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
## Anything Else?
Because the automation of LoRA training always annoys some people. So for the following groups, it is not recommended to use this model and we express regret:
1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.
2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.
3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.
4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.
5. Individuals who finds the generated image content offensive to their values.
## All Steps
We uploaded the files in all steps. you can check the images, metrics and download them in the following links:
* [Steps From 930 to 1200](all/0.md)
* [Steps From 630 to 900](all/1.md)
* [Steps From 330 to 600](all/2.md)
* [Steps From 30 to 300](all/3.md)
| {"license": "mit", "tags": ["art", "not-for-all-audiences"], "datasets": ["CyberHarem/jade_azurlane"], "pipeline_tag": "text-to-image"} | text-to-image | CyberHarem/jade_azurlane | [
"art",
"not-for-all-audiences",
"text-to-image",
"dataset:CyberHarem/jade_azurlane",
"license:mit",
"region:us"
] | 2024-02-15T07:17:58+00:00 | [] | [] | TAGS
#art #not-for-all-audiences #text-to-image #dataset-CyberHarem/jade_azurlane #license-mit #region-us
| Lora of jade/ヤーデ/亚德 (Azur Lane)
===============================
What Is This?
-------------
This is the LoRA model of waifu jade/ヤーデ/亚德 (Azur Lane).
How Is It Trained?
------------------
* This model is trained with HCP-Diffusion.
* The auto-training framework is maintained by DeepGHS Team.
* The base model used for training is deepghs/animefull-latest.
* Dataset used for training is the 'stage3-p480-800' in CyberHarem/jade\_azurlane, which contains 117 images.
* Batch size is 4, resolution is 720x720, clustering into 5 buckets.
* Batch size for regularization dataset is 16, resolution is 720x720, clustering into 20 buckets.
* Trained for 1200 steps, 40 checkpoints were saved and evaluated.
* Trigger word is 'jade\_azurlane'.
* Pruned core tags for this waifu are 'breasts, blue\_eyes, bangs, grey\_hair, hair\_bun, hair\_ornament, large\_breasts, short\_hair, hair\_between\_eyes, hairclip, hat, double\_bun, mole'. You can add them to the prompt when some features of waifu (e.g. hair color) are not stable.
How to Use It?
--------------
### If You Are Using A1111 WebUI v1.7+
Just use it like the classic LoRA. The LoRA we provided are bundled with the embedding file.
### If You Are Using A1111 WebUI v1.6 or Lower
After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.
For example, if you want to use the model from step 1020, you need to download '1020/jade\_azurlane.pt' as the embedding and '1020/jade\_azurlane.safetensors' for loading Lora. By using both files together, you can generate images for the desired characters.
Which Step Should I Use?
------------------------
We selected 5 good steps for you to choose. The best one is step 1020.
1600 images (1.66 GiB) were generated for auto-testing.
!Metrics Plot
The base model used for generating preview images is Meina/MeinaMix\_V11.
Here are the preview of the recommended steps:
Anything Else?
--------------
Because the automation of LoRA training always annoys some people. So for the following groups, it is not recommended to use this model and we express regret:
1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.
2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.
3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.
4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.
5. Individuals who finds the generated image content offensive to their values.
All Steps
---------
We uploaded the files in all steps. you can check the images, metrics and download them in the following links:
* Steps From 930 to 1200
* Steps From 630 to 900
* Steps From 330 to 600
* Steps From 30 to 300
| [
"### If You Are Using A1111 WebUI v1.7+\n\n\nJust use it like the classic LoRA. The LoRA we provided are bundled with the embedding file.",
"### If You Are Using A1111 WebUI v1.6 or Lower\n\n\nAfter downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.\n\n\nFor example, if you want to use the model from step 1020, you need to download '1020/jade\\_azurlane.pt' as the embedding and '1020/jade\\_azurlane.safetensors' for loading Lora. By using both files together, you can generate images for the desired characters.\n\n\nWhich Step Should I Use?\n------------------------\n\n\nWe selected 5 good steps for you to choose. The best one is step 1020.\n\n\n1600 images (1.66 GiB) were generated for auto-testing.\n\n\n!Metrics Plot\n\n\nThe base model used for generating preview images is Meina/MeinaMix\\_V11.\n\n\nHere are the preview of the recommended steps:\n\n\n\nAnything Else?\n--------------\n\n\nBecause the automation of LoRA training always annoys some people. So for the following groups, it is not recommended to use this model and we express regret:\n\n\n1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.\n2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.\n3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.\n4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.\n5. Individuals who finds the generated image content offensive to their values.\n\n\nAll Steps\n---------\n\n\nWe uploaded the files in all steps. you can check the images, metrics and download them in the following links:\n\n\n* Steps From 930 to 1200\n* Steps From 630 to 900\n* Steps From 330 to 600\n* Steps From 30 to 300"
] | [
"TAGS\n#art #not-for-all-audiences #text-to-image #dataset-CyberHarem/jade_azurlane #license-mit #region-us \n",
"### If You Are Using A1111 WebUI v1.7+\n\n\nJust use it like the classic LoRA. The LoRA we provided are bundled with the embedding file.",
"### If You Are Using A1111 WebUI v1.6 or Lower\n\n\nAfter downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.\n\n\nFor example, if you want to use the model from step 1020, you need to download '1020/jade\\_azurlane.pt' as the embedding and '1020/jade\\_azurlane.safetensors' for loading Lora. By using both files together, you can generate images for the desired characters.\n\n\nWhich Step Should I Use?\n------------------------\n\n\nWe selected 5 good steps for you to choose. The best one is step 1020.\n\n\n1600 images (1.66 GiB) were generated for auto-testing.\n\n\n!Metrics Plot\n\n\nThe base model used for generating preview images is Meina/MeinaMix\\_V11.\n\n\nHere are the preview of the recommended steps:\n\n\n\nAnything Else?\n--------------\n\n\nBecause the automation of LoRA training always annoys some people. So for the following groups, it is not recommended to use this model and we express regret:\n\n\n1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.\n2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.\n3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.\n4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.\n5. Individuals who finds the generated image content offensive to their values.\n\n\nAll Steps\n---------\n\n\nWe uploaded the files in all steps. you can check the images, metrics and download them in the following links:\n\n\n* Steps From 930 to 1200\n* Steps From 630 to 900\n* Steps From 330 to 600\n* Steps From 30 to 300"
] | [
44,
38,
468
] | [
"passage: TAGS\n#art #not-for-all-audiences #text-to-image #dataset-CyberHarem/jade_azurlane #license-mit #region-us \n### If You Are Using A1111 WebUI v1.7+\n\n\nJust use it like the classic LoRA. The LoRA we provided are bundled with the embedding file."
] | [
0.018105151131749153,
-0.002311146119609475,
-0.003986647818237543,
0.08558131009340286,
0.07890419661998749,
0.08021663874387741,
0.2236534208059311,
0.0820227712392807,
0.12177492678165436,
-0.07682682573795319,
0.09385376423597336,
0.06342301517724991,
-0.0067930701188743114,
0.03333399444818497,
-0.02584691345691681,
-0.1504276543855667,
-0.0670713409781456,
-0.02978837490081787,
0.0023649376817047596,
0.01901407353579998,
0.08095414936542511,
0.010018202476203442,
0.10569135844707489,
-0.049527738243341446,
-0.04081787168979645,
0.04782133176922798,
-0.03214836120605469,
-0.03924065828323364,
0.032487764954566956,
0.07784231007099152,
0.12032559514045715,
0.013574685901403427,
0.06430172920227051,
-0.16881594061851501,
0.06976702809333801,
-0.008745377883315086,
-0.10791534930467606,
-0.005961024668067694,
0.011951585300266743,
-0.036334242671728134,
0.12838231027126312,
0.0277507696300745,
-0.10895933210849762,
0.041692305356264114,
-0.13420559465885162,
-0.04003286734223366,
-0.05126294866204262,
0.035985518246889114,
0.14251571893692017,
0.05037957429885864,
0.024109171703457832,
0.05103248357772827,
-0.04604429006576538,
0.08152646571397781,
0.1122787743806839,
-0.1372183859348297,
-0.06800185889005661,
0.11135074496269226,
0.017166446894407272,
0.14073793590068817,
-0.09598063677549362,
0.0971110612154007,
0.07109708338975906,
-0.04552137479186058,
-0.13483677804470062,
-0.0978144034743309,
-0.22109965980052948,
-0.0033510171342641115,
0.018414774909615517,
0.02206689864397049,
0.4119005799293518,
0.05655302479863167,
0.03923945873975754,
0.06467106193304062,
-0.071074478328228,
0.01856667548418045,
-0.09767927974462509,
0.13707783818244934,
0.04365067183971405,
0.09600687026977539,
-0.04025065898895264,
-0.10035280883312225,
-0.11354725807905197,
-0.07094535231590271,
-0.06958404183387756,
-0.04322778433561325,
0.02311611920595169,
0.11731716990470886,
-0.199631005525589,
0.006106850691139698,
-0.04313969984650612,
-0.12524321675300598,
0.01659535989165306,
-0.10609831660985947,
0.16738781332969666,
0.06688713282346725,
-0.012883365154266357,
0.014615163207054138,
0.2482033371925354,
0.12598061561584473,
0.18835335969924927,
0.058556247502565384,
-0.08994341641664505,
0.12777309119701385,
0.041258055716753006,
-0.09032727032899857,
-0.006162284407764673,
-0.10442245751619339,
0.14655525982379913,
-0.04558673873543739,
0.1104201152920723,
-0.05887393653392792,
-0.11278092861175537,
0.023799503222107887,
-0.09727046638727188,
0.06389250606298447,
0.03317173570394516,
0.010521589778363705,
-0.04995870217680931,
0.047649066895246506,
0.04098621383309364,
-0.03096865862607956,
-0.008298685774207115,
-0.008372192271053791,
-0.052115678787231445,
0.03981387987732887,
0.11662568897008896,
0.035734012722969055,
0.06520943343639374,
-0.012038248591125011,
-0.022784970700740814,
0.001554663060232997,
-0.049775779247283936,
0.002426928374916315,
0.047161929309368134,
0.04527754709124565,
0.09229394048452377,
-0.15926939249038696,
-0.08404581248760223,
-0.00934215635061264,
0.06339757144451141,
0.006592150311917067,
0.09118771553039551,
-0.009434904903173447,
0.05329505726695061,
0.01017471682280302,
-0.02274155803024769,
0.03231312334537506,
-0.10256020724773407,
0.08514313399791718,
-0.008434352464973927,
0.09447458386421204,
-0.19652919471263885,
-0.005776737350970507,
-0.04285437613725662,
0.017752060666680336,
0.06256134808063507,
-0.005184923764318228,
-0.10617896914482117,
0.12226511538028717,
-0.012666997499763966,
0.07047845423221588,
-0.09143614023923874,
0.04235081002116203,
0.021830230951309204,
0.07441604882478714,
-0.10638941079378128,
0.012769513763487339,
0.1112007424235344,
-0.13323955237865448,
-0.17061656713485718,
0.09177660197019577,
-0.029566854238510132,
0.030316807329654694,
0.03967364877462387,
0.1455419659614563,
0.168297678232193,
-0.18988963961601257,
-0.014673511497676373,
0.05906959995627403,
-0.013729303143918514,
-0.07602351158857346,
-0.008733094669878483,
0.11313536018133163,
0.015780841931700706,
0.03303659334778786,
-0.027247969061136246,
0.1310730129480362,
-0.03015035204589367,
-0.08381512761116028,
-0.029453188180923462,
-0.07744065672159195,
-0.07313957065343857,
0.052139632403850555,
-0.009678129106760025,
-0.0474945604801178,
0.018618697300553322,
-0.1503300815820694,
0.16504159569740295,
0.01801564358174801,
0.018892936408519745,
-0.07648298889398575,
0.1073903813958168,
-0.001755456905812025,
0.005262867081910372,
0.010271543636918068,
-0.05599750578403473,
-0.10590860247612,
0.22901876270771027,
0.08019466698169708,
0.0874541699886322,
0.06228050962090492,
-0.04711121320724487,
-0.06805093586444855,
0.01888381689786911,
0.009207292459905148,
-0.034542471170425415,
0.023669838905334473,
-0.10135006159543991,
0.04754013940691948,
-0.01766873709857464,
0.03827376663684845,
-0.008702575229108334,
-0.03018326498568058,
0.0618237629532814,
0.010450510308146477,
-0.013581601902842522,
0.08747164905071259,
0.05493392422795296,
-0.01954749785363674,
-0.06927286088466644,
0.0036583854816854,
0.0753774419426918,
-0.010946293361485004,
-0.07329843938350677,
0.022554738447070122,
-0.00511406734585762,
0.036785610020160675,
0.1986389309167862,
-0.2192591279745102,
0.04445182904601097,
0.01579144224524498,
0.05161936208605766,
0.035054948180913925,
0.0064338017255067825,
-0.025198284536600113,
0.03511708602309227,
-0.024270856752991676,
0.07547967880964279,
-0.017265550792217255,
0.065149687230587,
-0.02988445572555065,
-0.13960830867290497,
-0.00687812315300107,
-0.031002329662442207,
0.16258665919303894,
-0.17121343314647675,
0.06452521681785583,
0.18553121387958527,
-0.12313695251941681,
0.13153903186321259,
0.003996989689767361,
-0.006241690833121538,
0.009077515453100204,
0.030840186402201653,
-0.0042946250177919865,
0.1065220758318901,
-0.06846442073583603,
-0.030694853514432907,
0.026592165231704712,
-0.08794346451759338,
0.026936911046504974,
-0.11819799244403839,
-0.11176461726427078,
-0.07006873190402985,
-0.034291185438632965,
-0.03710174188017845,
0.028994129970669746,
-0.050676144659519196,
0.07334250956773758,
-0.08724573254585266,
-0.07856278866529465,
-0.025098806247115135,
-0.0849665179848671,
0.02152954414486885,
0.0095761613920331,
-0.0617765374481678,
-0.14664515852928162,
-0.11184472590684891,
-0.09381669014692307,
-0.13854268193244934,
-0.00557469017803669,
0.06645811349153519,
-0.11192025989294052,
-0.04331091791391373,
0.018737031146883965,
-0.05152216553688049,
0.09764516353607178,
-0.08271253108978271,
0.02063279040157795,
0.04717997461557388,
-0.028890829533338547,
-0.16621102392673492,
0.0008676476427353919,
-0.059707190841436386,
-0.0563693642616272,
0.15549887716770172,
-0.15555310249328613,
0.17837755382061005,
-0.03989293798804283,
0.05300343781709671,
0.06132981926202774,
0.03587620332837105,
0.1316325068473816,
-0.10825073719024658,
0.08245637267827988,
0.19098228216171265,
0.04396047815680504,
0.07649584114551544,
0.11856724321842194,
0.07927259802818298,
-0.10715620219707489,
0.036196958273649216,
0.07785532623529434,
-0.09978093951940536,
-0.07363050431013107,
-0.05076810345053673,
-0.11482725292444229,
-0.062379561364650726,
0.05765029042959213,
0.0625632107257843,
0.051747798919677734,
0.12345073372125626,
-0.05581359565258026,
0.002035649260506034,
0.09939607232809067,
0.04991358518600464,
0.07402065396308899,
0.017807984724640846,
0.05283182114362717,
-0.14886412024497986,
-0.044523004442453384,
0.1597931832075119,
0.22946040332317352,
0.21726395189762115,
0.02560378797352314,
0.07403457164764404,
0.12383738160133362,
0.0930650606751442,
0.10117992013692856,
0.04445060342550278,
0.00021686076070182025,
0.017085108906030655,
-0.07627318799495697,
-0.047557033598423004,
0.007664840668439865,
0.008717861026525497,
-0.052484601736068726,
-0.14008264243602753,
0.10622701048851013,
0.005818954668939114,
0.08322661370038986,
0.13590870797634125,
0.03853234276175499,
-0.11739511042833328,
0.15860939025878906,
0.10360704362392426,
0.09096614271402359,
-0.06769051402807236,
0.13128729164600372,
0.047677334398031235,
-0.009150112979114056,
0.16289031505584717,
0.02343706786632538,
0.1511257439851761,
-0.038066793233156204,
-0.07818464189767838,
-0.06245904788374901,
-0.0602819100022316,
0.013522178865969181,
0.03308841213583946,
-0.22867052257061005,
0.08975288271903992,
0.057325247675180435,
0.012053528800606728,
-0.0005015481146983802,
-0.05561180040240288,
0.18331383168697357,
0.15351836383342743,
0.07786419987678528,
0.02500545233488083,
-0.03356369584798813,
-0.010457297787070274,
-0.07886287569999695,
0.05543842539191246,
0.004740877076983452,
0.06910692155361176,
-0.03691571578383446,
-0.10401776432991028,
-0.02037481591105461,
-0.004539214074611664,
0.026643497869372368,
-0.07989039272069931,
-0.11184220761060715,
-0.049921900033950806,
0.2557159960269928,
-0.05966866388916969,
0.04961198568344116,
0.060312580317258835,
0.02242952398955822,
-0.04035679250955582,
0.027397727593779564,
-0.035734158009290695,
-0.01133908424526453,
-0.04142925515770912,
-0.0004286464536562562,
0.008720967918634415,
-0.05033566802740097,
-0.06315609812736511,
-0.03206127882003784,
-0.10049207508563995,
-0.10778100788593292,
0.004380453377962112,
-0.045561641454696655,
0.016405809670686722,
-0.024340113624930382,
0.00799159612506628,
-0.09932759404182434,
-0.03427883982658386,
0.023960111662745476,
0.03670892119407654,
-0.08448603749275208,
-0.1291622817516327,
-0.009898480027914047,
-0.008304797112941742,
-0.054662760347127914,
0.02016589604318142,
-0.11064240336418152,
-0.08400098979473114,
-0.04873180761933327,
-0.027821289375424385,
0.12524215877056122,
0.22331033647060394,
-0.02660704217851162,
0.008316375315189362,
0.14995115995407104,
-0.09756791591644287,
-0.3150548040866852,
-0.1788792461156845,
-0.1627342849969864,
-0.1043674573302269,
0.032447442412376404,
-0.07833059132099152,
0.019734812900424004,
0.06982466578483582,
-0.03837939724326134,
0.19257698953151703,
-0.19819170236587524,
-0.09524018317461014,
0.08365649729967117,
0.09333856403827667,
0.31365180015563965,
-0.24681472778320312,
0.017008356750011444,
-0.11654126644134521,
-0.04448062181472778,
0.013685614801943302,
-0.07975547015666962,
0.1197255402803421,
0.03394508361816406,
0.07757864892482758,
-0.009209135547280312,
-0.005161031614989042,
0.14615432918071747,
-0.07794082164764404,
0.13996592164039612,
-0.12043380737304688,
-0.10058736801147461,
0.18946285545825958,
-0.036359403282403946,
0.007707692217081785,
-0.21915005147457123,
-0.033109284937381744,
-0.042601555585861206,
0.041719552129507065,
-0.008482868783175945,
0.04715585336089134,
-0.006365184206515551,
-0.010330636985599995,
-0.13089247047901154,
-0.01718859001994133,
-0.0294339656829834,
0.06265705078840256,
0.2355467975139618,
-0.05846427381038666,
-0.057586364448070526,
0.03760785236954689,
-0.009720401838421822,
0.10970613360404968,
0.01572311669588089,
-0.05258810892701149,
-0.04066624864935875,
0.0967750996351242,
-0.21585042774677277,
0.05731682479381561,
0.0024655787274241447,
-0.005943371914327145,
0.020225882530212402,
0.01015550084412098,
0.023389065638184547,
0.12483678013086319,
0.17960472404956818,
-0.008021015673875809,
-0.03367405757308006,
-0.018093880265951157,
0.029013225808739662,
0.12976539134979248,
-0.02518337592482567,
0.10940508544445038,
0.0180329829454422,
0.03302731737494469,
0.010661344043910503,
0.062229856848716736,
-0.08329592645168304,
-0.09430596232414246,
0.09645677357912064,
-0.043248217552900314,
-0.08432083576917648,
0.09337357431650162,
0.05348481237888336,
0.06801071763038635,
0.005817783996462822,
0.04835619404911995,
0.01373402588069439,
-0.12513472139835358,
0.022025585174560547,
0.21764910221099854,
-0.07176496833562851,
-0.06297145783901215,
-0.06538484990596771,
0.01264660619199276,
-0.12306933104991913,
0.07753506302833557,
0.03847890719771385,
-0.02582893893122673,
0.12036027759313583,
-0.04762949049472809,
-0.030452007427811623,
0.015287396498024464,
-0.062801793217659,
0.03498641774058342,
-0.15164940059185028,
-0.212413027882576,
0.049676988273859024,
0.003144423943012953,
-0.06587561219930649,
-0.09059159457683563,
-0.0817175880074501,
0.06296288967132568,
-0.16939403116703033,
0.1442982703447342,
-0.06688346713781357,
0.05980130657553673,
-0.03491887450218201,
-0.04886871203780174,
-0.10822433978319168,
-0.016972988843917847,
-0.049004871398210526,
-0.022797128185629845,
0.05927484109997749,
0.01631953753530979,
-0.12132805585861206,
-0.11916018277406693,
0.06201575696468353,
-0.0017812262522056699,
-0.0023182835429906845,
0.013604536652565002,
-0.06978810578584671,
0.0177262295037508,
-0.21924342215061188,
-0.06654036790132523,
0.0881333276629448,
0.04244370758533478,
-0.08838248252868652,
0.11458612978458405,
0.046991050243377686,
-0.02929641120135784,
0.04008028283715248,
0.0016257340321317315,
0.18239690363407135,
-0.07313255965709686,
0.018351871520280838,
-0.12585967779159546,
-0.15965551137924194,
-0.03228223696351051,
0.03053850121796131,
0.22533629834651947,
0.08978050947189331,
0.12228401005268097,
-0.05022631213068962,
0.015959814190864563,
-0.01573505438864231,
0.07150733470916748,
0.012045652605593204,
-0.10142543911933899,
-0.07007335871458054,
-0.16936619579792023,
-0.0686839371919632,
-0.06671680510044098,
0.1565845012664795,
0.0335392989218235,
-0.14877575635910034,
-0.003527659922838211,
0.11350366473197937,
-0.18465620279312134,
-0.015252254903316498,
0.16063761711120605,
-0.05043110251426697,
0.02196754887700081,
-0.1477271467447281,
0.03444007784128189,
0.07913503795862198,
-0.03079954721033573,
-0.011292177252471447,
0.1251596361398697,
0.006962950807064772,
-0.001245490275323391,
0.027517177164554596,
-0.031414564698934555,
0.09233711659908295,
-0.06022096052765846,
0.05935516953468323,
-0.002823511604219675,
-0.04707656428217888,
-0.11603350937366486,
0.1993931233882904,
-0.014706515707075596,
0.012973819859325886,
-0.06889006495475769,
-0.0010565933771431446,
-0.0961298942565918,
-0.09530485421419144,
-0.07388363033533096,
-0.13231314718723297,
0.06816960871219635,
-0.061558958142995834,
0.005890954285860062,
0.0037852979730814695,
0.01495804637670517,
-0.07606997340917587,
0.01064944639801979,
-0.1769399791955948,
-0.05017067492008209,
0.009497985243797302,
-0.013810072094202042,
-0.02128165028989315,
-0.03902759030461311,
-0.03152630850672722,
0.03071613982319832,
-0.0648311898112297,
-0.06491444259881973,
0.06140781566500664,
0.08786221593618393,
0.06494501233100891,
-0.16458141803741455,
-0.11036977171897888,
-0.07258577644824982,
0.03822898492217064,
0.0731598511338234,
0.18322929739952087,
0.03946195915341377,
-0.005576093215495348,
0.045181844383478165,
0.14122840762138367,
0.017086567357182503,
-0.07161372154951096,
-0.07228255271911621,
-0.11826013028621674,
-0.14245308935642242,
-0.01710057444870472,
-0.05792902782559395,
-0.016359824687242508,
0.01830320805311203,
0.2278801053762436,
0.1862805187702179,
-0.15585489571094513,
0.033766619861125946,
-0.0812360942363739,
0.04120035842061043,
-0.03726594150066376,
0.1583452969789505,
0.05449500307440758,
0.14719440042972565,
-0.0345853716135025,
-0.03461598977446556,
-0.0637049749493599,
0.022151559591293335,
-0.09865725040435791,
0.029390545561909676,
-0.014084573835134506,
-0.06505692005157471,
-0.0609976090490818,
0.09962880611419678,
-0.11367050558328629,
0.07023836672306061,
0.18221554160118103,
-0.1416911482810974,
-0.013575843535363674,
-0.04171216860413551,
0.054754145443439484,
0.10963895916938782,
0.009417104534804821,
-0.07809403538703918,
-0.02010231651365757,
0.01379480492323637,
0.027303265407681465,
-0.17100770771503448,
-0.10514380037784576,
0.005188444629311562,
-0.12706080079078674,
0.1284683793783188,
-0.008302178233861923,
0.0006995138828642666,
0.036729324609041214,
-0.06920353323221207,
-0.005455189850181341,
0.16919732093811035,
0.020444689318537712,
-0.023261070251464844,
-0.026474911719560623,
-0.05712882801890373,
-0.09796532988548279,
0.07540777325630188,
0.09155140072107315,
0.06436330825090408,
-0.0052882409654557705,
0.1555110216140747,
-0.021823300048708916,
-0.03601229935884476,
0.13777680695056915,
-0.16739414632320404,
0.09351874142885208,
-0.009284975938498974,
-0.020433152094483376,
-0.06465592980384827,
-0.043260812759399414,
0.03914878889918327,
0.07856690138578415,
-0.17547059059143066,
-0.04565659910440445,
0.070294588804245,
-0.09175746142864227,
0.05148477479815483,
0.04385018348693848,
-0.08995816111564636,
0.016845276579260826,
-0.1235792487859726,
-0.005234155338257551,
-0.09575838595628738,
0.05333502218127251,
0.19501599669456482,
-0.028799783438444138,
0.00887447502464056,
-0.13470768928527832,
0.05796405300498009,
-0.030615560710430145,
-0.04378281533718109,
-0.08297271281480789
] |
null | null | transformers |
From BKAI Vietnamese LLama2 120GB 7B, I pretrain on law/online public services crawl on VBPL
### Training process
The model is pretrain on a single A600 system.
Hyperparameters are set as follows:
- Training Regime: BFloat16 mixed precision
- Lora Config:
```
{
"base_model_name_or_path": "meta-llama/Llama-2-7b-hf",
"bias": "none",
"enable_lora": null,
"fan_in_fan_out": false,
"inference_mode": true,
"lora_alpha": 32.0,
"lora_dropout": 0.05,
"merge_weights": false,
"modules_to_save": [
"embed_tokens",
"lm_head"
],
"peft_type": "LORA",
"r": 8,
"target_modules": [
"q_proj",
"v_proj",
"k_proj",
"o_proj",
"gate_proj",
"down_proj",
"up_proj"
],
"task_type": "CAUSAL_LM"
}
```
Please note that **this model requires further supervised fine-tuning (SFT)** to be used in practice!
Usage and other considerations: Please refer to the [Llama 2](https://github.com/facebookresearch/llama)
### Training loss
To be updated.
### Disclaimer
This project is built upon bkai-foundation-models/vietnamese-llama2-7b-120GB, which is built upon Meta's Llama-2 model. It is essential to strictly adhere to the open-source license agreement of Llama-2 when using this model. If you incorporate third-party code, please ensure compliance with the relevant open-source license agreements.
It's important to note that the content generated by the model may be influenced by various factors, such as calculation methods, random elements, and potential inaccuracies in quantification. Consequently, this project does not offer any guarantees regarding the accuracy of the model's outputs, and it disclaims any responsibility for consequences resulting from the use of the model's resources and its output.
For those employing the models from this project for commercial purposes, developers must adhere to local laws and regulations to ensure the compliance of the model's output content. This project is not accountable for any products or services derived from such usage.
### Contact
[email protected] (persional email)
https://github.com/huyhuyvu01 (Github) | {"language": ["vi", "en"], "license": "llama2"} | text-generation | huyhuyvu01/VietLlama2_Law_Pretrain_7B | [
"transformers",
"safetensors",
"llama",
"text-generation",
"vi",
"en",
"license:llama2",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-15T07:19:04+00:00 | [] | [
"vi",
"en"
] | TAGS
#transformers #safetensors #llama #text-generation #vi #en #license-llama2 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
From BKAI Vietnamese LLama2 120GB 7B, I pretrain on law/online public services crawl on VBPL
### Training process
The model is pretrain on a single A600 system.
Hyperparameters are set as follows:
- Training Regime: BFloat16 mixed precision
- Lora Config:
Please note that this model requires further supervised fine-tuning (SFT) to be used in practice!
Usage and other considerations: Please refer to the Llama 2
### Training loss
To be updated.
### Disclaimer
This project is built upon bkai-foundation-models/vietnamese-llama2-7b-120GB, which is built upon Meta's Llama-2 model. It is essential to strictly adhere to the open-source license agreement of Llama-2 when using this model. If you incorporate third-party code, please ensure compliance with the relevant open-source license agreements.
It's important to note that the content generated by the model may be influenced by various factors, such as calculation methods, random elements, and potential inaccuracies in quantification. Consequently, this project does not offer any guarantees regarding the accuracy of the model's outputs, and it disclaims any responsibility for consequences resulting from the use of the model's resources and its output.
For those employing the models from this project for commercial purposes, developers must adhere to local laws and regulations to ensure the compliance of the model's output content. This project is not accountable for any products or services derived from such usage.
### Contact
huyhuyvu01@URL (persional email)
URL (Github) | [
"### Training process\nThe model is pretrain on a single A600 system. \n\nHyperparameters are set as follows:\n- Training Regime: BFloat16 mixed precision\n- Lora Config: \n \n \n\nPlease note that this model requires further supervised fine-tuning (SFT) to be used in practice!\n\nUsage and other considerations: Please refer to the Llama 2",
"### Training loss \nTo be updated.",
"### Disclaimer\n\nThis project is built upon bkai-foundation-models/vietnamese-llama2-7b-120GB, which is built upon Meta's Llama-2 model. It is essential to strictly adhere to the open-source license agreement of Llama-2 when using this model. If you incorporate third-party code, please ensure compliance with the relevant open-source license agreements.\nIt's important to note that the content generated by the model may be influenced by various factors, such as calculation methods, random elements, and potential inaccuracies in quantification. Consequently, this project does not offer any guarantees regarding the accuracy of the model's outputs, and it disclaims any responsibility for consequences resulting from the use of the model's resources and its output.\nFor those employing the models from this project for commercial purposes, developers must adhere to local laws and regulations to ensure the compliance of the model's output content. This project is not accountable for any products or services derived from such usage.",
"### Contact\nhuyhuyvu01@URL (persional email)\nURL (Github)"
] | [
"TAGS\n#transformers #safetensors #llama #text-generation #vi #en #license-llama2 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"### Training process\nThe model is pretrain on a single A600 system. \n\nHyperparameters are set as follows:\n- Training Regime: BFloat16 mixed precision\n- Lora Config: \n \n \n\nPlease note that this model requires further supervised fine-tuning (SFT) to be used in practice!\n\nUsage and other considerations: Please refer to the Llama 2",
"### Training loss \nTo be updated.",
"### Disclaimer\n\nThis project is built upon bkai-foundation-models/vietnamese-llama2-7b-120GB, which is built upon Meta's Llama-2 model. It is essential to strictly adhere to the open-source license agreement of Llama-2 when using this model. If you incorporate third-party code, please ensure compliance with the relevant open-source license agreements.\nIt's important to note that the content generated by the model may be influenced by various factors, such as calculation methods, random elements, and potential inaccuracies in quantification. Consequently, this project does not offer any guarantees regarding the accuracy of the model's outputs, and it disclaims any responsibility for consequences resulting from the use of the model's resources and its output.\nFor those employing the models from this project for commercial purposes, developers must adhere to local laws and regulations to ensure the compliance of the model's output content. This project is not accountable for any products or services derived from such usage.",
"### Contact\nhuyhuyvu01@URL (persional email)\nURL (Github)"
] | [
58,
84,
8,
233,
21
] | [
"passage: TAGS\n#transformers #safetensors #llama #text-generation #vi #en #license-llama2 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### Training process\nThe model is pretrain on a single A600 system. \n\nHyperparameters are set as follows:\n- Training Regime: BFloat16 mixed precision\n- Lora Config: \n \n \n\nPlease note that this model requires further supervised fine-tuning (SFT) to be used in practice!\n\nUsage and other considerations: Please refer to the Llama 2### Training loss \nTo be updated.### Disclaimer\n\nThis project is built upon bkai-foundation-models/vietnamese-llama2-7b-120GB, which is built upon Meta's Llama-2 model. It is essential to strictly adhere to the open-source license agreement of Llama-2 when using this model. If you incorporate third-party code, please ensure compliance with the relevant open-source license agreements.\nIt's important to note that the content generated by the model may be influenced by various factors, such as calculation methods, random elements, and potential inaccuracies in quantification. Consequently, this project does not offer any guarantees regarding the accuracy of the model's outputs, and it disclaims any responsibility for consequences resulting from the use of the model's resources and its output.\nFor those employing the models from this project for commercial purposes, developers must adhere to local laws and regulations to ensure the compliance of the model's output content. This project is not accountable for any products or services derived from such usage.### Contact\nhuyhuyvu01@URL (persional email)\nURL (Github)"
] | [
-0.0919533371925354,
0.14760653674602509,
-0.0030848446767777205,
0.016362106427550316,
0.07051864266395569,
-0.05911470204591751,
0.13610464334487915,
0.002405600156635046,
0.05102020874619484,
0.07078015059232712,
0.05068269371986389,
0.02582324855029583,
0.05578428506851196,
0.12469885498285294,
0.004302104469388723,
-0.08463641256093979,
0.07337380200624466,
-0.046791911125183105,
0.04017806798219681,
0.07779323309659958,
0.05563511326909065,
-0.06618381291627884,
0.10574928671121597,
-0.002046949462965131,
-0.07227201759815216,
0.026044420897960663,
0.02955561690032482,
-0.04812351241707802,
0.06435304135084152,
0.051885053515434265,
0.04558877646923065,
0.0696089118719101,
0.02867579460144043,
-0.17921748757362366,
0.014872283674776554,
0.0008265785872936249,
-0.03319110348820686,
0.011300760321319103,
0.05445228889584541,
0.03657558187842369,
0.07507872581481934,
-0.011257065460085869,
0.030819721519947052,
0.028712192550301552,
-0.11687309294939041,
0.03368502855300903,
-0.12353213876485825,
0.035634845495224,
0.13916628062725067,
0.0613473542034626,
-0.004599112551659346,
0.154830664396286,
-0.028538085520267487,
0.06409729272127151,
0.04665086045861244,
-0.3291419744491577,
-0.01572680100798607,
0.12754370272159576,
-0.0036120095755904913,
0.031832050532102585,
0.03767963871359825,
0.018897563219070435,
0.12268487364053726,
0.020197568461298943,
0.1059432253241539,
-0.07316827028989792,
-0.023346606642007828,
-0.0655427873134613,
-0.1575431227684021,
-0.0328754223883152,
0.2597822844982147,
0.020082036033272743,
-0.13450337946414948,
-0.08677539229393005,
-0.013971646316349506,
0.03996714949607849,
0.0580972358584404,
-0.015610173344612122,
0.042357370257377625,
0.0023483727127313614,
0.10078862309455872,
-0.02735389955341816,
-0.1304098665714264,
-0.05812748149037361,
-0.0502886027097702,
0.09526001662015915,
0.03928579017519951,
0.034301020205020905,
-0.05611554533243179,
0.10000517964363098,
-0.16474059224128723,
-0.08230796456336975,
-0.10950528085231781,
-0.06631825864315033,
0.00033351461752317846,
-0.03603259474039078,
0.005170759744942188,
-0.06260371208190918,
0.10292471200227737,
0.0966915562748909,
-0.17111974954605103,
-0.02433766983449459,
-0.024719303473830223,
0.046987298876047134,
0.09921003878116608,
0.08088196069002151,
-0.07794562727212906,
0.035170864313840866,
0.11606145650148392,
0.00950190145522356,
0.13936810195446014,
0.02259964495897293,
-0.02697039395570755,
0.030036266893148422,
-0.023734023794531822,
0.10777454078197479,
-0.0034766995813697577,
0.049405671656131744,
-0.04185275733470917,
0.024725308641791344,
0.24220581352710724,
-0.11643551290035248,
-0.014386245049536228,
-0.05376836657524109,
0.028663842007517815,
-0.029792916029691696,
0.09426002949476242,
0.04536094516515732,
-0.030459076166152954,
0.005572461057454348,
-0.0570901557803154,
-0.02207575924694538,
-0.06258129328489304,
-0.043617747724056244,
0.052102066576480865,
-0.00980961974710226,
0.009726835414767265,
-0.1646159589290619,
-0.13230276107788086,
0.02105814777314663,
0.00007338909927057102,
-0.032100431621074677,
-0.005131198093295097,
0.040804218500852585,
0.004250749945640564,
-0.10792345553636551,
-0.030673077329993248,
-0.148304745554924,
-0.030685843899846077,
0.04871934652328491,
0.05257338657975197,
0.01071673259139061,
-0.13891932368278503,
-0.00526089034974575,
-0.1642594337463379,
0.12107335776090622,
-0.06773008406162262,
0.016037309542298317,
0.02226932719349861,
0.03432884067296982,
-0.033775001764297485,
-0.029883358627557755,
-0.02628083899617195,
-0.0022629774175584316,
0.04596177488565445,
0.17455817759037018,
-0.08121282607316971,
0.02768024615943432,
0.16308344900608063,
-0.21429821848869324,
-0.14069119095802307,
0.10226182639598846,
0.012517670169472694,
0.08299025893211365,
0.07121478021144867,
0.06406748294830322,
0.0015364388236775994,
-0.1468503475189209,
-0.07871051132678986,
-0.02506815642118454,
-0.017114391550421715,
0.027701882645487785,
0.0244428850710392,
0.03470378741621971,
-0.016006680205464363,
0.023641027510166168,
-0.035206764936447144,
-0.02360924333333969,
0.0052890595979988575,
-0.06111527979373932,
-0.03695164993405342,
-0.07992339879274368,
-0.05653714761137962,
-0.019377512857317924,
-0.03085186891257763,
-0.027844849973917007,
0.0030131470412015915,
-0.031092289835214615,
0.07275894284248352,
-0.07912224531173706,
0.009012525901198387,
-0.09906868636608124,
0.1383616179227829,
-0.06639446318149567,
-0.018525592982769012,
-0.09504589438438416,
-0.062077224254608154,
0.08795348554849625,
-0.01849856972694397,
0.047911740839481354,
-0.029655542224645615,
0.008750754408538342,
0.15880970656871796,
-0.0140567971393466,
-0.03313547000288963,
0.05423497408628464,
-0.002712358022108674,
0.0018012686632573605,
-0.13145311176776886,
-0.014814786612987518,
-0.08700034767389297,
0.22393493354320526,
-0.16243566572666168,
0.03167473524808884,
0.058863718062639236,
0.02858368121087551,
0.015119761228561401,
-0.0049649449065327644,
0.05803542584180832,
0.012858693487942219,
-0.07049720734357834,
-0.012417025864124298,
0.00567256985232234,
0.05806034803390503,
-0.053016431629657745,
0.06142770126461983,
-0.11654949188232422,
0.12215345352888107,
0.07252730429172516,
0.03071613796055317,
-0.0993809923529625,
-0.10038352012634277,
-0.019030561670660973,
0.017861422151327133,
-0.05922813341021538,
0.009536604396998882,
0.0734964907169342,
0.009103355929255486,
0.06575058400630951,
-0.10363560169935226,
0.00011893878399860114,
0.05226382985711098,
-0.15351074934005737,
-0.07885202020406723,
0.004329233430325985,
0.08699075877666473,
-0.1471123844385147,
0.06654360145330429,
0.07446476072072983,
-0.0006039206054992974,
0.12253434956073761,
0.025019727647304535,
-0.035660069435834885,
-0.036308348178863525,
0.09862849861383438,
0.02516019158065319,
0.10109241306781769,
0.006143270991742611,
0.039902571588754654,
0.01838696002960205,
0.003164505586028099,
0.003323390381410718,
-0.13084906339645386,
-0.014480136334896088,
-0.008556770160794258,
-0.028347503393888474,
-0.056776486337184906,
-0.006250101141631603,
-0.06372128427028656,
0.09911121428012848,
-0.044342510402202606,
0.041594650596380234,
0.013309603556990623,
-0.04798581451177597,
-0.15614870190620422,
0.16073770821094513,
-0.07343865185976028,
-0.1946677714586258,
-0.19028936326503754,
0.04559572786092758,
-0.09963655471801758,
0.0037232120521366596,
0.06167663261294365,
-0.029232924804091454,
-0.09323138743638992,
-0.1461731195449829,
-0.03514881059527397,
0.018134193494915962,
-0.06351406872272491,
-0.08061164617538452,
0.024448638781905174,
0.012868707068264484,
-0.10587621480226517,
-0.05168034881353378,
-0.006106893066316843,
-0.1498258113861084,
0.07258284837007523,
-0.05177011340856552,
0.13121618330478668,
0.1532108038663864,
0.018390730023384094,
-0.03959984332323074,
-0.006209747865796089,
0.1704343557357788,
-0.00892927497625351,
0.05780203640460968,
0.25830575823783875,
-0.003697456791996956,
0.04807068780064583,
0.1675652265548706,
0.013399324379861355,
-0.10561558604240417,
0.05106228590011597,
-0.061360254883766174,
-0.10035324841737747,
-0.15888634324073792,
-0.06237439811229706,
-0.04743511229753494,
0.022091364488005638,
0.011098483577370644,
0.019825782626867294,
0.06668252497911453,
0.108283132314682,
-0.051717597991228104,
-0.020508430898189545,
0.0527462512254715,
0.10015912353992462,
0.10026615858078003,
0.0018351366743445396,
0.05997200310230255,
-0.07973119616508484,
0.056329015642404556,
0.09085403382778168,
0.04601440578699112,
0.17079445719718933,
-0.033348869532346725,
-0.012021787464618683,
0.1431780755519867,
0.1585354208946228,
-0.016875626519322395,
0.08413109183311462,
-0.05040251091122627,
0.03065207228064537,
-0.04354234039783478,
-0.14079619944095612,
-0.058733489364385605,
0.09975989907979965,
-0.11060451716184616,
0.05342132970690727,
-0.05057976394891739,
-0.04302409663796425,
0.012935920618474483,
0.18512143194675446,
0.09813261032104492,
-0.23161937296390533,
-0.07506972551345825,
0.07261046767234802,
0.08756665140390396,
-0.051613446325063705,
0.030169209465384483,
0.09397025406360626,
-0.10189975053071976,
0.10652349889278412,
-0.00002527391552575864,
0.013554797507822514,
-0.028341593220829964,
-0.0053354729898273945,
-0.027980022132396698,
0.005036644171923399,
-0.0013568004360422492,
0.07754504680633545,
-0.2012546807527542,
0.20704199373722076,
0.008130698464810848,
0.033234670758247375,
-0.013988365419209003,
0.0479755736887455,
-0.044649556279182434,
0.21395933628082275,
0.14361268281936646,
-0.005798810627311468,
0.09710036963224411,
-0.12701652944087982,
-0.03353884816169739,
0.05076306313276291,
-0.0031792004592716694,
-0.0009624728700146079,
0.07401084154844284,
0.02021968737244606,
0.005262948106974363,
-0.05024024099111557,
-0.016642000526189804,
-0.1551135927438736,
-0.14459019899368286,
0.05501365661621094,
0.022261861711740494,
0.012455493211746216,
-0.1078903004527092,
-0.004614969715476036,
-0.029617508873343468,
0.1691780537366867,
-0.1652103066444397,
-0.09328457713127136,
-0.04047892242670059,
-0.12600606679916382,
0.04468432813882828,
-0.04940266162157059,
0.06667190045118332,
-0.09368419647216797,
0.08339902013540268,
-0.05565641075372696,
-0.07688038796186447,
0.030999114736914635,
-0.1253064125776291,
-0.10935768485069275,
-0.023958181962370872,
0.10945526510477066,
0.03250013291835785,
-0.025972679257392883,
0.07859323173761368,
0.02240999974310398,
0.05577101558446884,
-0.13724222779273987,
-0.07364266365766525,
0.15242809057235718,
0.14625284075737,
0.0020357731264084578,
-0.12512914836406708,
-0.027256464585661888,
-0.07293897122144699,
-0.07453396916389465,
0.018550844863057137,
0.28401896357536316,
-0.056013837456703186,
0.08762387186288834,
0.16118869185447693,
-0.08838609606027603,
-0.1504879742860794,
-0.04460633546113968,
-0.033834222704172134,
0.018939634785056114,
0.09712696075439453,
-0.08178568631410599,
0.05579761415719986,
0.03966460004448891,
-0.02990221045911312,
-0.012816846370697021,
-0.20354117453098297,
-0.07803338766098022,
0.007736026309430599,
0.15561868250370026,
0.13504382967948914,
-0.13809283077716827,
-0.025136716663837433,
-0.02884732000529766,
-0.23751811683177948,
0.06911443918943405,
-0.11048559844493866,
0.03709506615996361,
0.0029925345443189144,
-0.010391253978013992,
0.019556347280740738,
-0.049337174743413925,
0.1527658849954605,
-0.05740649253129959,
0.06814879179000854,
-0.08181819319725037,
0.06189417839050293,
0.035160258412361145,
-0.0693378895521164,
0.18715651333332062,
0.00337304943241179,
0.028414327651262283,
-0.0019604566041380167,
-0.06883145123720169,
-0.009575068950653076,
0.08014747500419617,
-0.04132923483848572,
-0.0962778702378273,
-0.03222830966114998,
0.06437727063894272,
-0.0012478575808927417,
-0.010813111439347267,
-0.06401197612285614,
-0.08382634073495865,
0.04438839107751846,
0.27420908212661743,
0.20885635912418365,
-0.044009312987327576,
0.06671498715877533,
0.00524761900305748,
-0.04757746309041977,
0.04204497113823891,
-0.060113731771707535,
0.03342835605144501,
0.0584011971950531,
0.010592437349259853,
0.050101302564144135,
-0.022258074954152107,
-0.11076473444700241,
0.042407844215631485,
0.0800587385892868,
-0.13922114670276642,
-0.06637107580900192,
0.010078733786940575,
0.10719592869281769,
-0.041487179696559906,
0.12380144000053406,
0.1431264877319336,
-0.07003259658813477,
-0.00007416197331622243,
-0.03537366911768913,
0.07700344920158386,
-0.04301474243402481,
0.058974575251340866,
-0.04050791636109352,
0.015695199370384216,
-0.04941602796316147,
0.07936585694551468,
0.05252355709671974,
0.01947902701795101,
0.04053306207060814,
-0.058532338589429855,
-0.07691445201635361,
-0.05710214748978615,
-0.1291358470916748,
-0.027569681406021118,
-0.08608188480138779,
-0.14790716767311096,
-0.06189917027950287,
-0.11503420025110245,
-0.07734548300504684,
0.19803361594676971,
0.0522899404168129,
0.08177454769611359,
0.05408451706171036,
-0.029169030487537384,
-0.11779743432998657,
0.025568634271621704,
0.036028701812028885,
0.059630557894706726,
-0.13983695209026337,
0.05590631812810898,
0.07024617493152618,
0.08273942023515701,
-0.031673651188611984,
-0.05746414139866829,
-0.09819867461919785,
-0.019470389932394028,
-0.09692972153425217,
0.03093392588198185,
-0.10283738374710083,
0.0075174979865550995,
-0.024641579017043114,
0.008542117662727833,
-0.05679497495293617,
0.04975543171167374,
-0.0074010733515024185,
0.05077704042196274,
-0.06284084916114807,
0.1149754673242569,
-0.15339331328868866,
-0.049674540758132935,
0.036957647651433945,
-0.07711920887231827,
0.07669235020875931,
-0.07367987930774689,
-0.061641912907361984,
0.02801130712032318,
-0.12331393361091614,
0.10277239233255386,
0.04144558310508728,
0.02023097313940525,
-0.0056525650434195995,
-0.11378440260887146,
0.016020942479372025,
0.06275251507759094,
-0.09056634455919266,
0.021859178319573402,
0.09435759484767914,
-0.08014639467000961,
0.007916569709777832,
0.007823484018445015,
-0.0887557789683342,
-0.06167354807257652,
0.022762922570109367,
0.05690806731581688,
0.006263922434300184,
0.11278579384088516,
-0.10150649398565292,
-0.030762769281864166,
-0.11370823532342911,
-0.027587585151195526,
-0.024399854242801666,
0.018528901040554047,
-0.14732353389263153,
-0.052382368594408035,
0.001294653513468802,
0.05611151084303856,
0.23059304058551788,
-0.01023425068706274,
0.04702199250459671,
0.01768769696354866,
-0.07465646415948868,
0.1355990767478943,
0.02117050066590309,
-0.006753386463969946,
-0.01832868717610836,
0.017187373712658882,
-0.11703938245773315,
0.04341565817594528,
-0.0399443544447422,
-0.08897969126701355,
0.1128309965133667,
-0.016512803733348846,
-0.05326021835207939,
0.04193786159157753,
0.06734449416399002,
-0.03644970431923866,
0.0672459751367569,
-0.12709194421768188,
-0.008631612174212933,
-0.006695587653666735,
-0.021083179861307144,
0.08573199808597565,
0.16215559840202332,
-0.13762694597244263,
0.015906497836112976,
-0.0019429308595135808,
-0.03041045553982258,
-0.1019081398844719,
-0.19931749999523163,
-0.033819086849689484,
-0.07087801396846771,
0.00017080226098187268,
-0.10635266453027725,
0.013897344470024109,
0.03821178898215294,
0.010531744919717312,
-0.04308735579252243,
0.08415307104587555,
-0.030640017241239548,
-0.0001798678276827559,
-0.04459190368652344,
0.004500311799347401,
0.032171137630939484,
0.06187857314944267,
0.05946607142686844,
0.008972562849521637,
0.008357111364603043,
0.01943136937916279,
0.12235137820243835,
0.11507118493318558,
0.018944205716252327,
-0.020222699269652367,
-0.03464870527386665,
-0.026205100119113922,
0.003434516256675124,
0.0562717467546463,
0.11633093655109406,
0.06866076588630676,
-0.06987032294273376,
0.04609762877225876,
0.14881405234336853,
-0.03836733102798462,
-0.073548823595047,
-0.14734336733818054,
0.35484808683395386,
-0.039778295904397964,
0.03954322263598442,
0.009157280437648296,
-0.08600743860006332,
0.029129743576049805,
0.20026150345802307,
0.07210355997085571,
-0.02102297730743885,
0.0016280845738947392,
-0.11029131710529327,
0.0234740749001503,
-0.05622538924217224,
0.1115519180893898,
0.018846919760107994,
0.30011752247810364,
-0.060427092015743256,
0.03506835177540779,
-0.05228740721940994,
0.06598887592554092,
-0.08950985223054886,
0.011453812010586262,
-0.04810425266623497,
-0.0185726135969162,
-0.0501236617565155,
0.09160326421260834,
-0.0004018574545625597,
-0.07099214941263199,
0.020107079297304153,
-0.04088512808084488,
-0.025727886706590652,
0.028278101235628128,
-0.08028924465179443,
-0.06398880481719971,
0.06906452775001526,
-0.06653884798288345,
-0.019031325355172157,
0.1683443784713745,
0.036850906908512115,
-0.036657072603702545,
0.0022169409785419703,
0.09634727239608765,
0.03458007425069809,
0.23766005039215088,
0.02734019048511982,
0.13909795880317688,
0.06594772636890411,
-0.015197326429188251,
-0.07059188187122345,
0.12964345514774323,
0.02967793494462967,
-0.03357156366109848,
0.011503391899168491,
-0.005094922613352537,
-0.010933554731309414,
0.10286983847618103,
0.029283961281180382,
0.009563245810568333,
0.08420733362436295,
-0.03985615819692612,
0.03471285104751587,
-0.12470883131027222,
0.08044776320457458,
-0.10143807530403137,
0.15817390382289886,
0.051257140934467316,
-0.056773651391267776,
0.0011094348737969995,
-0.06674904376268387,
0.11137136071920395,
0.02686849609017372,
-0.010658369399607182,
0.05449555814266205,
-0.16894596815109253,
0.06484544277191162,
-0.02692975103855133,
0.03603949770331383,
-0.2556813955307007,
-0.0492730513215065,
0.012744319625198841,
0.031948961317539215,
-0.10932479798793793,
0.07251615822315216,
0.11777467280626297,
0.03457304462790489,
-0.07187113165855408,
-0.08377023041248322,
-0.022435123100876808,
0.08660291135311127,
-0.05563287064433098,
-0.10325400531291962
] |
null | null | null | This repo contains llamafile version for Phi 2 Electrical Engineering model. This llamafile is created from gguf format of original model's quantized version.
# About llamafile
llamafile is the new framework that collapses all the complexity of (large language models) LLMs down to a single-file executable (called a "llamafile") that runs locally on most computers, with no installation. The first release of llamafile is a product of Mozilla’s innovation group and developed by [Justine Tunney](https://justine.lol/), about llamafile in short as per [introductory post](https://hacks.mozilla.org/2023/11/introducing-llamafile/)
**llamafile lets you turn large language model (LLM) weights into executables.**
**Say you have a set of LLM weights in the form of a 4GB file (in the commonly-used GGUF format). With llamafile you can transform that 4GB file into a binary that runs on six OSes without needing to be installed.**
Basically, llamafile lets anyone distribute and run LLMs with a single file.
Here in [github](https://github.com/Mozilla-Ocho/llamafile/issues/242#issuecomment-1930700064) you can find how I created llamafile version from gguf version for other model.
# Model Description
## [From the original model card ](https://huggingface.co/STEM-AI-mtl/phi-2-electrical-engineering)
This is the adapters from the LoRa fine-tuning of the phi-2 model from Microsoft. It was trained on the STEM-AI-mtl/Electrical-engineering dataset combined with garage-bAInd/Open-Platypus.**
**Developed by: STEM.AI**
Model type: **Q&A and code generation**
Language(s) (NLP): **English**
Finetuned from model [optional]: **microsoft/phi-2**
Direct Use: **Q&A related to electrical engineering, and Kicad software. Creation of Python code in general, and for Kicad's scripting console.**
**Refer to microsoft/phi-2 model card for recommended prompt format.**
## GGUF version
The original model was quantized by the [ Great Mr. Bloke](https://huggingface.co/TheBloke/phi-2-electrical-engineering-GGUF).
As per TheBloke's model card the recommended model phi-2-electrical-engineering.Q5_K_M.gguf quantized with Q5_K_M method this gguf file can take maximum 4.50 GB of RAM.
# How to run
To run llamafile version in windows just rename the file by adding .exe at the last and run it as you run any other exe file.
For running in linux run like any binary in linux, just make sure the llamafile is executable by running chmod.
Thats it.
# Prompt format
```
Instruction: <prompt> (without the <>)
Response:
```
Thank you for using this model.
| {"language": ["en"], "license": "other", "tags": ["phi-2", "electrical engineering", "Microsoft"], "datasets": ["STEM-AI-mtl/Electrical-engineering", "garage-bAInd/Open-Platypus"], "model_name": "Phi 2 Electrical Engineering", "base_model": "STEM-AI-mtl/phi-2-electrical-engineering", "inference": false, "license_link": "LICENSE", "license_name": "stem.ai.mtl", "model_creator": "mod", "model_type": "phi-msft", "prompt_template": "{prompt} ", "quantized_by": "TheBloke", "llamafile coverted by": "Shaikat"} | null | shaikatasif/Phi_2_Electrical_Engineering_llamafile | [
"phi-2",
"electrical engineering",
"Microsoft",
"en",
"dataset:STEM-AI-mtl/Electrical-engineering",
"dataset:garage-bAInd/Open-Platypus",
"base_model:STEM-AI-mtl/phi-2-electrical-engineering",
"license:other",
"region:us"
] | 2024-02-15T07:19:25+00:00 | [] | [
"en"
] | TAGS
#phi-2 #electrical engineering #Microsoft #en #dataset-STEM-AI-mtl/Electrical-engineering #dataset-garage-bAInd/Open-Platypus #base_model-STEM-AI-mtl/phi-2-electrical-engineering #license-other #region-us
| This repo contains llamafile version for Phi 2 Electrical Engineering model. This llamafile is created from gguf format of original model's quantized version.
# About llamafile
llamafile is the new framework that collapses all the complexity of (large language models) LLMs down to a single-file executable (called a "llamafile") that runs locally on most computers, with no installation. The first release of llamafile is a product of Mozilla’s innovation group and developed by Justine Tunney, about llamafile in short as per introductory post
llamafile lets you turn large language model (LLM) weights into executables.
Say you have a set of LLM weights in the form of a 4GB file (in the commonly-used GGUF format). With llamafile you can transform that 4GB file into a binary that runs on six OSes without needing to be installed.
Basically, llamafile lets anyone distribute and run LLMs with a single file.
Here in github you can find how I created llamafile version from gguf version for other model.
# Model Description
## From the original model card
This is the adapters from the LoRa fine-tuning of the phi-2 model from Microsoft. It was trained on the STEM-AI-mtl/Electrical-engineering dataset combined with garage-bAInd/Open-Platypus.
Developed by: STEM.AI
Model type: Q&A and code generation
Language(s) (NLP): English
Finetuned from model [optional]: microsoft/phi-2
Direct Use: Q&A related to electrical engineering, and Kicad software. Creation of Python code in general, and for Kicad's scripting console.
Refer to microsoft/phi-2 model card for recommended prompt format.
## GGUF version
The original model was quantized by the Great Mr. Bloke.
As per TheBloke's model card the recommended model phi-2-electrical-engineering.Q5_K_M.gguf quantized with Q5_K_M method this gguf file can take maximum 4.50 GB of RAM.
# How to run
To run llamafile version in windows just rename the file by adding .exe at the last and run it as you run any other exe file.
For running in linux run like any binary in linux, just make sure the llamafile is executable by running chmod.
Thats it.
# Prompt format
Thank you for using this model.
| [
"# About llamafile\n\nllamafile is the new framework that collapses all the complexity of (large language models) LLMs down to a single-file executable (called a \"llamafile\") that runs locally on most computers, with no installation. The first release of llamafile is a product of Mozilla’s innovation group and developed by Justine Tunney, about llamafile in short as per introductory post\n\nllamafile lets you turn large language model (LLM) weights into executables.\n\nSay you have a set of LLM weights in the form of a 4GB file (in the commonly-used GGUF format). With llamafile you can transform that 4GB file into a binary that runs on six OSes without needing to be installed.\n\nBasically, llamafile lets anyone distribute and run LLMs with a single file.\n\nHere in github you can find how I created llamafile version from gguf version for other model.",
"# Model Description",
"## From the original model card \n\nThis is the adapters from the LoRa fine-tuning of the phi-2 model from Microsoft. It was trained on the STEM-AI-mtl/Electrical-engineering dataset combined with garage-bAInd/Open-Platypus.\n\nDeveloped by: STEM.AI\n\nModel type: Q&A and code generation\n\nLanguage(s) (NLP): English\n\nFinetuned from model [optional]: microsoft/phi-2\n\nDirect Use: Q&A related to electrical engineering, and Kicad software. Creation of Python code in general, and for Kicad's scripting console.\n\nRefer to microsoft/phi-2 model card for recommended prompt format.",
"## GGUF version\n\nThe original model was quantized by the Great Mr. Bloke. \n\n\nAs per TheBloke's model card the recommended model phi-2-electrical-engineering.Q5_K_M.gguf quantized with Q5_K_M method this gguf file can take maximum 4.50 GB of RAM.",
"# How to run \n\nTo run llamafile version in windows just rename the file by adding .exe at the last and run it as you run any other exe file. \n\nFor running in linux run like any binary in linux, just make sure the llamafile is executable by running chmod. \n\nThats it.",
"# Prompt format\n\n\n\n Thank you for using this model."
] | [
"TAGS\n#phi-2 #electrical engineering #Microsoft #en #dataset-STEM-AI-mtl/Electrical-engineering #dataset-garage-bAInd/Open-Platypus #base_model-STEM-AI-mtl/phi-2-electrical-engineering #license-other #region-us \n",
"# About llamafile\n\nllamafile is the new framework that collapses all the complexity of (large language models) LLMs down to a single-file executable (called a \"llamafile\") that runs locally on most computers, with no installation. The first release of llamafile is a product of Mozilla’s innovation group and developed by Justine Tunney, about llamafile in short as per introductory post\n\nllamafile lets you turn large language model (LLM) weights into executables.\n\nSay you have a set of LLM weights in the form of a 4GB file (in the commonly-used GGUF format). With llamafile you can transform that 4GB file into a binary that runs on six OSes without needing to be installed.\n\nBasically, llamafile lets anyone distribute and run LLMs with a single file.\n\nHere in github you can find how I created llamafile version from gguf version for other model.",
"# Model Description",
"## From the original model card \n\nThis is the adapters from the LoRa fine-tuning of the phi-2 model from Microsoft. It was trained on the STEM-AI-mtl/Electrical-engineering dataset combined with garage-bAInd/Open-Platypus.\n\nDeveloped by: STEM.AI\n\nModel type: Q&A and code generation\n\nLanguage(s) (NLP): English\n\nFinetuned from model [optional]: microsoft/phi-2\n\nDirect Use: Q&A related to electrical engineering, and Kicad software. Creation of Python code in general, and for Kicad's scripting console.\n\nRefer to microsoft/phi-2 model card for recommended prompt format.",
"## GGUF version\n\nThe original model was quantized by the Great Mr. Bloke. \n\n\nAs per TheBloke's model card the recommended model phi-2-electrical-engineering.Q5_K_M.gguf quantized with Q5_K_M method this gguf file can take maximum 4.50 GB of RAM.",
"# How to run \n\nTo run llamafile version in windows just rename the file by adding .exe at the last and run it as you run any other exe file. \n\nFor running in linux run like any binary in linux, just make sure the llamafile is executable by running chmod. \n\nThats it.",
"# Prompt format\n\n\n\n Thank you for using this model."
] | [
78,
210,
3,
156,
74,
67,
12
] | [
"passage: TAGS\n#phi-2 #electrical engineering #Microsoft #en #dataset-STEM-AI-mtl/Electrical-engineering #dataset-garage-bAInd/Open-Platypus #base_model-STEM-AI-mtl/phi-2-electrical-engineering #license-other #region-us \n# About llamafile\n\nllamafile is the new framework that collapses all the complexity of (large language models) LLMs down to a single-file executable (called a \"llamafile\") that runs locally on most computers, with no installation. The first release of llamafile is a product of Mozilla’s innovation group and developed by Justine Tunney, about llamafile in short as per introductory post\n\nllamafile lets you turn large language model (LLM) weights into executables.\n\nSay you have a set of LLM weights in the form of a 4GB file (in the commonly-used GGUF format). With llamafile you can transform that 4GB file into a binary that runs on six OSes without needing to be installed.\n\nBasically, llamafile lets anyone distribute and run LLMs with a single file.\n\nHere in github you can find how I created llamafile version from gguf version for other model.# Model Description## From the original model card \n\nThis is the adapters from the LoRa fine-tuning of the phi-2 model from Microsoft. It was trained on the STEM-AI-mtl/Electrical-engineering dataset combined with garage-bAInd/Open-Platypus.\n\nDeveloped by: STEM.AI\n\nModel type: Q&A and code generation\n\nLanguage(s) (NLP): English\n\nFinetuned from model [optional]: microsoft/phi-2\n\nDirect Use: Q&A related to electrical engineering, and Kicad software. Creation of Python code in general, and for Kicad's scripting console.\n\nRefer to microsoft/phi-2 model card for recommended prompt format."
] | [
-0.051050879061222076,
0.1838759481906891,
-0.003934354521334171,
0.004525717813521624,
0.12746146321296692,
0.014647496864199638,
0.10555046051740646,
0.12759436666965485,
-0.01097257249057293,
0.032857149839401245,
-0.04476693272590637,
0.09527967870235443,
0.08373471349477768,
0.11282794177532196,
0.14663004875183105,
-0.169986292719841,
0.03943318501114845,
-0.0381905734539032,
-0.03480842337012291,
0.027210058644413948,
0.052255310118198395,
-0.02147345431149006,
0.06865371763706207,
0.04129771143198013,
-0.0695069432258606,
0.03329557552933693,
-0.07848905771970749,
-0.004876350052654743,
0.022215891629457474,
0.094258613884449,
-0.001411511329934001,
-0.06956792622804642,
0.01661253347992897,
-0.15045106410980225,
0.020196041092276573,
0.04761321097612381,
-0.01740444265305996,
0.003668604651466012,
0.07835753262042999,
-0.015536576509475708,
0.23402710258960724,
0.031182842329144478,
0.005144117865711451,
0.05103190615773201,
0.006592712365090847,
-0.13304032385349274,
-0.07143295556306839,
0.12600938975811005,
0.12665630877017975,
0.05661415681242943,
0.02049459144473076,
-0.09982755780220032,
0.11603685468435287,
0.048669856041669846,
0.2052953839302063,
-0.10549231618642807,
-0.02863277867436409,
0.09797285497188568,
0.022097140550613403,
0.06889820843935013,
-0.03287508338689804,
0.0348554328083992,
0.010063535533845425,
0.027280276641249657,
0.05238819867372513,
-0.0412546768784523,
0.06160962954163551,
-0.026035096496343613,
-0.12909777462482452,
-0.010280050337314606,
0.01930590346455574,
-0.07280445098876953,
-0.07988569140434265,
-0.044541992247104645,
0.030938686802983284,
0.0005223134066909552,
0.029357654973864555,
0.08817226439714432,
0.0200374573469162,
0.01026642881333828,
0.1263693869113922,
-0.21368610858917236,
-0.09910321980714798,
-0.04647958651185036,
-0.06461883336305618,
0.27755007147789,
0.05311976373195648,
0.055278073996305466,
0.05089527741074562,
0.06841502338647842,
-0.07672902196645737,
-0.00044057416380383074,
-0.08332768082618713,
0.02823161706328392,
0.0025028272066265345,
0.02621416002511978,
-0.04285907372832298,
-0.06087839975953102,
0.052820585668087006,
0.049072813242673874,
0.022662373259663582,
-0.008870149962604046,
0.07920140773057938,
0.008347975090146065,
0.03011579066514969,
0.07035908102989197,
-0.1217431128025055,
0.022055180743336678,
0.05415220186114311,
-0.014369187876582146,
-0.010468551889061928,
-0.014484348706901073,
-0.10666926205158234,
-0.018035955727100372,
-0.037929028272628784,
0.09917449951171875,
0.0638997033238411,
0.014370396733283997,
0.01531999558210373,
-0.07898120582103729,
-0.013449556194245815,
-0.15151914954185486,
0.004686014261096716,
-0.05412798002362251,
-0.02511117048561573,
0.05216474458575249,
0.08243001252412796,
-0.0062193600460886955,
-0.05905911698937416,
-0.035122115164995193,
-0.03211798518896103,
-0.013681008480489254,
-0.11012545228004456,
-0.012534601613879204,
0.03965721279382706,
-0.0708012729883194,
-0.028245985507965088,
-0.15886501967906952,
-0.2513313591480255,
0.010370025411248207,
0.09899692982435226,
-0.002882384927943349,
-0.027626734226942062,
0.07578562945127487,
0.0005033990601077676,
-0.054653480648994446,
-0.009417223744094372,
0.025006216019392014,
-0.002382636535912752,
-0.0022687558084726334,
-0.038393206894397736,
0.040678225457668304,
-0.16848018765449524,
0.039538756012916565,
0.032096996903419495,
0.07261812686920166,
-0.16655485332012177,
0.06117286533117294,
-0.06513821333646774,
0.08380389958620071,
-0.10652569681406021,
0.02858980931341648,
0.04098605364561081,
0.0055899713188409805,
0.0037927215453237295,
-0.00805837195366621,
-0.10236936062574387,
-0.04310164973139763,
0.016188787296414375,
-0.11640472710132599,
0.004549844656139612,
0.028216488659381866,
0.03554460406303406,
0.1623762995004654,
0.1095978394150734,
0.1299229860305786,
0.2730076014995575,
-0.11931207031011581,
-0.05658004432916641,
0.017605498433113098,
0.02929774299263954,
-0.056511152535676956,
0.07038014382123947,
0.007989074103534222,
-0.04079589992761612,
0.04056074097752571,
-0.09081793576478958,
0.12151271104812622,
0.010942567139863968,
-0.04766365513205528,
0.01984952576458454,
-0.16428793966770172,
-0.024410320445895195,
-0.028861509636044502,
-0.027582388371229172,
0.09021912515163422,
-0.035696644335985184,
0.1027236059308052,
0.1577756106853485,
-0.09182314574718475,
0.03002786450088024,
-0.0625295415520668,
0.1886644959449768,
-0.0842759907245636,
0.012411676347255707,
-0.1029902771115303,
-0.09487167000770569,
0.0024564482737332582,
-0.18345557153224945,
0.07109005749225616,
-0.06633242964744568,
0.013469729572534561,
0.06433949619531631,
-0.02372562699019909,
0.03432375565171242,
-0.03902149200439453,
-0.024219049140810966,
-0.06012982130050659,
-0.012216759845614433,
-0.0734216570854187,
-0.03293880820274353,
0.06947106868028641,
0.018788019195199013,
0.0652768462896347,
-0.01241633202880621,
0.06746713817119598,
-0.011311598122119904,
-0.08477732539176941,
0.028845462948083878,
-0.052873894572257996,
-0.004889616742730141,
-0.029120177030563354,
0.026292214170098305,
0.038032397627830505,
-0.0473199225962162,
0.035166483372449875,
-0.0850178450345993,
0.024378130212426186,
0.04741528630256653,
0.11717244237661362,
-0.013131996616721153,
-0.06382334232330322,
0.004616164602339268,
-0.010163485072553158,
-0.026951923966407776,
-0.09238128364086151,
0.13366658985614777,
-0.0034994659945368767,
0.029656531289219856,
-0.07054242491722107,
-0.02032626047730446,
-0.0024186710361391306,
-0.027476632967591286,
0.0440559983253479,
0.02888183295726776,
0.05889107659459114,
0.011995191685855389,
0.034470584243535995,
0.047368455678224564,
-0.076909139752388,
0.16076694428920746,
0.015033852308988571,
-0.08204680681228638,
-0.0424005463719368,
-0.010906494222581387,
0.023750735446810722,
0.031862109899520874,
0.060930605977773666,
0.004066611640155315,
0.011154331266880035,
-0.01879669353365898,
0.09434831887483597,
-0.07955111563205719,
0.04622889682650566,
-0.0019145541591569781,
-0.06401921808719635,
0.12234075367450714,
0.0023295001592487097,
-0.07217199355363846,
0.027760639786720276,
0.044331811368465424,
0.13229778409004211,
-0.0065767886117100716,
0.010024083778262138,
-0.08386924862861633,
0.11232077330350876,
-0.07679277658462524,
-0.19878603518009186,
-0.2212722897529602,
-0.041306816041469574,
-0.07870706170797348,
-0.005569701548665762,
0.02891923487186432,
0.002607622416689992,
-0.046739932149648666,
-0.0627075731754303,
0.2056441456079483,
-0.0833808034658432,
0.03975817933678627,
0.010612488724291325,
-0.02454853057861328,
-0.024426603689789772,
-0.08801595121622086,
0.0247135441750288,
0.07409005612134933,
-0.10206986963748932,
-0.06292319297790527,
-0.0227756779640913,
0.05565359443426132,
0.034951478242874146,
0.008590085431933403,
-0.008087452501058578,
0.0030335113406181335,
0.17101064324378967,
-0.013764356262981892,
0.09430171549320221,
0.2131185233592987,
-0.015149778686463833,
0.07580796629190445,
0.038545556366443634,
-0.006362954154610634,
-0.012746412307024002,
-0.019169583916664124,
0.06948317587375641,
-0.11583469808101654,
-0.14599308371543884,
-0.011559385806322098,
-0.025950882583856583,
0.0028128589037805796,
0.024891790002584457,
0.06985151767730713,
0.053962964564561844,
0.11833953857421875,
-0.010381221771240234,
0.022921081632375717,
-0.003238087520003319,
0.08139770478010178,
0.06996220350265503,
-0.022868968546390533,
0.023501412943005562,
-0.08200270682573318,
0.071223184466362,
0.09703121334314346,
0.10839217901229858,
0.22784362733364105,
-0.053703147917985916,
0.025420984253287315,
0.06343074142932892,
0.0198668260127306,
-0.0062015545554459095,
0.10254216194152832,
-0.09411575645208359,
0.039927888661623,
0.017489789053797722,
-0.047709573060274124,
0.004133396781980991,
0.06903380155563354,
0.03159671649336815,
-0.020581835880875587,
0.004737038630992174,
-0.044787827879190445,
-0.034157756716012955,
0.1444414258003235,
-0.07680118083953857,
-0.12497396022081375,
-0.006421088241040707,
0.011116006411612034,
-0.0980437844991684,
-0.06542655825614929,
0.0034767843317240477,
0.026191437616944313,
-0.07891038805246353,
0.04316743463277817,
0.0070455255918204784,
0.06458226591348648,
-0.13405130803585052,
-0.05640169978141785,
0.07902137190103531,
0.06759411841630936,
0.022763961926102638,
0.09535381197929382,
-0.0576617494225502,
-0.05652843415737152,
0.04235183075070381,
0.0459633469581604,
-0.028548046946525574,
0.04548121243715286,
0.06518035382032394,
-0.010091018863022327,
0.06586337834596634,
0.023423580452799797,
0.09032637625932693,
-0.046891868114471436,
-0.11675728112459183,
0.014672539196908474,
0.02554660104215145,
-0.18201296031475067,
0.12979115545749664,
-0.03374618664383888,
0.044188469648361206,
-0.09286966919898987,
0.08938909322023392,
-0.02471003122627735,
-0.17511063814163208,
0.12373438477516174,
0.06280051171779633,
0.04317747801542282,
-0.09660675376653671,
-0.039143722504377365,
-0.04969719052314758,
0.1232556402683258,
-0.027660217136144638,
-0.16656607389450073,
-0.12554016709327698,
-0.10428998619318008,
0.14625278115272522,
-0.050183769315481186,
0.10359830409288406,
-0.05095240846276283,
0.09666106849908829,
-0.01638462208211422,
-0.1475563943386078,
-0.06685342639684677,
-0.07008425891399384,
-0.05228037387132645,
0.022148529067635536,
0.05973150208592415,
0.039481669664382935,
-0.016907459124922752,
0.013346522115170956,
-0.01608126610517502,
-0.030523953959345818,
-0.0882701724767685,
-0.03972886502742767,
0.23171932995319366,
-0.031158113852143288,
0.014202793128788471,
-0.08985256403684616,
0.034687526524066925,
-0.034713014960289,
-0.0866926833987236,
0.0019114033784717321,
0.09472715109586716,
-0.02300320193171501,
0.07201596349477768,
0.14663653075695038,
-0.13856521248817444,
-0.20217396318912506,
-0.11096826195716858,
0.1046096682548523,
0.012224316596984863,
-0.0070677874609827995,
-0.2589966654777527,
0.11866877228021622,
0.07053907215595245,
-0.039943769574165344,
0.1584790199995041,
-0.18914297223091125,
-0.06869857758283615,
0.03308482840657234,
0.08462195098400116,
0.06960232555866241,
-0.20442339777946472,
-0.053101547062397,
-0.03547517955303192,
-0.14415700733661652,
0.10678249597549438,
-0.13876235485076904,
0.11040978878736496,
-0.015493987128138542,
0.17868788540363312,
0.04243330657482147,
0.010629839263856411,
0.1276116520166397,
0.00031604920513927937,
-0.0302750077098608,
-0.05386786162853241,
0.055053431540727615,
-0.019474957138299942,
-0.10305539518594742,
0.08407124131917953,
-0.13320235908031464,
0.02320917509496212,
-0.14654897153377533,
-0.08542155474424362,
-0.044411398470401764,
0.04788457974791527,
-0.01665468141436577,
-0.0797235444188118,
-0.0776878297328949,
0.03882129117846489,
0.037035975605249405,
0.021804537624120712,
-0.09250946342945099,
-0.03751041740179062,
-0.004038815852254629,
0.15078625082969666,
0.021283358335494995,
0.04971446841955185,
-0.13629406690597534,
-0.05314793065190315,
-0.05145203694701195,
0.04493546858429909,
-0.1220519170165062,
0.040747448801994324,
0.06137434020638466,
-0.024255773052573204,
0.03887930139899254,
-0.011404349468648434,
-0.13413074612617493,
0.002714278409257531,
0.12488866597414017,
-0.04360688477754593,
-0.06156950816512108,
-0.025822756811976433,
0.13472028076648712,
-0.09266114234924316,
0.07534768432378769,
0.1335488110780716,
0.015511729754507542,
-0.03820108622312546,
0.005752597004175186,
0.04381262883543968,
-0.06363663077354431,
0.06374520808458328,
0.03509465605020523,
-0.029531748965382576,
-0.055538877844810486,
0.06959051638841629,
0.036532506346702576,
0.06788159906864166,
-0.011272312141954899,
0.10837597399950027,
-0.032767802476882935,
-0.07687200605869293,
-0.07426749914884567,
0.0037257950752973557,
-0.21963541209697723,
-0.07370877265930176,
0.06490786373615265,
-0.06407392024993896,
-0.0256927739828825,
0.10197251290082932,
0.014671294949948788,
0.0018347988370805979,
0.02582531049847603,
0.03097948431968689,
-0.002247757511213422,
0.0557001568377018,
-0.023957964032888412,
0.03488251939415932,
-0.14782394468784332,
0.048599909991025925,
0.0033618814777582884,
0.06828657537698746,
-0.00583281647413969,
0.012683301232755184,
-0.05707545951008797,
-0.006454944144934416,
-0.13821639120578766,
0.030168145895004272,
-0.09261735528707504,
0.0028391294181346893,
0.05404118448495865,
-0.0009017072152346373,
-0.0036484969314187765,
0.017615335062146187,
-0.07701049745082855,
-0.0543038547039032,
-0.08267401903867722,
0.03604845330119133,
-0.014345452189445496,
-0.035744693130254745,
0.04299049824476242,
-0.04503129422664642,
0.07955095916986465,
-0.01730181835591793,
0.0348343551158905,
0.028201663866639137,
0.01797715574502945,
0.08563205599784851,
-0.02537454664707184,
-0.018853655084967613,
-0.042891621589660645,
-0.07905010133981705,
0.002816802356392145,
-0.003257795237004757,
0.0007229456678032875,
-0.017063381150364876,
0.026058560237288475,
-0.0993262454867363,
0.006334382575005293,
-0.053782809525728226,
0.015521540306508541,
-0.037025533616542816,
-0.006785959471017122,
0.04912038892507553,
0.07550571858882904,
0.04377797245979309,
0.007803766988217831,
0.027300065383315086,
-0.02318960428237915,
0.025692909955978394,
0.007785136811435223,
0.06716001033782959,
0.09039753675460815,
-0.13512194156646729,
-0.04472796618938446,
0.03547140955924988,
0.12759512662887573,
0.035527508705854416,
-0.050712279975414276,
-0.04429717734456062,
-0.04036412015557289,
0.09835836291313171,
-0.012998717837035656,
0.10794574022293091,
-0.014924812130630016,
0.06759427487850189,
-0.02032952383160591,
0.006309386808425188,
-0.03145359084010124,
0.004377152305096388,
0.0422234833240509,
0.006993651390075684,
0.03109746053814888,
0.09574323147535324,
0.028486665338277817,
-0.015818743035197258,
-0.04462523013353348,
-0.06915463507175446,
-0.02649051509797573,
0.01207883469760418,
-0.06741581857204437,
0.1727912724018097,
0.13379555940628052,
-0.14278747141361237,
0.08389165997505188,
0.1068701446056366,
-0.015407184138894081,
-0.03662143647670746,
-0.11488331854343414,
0.0009306150604970753,
-0.05752839520573616,
-0.018753893673419952,
-0.04593473672866821,
0.03916759788990021,
0.06610725075006485,
0.026911811903119087,
0.034352246671915054,
0.11673272401094437,
-0.05694754049181938,
-0.12794426083564758,
0.0034833126701414585,
0.004803604446351528,
0.027876276522874832,
0.03571333363652229,
-0.006757275201380253,
0.025126898661255836,
0.037661269307136536,
0.05700773000717163,
0.0576583594083786,
-0.00014066617586649954,
0.06670194864273071,
-0.05539390817284584,
-0.000040106278902385384,
-0.018025116994976997,
-0.038780953735113144,
0.024897422641515732,
0.1656811684370041,
-0.02285299077630043,
-0.09735289961099625,
-0.00719451904296875,
0.05330738052725792,
-0.035358868539333344,
-0.010354374535381794,
-0.07053027302026749,
0.035464946180582047,
-0.12244588881731033,
-0.021969491615891457,
-0.04690953344106674,
-0.06980738788843155,
-0.05498051270842552,
0.2883734107017517,
-0.001389021403156221,
0.02293720841407776,
-0.02866111695766449,
0.033129602670669556,
-0.0031260414980351925,
-0.03310687467455864,
0.06617917120456696,
0.05448049306869507,
0.30289092659950256,
-0.01757085509598255,
0.07969187945127487,
-0.0037976980675011873,
0.00046634883619844913,
-0.12141665071249008,
0.0655340924859047,
-0.13872680068016052,
0.0322953462600708,
-0.0422712005674839,
0.0412832535803318,
-0.06496959924697876,
-0.2456788569688797,
0.078823983669281,
0.07212567329406738,
-0.04918082058429718,
-0.03827317804098129,
0.03556493669748306,
-0.020933538675308228,
0.02907039038836956,
-0.04295125603675842,
0.004598292056471109,
0.10394980013370514,
0.005830163601785898,
-0.13558900356292725,
-0.0712689682841301,
0.1327027678489685,
0.01948539912700653,
0.17618459463119507,
-0.03642755001783371,
0.035624466836452484,
0.04560810700058937,
-0.0704176127910614,
-0.14770014584064484,
0.029689131304621696,
0.00956942792981863,
-0.15904052555561066,
0.020410219207406044,
0.08443833142518997,
-0.029617972671985626,
0.002030538860708475,
-0.02112727053463459,
0.15449681878089905,
-0.017816729843616486,
0.09690690040588379,
0.026634713634848595,
-0.044520724564790726,
-0.02663496695458889,
-0.15937265753746033,
0.1496184766292572,
0.07541963458061218,
-0.045259684324264526,
-0.02876943349838257,
-0.07100564241409302,
0.05578663945198059,
0.02106611244380474,
-0.06547528505325317,
-0.02268732152879238,
-0.08554011583328247,
-0.06270583719015121,
-0.03162369132041931,
0.04363113269209862,
-0.19318191707134247,
0.011009684763848782,
-0.010723080486059189,
-0.02525998465716839,
-0.05347353219985962,
0.08038366585969925,
0.12957660853862762,
-0.009466881863772869,
-0.0198848657310009,
-0.13042396306991577,
-0.030156396329402924,
0.006843851879239082,
-0.12170339375734329,
-0.09546719491481781
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# t5
This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on the wmt16 ro-en dataset.
It achieves the following results on the evaluation set:
- Loss: 1.3574
- Bleu: 27.1318
- Gen Len: 42.5798
- Loss Smallest Subnet: 1.3574
- Bleu Smallest Subnet: 27.1318
- Gen Len Smallest Subnet: 42.5798
- Loss Random Subnet: 1.3574
- Loss Sum: 4.0723
- Bleu Random Subnet: 27.1318
- Bleu Sum: 81.3954
- Gen Len Random Subnet: 42.5798
- Gen Len Sum: 127.7394
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 12
- eval_batch_size: 24
- seed: 42
- distributed_type: multi-GPU
- num_devices: 4
- total_train_batch_size: 48
- total_eval_batch_size: 96
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5.0
### Training results
| Training Loss | Epoch | Step | Validation Loss | Bleu | Gen Len | Loss Smallest Subnet | Bleu Smallest Subnet | Gen Len Smallest Subnet | Loss Random Subnet | Loss Sum | Bleu Random Subnet | Bleu Sum | Gen Len Random Subnet | Gen Len Sum |
|:-------------:|:-----:|:-----:|:---------------:|:-------:|:-------:|:--------------------:|:--------------------:|:-----------------------:|:------------------:|:--------:|:------------------:|:--------:|:---------------------:|:-----------:|
| 0.5967 | 1.0 | 12715 | 1.3820 | 26.593 | 42.4422 | 1.3820 | 26.593 | 42.4422 | 1.3820 | 4.1461 | 26.593 | 79.779 | 42.4422 | 127.3266 |
| 0.5768 | 2.0 | 25430 | 1.3728 | 26.6191 | 42.6738 | 1.3728 | 26.6191 | 42.6738 | 1.3728 | 4.1184 | 26.6191 | 79.8573 | 42.6738 | 128.0214 |
| 0.5663 | 3.0 | 38145 | 1.3616 | 26.9203 | 42.5298 | 1.3616 | 26.9203 | 42.5298 | 1.3616 | 4.0849 | 26.9203 | 80.7609 | 42.5298 | 127.5894 |
| 0.5523 | 4.0 | 50860 | 1.3570 | 27.0195 | 42.5203 | 1.3570 | 27.0195 | 42.5203 | 1.3570 | 4.0709 | 27.0195 | 81.0585 | 42.5203 | 127.5609 |
| 0.5436 | 5.0 | 63575 | 1.3574 | 27.1318 | 42.5798 | 1.3574 | 27.1318 | 42.5798 | 1.3574 | 4.0723 | 27.1318 | 81.3954 | 42.5798 | 127.7394 |
### Framework versions
- Transformers 4.20.1
- Pytorch 1.8.0
- Datasets 2.4.0
- Tokenizers 0.12.1
| {"language": ["en", "ro"], "license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["wmt16"], "metrics": ["bleu"], "model-index": [{"name": "t5", "results": [{"task": {"type": "translation", "name": "Translation"}, "dataset": {"name": "wmt16 ro-en", "type": "wmt16", "args": "ro-en"}, "metrics": [{"type": "bleu", "value": 27.1318, "name": "Bleu"}]}]}]} | text2text-generation | wonjeongho/t5-wmt16-ro-en | [
"transformers",
"pytorch",
"elastic_t5",
"text2text-generation",
"generated_from_trainer",
"en",
"ro",
"dataset:wmt16",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-15T07:21:39+00:00 | [] | [
"en",
"ro"
] | TAGS
#transformers #pytorch #elastic_t5 #text2text-generation #generated_from_trainer #en #ro #dataset-wmt16 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
| t5
==
This model is a fine-tuned version of t5-small on the wmt16 ro-en dataset.
It achieves the following results on the evaluation set:
* Loss: 1.3574
* Bleu: 27.1318
* Gen Len: 42.5798
* Loss Smallest Subnet: 1.3574
* Bleu Smallest Subnet: 27.1318
* Gen Len Smallest Subnet: 42.5798
* Loss Random Subnet: 1.3574
* Loss Sum: 4.0723
* Bleu Random Subnet: 27.1318
* Bleu Sum: 81.3954
* Gen Len Random Subnet: 42.5798
* Gen Len Sum: 127.7394
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 5e-05
* train\_batch\_size: 12
* eval\_batch\_size: 24
* seed: 42
* distributed\_type: multi-GPU
* num\_devices: 4
* total\_train\_batch\_size: 48
* total\_eval\_batch\_size: 96
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 5.0
### Training results
### Framework versions
* Transformers 4.20.1
* Pytorch 1.8.0
* Datasets 2.4.0
* Tokenizers 0.12.1
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 12\n* eval\\_batch\\_size: 24\n* seed: 42\n* distributed\\_type: multi-GPU\n* num\\_devices: 4\n* total\\_train\\_batch\\_size: 48\n* total\\_eval\\_batch\\_size: 96\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.20.1\n* Pytorch 1.8.0\n* Datasets 2.4.0\n* Tokenizers 0.12.1"
] | [
"TAGS\n#transformers #pytorch #elastic_t5 #text2text-generation #generated_from_trainer #en #ro #dataset-wmt16 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 12\n* eval\\_batch\\_size: 24\n* seed: 42\n* distributed\\_type: multi-GPU\n* num\\_devices: 4\n* total\\_train\\_batch\\_size: 48\n* total\\_eval\\_batch\\_size: 96\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.20.1\n* Pytorch 1.8.0\n* Datasets 2.4.0\n* Tokenizers 0.12.1"
] | [
72,
147,
4,
31
] | [
"passage: TAGS\n#transformers #pytorch #elastic_t5 #text2text-generation #generated_from_trainer #en #ro #dataset-wmt16 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 12\n* eval\\_batch\\_size: 24\n* seed: 42\n* distributed\\_type: multi-GPU\n* num\\_devices: 4\n* total\\_train\\_batch\\_size: 48\n* total\\_eval\\_batch\\_size: 96\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5.0### Training results### Framework versions\n\n\n* Transformers 4.20.1\n* Pytorch 1.8.0\n* Datasets 2.4.0\n* Tokenizers 0.12.1"
] | [
-0.11049036681652069,
0.09477531909942627,
-0.0014200173318386078,
0.07716193795204163,
0.15277165174484253,
0.043700624257326126,
0.10745809972286224,
0.1365129053592682,
-0.1322702318429947,
0.08449317514896393,
0.11734872311353683,
0.11100301146507263,
0.05725912004709244,
0.13531285524368286,
-0.0016595398774370551,
-0.2698667645454407,
0.008590824902057648,
0.011519263498485088,
-0.1595603972673416,
0.11137362569570541,
0.0949283018708229,
-0.10574844479560852,
0.08927199989557266,
0.017736047506332397,
-0.19192396104335785,
0.002027335111051798,
-0.028184067457914352,
-0.0423470064997673,
0.09900764375925064,
0.06481308490037918,
0.06373553723096848,
-0.005794917233288288,
0.08518767356872559,
-0.22808711230754852,
0.0002610132796689868,
0.08094684779644012,
0.0013538632774725556,
0.07708726823329926,
0.09082482010126114,
0.0014826556434854865,
0.2012353390455246,
-0.056548357009887695,
0.043934885412454605,
0.06145244091749191,
-0.1017046645283699,
-0.27608051896095276,
-0.09726153314113617,
0.07586262375116348,
0.08251915127038956,
0.08745349198579788,
-0.0272921621799469,
0.10795251280069351,
-0.0859462320804596,
0.07861995697021484,
0.22466114163398743,
-0.272698312997818,
-0.08145221322774887,
0.052688322961330414,
0.029875559732317924,
0.04037753865122795,
-0.10276751965284348,
-0.01903539150953293,
0.07280478626489639,
0.03949273005127907,
0.05779324471950531,
0.021714212372899055,
-0.00030343641992658377,
0.028595881536602974,
-0.15129944682121277,
-0.04805895686149597,
0.19891811907291412,
0.09425859898328781,
-0.024460284039378166,
-0.046577855944633484,
-0.038551513105630875,
-0.18470266461372375,
-0.015080643817782402,
0.0017132711363956332,
0.041494857519865036,
-0.045810993760824203,
-0.11499243974685669,
0.03692527115345001,
-0.06971336156129837,
-0.09469100832939148,
-0.007097579073160887,
0.1296604871749878,
0.07613666355609894,
0.015195658430457115,
-0.009853390976786613,
0.125628262758255,
0.026268742978572845,
-0.12240652740001678,
-0.02602744847536087,
0.01353688444942236,
-0.04655981808900833,
-0.015890995040535927,
-0.04784431308507919,
0.04020186513662338,
0.03851446881890297,
0.16012690961360931,
-0.0726662203669548,
0.04259328544139862,
0.08311610668897629,
0.012559039518237114,
-0.07775595784187317,
0.13594457507133484,
-0.09631878137588501,
-0.05571267381310463,
-0.038077469915151596,
0.06956855207681656,
-0.014557875692844391,
0.006379889789968729,
-0.06620138883590698,
0.03424503654241562,
0.09174679219722748,
0.021461158990859985,
-0.022632429376244545,
0.03796777129173279,
-0.04933326691389084,
-0.034427154809236526,
0.009004728868603706,
-0.08706075698137283,
0.017880279570817947,
0.005923398770391941,
-0.10463342070579529,
0.014229360036551952,
0.00601415429264307,
-0.0024942103773355484,
-0.006299263797700405,
0.09194197505712509,
-0.11599447578191757,
-0.025024347007274628,
-0.1104147806763649,
-0.09184493124485016,
0.038154665380716324,
-0.02731669507920742,
0.003223330480977893,
-0.08102333545684814,
-0.13644351065158844,
-0.0415949672460556,
0.05008436366915703,
-0.04315583407878876,
-0.08293189108371735,
-0.05061480030417442,
-0.08377771824598312,
0.024509543552994728,
0.0001844962389441207,
0.1668659895658493,
-0.052720677107572556,
0.10630564391613007,
0.07511292397975922,
0.05894102156162262,
0.04387008398771286,
0.04664945974946022,
-0.05873113498091698,
0.03818853199481964,
-0.09564102441072464,
0.049995627254247665,
-0.07978246361017227,
0.06771004945039749,
-0.10968592762947083,
-0.12291935086250305,
-0.012298831716179848,
-0.006091954652220011,
0.0706014335155487,
0.09355135262012482,
-0.11207534372806549,
-0.10393066704273224,
0.19353295862674713,
-0.07515829056501389,
-0.15450841188430786,
0.1206924170255661,
-0.0027701538056135178,
-0.024716375395655632,
0.045765917748212814,
0.10417773574590683,
0.07177631556987762,
-0.07261918485164642,
-0.05956403538584709,
-0.025200404226779938,
0.10900554805994034,
-0.01750567927956581,
0.12929503619670868,
-0.015558896586298943,
-0.016355860978364944,
0.03314472362399101,
-0.04611476510763168,
0.03816218301653862,
-0.1314171999692917,
-0.0958392471075058,
-0.05248795449733734,
-0.11164387315511703,
0.001931849867105484,
0.06259433180093765,
0.07546450197696686,
-0.1041620671749115,
-0.12713614106178284,
0.04205971211194992,
0.14717814326286316,
-0.065073661506176,
0.010747541673481464,
-0.08218619227409363,
0.09153448790311813,
-0.05903620645403862,
0.009376201778650284,
-0.15500985085964203,
-0.07995274662971497,
0.023600753396749496,
-0.06843610852956772,
0.011406820267438889,
0.011087315157055855,
0.05383245646953583,
0.09338856488466263,
-0.06183373183012009,
-0.030851591378450394,
-0.09745007753372192,
-0.025557609274983406,
-0.0763881579041481,
-0.22237645089626312,
-0.07293126732110977,
-0.005560119170695543,
0.15445736050605774,
-0.22682487964630127,
0.02718447521328926,
0.00007032789289951324,
0.12551942467689514,
0.014250469394028187,
-0.047280408442020416,
-0.055775877088308334,
0.06791111826896667,
-0.05819331854581833,
-0.06080089136958122,
0.033897824585437775,
-0.008101742714643478,
-0.07729119062423706,
-0.0655255988240242,
-0.09551164507865906,
0.1664569079875946,
0.12277441471815109,
-0.038982778787612915,
-0.08436315506696701,
0.0337737612426281,
-0.0924028605222702,
-0.05458363518118858,
-0.03795871511101723,
-0.0005582945886999369,
0.07841543853282928,
-0.023524532094597816,
0.11892247945070267,
-0.06864575296640396,
-0.05727125704288483,
0.037747740745544434,
-0.003489469178020954,
-0.014192484319210052,
0.14634427428245544,
0.12997154891490936,
-0.047172319144010544,
0.15100030601024628,
0.07710114866495132,
-0.0818009227514267,
0.14874760806560516,
-0.060154028236866,
-0.10466616600751877,
-0.03738481178879738,
0.009844556450843811,
0.0273855272680521,
0.11318561434745789,
-0.13901954889297485,
0.0005481138359755278,
0.02818361110985279,
0.023460406810045242,
0.05149368941783905,
-0.1992213875055313,
-0.016259880736470222,
0.02665828727185726,
-0.057590052485466,
-0.008753710426390171,
0.00035815942101180553,
-0.03644818067550659,
0.1029527559876442,
0.015713971108198166,
-0.005128782708197832,
0.0025624819099903107,
0.007627596613019705,
-0.09301886707544327,
0.21299061179161072,
-0.08964473754167557,
-0.1031542420387268,
-0.15584376454353333,
-0.004141191020607948,
-0.08163481205701828,
-0.0007533304742537439,
0.054537247866392136,
-0.12980812788009644,
-0.03431984782218933,
-0.05018860474228859,
0.07277103513479233,
-0.05027124658226967,
0.02748410776257515,
0.04194435477256775,
0.03539662808179855,
0.07885153591632843,
-0.11942130327224731,
0.023846697062253952,
-0.019089363515377045,
-0.10320388525724411,
0.017799081280827522,
0.01812570169568062,
0.12547101080417633,
0.14732933044433594,
0.03699755296111107,
0.029131878167390823,
-0.011839701794087887,
0.21292126178741455,
-0.07736700773239136,
-0.05422066152095795,
0.11296478658914566,
0.06857655942440033,
0.03139452636241913,
0.07540673017501831,
0.05704660341143608,
-0.09729429334402084,
0.03052343614399433,
0.07738278806209564,
-0.01833445020020008,
-0.24801521003246307,
-0.035893574357032776,
-0.05410703644156456,
0.014116701669991016,
0.08719027042388916,
0.04542640969157219,
0.013650468550622463,
0.0623122975230217,
-0.01793229579925537,
0.09092036634683609,
-0.04882539436221123,
0.06867531687021255,
0.07253868877887726,
0.05543554574251175,
0.11919423937797546,
-0.060375768691301346,
-0.02262544445693493,
0.06646623462438583,
-0.009215842932462692,
0.2562910318374634,
-0.013043863698840141,
0.12358593195676804,
0.08001992106437683,
0.1728798747062683,
-0.015939049422740936,
0.053430382162332535,
0.013442812487483025,
-0.01808808743953705,
0.01883845031261444,
-0.050378862768411636,
-0.01376631110906601,
0.042315367609262466,
0.015057200565934181,
0.06043078750371933,
-0.14525236189365387,
0.021050114184617996,
0.0642310306429863,
0.2732357084751129,
0.0649770200252533,
-0.32868820428848267,
-0.13248832523822784,
0.004984779749065638,
-0.04469519108533859,
-0.013308758847415447,
0.025252843275666237,
0.09740996360778809,
-0.10787414014339447,
0.05846244469285011,
-0.053497567772865295,
0.09486014395952225,
-0.025588661432266235,
-0.000012696022167801857,
0.11641758680343628,
0.128327876329422,
0.0053799753077328205,
0.09787458926439285,
-0.26209619641304016,
0.2647828161716461,
-0.008287414908409119,
0.07544483989477158,
-0.04046471416950226,
0.03395000100135803,
0.027929365634918213,
0.03170013427734375,
0.04573189839720726,
-0.0033209489192813635,
-0.06894426047801971,
-0.18217583000659943,
-0.06013615056872368,
0.04642164707183838,
0.11896494776010513,
-0.09397081285715103,
0.11969683319330215,
-0.03392203152179718,
0.0017944450955837965,
0.05117311701178551,
-0.009746501222252846,
-0.11270422488451004,
-0.09294447302818298,
0.01827925071120262,
-0.037767112255096436,
0.015095788985490799,
-0.0933573767542839,
-0.11957599222660065,
-0.10448935627937317,
0.1685982346534729,
-0.08455649018287659,
-0.012905576266348362,
-0.11657562106847763,
0.11736811697483063,
0.12957286834716797,
-0.09256324917078018,
0.03152388706803322,
-0.003970266319811344,
0.09324514865875244,
0.040805865079164505,
-0.05343017727136612,
0.09639772027730942,
-0.08083278685808182,
-0.22589141130447388,
-0.03993893787264824,
0.09969107061624527,
0.038350384682416916,
0.05644412338733673,
-0.05115249752998352,
0.009304999373853207,
-0.03696804493665695,
-0.11158659309148788,
0.05237460508942604,
0.04481738433241844,
0.0610649473965168,
0.05499915033578873,
-0.062480125576257706,
0.019244609400629997,
-0.038738660514354706,
-0.07273302972316742,
0.10373388975858688,
0.29289475083351135,
-0.08396346122026443,
-0.019881606101989746,
0.04811694473028183,
-0.061117496341466904,
-0.15261481702327728,
0.031141139566898346,
0.09599676728248596,
0.0207219198346138,
-0.007911112159490585,
-0.22634384036064148,
0.1050667017698288,
0.12862825393676758,
-0.02066759578883648,
0.1444276124238968,
-0.3210983872413635,
-0.13337258994579315,
0.0700574517250061,
0.1166282370686531,
0.020713407546281815,
-0.1853841096162796,
-0.05073901265859604,
-0.020732149481773376,
-0.18189288675785065,
0.10229147970676422,
-0.04697081819176674,
0.12582872807979584,
-0.03495561704039574,
0.042092930525541306,
-0.006063653621822596,
-0.05106521397829056,
0.1634581834077835,
0.01847473345696926,
0.05962248891592026,
-0.023731792345643044,
0.032717082649469376,
0.07862190902233124,
-0.043819691985845566,
0.013287763111293316,
-0.07742618024349213,
0.05270884931087494,
-0.11758149415254593,
-0.03506851568818092,
-0.0875990241765976,
0.03594302758574486,
-0.05362479388713837,
-0.04415874183177948,
-0.054748885333538055,
0.05467638373374939,
0.03814110904932022,
-0.0172355305403471,
0.11038453876972198,
0.024249669164419174,
0.18059371411800385,
0.06392684578895569,
0.027294190600514412,
-0.010359195992350578,
-0.08695671707391739,
-0.02189810574054718,
-0.00568766426295042,
0.0606590211391449,
-0.15908731520175934,
0.003972287755459547,
0.1403329223394394,
0.035118468105793,
0.13855375349521637,
0.08117375522851944,
-0.07830052077770233,
0.0232680793851614,
0.09150838851928711,
-0.11986593157052994,
-0.12523657083511353,
-0.029216809198260307,
-0.06747104972600937,
-0.1442670077085495,
0.07057492434978485,
0.06939361244440079,
-0.06791307032108307,
-0.01315565686672926,
-0.013766339980065823,
0.03470567613840103,
-0.07641422748565674,
0.23044580221176147,
0.05275525525212288,
0.0754784569144249,
-0.09516756236553192,
0.09955154359340668,
0.03904147446155548,
-0.11533419787883759,
0.0031994194723665714,
0.06945306062698364,
-0.05784740671515465,
-0.02107521891593933,
0.0435468927025795,
0.11402881890535355,
-0.06827989220619202,
-0.06255315989255905,
-0.1413603127002716,
-0.11445175856351852,
0.09780190885066986,
0.05724983662366867,
0.08417955040931702,
0.04825255274772644,
-0.009085549041628838,
0.05276884511113167,
-0.12602569162845612,
0.09749283641576767,
0.08391983062028885,
0.07979478687047958,
-0.15475596487522125,
0.13258309662342072,
0.02318127453327179,
0.04055773466825485,
-0.000907612731680274,
0.021605176851153374,
-0.12349496781826019,
-0.010069545358419418,
-0.11888396739959717,
-0.05796809121966362,
-0.058242205530405045,
0.0017047817818820477,
0.009094328619539738,
-0.04561924189329147,
-0.0837075263261795,
0.03863498941063881,
-0.11662645637989044,
-0.06504442542791367,
0.0060188001953065395,
0.08257614821195602,
-0.11416170746088028,
0.01759270578622818,
0.04299367219209671,
-0.09977532923221588,
0.07767858356237411,
0.05122464522719383,
0.059438757598400116,
0.0500975139439106,
-0.06931550055742264,
0.0372074618935585,
0.018878428265452385,
0.001977665815502405,
0.05156297609210014,
-0.1338125616312027,
0.024863477796316147,
-0.02622043527662754,
0.05755149945616722,
0.008018315769731998,
0.03745977580547333,
-0.13640256226062775,
-0.02060503512620926,
-0.05469757691025734,
-0.056782033294439316,
-0.057560764253139496,
0.03998453915119171,
0.06977944076061249,
0.022595496848225594,
0.16510044038295746,
-0.06698262691497803,
0.010660090483725071,
-0.23665839433670044,
-0.00402481434866786,
-0.01061631552875042,
-0.07499799132347107,
-0.07987788319587708,
-0.012839031405746937,
0.07198118418455124,
-0.05262162536382675,
0.1362648755311966,
-0.03685111179947853,
0.036144401878118515,
0.030318940058350563,
-0.0571318119764328,
0.027529597282409668,
0.02105223387479782,
0.2084321826696396,
0.04274117946624756,
0.006604830734431744,
0.03516332060098648,
0.031077075749635696,
0.08300964534282684,
0.06160400062799454,
0.19622069597244263,
0.11366865038871765,
-0.04904088377952576,
0.10262387990951538,
0.05561879649758339,
-0.1261722594499588,
-0.15656785666942596,
0.08444412797689438,
-0.06344887614250183,
0.12695622444152832,
-0.019195210188627243,
0.12920697033405304,
0.12602069973945618,
-0.19140876829624176,
0.029073085635900497,
-0.04346892982721329,
-0.07027861475944519,
-0.1115419864654541,
-0.036807894706726074,
-0.08010606467723846,
-0.2118861973285675,
0.03475358337163925,
-0.12942630052566528,
0.027059592306613922,
0.09768451750278473,
0.030830522999167442,
0.0020353198051452637,
0.1370432823896408,
0.02587069198489189,
0.027434784919023514,
0.04216407239437103,
0.0005893702618777752,
-0.04504380375146866,
-0.0494178831577301,
-0.08574660867452621,
0.009194094687700272,
-0.042021967470645905,
0.06617983430624008,
-0.06312088668346405,
-0.11427141726016998,
0.08960967510938644,
0.017427925020456314,
-0.06995059549808502,
0.011818980798125267,
-0.0007818352896720171,
0.05195653811097145,
0.04998398572206497,
0.008938344195485115,
0.00016719265840947628,
-0.01452202070504427,
0.21584612131118774,
-0.08824951946735382,
-0.04370992258191109,
-0.14571359753608704,
0.25089362263679504,
0.034498076885938644,
-0.022917237132787704,
0.05148034170269966,
-0.08688202500343323,
-0.027894707396626472,
0.1887550950050354,
0.18736520409584045,
-0.014553887769579887,
-0.05002756789326668,
0.03394056856632233,
-0.020091410726308823,
-0.02966197021305561,
0.10801409929990768,
0.11518801003694534,
0.09449438005685806,
-0.0612647570669651,
-0.049000680446624756,
-0.04694438353180885,
-0.03000710904598236,
-0.005044703371822834,
0.07675592601299286,
0.03121880441904068,
-0.020703407004475594,
-0.02013620361685753,
0.06907803565263748,
-0.07047885656356812,
-0.09381771832704544,
0.1016862615942955,
-0.1568942368030548,
-0.17996761202812195,
-0.03452460840344429,
0.07816679030656815,
-0.0008042479166761041,
0.07880689948797226,
-0.01228799857199192,
-0.03256829455494881,
0.11290180683135986,
-0.002421111101284623,
-0.11561037600040436,
-0.14754682779312134,
0.09367959201335907,
-0.04435078799724579,
0.21310965716838837,
-0.05314883589744568,
0.0337706059217453,
0.13907895982265472,
0.03355542570352554,
-0.0999884158372879,
0.05157928168773651,
0.0578443706035614,
-0.1011534333229065,
0.028023425489664078,
0.12558862566947937,
-0.03538387268781662,
0.09886763244867325,
0.029716208577156067,
-0.10018158704042435,
0.0007947888225317001,
-0.04519171267747879,
-0.017861774191260338,
-0.07183346152305603,
-0.008341713808476925,
-0.07438794523477554,
0.13659560680389404,
0.25132685899734497,
-0.06358828395605087,
-0.01258753053843975,
-0.07288156449794769,
0.06309203058481216,
0.0432918444275856,
0.1099909171462059,
-0.024601683020591736,
-0.2830948233604431,
0.0349867083132267,
-0.014264602214097977,
-0.006273791193962097,
-0.23892278969287872,
-0.08339092135429382,
0.04781134054064751,
-0.06310112774372101,
-0.07111173868179321,
0.11187884211540222,
0.11547049134969711,
0.04940648004412651,
-0.06688066571950912,
-0.10381605476140976,
-0.08159905672073364,
0.1647169589996338,
-0.16252201795578003,
-0.10963647067546844
] |
null | null | null |
# Model Trained Using AutoTrain
This model was trained using AutoTrain. For more information, please visit [AutoTrain](https://hf.co/docs/autotrain).
# Usage
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
model_path = "PATH_TO_THIS_REPO"
tokenizer = AutoTokenizer.from_pretrained(model_path)
model = AutoModelForCausalLM.from_pretrained(
model_path,
device_map="auto",
torch_dtype='auto'
).eval()
# Prompt content: "hi"
messages = [
{"role": "user", "content": "hi"}
]
input_ids = tokenizer.apply_chat_template(conversation=messages, tokenize=True, add_generation_prompt=True, return_tensors='pt')
output_ids = model.generate(input_ids.to('cuda'))
response = tokenizer.decode(output_ids[0][input_ids.shape[1]:], skip_special_tokens=True)
# Model response: "Hello! How can I assist you today?"
print(response)
``` | {"license": "other", "tags": ["autotrain", "text-generation"], "widget": [{"text": "I love AutoTrain because "}]} | text-generation | siumankwan23/dk1 | [
"safetensors",
"autotrain",
"text-generation",
"license:other",
"endpoints_compatible",
"region:us"
] | 2024-02-15T07:22:44+00:00 | [] | [] | TAGS
#safetensors #autotrain #text-generation #license-other #endpoints_compatible #region-us
|
# Model Trained Using AutoTrain
This model was trained using AutoTrain. For more information, please visit AutoTrain.
# Usage
| [
"# Model Trained Using AutoTrain\n\nThis model was trained using AutoTrain. For more information, please visit AutoTrain.",
"# Usage"
] | [
"TAGS\n#safetensors #autotrain #text-generation #license-other #endpoints_compatible #region-us \n",
"# Model Trained Using AutoTrain\n\nThis model was trained using AutoTrain. For more information, please visit AutoTrain.",
"# Usage"
] | [
33,
29,
3
] | [
"passage: TAGS\n#safetensors #autotrain #text-generation #license-other #endpoints_compatible #region-us \n# Model Trained Using AutoTrain\n\nThis model was trained using AutoTrain. For more information, please visit AutoTrain.# Usage"
] | [
-0.03320549428462982,
0.03780708089470863,
-0.0005784488166682422,
0.037439193576574326,
0.13256101310253143,
-0.02594633586704731,
0.22870999574661255,
0.04971681907773018,
-0.04270017519593239,
-0.08776232600212097,
0.19642603397369385,
0.16802352666854858,
-0.04566871374845505,
0.18935616314411163,
-0.02990073338150978,
-0.2414124757051468,
0.021885043010115623,
-0.025850016623735428,
0.1327640414237976,
0.11522045731544495,
0.14238014817237854,
-0.07779128849506378,
0.06120644509792328,
0.04086628183722496,
-0.20404933393001556,
0.03463415056467056,
0.07968573272228241,
-0.11895040422677994,
0.18004877865314484,
0.032886918634176254,
0.13635416328907013,
0.01931498385965824,
0.14652439951896667,
-0.12186150997877121,
0.014377960003912449,
0.01464270893484354,
-0.015491045080125332,
0.055415596812963486,
0.08804452419281006,
-0.038794226944446564,
0.09763352572917938,
0.177653506398201,
0.10883878171443939,
0.04911845549941063,
-0.10558086633682251,
-0.014727416448295116,
-0.03310466557741165,
0.018835384398698807,
0.12075160443782806,
0.1193094402551651,
-0.01845790445804596,
0.20021599531173706,
-0.14986595511436462,
0.07329507917165756,
-0.0995626449584961,
-0.27255508303642273,
-0.0038277229759842157,
0.21143054962158203,
0.07346842437982559,
-0.025004452094435692,
-0.12620827555656433,
0.06475763022899628,
0.12761425971984863,
0.0030757547356188297,
0.06504988670349121,
-0.015198786742985249,
-0.055105701088905334,
-0.0015243350062519312,
-0.07397002726793289,
-0.004598719999194145,
0.18640007078647614,
-0.07974611967802048,
-0.031184203922748566,
-0.12737500667572021,
-0.019428882747888565,
0.04709514603018761,
0.011552144773304462,
-0.09352482110261917,
-0.0217994824051857,
0.11079124361276627,
-0.007622338831424713,
-0.02531961165368557,
-0.15207529067993164,
-0.05755603685975075,
-0.08864409476518631,
0.04077286645770073,
0.0017509139142930508,
0.011538662947714329,
-0.09947098046541214,
0.12073534727096558,
-0.029350996017456055,
-0.0943499282002449,
0.052897434681653976,
-0.1107030138373375,
0.04635190963745117,
-0.11982002854347229,
-0.03970254212617874,
-0.10856737196445465,
0.013430505990982056,
0.22841021418571472,
0.1669083684682846,
-0.015314205549657345,
-0.08587565273046494,
0.039016176015138626,
0.02371702343225479,
0.09614221751689911,
0.06376225501298904,
-0.015822242945432663,
0.06775996834039688,
-0.04785482585430145,
-0.017039362341165543,
-0.025495992973446846,
-0.1726902425289154,
0.032083623111248016,
0.01997307874262333,
0.07117509841918945,
-0.0760226845741272,
0.06040170043706894,
-0.01951628364622593,
0.055283352732658386,
0.05161101743578911,
-0.031190861016511917,
0.03744623437523842,
-0.052504897117614746,
0.01617865450680256,
-0.09791388362646103,
0.0286922138184309,
0.1180110052227974,
0.03286140412092209,
0.1336720734834671,
-0.09649777412414551,
-0.026225421577692032,
-0.1056324690580368,
-0.03878350928425789,
0.018166208639740944,
-0.0019215025240555406,
0.0628642737865448,
-0.19663763046264648,
-0.30395275354385376,
-0.027070891112089157,
0.053043100982904434,
-0.019671862944960594,
-0.05561401695013046,
-0.07015043497085571,
0.016289202496409416,
0.059536442160606384,
-0.02920805849134922,
0.054385289549827576,
-0.022419849410653114,
0.03813159465789795,
-0.07676586508750916,
-0.02052054926753044,
-0.06291672587394714,
0.006658008787781,
-0.14841435849666595,
-0.03448035567998886,
-0.030017102137207985,
0.006548900622874498,
-0.03775618225336075,
0.16895608603954315,
-0.011088937520980835,
0.047757651656866074,
-0.05747115612030029,
0.05074193328619003,
0.007877329364418983,
0.1440490484237671,
-0.1335235834121704,
0.005429679993540049,
0.1511751264333725,
-0.11302075535058975,
-0.10663392394781113,
0.09467647224664688,
-0.10317569971084595,
0.23649843037128448,
0.10416192561388016,
0.13955152034759521,
0.05125761032104492,
-0.12630151212215424,
0.11601320654153824,
0.03282208740711212,
-0.08780468255281448,
-0.062369491904973984,
-0.0006791196065023541,
-0.034443121403455734,
-0.22099432349205017,
0.031658004969358444,
0.11068084836006165,
0.07476310431957245,
-0.03403317928314209,
-0.08304393291473389,
-0.02895026095211506,
-0.058612581342458725,
0.03986813873052597,
0.016017582267522812,
0.12599535286426544,
-0.07699156552553177,
-0.02858225256204605,
0.032077912241220474,
0.038467586040496826,
0.07923582941293716,
-0.054815541952848434,
-0.057291675359010696,
-0.01996961608529091,
-0.023569827899336815,
-0.00915558822453022,
-0.0898597314953804,
-0.0620407834649086,
-0.006840218789875507,
0.1304454207420349,
0.03466487303376198,
0.07167287915945053,
0.0362425372004509,
0.052633073180913925,
-0.028641145676374435,
0.002677651820704341,
0.1629824936389923,
0.04459667578339577,
-0.12675853073596954,
-0.08582112193107605,
0.10815013945102692,
-0.07446087151765823,
0.1071702167391777,
-0.2590586841106415,
0.028333326801657677,
-0.11371348798274994,
0.08611167222261429,
-0.013308924622833729,
0.06491301208734512,
-0.08320876955986023,
0.024355897679924965,
-0.08930765837430954,
-0.008432179689407349,
0.05678462237119675,
0.04953930526971817,
-0.02282531000673771,
0.12372811883687973,
-0.1432238668203354,
0.21934939920902252,
0.1198250874876976,
-0.09310522675514221,
-0.11077594012022018,
-0.0739443302154541,
0.009118417277932167,
-0.005148864816874266,
-0.1179550290107727,
0.005491754971444607,
0.076014444231987,
-0.04686584323644638,
0.1847466230392456,
-0.034107014536857605,
-0.03428659960627556,
-0.015382813289761543,
-0.08532355725765228,
-0.009268855676054955,
-0.02073976956307888,
0.09649215638637543,
-0.2238936424255371,
0.1325010061264038,
0.16212041676044464,
-0.015046309679746628,
0.1718226969242096,
0.01847519353032112,
0.013679388910531998,
0.006052343640476465,
-0.04082776978611946,
-0.00007846848893677816,
0.02128027006983757,
0.0015916629927232862,
0.0011914868373423815,
0.007707077544182539,
0.02131907269358635,
0.030305195599794388,
-0.14438240230083466,
-0.05413905158638954,
0.010167223401367664,
0.052466847002506256,
0.00018202696810476482,
0.0614926852285862,
-0.08105885237455368,
0.05735839903354645,
-0.0333511158823967,
-0.11407014727592468,
0.12527471780776978,
0.0140310637652874,
-0.12375999987125397,
0.1809239387512207,
-0.09875242412090302,
-0.177916020154953,
-0.19897617399692535,
-0.11664178967475891,
0.025174645707011223,
0.09509945660829544,
0.06778308749198914,
-0.06591268628835678,
-0.0677633062005043,
-0.013884147629141808,
-0.13205823302268982,
0.015237858518958092,
-0.0303916335105896,
-0.10815607011318207,
0.06643082201480865,
0.002197817200794816,
-0.1106930822134018,
-0.04751880466938019,
0.012397545389831066,
-0.05212624743580818,
0.06534521281719208,
-0.032029394060373306,
0.06015416979789734,
0.12733860313892365,
-0.009645693004131317,
0.014830506406724453,
-0.03892328962683678,
0.1736617386341095,
-0.07863081991672516,
0.0028175772167742252,
0.11224561184644699,
-0.04382455348968506,
0.03531843051314354,
0.2027312070131302,
0.03458266332745552,
-0.07247956842184067,
0.06938916444778442,
-0.03509911522269249,
-0.05979844182729721,
-0.202435702085495,
-0.10123657435178757,
-0.007523522712290287,
-0.02823515795171261,
0.08373580127954483,
0.0565473809838295,
0.25448861718177795,
0.1288231760263443,
0.060374923050403595,
0.03997355327010155,
0.024889161810278893,
0.0913970097899437,
0.1029813289642334,
-0.027027886360883713,
0.16222402453422546,
-0.08429007232189178,
-0.14650671184062958,
0.048164136707782745,
-0.022769063711166382,
0.07281020283699036,
0.17174853384494781,
-0.06210782378911972,
0.04705783352255821,
0.11571547389030457,
0.13094793260097504,
0.12702703475952148,
0.07746905833482742,
-0.061997704207897186,
-0.006629003677517176,
0.0010869213147088885,
-0.04415592923760414,
0.14652740955352783,
-0.060009948909282684,
-0.06889448314905167,
-0.04306207224726677,
-0.003198902355507016,
0.04323491454124451,
0.05818231403827667,
0.026216039434075356,
-0.28657910227775574,
0.042942874133586884,
0.04888097196817398,
-0.05969006195664406,
-0.11467164009809494,
0.09232109785079956,
-0.027857046574354172,
-0.18361465632915497,
0.03563778102397919,
-0.033283449709415436,
0.09147034585475922,
0.062072351574897766,
0.04841171205043793,
-0.06585943698883057,
-0.0609852597117424,
-0.045712124556303024,
0.15376420319080353,
-0.33846980333328247,
0.20756816864013672,
-0.011205663904547691,
0.08115556091070175,
-0.10785048454999924,
0.010794016532599926,
0.08773794025182724,
0.19103488326072693,
0.12050216645002365,
-0.049261946231126785,
-0.19848455488681793,
-0.11937171965837479,
-0.08363119512796402,
-0.015415008179843426,
0.02001480758190155,
-0.008096402511000633,
0.0008919041720218956,
-0.11757626384496689,
0.0014032695908099413,
0.04126403480768204,
-0.0069845812395215034,
-0.17894983291625977,
-0.15384836494922638,
-0.03538630157709122,
0.030474675819277763,
0.10934672504663467,
-0.04776112735271454,
-0.0534328930079937,
-0.06292759627103806,
0.13548673689365387,
0.026695549488067627,
0.008182995021343231,
-0.1301279366016388,
-0.053804632276296616,
-0.044131867587566376,
-0.023950019851326942,
0.07710648328065872,
0.009424211457371712,
0.11959850043058395,
-0.08615647256374359,
-0.06447352468967438,
0.09218238294124603,
-0.12910714745521545,
-0.042984966188669205,
-0.12177132815122604,
0.03449074551463127,
-0.045684002339839935,
-0.01073586754500866,
0.11459703743457794,
0.04736353084445,
-0.07455705851316452,
-0.06686578691005707,
-0.016151487827301025,
-0.0162202138453722,
0.052238523960113525,
-0.10140960663557053,
-0.11989933252334595,
-0.12391869723796844,
-0.023699220269918442,
-0.11985665559768677,
0.1933230459690094,
0.14995472133159637,
-0.08873795717954636,
0.15256796777248383,
0.2099498212337494,
-0.11413656920194626,
-0.29302918910980225,
-0.05128840357065201,
-0.06601350009441376,
0.004299632739275694,
0.06156041473150253,
-0.10058135539293289,
0.1023014560341835,
0.016915474086999893,
-0.08869403600692749,
-0.016260353848338127,
-0.10926515609025955,
-0.16224952042102814,
0.22960300743579865,
-0.0020108406897634268,
0.18459931015968323,
-0.07568172365427017,
-0.05459576100111008,
-0.12268339842557907,
0.05030543729662895,
0.043312136083841324,
-0.06949128210544586,
0.04921199381351471,
0.045118432492017746,
0.04848489910364151,
0.02309754677116871,
-0.04944291338324547,
0.05402865633368492,
-0.07527824491262436,
0.09563448280096054,
-0.16834798455238342,
-0.019022751599550247,
0.05676575005054474,
-0.027846379205584526,
0.11607834696769714,
-0.040225449949502945,
0.045501600950956345,
-0.05838647112250328,
-0.07079911977052689,
0.02105431631207466,
0.07136379927396774,
-0.007516450714319944,
-0.11632271111011505,
0.009460309520363808,
0.0020681610330939293,
-0.007515698205679655,
-0.07468903809785843,
0.01720641367137432,
-0.009510648436844349,
0.14864802360534668,
0.13830016553401947,
0.2062399536371231,
-0.06995580345392227,
0.06706579029560089,
-0.03199863061308861,
-0.11711113899946213,
0.07805433124303818,
-0.07166967540979385,
0.004296483471989632,
0.05220668390393257,
-0.0538930743932724,
0.14611311256885529,
0.06082209199666977,
0.003751826472580433,
-0.01890469156205654,
0.16250212490558624,
-0.16876746714115143,
0.04684048146009445,
-0.0843876302242279,
0.1279323697090149,
0.04778100550174713,
-0.03293748199939728,
0.09026376903057098,
-0.07791304588317871,
-0.03329215198755264,
-0.0002585914626251906,
0.006090222392231226,
-0.038581836968660355,
0.06518552452325821,
0.04536600783467293,
0.02252393215894699,
-0.06704199314117432,
0.0445764996111393,
0.07239795476198196,
0.016518399119377136,
0.041721411049366,
0.015846284106373787,
-0.09952405095100403,
-0.09522253274917603,
0.04372299090027809,
0.26397231221199036,
-0.1863422393798828,
-0.09990737587213516,
0.004564397502690554,
-0.09345841407775879,
0.004960347898304462,
0.08620705455541611,
0.0809662714600563,
0.04341237619519234,
-0.03603934869170189,
-0.02565331570804119,
-0.11602527648210526,
0.08217493444681168,
-0.015696978196501732,
0.05509110167622566,
-0.16319575905799866,
0.06676459312438965,
-0.030968010425567627,
-0.008549565449357033,
-0.08279257267713547,
-0.010031647980213165,
-0.11571928858757019,
0.026098787784576416,
-0.10430167615413666,
-0.03189973905682564,
-0.041006896644830704,
-0.011233619414269924,
0.05850789323449135,
-0.011018243618309498,
-0.013110441155731678,
-0.01927962154150009,
-0.08805359154939651,
0.02887921780347824,
-0.0008198951254598796,
0.04547540098428726,
-0.05460818111896515,
-0.024217726662755013,
0.037278566509485245,
0.004562355112284422,
0.046250831335783005,
0.012032478116452694,
-0.0011190201621502638,
0.049139540642499924,
-0.14732354879379272,
0.009436994791030884,
0.06159417703747749,
-0.0016145178815349936,
0.0070913624949753284,
-0.028678715229034424,
0.005330502521246672,
0.09783722460269928,
0.018718764185905457,
0.04128317907452583,
-0.0048657008446753025,
-0.1091027706861496,
0.014511657878756523,
0.10307195782661438,
-0.14174701273441315,
-0.03145497664809227,
-0.052812907844781876,
0.01100962609052658,
-0.05524790287017822,
0.23351503908634186,
-0.11669892817735672,
0.04470064863562584,
-0.02692001312971115,
0.030550040304660797,
-0.05822846665978432,
-0.10757116973400116,
-0.12190251797437668,
-0.0954190194606781,
-0.042861051857471466,
0.007703589275479317,
0.2689315676689148,
0.1459355354309082,
-0.008143693208694458,
0.0415508970618248,
0.07256698608398438,
0.09993022680282593,
0.001325596240349114,
0.22187061607837677,
0.09407079964876175,
-0.011255222372710705,
-0.12900875508785248,
0.0802748054265976,
0.027718892320990562,
-0.10550516843795776,
0.0003671931044664234,
0.017833324149250984,
-0.07709381729364395,
0.05998256057500839,
0.04779348149895668,
-0.04618219658732414,
-0.11530262231826782,
-0.1887446641921997,
-0.1010153517127037,
0.01362328790128231,
-0.09494820982217789,
-0.00841664057224989,
0.17340072989463806,
-0.07381404936313629,
-0.020257510244846344,
-0.08453129231929779,
-0.042230453342199326,
-0.21403644979000092,
-0.1685105264186859,
-0.09951409697532654,
-0.07172851264476776,
0.054574232548475266,
-0.01444533746689558,
0.051937036216259,
0.0384058877825737,
0.03334033116698265,
-0.0690227821469307,
0.10118697583675385,
-0.11317354440689087,
0.006825347896665335,
-0.007538147736340761,
-0.042660877108573914,
0.007157159503549337,
-0.17031751573085785,
-0.023363124579191208,
-0.1397811770439148,
-0.04669688642024994,
-0.031707603484392166,
-0.04375086724758148,
0.0007692996296100318,
-0.003963754046708345,
-0.03139100596308708,
-0.009807240217924118,
-0.01006900705397129,
0.03744599595665932,
0.023235660046339035,
0.05043753236532211,
0.022183645516633987,
0.01541586872190237,
0.043549589812755585,
0.21836970746517181,
-0.03527946025133133,
-0.18426218628883362,
-0.12376350164413452,
0.24631790816783905,
0.03293769061565399,
0.11490416526794434,
-0.07057193666696548,
-0.01361043006181717,
0.07598087936639786,
0.31235218048095703,
0.2598150074481964,
-0.03414434567093849,
0.010121017694473267,
-0.03132476285099983,
-0.014958096668124199,
-0.0064048562198877335,
0.18490195274353027,
0.008828791789710522,
0.16826002299785614,
-0.0621221587061882,
0.059055350720882416,
-0.016177164390683174,
-0.07808512449264526,
-0.06689254939556122,
0.14256809651851654,
-0.036333873867988586,
-0.02151089534163475,
-0.01796986348927021,
0.08792226016521454,
-0.0589551106095314,
0.17949369549751282,
-0.09007178992033005,
-0.009130639024078846,
-0.04809116572141647,
0.053617071360349655,
0.11827872693538666,
-0.02074413187801838,
0.03285614401102066,
-0.03567332774400711,
-0.018393725156784058,
0.0029441264923661947,
-0.04050283133983612,
-0.07413910329341888,
-0.04345672205090523,
0.06311136484146118,
0.02551795169711113,
0.25671228766441345,
-0.009337767027318478,
0.05477561056613922,
0.07988451421260834,
-0.0020537625532597303,
-0.10351628065109253,
0.11267323791980743,
0.00224103475920856,
-0.029008302837610245,
0.12491703033447266,
-0.015443749725818634,
0.007564615458250046,
-0.01867114193737507,
-0.01239294558763504,
-0.15698960423469543,
0.14728498458862305,
-0.10142818093299866,
-0.08940913528203964,
-0.05584051460027695,
0.12545742094516754,
-0.032320525497198105,
0.16258437931537628,
0.05726946145296097,
-0.026426637545228004,
0.0021389273460954428,
-0.0331779383122921,
0.08067825436592102,
0.009919043630361557,
-0.09914126992225647,
-0.02203422784805298,
-0.17707498371601105,
-0.016973769292235374,
0.12876249849796295,
-0.02544221095740795,
-0.24601322412490845,
-0.07971391826868057,
-0.06824030727148056,
-0.04311496391892433,
-0.1386985182762146,
0.07398401945829391,
0.2028772532939911,
0.019287997856736183,
-0.01476763840764761,
-0.1369636058807373,
-0.021961720660328865,
0.019149890169501305,
-0.026857441291213036,
-0.10799262672662735
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Yaseen-finetuned-kde4-en-to-fr
This model is a fine-tuned version of [Helsinki-NLP/opus-mt-en-fr](https://huggingface.co/Helsinki-NLP/opus-mt-en-fr) on the kde4 dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.37.2
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.1
| {"license": "apache-2.0", "tags": ["translation", "generated_from_trainer"], "datasets": ["kde4"], "base_model": "Helsinki-NLP/opus-mt-en-fr", "model-index": [{"name": "Yaseen-finetuned-kde4-en-to-fr", "results": []}]} | translation | itsyasin2002ai/Yaseen-finetuned-kde4-en-to-fr | [
"transformers",
"tensorboard",
"safetensors",
"marian",
"text2text-generation",
"translation",
"generated_from_trainer",
"dataset:kde4",
"base_model:Helsinki-NLP/opus-mt-en-fr",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2024-02-15T07:27:28+00:00 | [] | [] | TAGS
#transformers #tensorboard #safetensors #marian #text2text-generation #translation #generated_from_trainer #dataset-kde4 #base_model-Helsinki-NLP/opus-mt-en-fr #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# Yaseen-finetuned-kde4-en-to-fr
This model is a fine-tuned version of Helsinki-NLP/opus-mt-en-fr on the kde4 dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.37.2
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.1
| [
"# Yaseen-finetuned-kde4-en-to-fr\n\nThis model is a fine-tuned version of Helsinki-NLP/opus-mt-en-fr on the kde4 dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 32\n- eval_batch_size: 64\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- Transformers 4.37.2\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1"
] | [
"TAGS\n#transformers #tensorboard #safetensors #marian #text2text-generation #translation #generated_from_trainer #dataset-kde4 #base_model-Helsinki-NLP/opus-mt-en-fr #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# Yaseen-finetuned-kde4-en-to-fr\n\nThis model is a fine-tuned version of Helsinki-NLP/opus-mt-en-fr on the kde4 dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 32\n- eval_batch_size: 64\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- Transformers 4.37.2\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1"
] | [
90,
47,
6,
12,
8,
3,
103,
4,
33
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #marian #text2text-generation #translation #generated_from_trainer #dataset-kde4 #base_model-Helsinki-NLP/opus-mt-en-fr #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n# Yaseen-finetuned-kde4-en-to-fr\n\nThis model is a fine-tuned version of Helsinki-NLP/opus-mt-en-fr on the kde4 dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 32\n- eval_batch_size: 64\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3\n- mixed_precision_training: Native AMP### Training results### Framework versions\n\n- Transformers 4.37.2\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1"
] | [
-0.10625716298818588,
0.13404017686843872,
-0.0022666233126074076,
0.06095659360289574,
0.12414421886205673,
0.00047823088243603706,
0.10552185773849487,
0.1276921182870865,
-0.056204069405794144,
0.08763386309146881,
0.08969845622777939,
0.029317567124962807,
0.08557213097810745,
0.13202621042728424,
-0.022737611085176468,
-0.27046269178390503,
0.03868322819471359,
0.006140575744211674,
-0.10024088621139526,
0.08833227306604385,
0.1319979876279831,
-0.07030390202999115,
0.0654996782541275,
0.03934895619750023,
-0.11592375487089157,
0.018152352422475815,
-0.045366521924734116,
-0.07324463129043579,
0.08159220218658447,
0.02433471381664276,
0.0742865651845932,
0.016386155039072037,
0.09530295431613922,
-0.23374179005622864,
0.0020540915429592133,
0.057510148733854294,
0.037242237478494644,
0.07220634073019028,
0.05347159132361412,
0.051380496472120285,
0.11289983987808228,
-0.15642745792865753,
0.09181010723114014,
0.010255963541567326,
-0.053007375448942184,
-0.1685839295387268,
-0.0727822333574295,
0.07490692287683487,
0.1274043172597885,
0.1056922972202301,
0.0037489873357117176,
0.15411634743213654,
-0.021343793720006943,
0.08006870746612549,
0.13440720736980438,
-0.2446688413619995,
-0.06995207816362381,
0.020063403993844986,
0.07725360244512558,
0.060321711003780365,
-0.08948035538196564,
0.010374743491411209,
0.045509062707424164,
0.01962978020310402,
0.06416691094636917,
-0.005461203400045633,
-0.012567724101245403,
-0.03747740760445595,
-0.12104586511850357,
-0.03795158118009567,
0.20983625948429108,
0.08229292929172516,
-0.03935767337679863,
-0.12391886860132217,
-0.008748969063162804,
-0.0743815153837204,
-0.026223117485642433,
-0.055416543036699295,
-0.0000756636873120442,
-0.06153484433889389,
-0.02700047381222248,
-0.0884917676448822,
-0.10240285843610764,
-0.04081720486283302,
0.05301814153790474,
0.14120550453662872,
0.03742487356066704,
0.01253513153642416,
-0.011494799517095089,
0.0777060016989708,
-0.014476479031145573,
-0.13700148463249207,
-0.030012590810656548,
-0.0054252962581813335,
-0.05959847569465637,
-0.06356573849916458,
-0.027939489111304283,
-0.06987377256155014,
0.006604645401239395,
0.10051992535591125,
-0.03903951495885849,
0.05878791958093643,
0.026494022458791733,
0.004044193774461746,
-0.009563876315951347,
0.1370459347963333,
-0.02927238680422306,
-0.03625029698014259,
-0.0004958713543601334,
0.09538109600543976,
0.006279373541474342,
-0.026917681097984314,
-0.08637575805187225,
-0.04190966114401817,
0.09398677200078964,
0.06975699216127396,
-0.0181637704372406,
0.023377399891614914,
-0.032959260046482086,
-0.056328143924474716,
0.06761395186185837,
-0.12926605343818665,
0.04085346683859825,
-0.025987759232521057,
-0.07553631067276001,
-0.06355918943881989,
0.005947803612798452,
0.03309474140405655,
-0.04837070405483246,
0.03500210866332054,
-0.05432834103703499,
-0.0156736858189106,
-0.060370784252882004,
-0.04348836839199066,
0.02014506421983242,
-0.03430919721722603,
0.029161229729652405,
-0.08581459522247314,
-0.15523761510849,
-0.03417371213436127,
0.053917545825242996,
-0.056649480015039444,
-0.10062173008918762,
-0.032760217785835266,
-0.0461883619427681,
0.022782113403081894,
-0.02169712260365486,
0.11906938254833221,
-0.037698306143283844,
0.03391226753592491,
-0.0004268217599019408,
-0.0003032596141565591,
0.018247514963150024,
0.035135067999362946,
-0.07807504385709763,
0.050271522253751755,
-0.0905524492263794,
0.06795933842658997,
-0.11823267489671707,
0.022326994687318802,
-0.13350912928581238,
-0.10629928857088089,
-0.007000131066888571,
-0.02758101001381874,
0.06996043026447296,
0.1139822006225586,
-0.11116733402013779,
-0.01953117549419403,
0.11206973344087601,
-0.08305434882640839,
-0.12234307080507278,
0.08484619855880737,
-0.012665441259741783,
0.04268212616443634,
0.04787614941596985,
0.1741759032011032,
0.1489153504371643,
-0.14416669309139252,
-0.03378022462129593,
0.026109855622053146,
0.06931807845830917,
-0.002008146373555064,
0.08648965507745743,
0.009968391619622707,
0.005276606883853674,
0.027059931308031082,
-0.07229642570018768,
0.00898016057908535,
-0.048229701817035675,
-0.09881365299224854,
-0.04361366853117943,
-0.07862304151058197,
-0.012634038925170898,
0.02589523233473301,
0.03758776932954788,
-0.07514176517724991,
-0.0975818783044815,
0.08902476727962494,
0.1354040503501892,
-0.06441085785627365,
0.02272075228393078,
-0.06554524600505829,
0.03792545199394226,
-0.04469884932041168,
-0.029469087719917297,
-0.15584628283977509,
-0.1253906935453415,
0.055915024131536484,
-0.10827083885669708,
0.013449098914861679,
-0.0033360738307237625,
0.06778793036937714,
0.07873855531215668,
-0.047134894877672195,
-0.047925785183906555,
-0.10965574532747269,
0.014801464974880219,
-0.09227264672517776,
-0.15025071799755096,
-0.0481676310300827,
-0.036261532455682755,
0.1933635026216507,
-0.2389911562204361,
0.00811176560819149,
0.028206687420606613,
0.1627013087272644,
0.01888262666761875,
-0.047074273228645325,
0.0010284868767485023,
0.011990603059530258,
-0.0011507192393764853,
-0.0950353816151619,
0.018590210005640984,
0.0149050522595644,
-0.12271501123905182,
0.016618648543953896,
-0.1256258487701416,
0.04300399497151375,
0.06437855958938599,
0.09715933352708817,
-0.0631110891699791,
-0.04994567483663559,
-0.0727837085723877,
-0.053700413554906845,
-0.04009733349084854,
-0.003808867186307907,
0.17680387198925018,
0.01688491180539131,
0.10486109554767609,
-0.060739751905202866,
-0.05960465222597122,
0.02681657299399376,
-0.00475796265527606,
-0.0773693397641182,
0.0965895876288414,
0.013643134385347366,
-0.17689678072929382,
0.07731303572654724,
0.07853159308433533,
-0.031348660588264465,
0.17134734988212585,
-0.0466010682284832,
-0.10741980373859406,
-0.03202421963214874,
0.020213359966874123,
0.013971838168799877,
0.15082257986068726,
-0.07380560785531998,
0.03146737813949585,
0.04423435777425766,
0.01985342614352703,
0.047331660985946655,
-0.13788528740406036,
-0.0011701332405209541,
0.02520059235394001,
-0.044088214635849,
0.04328985512256622,
-0.007320367265492678,
-0.0022116524633020163,
0.06163748726248741,
0.0291989017277956,
-0.016992686316370964,
0.02256505750119686,
-0.015470774844288826,
-0.07263197004795074,
0.17243647575378418,
-0.11283431947231293,
-0.20906002819538116,
-0.1531389057636261,
0.07335017621517181,
-0.0659957006573677,
-0.03586218133568764,
0.014467811211943626,
-0.062147900462150574,
-0.06923229992389679,
-0.12051119655370712,
-0.017890579998493195,
-0.07156533747911453,
-0.017650654539465904,
0.046050064265728,
0.032612305134534836,
0.08617034554481506,
-0.11287696659564972,
0.011889497749507427,
0.014516199007630348,
-0.04468129947781563,
-0.030100878328084946,
0.02553461119532585,
0.0929783508181572,
0.08030427992343903,
-0.014455055817961693,
0.031004007905721664,
-0.013995561748743057,
0.23264317214488983,
-0.09523116052150726,
0.016221093013882637,
0.11180318146944046,
0.013537089340388775,
0.05153430253267288,
0.1368926465511322,
0.016921278089284897,
-0.06443363428115845,
0.02955329418182373,
0.04690840467810631,
-0.008083526976406574,
-0.22944533824920654,
-0.056575872004032135,
-0.033159125596284866,
-0.05335238203406334,
0.12209445238113403,
0.06664738804101944,
0.021093621850013733,
0.07664484530687332,
-0.036620497703552246,
0.02200242318212986,
0.008075070567429066,
0.09661266952753067,
0.09056633710861206,
0.047121524810791016,
0.07144180685281754,
-0.021434053778648376,
-0.03447910398244858,
0.06103270873427391,
0.04331367462873459,
0.2266898900270462,
-0.036655958741903305,
0.13757449388504028,
0.01553803775459528,
0.15143492817878723,
-0.01140880398452282,
0.030024329200387,
0.032208848744630814,
0.0036735942121595144,
0.02824024297297001,
-0.0710304006934166,
-0.003751115407794714,
0.05654054880142212,
0.017470331862568855,
0.01760770007967949,
-0.0691656768321991,
0.036587100476026535,
-0.006682824343442917,
0.23584075272083282,
0.06350382417440414,
-0.28054526448249817,
-0.09397575259208679,
0.024953637272119522,
-0.028714170679450035,
-0.10500521957874298,
-0.0045716967433691025,
0.10649162530899048,
-0.15287964046001434,
0.07832836359739304,
-0.07902632653713226,
0.09824299812316895,
-0.03460293263196945,
-0.022967763245105743,
0.06125447154045105,
0.0826055258512497,
0.0075329444371163845,
0.12083328515291214,
-0.16938500106334686,
0.21703402698040009,
0.011679097078740597,
0.11532601714134216,
-0.07370635122060776,
0.048699140548706055,
0.002725092461332679,
0.08104479312896729,
0.1194319874048233,
0.025980252772569656,
-0.121815524995327,
-0.16788624227046967,
-0.13552509248256683,
0.032052312046289444,
0.08486632257699966,
-0.06385709345340729,
0.06864412873983383,
-0.02646060474216938,
0.004689874593168497,
0.019831782206892967,
-0.07671686261892319,
-0.19507409632205963,
-0.16147638857364655,
0.0511254146695137,
0.03214966878294945,
-0.007532332092523575,
-0.09634898602962494,
-0.11129231005907059,
-0.001114163314923644,
0.20631928741931915,
0.04726872593164444,
-0.05782714858651161,
-0.15512463450431824,
0.056615978479385376,
0.152378648519516,
-0.08017267286777496,
0.012188463471829891,
0.0030860374681651592,
0.1804768294095993,
0.018341081216931343,
-0.053673844784498215,
0.04595240205526352,
-0.06765680015087128,
-0.13385964930057526,
-0.016047529876232147,
0.14750328660011292,
0.03442612662911415,
0.03247031196951866,
0.03467465192079544,
0.029655903577804565,
0.0018641711212694645,
-0.08184120059013367,
0.0074111418798565865,
0.02186717465519905,
0.06484628468751907,
0.03266014903783798,
-0.04828996583819389,
0.04645432159304619,
-0.06975233554840088,
-0.035753220319747925,
0.1285298466682434,
0.22879968583583832,
-0.06631213426589966,
0.0571187324821949,
0.044482164084911346,
-0.0767565593123436,
-0.16286702454090118,
0.039675500243902206,
0.13262736797332764,
0.04152388870716095,
0.06398313492536545,
-0.16274254024028778,
0.08919984847307205,
0.08559177815914154,
-0.040418144315481186,
0.07492165267467499,
-0.23112809658050537,
-0.1385633647441864,
0.08329609036445618,
0.13325539231300354,
-0.010495477356016636,
-0.09238678961992264,
-0.058007337152957916,
-0.028190836310386658,
-0.11921899020671844,
0.112031951546669,
-0.04737142100930214,
0.09168745577335358,
-0.0007853604038245976,
0.08523630350828171,
0.03830239549279213,
-0.04066731780767441,
0.1877680867910385,
-0.0008633150137029588,
0.028321057558059692,
-0.0673564076423645,
0.08781059086322784,
0.07635705173015594,
-0.08987584710121155,
0.09440802782773972,
-0.052055101841688156,
0.06131545081734657,
-0.16339315474033356,
-0.03581216558814049,
-0.05338903144001961,
0.10344914346933365,
-0.06374010443687439,
-0.05969397723674774,
-0.03169851377606392,
0.07701502740383148,
0.07441765815019608,
-0.019453542307019234,
0.12151481211185455,
0.03051082231104374,
0.05731066316366196,
0.12066832184791565,
0.10022961348295212,
0.03999883681535721,
-0.08175422996282578,
-0.006140340585261583,
-0.020908189937472343,
0.061237089335918427,
-0.05934953689575195,
0.024650098755955696,
0.11943171918392181,
0.005328577943146229,
0.11788616329431534,
-0.011474411003291607,
-0.08562958985567093,
-0.015100737102329731,
0.04149484261870384,
-0.09405543655157089,
-0.15968048572540283,
-0.06149252876639366,
0.02473566122353077,
-0.13529814779758453,
0.0002424728445475921,
0.14156080782413483,
-0.08199900388717651,
-0.008403311483561993,
-0.021680394187569618,
0.022298594936728477,
-0.0252370685338974,
0.15600089728832245,
0.045045506209135056,
0.07057613134384155,
-0.05172911658883095,
0.09869639575481415,
0.06860269606113434,
-0.10835112631320953,
0.07273834198713303,
0.04415888339281082,
-0.08344751596450806,
-0.035927366465330124,
0.051035087555646896,
0.10287532955408096,
-0.0038501464296132326,
-0.07826337963342667,
-0.06777064502239227,
-0.08275306224822998,
0.025034824386239052,
0.0042515043169260025,
0.030215181410312653,
-0.013782847672700882,
-0.007963346317410469,
0.005357116926461458,
-0.15078900754451752,
0.12241486459970474,
0.046067975461483,
0.07123900949954987,
-0.15583384037017822,
0.020950933918356895,
0.0043466659262776375,
0.05678095296025276,
-0.012593267485499382,
0.008749307133257389,
-0.054314691573381424,
-0.03450077772140503,
-0.10208605229854584,
-0.003163426648825407,
-0.040217719972133636,
0.008108641020953655,
-0.0292438305914402,
-0.0689464807510376,
-0.056343503296375275,
0.060003530234098434,
-0.060311343520879745,
-0.0660441517829895,
-0.020334571599960327,
0.057609207928180695,
-0.1025986447930336,
-0.030490130186080933,
0.04609733447432518,
-0.10737583041191101,
0.06844508647918701,
0.05366235971450806,
0.029160460457205772,
0.02975347265601158,
-0.031198540702462196,
0.04261023551225662,
0.009350983425974846,
0.0466572605073452,
0.05057969689369202,
-0.12368027120828629,
-0.023410771042108536,
0.005414222367107868,
0.045444078743457794,
0.005636084824800491,
0.046188030391931534,
-0.11881755292415619,
-0.08124430477619171,
-0.05398771911859512,
-0.05857983976602554,
-0.056060679256916046,
0.0718732476234436,
0.05529877915978432,
0.023153694346547127,
0.15987259149551392,
-0.06903482228517532,
0.05039765313267708,
-0.18081039190292358,
-0.0138933677226305,
0.003898301627486944,
-0.03814147785305977,
-0.015743600204586983,
-0.028478730469942093,
0.06842950731515884,
-0.061817072331905365,
0.11317886412143707,
-0.01358735654503107,
0.08670752495527267,
0.04635201767086983,
-0.08131241798400879,
0.04068616405129433,
0.0173792727291584,
0.20818351209163666,
0.06397195160388947,
0.001367601566016674,
0.09864175319671631,
-0.03900809586048126,
0.03990551456809044,
0.06663969159126282,
0.09123378247022629,
0.1512729376554489,
0.00001635179614822846,
0.07609409838914871,
0.06651956588029861,
-0.06658248603343964,
-0.1238863542675972,
0.06356348097324371,
-0.03531535342335701,
0.12055937200784683,
-0.008615560829639435,
0.1364549845457077,
0.10989703983068466,
-0.19381926953792572,
0.03227512910962105,
-0.04979069158434868,
-0.1312451958656311,
-0.09382583200931549,
-0.15138991177082062,
-0.09717731177806854,
-0.10056103020906448,
0.033886782824993134,
-0.12563347816467285,
0.008991601876914501,
0.037527259439229965,
0.01658240333199501,
-0.0008816051995381713,
0.1694568693637848,
-0.0021954963449388742,
0.013677556067705154,
0.0826169103384018,
-0.0005776367615908384,
0.0048763565719127655,
-0.03495611995458603,
-0.05395854264497757,
0.03363692760467529,
0.02110956236720085,
0.06357648968696594,
-0.04712023213505745,
-0.012926056049764156,
0.04257531836628914,
0.024476567283272743,
-0.08598192036151886,
0.01218652818351984,
0.008477529510855675,
0.040439508855342865,
0.029605846852064133,
0.05281934142112732,
-0.0025917203165590763,
-0.04967997968196869,
0.2609140872955322,
-0.042855605483055115,
-0.06838393956422806,
-0.1298607438802719,
0.08406268060207367,
0.0202383603900671,
-0.014143713749945164,
0.06621014326810837,
-0.0981130301952362,
-0.00935119017958641,
0.1254279911518097,
0.13366404175758362,
-0.028712686151266098,
-0.024357078596949577,
-0.020255357027053833,
-0.018498051911592484,
-0.05978536605834961,
0.0942264199256897,
0.07760845124721527,
-0.015782330185174942,
-0.06326095014810562,
-0.014337988570332527,
-0.02375981956720352,
-0.03970315307378769,
-0.10323183983564377,
0.06728964298963547,
0.014053741469979286,
-0.0031207259744405746,
-0.03193479776382446,
0.09665726870298386,
0.0034445442724972963,
-0.13821858167648315,
-0.0022894099820405245,
-0.13776859641075134,
-0.20596320927143097,
-0.024781517684459686,
0.10868343710899353,
-0.01989772543311119,
0.050814371556043625,
-0.0037824970204383135,
0.007109205238521099,
0.09203898906707764,
-0.0026269203517585993,
-0.03694136440753937,
-0.08227919042110443,
0.08404487371444702,
-0.05868374928832054,
0.24685756862163544,
0.007029105443507433,
0.07875870913267136,
0.09944223612546921,
-0.01705792173743248,
-0.14366744458675385,
0.024581117555499077,
0.0873577818274498,
-0.037767261266708374,
0.057998787611722946,
0.18404491245746613,
-0.04856580123305321,
0.07129324972629547,
0.04473748058080673,
-0.13867516815662384,
-0.031063852831721306,
-0.0385739840567112,
0.01072783675044775,
-0.06425217539072037,
0.014778146520256996,
-0.07992707937955856,
0.16232305765151978,
0.1874723583459854,
-0.05737445130944252,
-0.03168037161231041,
-0.07026921212673187,
0.042775094509124756,
0.045374393463134766,
0.09828506410121918,
0.010139304213225842,
-0.20200206339359283,
-0.030822791159152985,
0.028480449691414833,
0.032831013202667236,
-0.22975686192512512,
-0.10395964980125427,
0.020426737144589424,
-0.05845436453819275,
-0.0321771502494812,
0.11420571058988571,
0.06036614999175072,
0.006017928011715412,
-0.03851472958922386,
-0.09572873264551163,
-0.044647667557001114,
0.11831934750080109,
-0.13935783505439758,
-0.059636473655700684
] |
null | null | peft |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
### Framework versions
- PEFT 0.8.2 | {"library_name": "peft", "base_model": "microsoft/phi-2"} | null | Phanh2532/GAML-Phi2 | [
"peft",
"safetensors",
"phi",
"custom_code",
"arxiv:1910.09700",
"base_model:microsoft/phi-2",
"region:us"
] | 2024-02-15T07:29:26+00:00 | [
"1910.09700"
] | [] | TAGS
#peft #safetensors #phi #custom_code #arxiv-1910.09700 #base_model-microsoft/phi-2 #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
### Framework versions
- PEFT 0.8.2 | [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact",
"### Framework versions\n\n- PEFT 0.8.2"
] | [
"TAGS\n#peft #safetensors #phi #custom_code #arxiv-1910.09700 #base_model-microsoft/phi-2 #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact",
"### Framework versions\n\n- PEFT 0.8.2"
] | [
39,
6,
3,
54,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4,
11
] | [
"passage: TAGS\n#peft #safetensors #phi #custom_code #arxiv-1910.09700 #base_model-microsoft/phi-2 #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact### Framework versions\n\n- PEFT 0.8.2"
] | [
-0.11378578096628189,
0.21487605571746826,
-0.003014104440808296,
0.031686052680015564,
0.08687818795442581,
0.01809370145201683,
0.04882637411355972,
0.12414636462926865,
-0.012270008213818073,
0.1115494966506958,
0.06632576882839203,
0.11320555210113525,
0.1147451251745224,
0.21791662275791168,
0.005254024639725685,
-0.1890133172273636,
0.029368091374635696,
-0.09263735264539719,
0.009525506757199764,
0.12304094433784485,
0.13995571434497833,
-0.10091966390609741,
0.0840192511677742,
-0.017577290534973145,
0.002505079610273242,
-0.03923873230814934,
-0.07240515202283859,
-0.020789101719856262,
0.03895371034741402,
0.03899342566728592,
0.05840698629617691,
-0.008560311049222946,
0.09151440113782883,
-0.26638829708099365,
0.01925753429532051,
0.04696262255311012,
-0.0009002205915749073,
0.08943339437246323,
0.10088678449392319,
-0.0396558903157711,
0.11754472553730011,
-0.02646174654364586,
0.14021790027618408,
0.08679172396659851,
-0.08723965287208557,
-0.21855652332305908,
-0.06645344942808151,
0.08943150192499161,
0.19024571776390076,
0.0764821395277977,
-0.03849251568317413,
0.12805522978305817,
-0.06970234215259552,
0.01953382045030594,
0.03645600378513336,
-0.0871284231543541,
-0.06591394543647766,
0.05640898272395134,
0.11739108711481094,
0.061959829181432724,
-0.12822699546813965,
-0.03430638462305069,
0.02571694925427437,
0.03650693967938423,
0.07858042418956757,
0.011263007298111916,
0.16213418543338776,
0.028719713911414146,
-0.14347276091575623,
-0.049156736582517624,
0.13057731091976166,
0.02494654804468155,
-0.03856918215751648,
-0.23271606862545013,
-0.010420621372759342,
-0.09283409267663956,
-0.028814610093832016,
-0.04791964590549469,
0.037188682705163956,
0.012448777444660664,
0.11711319535970688,
-0.03599626198410988,
-0.09174647182226181,
-0.01749015785753727,
0.09575329720973969,
0.04688800126314163,
0.024876365438103676,
-0.018554337322711945,
0.009301628917455673,
0.12789368629455566,
0.08466168493032455,
-0.13032005727291107,
-0.060269810259342194,
-0.07833564281463623,
-0.04174656420946121,
-0.032705921679735184,
0.046923428773880005,
0.0509127601981163,
0.05991829186677933,
0.25203654170036316,
-0.019245214760303497,
0.059129443019628525,
0.07174526154994965,
0.016991790384054184,
0.04980243369936943,
0.09777380526065826,
-0.05408836156129837,
-0.16170255839824677,
-0.011144663207232952,
0.09460863471031189,
0.0019380954327061772,
-0.029393989592790604,
-0.05396348983049393,
0.049743179231882095,
0.03504086658358574,
0.11136593669652939,
0.11053898185491562,
-0.011951576918363571,
-0.0811414048075676,
-0.06257052719593048,
0.21462588012218475,
-0.1531042456626892,
0.04846639558672905,
0.02325751632452011,
-0.0021315175108611584,
-0.043181490153074265,
0.012277473695576191,
0.019188493490219116,
-0.03219219297170639,
0.08071576058864594,
-0.06732621788978577,
-0.044146936386823654,
-0.12473537027835846,
-0.02585395984351635,
0.029414786025881767,
0.0034624978434294462,
-0.03429713845252991,
-0.03663012012839317,
-0.07274344563484192,
-0.09641832113265991,
0.10573890060186386,
-0.059576328843832016,
-0.06006880849599838,
-0.028057241812348366,
-0.09290298819541931,
0.023916589096188545,
0.024231843650341034,
0.07666552811861038,
-0.030356913805007935,
0.04322225600481033,
-0.015659697353839874,
0.06239888817071915,
0.07521943002939224,
0.031847018748521805,
-0.0704328715801239,
0.06221601739525795,
-0.19164729118347168,
0.07933833450078964,
-0.07761456072330475,
0.0324440523982048,
-0.15777187049388885,
-0.0097803371027112,
0.015198263339698315,
0.02316240780055523,
0.029932865872979164,
0.16084134578704834,
-0.21240918338298798,
-0.02773977443575859,
0.15688647329807281,
-0.10242363065481186,
-0.11922230571508408,
0.031649261713027954,
-0.0404195711016655,
0.1625688225030899,
0.0218362919986248,
-0.007891842164099216,
0.09372486919164658,
-0.16343186795711517,
-0.025796568021178246,
-0.018779532983899117,
-0.0016074770828709006,
0.0868183821439743,
0.08662477135658264,
-0.09070755541324615,
0.030043402686715126,
0.01716618798673153,
-0.050345245748758316,
-0.023745667189359665,
-0.04187468811869621,
-0.10612507909536362,
0.010015035048127174,
-0.08270856738090515,
0.01418966893106699,
-0.005394088104367256,
-0.09494779258966446,
-0.0018128766678273678,
-0.1594872921705246,
-0.04243171587586403,
0.0837237536907196,
0.0040346477180719376,
-0.023361938074231148,
-0.10437914729118347,
0.04703163355588913,
-0.03366679325699806,
-0.02288144640624523,
-0.13922806084156036,
-0.023543596267700195,
0.01948307640850544,
-0.13902294635772705,
-0.007538620848208666,
-0.11935809999704361,
0.06430456787347794,
0.013722626492381096,
-0.05074096471071243,
-0.045100364834070206,
-0.004509648773819208,
0.008674689568579197,
-0.05467041954398155,
-0.24000859260559082,
-0.03280498832464218,
-0.04922189936041832,
0.151929572224617,
-0.21615469455718994,
0.04119541123509407,
0.04072707146406174,
0.13240203261375427,
0.003618960501626134,
-0.06930549442768097,
0.028165286406874657,
-0.07070443034172058,
-0.02757997438311577,
-0.07413250207901001,
-0.003358797635883093,
-0.0022738866973668337,
-0.033985406160354614,
0.014930957928299904,
-0.12336187064647675,
-0.03733876347541809,
0.10252929478883743,
0.06255297362804413,
-0.14630277454853058,
0.0005532137583941221,
-0.04116472601890564,
-0.06219388172030449,
-0.07378923147916794,
-0.06999711692333221,
0.09610669314861298,
0.05702423304319382,
0.0389622300863266,
-0.07397157698869705,
-0.07357426732778549,
0.010085520334541798,
-0.023125045001506805,
-0.016276869922876358,
0.1114577203989029,
0.08072498440742493,
-0.09532269090414047,
0.0977211445569992,
0.07428096234798431,
0.04341832175850868,
0.08059730380773544,
-0.026467397809028625,
-0.10628718137741089,
-0.029918964952230453,
0.04393020272254944,
0.012534094043076038,
0.170135036110878,
-0.06435423344373703,
0.05593555420637131,
0.04853878170251846,
-0.039440903812646866,
0.04857658967375755,
-0.09172812104225159,
0.012218222953379154,
0.008259686641395092,
-0.013282502070069313,
0.026792515069246292,
-0.027576176449656487,
0.00989769771695137,
0.07695943117141724,
0.054547011852264404,
0.03347555920481682,
0.023557791486382484,
-0.03518669307231903,
-0.13613557815551758,
0.17973223328590393,
-0.0980633944272995,
-0.24342535436153412,
-0.16348034143447876,
0.058905139565467834,
0.051766909658908844,
-0.014864468947052956,
0.020471779629588127,
-0.05196843668818474,
-0.10846822708845139,
-0.08663566410541534,
0.002807978540658951,
0.034426216036081314,
-0.05883907526731491,
-0.06927847117185593,
0.04490082338452339,
0.04266753047704697,
-0.12325035780668259,
0.030828947201371193,
0.061298977583646774,
-0.018604576587677002,
-0.004119786899536848,
0.06045187637209892,
0.09265239536762238,
0.18406151235103607,
-0.001328265992924571,
-0.0011371568543836474,
0.059842657297849655,
0.2756204307079315,
-0.15492938458919525,
0.11946158111095428,
0.1373520940542221,
-0.07204940170049667,
0.07835719734430313,
0.18773789703845978,
0.03189133107662201,
-0.09863507747650146,
0.02404346503317356,
0.026528406888246536,
-0.021228188648819923,
-0.25946274399757385,
-0.05701978877186775,
-0.01497858390212059,
-0.08355594426393509,
0.07270007580518723,
0.09073062986135483,
0.0768696591258049,
0.032613810151815414,
-0.07015887647867203,
-0.1014537513256073,
0.029150888323783875,
0.10604166239500046,
-0.030770577490329742,
0.004458488430827856,
0.08117850869894028,
-0.03906247764825821,
0.012506335973739624,
0.0983913391828537,
-0.008942689746618271,
0.150946706533432,
0.05967402458190918,
0.1079942137002945,
0.08179552853107452,
0.09575852006673813,
-0.004110958427190781,
0.03683086857199669,
0.01611367240548134,
0.030142860487103462,
0.014639263972640038,
-0.08565228432416916,
0.01857750117778778,
0.11297082155942917,
0.03383457288146019,
0.02836533449590206,
0.020458385348320007,
-0.04402661323547363,
0.04856693372130394,
0.18994785845279694,
0.016464000567793846,
-0.2079681009054184,
-0.07882324606180191,
0.06095999479293823,
-0.08257437497377396,
-0.14695735275745392,
-0.012824309058487415,
0.030040999874472618,
-0.16736248135566711,
0.018347768113017082,
-0.03714447841048241,
0.103591188788414,
-0.1025761142373085,
-0.03841165080666542,
0.1079198569059372,
0.05442623794078827,
-0.015187278389930725,
0.047005534172058105,
-0.17364296317100525,
0.11377785354852676,
0.02885613776743412,
0.07626480609178543,
-0.08584360778331757,
0.10410025715827942,
-0.0024694486055523157,
-0.013017654418945312,
0.1650916486978531,
0.004640169441699982,
-0.04359374940395355,
-0.08146204054355621,
-0.10679106414318085,
-0.009851690381765366,
0.09097883105278015,
-0.14237122237682343,
0.07569620758295059,
-0.025136372074484825,
-0.03015020117163658,
-0.006467180326581001,
-0.08990833908319473,
-0.12640880048274994,
-0.16933627426624298,
0.05588266998529434,
-0.10439150780439377,
0.03259894251823425,
-0.08868496865034103,
-0.05494282394647598,
0.005613612476736307,
0.1824626475572586,
-0.23491983115673065,
-0.10848495364189148,
-0.15077854692935944,
-0.10968028008937836,
0.1570584923028946,
-0.040440380573272705,
0.0908111184835434,
-0.0004727050836663693,
0.16086126863956451,
0.014309287071228027,
-0.014294659718871117,
0.10229017585515976,
-0.09227950870990753,
-0.19411838054656982,
-0.05605044215917587,
0.16273552179336548,
0.13882121443748474,
0.029963457956910133,
-0.015256067737936974,
0.029887085780501366,
-0.06062353029847145,
-0.12228400260210037,
0.02515939436852932,
0.16830317676067352,
0.07873539626598358,
-0.019460853189229965,
-0.02204161509871483,
-0.10774783045053482,
-0.05267307907342911,
-0.042266055941581726,
-0.009859311394393444,
0.18955232203006744,
-0.07369155436754227,
0.15753844380378723,
0.11278185248374939,
-0.056745246052742004,
-0.21171359717845917,
0.029571376740932465,
0.04310388118028641,
0.01731574721634388,
0.03987450152635574,
-0.18866536021232605,
0.09013476222753525,
-0.010193737223744392,
-0.07772461324930191,
0.16255907714366913,
-0.1637570559978485,
-0.1397189348936081,
0.10158491134643555,
0.03226342052221298,
-0.22403883934020996,
-0.1380227655172348,
-0.1015327200293541,
-0.022339852526783943,
-0.1425633728504181,
0.05133528634905815,
0.0038280391599982977,
0.005740544758737087,
0.023133208975195885,
0.0100343506783247,
0.026263609528541565,
-0.05070379376411438,
0.20506131649017334,
-0.03198447823524475,
0.004202441778033972,
-0.04896806553006172,
-0.08344719558954239,
0.02287120185792446,
-0.04900979623198509,
0.10722857713699341,
-0.004787186160683632,
0.029421227052807808,
-0.16337278485298157,
-0.041271910071372986,
-0.04863564670085907,
0.02544708363711834,
-0.09258659183979034,
-0.08845847100019455,
-0.03991345688700676,
0.08805834501981735,
0.09627017378807068,
-0.025087371468544006,
0.0015643826918676496,
-0.09028701484203339,
0.05762522295117378,
0.2114574909210205,
0.19455400109291077,
0.0639931783080101,
-0.06370538473129272,
0.015163743868470192,
-0.03506004065275192,
0.045161597430706024,
-0.21936193108558655,
0.041207730770111084,
0.05549737438559532,
0.01843956485390663,
0.07175178080797195,
-0.011908025480806828,
-0.1538621336221695,
-0.07281965762376785,
0.08316335082054138,
-0.06008784845471382,
-0.17475618422031403,
-0.02553386613726616,
0.023019753396511078,
-0.20269595086574554,
-0.033461298793554306,
0.027489302679896355,
-0.014068331569433212,
-0.040853142738342285,
0.021168310195207596,
0.08470514416694641,
-0.020403800532221794,
0.09541773051023483,
0.08339102566242218,
0.09199877828359604,
-0.10454605519771576,
0.07061602175235748,
0.07361934334039688,
-0.037707503885030746,
0.028128663077950478,
0.11154422163963318,
-0.05099904537200928,
-0.034432075917720795,
0.0803576186299324,
0.10143010318279266,
0.022258860990405083,
-0.057082775980234146,
0.009974179789423943,
-0.05008767545223236,
0.05785248428583145,
0.09998168796300888,
0.027409514412283897,
0.005640851799398661,
0.06221208721399307,
0.03354763984680176,
-0.0825996920466423,
0.1104368269443512,
0.05596213787794113,
0.016932696104049683,
-0.05582371726632118,
-0.043363314121961594,
-0.010172837413847446,
-0.018938569352030754,
-0.020693456754088402,
-0.003234695177525282,
-0.0851198136806488,
-0.00929221510887146,
-0.10520675778388977,
0.02047610841691494,
-0.07947297394275665,
0.005491711664944887,
0.029578380286693573,
-0.04913987219333649,
0.003183731809258461,
0.003288250183686614,
-0.06930408626794815,
-0.055251434445381165,
-0.013981658965349197,
0.08149424195289612,
-0.13338683545589447,
0.041997820138931274,
0.07360140234231949,
-0.1068057119846344,
0.07593489438295364,
-0.006002439651638269,
0.010332001373171806,
0.0018799843965098262,
-0.14315930008888245,
0.0593060664832592,
-0.02380082756280899,
-0.007985934615135193,
0.01736851967871189,
-0.19265496730804443,
-0.010922934859991074,
-0.038157571107149124,
-0.06689140945672989,
0.005581908859312534,
-0.003703894093632698,
-0.12366945296525955,
0.10148878395557404,
0.005400797817856073,
-0.060076646506786346,
-0.02240055426955223,
0.03595719486474991,
0.10262934863567352,
-0.0218011774122715,
0.13532501459121704,
-0.02069735713303089,
0.07241874933242798,
-0.1732754111289978,
-0.005699788220226765,
-0.013228282332420349,
0.046691954135894775,
-0.02739017829298973,
-0.02813844569027424,
0.05901901051402092,
-0.01949450746178627,
0.18125669658184052,
-0.012733114883303642,
0.06586373597383499,
0.05252611264586449,
0.016691552475094795,
0.02774411253631115,
0.08316776156425476,
0.05848155915737152,
-0.006899203173816204,
0.0011363255325704813,
0.03370583802461624,
-0.007265168707817793,
-0.04685811698436737,
-0.1627712994813919,
0.05946217104792595,
0.15991315245628357,
0.06156396493315697,
0.025377849116921425,
0.025036750361323357,
-0.11270148307085037,
-0.09070201218128204,
0.1099759042263031,
-0.020845657214522362,
-0.03162771090865135,
-0.07206284254789352,
0.1784706711769104,
0.13805341720581055,
-0.19670233130455017,
0.06762019544839859,
-0.049522291868925095,
-0.04278228431940079,
-0.13685674965381622,
-0.1731935441493988,
-0.05758243799209595,
-0.04815209284424782,
-0.02862224541604519,
-0.05996323376893997,
0.04812511056661606,
0.04870977625250816,
-0.0013058161130174994,
-0.017262300476431847,
0.10135054588317871,
0.018442509695887566,
-0.02554592490196228,
0.04320884868502617,
0.06737314164638519,
0.038117676973342896,
-0.09397329390048981,
0.008220016956329346,
-0.00428364984691143,
0.02280941791832447,
0.06965187937021255,
0.022360138595104218,
-0.0627666711807251,
0.023585017770528793,
-0.020857973024249077,
-0.12144241482019424,
0.03918599709868431,
-0.016149744391441345,
-0.04684383422136307,
0.15600711107254028,
0.0342925563454628,
0.008289451710879803,
-0.01975717954337597,
0.2220326066017151,
-0.08564603328704834,
-0.07243220508098602,
-0.1454060971736908,
0.06444915384054184,
-0.0720582976937294,
0.03540246933698654,
0.03386542201042175,
-0.122121661901474,
0.007912740111351013,
0.16971389949321747,
0.12601661682128906,
-0.01522133033722639,
0.012210380285978317,
0.05022789537906647,
0.005832062102854252,
-0.03888465836644173,
0.02117805741727352,
0.05131001025438309,
0.17871327698230743,
-0.06961652636528015,
0.059935905039310455,
-0.009031261317431927,
-0.08582812547683716,
-0.021583784371614456,
0.09869586676359177,
-0.00951576791703701,
0.0013726436300203204,
-0.06625348329544067,
0.14181657135486603,
-0.08145101368427277,
-0.21426093578338623,
0.0612000934779644,
-0.06259262561798096,
-0.13984999060630798,
-0.040615830570459366,
0.03427382558584213,
-0.020975947380065918,
0.0055834646336734295,
0.0755208432674408,
-0.0456472784280777,
0.194069504737854,
0.0387730747461319,
-0.05348348617553711,
-0.07885733246803284,
0.05375857651233673,
-0.16017696261405945,
0.2810978293418884,
0.02048351615667343,
0.044179387390613556,
0.10840565711259842,
-0.017054110765457153,
-0.14833848178386688,
0.01177469827234745,
0.10568798333406448,
-0.07036097347736359,
0.0586765855550766,
0.1642809510231018,
0.0035266487393528223,
0.12358652055263519,
0.05773075670003891,
-0.052491940557956696,
0.03270251303911209,
-0.0931495949625969,
-0.050640739500522614,
-0.10719091445207596,
0.07893049716949463,
-0.08625559508800507,
0.1592598557472229,
0.109782375395298,
-0.07223892956972122,
-0.0009203138761222363,
-0.01679535023868084,
0.08623547106981277,
0.007533833384513855,
0.11042984575033188,
0.016117723658680916,
-0.18822228908538818,
0.0336274616420269,
0.009934859350323677,
0.1064552292227745,
-0.1833043098449707,
-0.05367206037044525,
0.042722709476947784,
-0.01595555618405342,
-0.08584911376237869,
0.11669338494539261,
0.037754278630018234,
0.028852831572294235,
-0.03966382145881653,
-0.03636007755994797,
0.008827653713524342,
0.14421530067920685,
-0.1104927659034729,
-0.013533114455640316
] |
null | null | null |
# Lora of chiyoda/千代田/千代田 (Azur Lane)
## What Is This?
This is the LoRA model of waifu chiyoda/千代田/千代田 (Azur Lane).
## How Is It Trained?
* This model is trained with [HCP-Diffusion](https://github.com/7eu7d7/HCP-Diffusion).
* The [auto-training framework](https://github.com/deepghs/cyberharem) is maintained by [DeepGHS Team](https://huggingface.co/deepghs).
* The base model used for training is [deepghs/animefull-latest](https://huggingface.co/deepghs/animefull-latest).
* Dataset used for training is the `stage3-p480-800` in [CyberHarem/chiyoda_azurlane](https://huggingface.co/datasets/CyberHarem/chiyoda_azurlane), which contains 81 images.
* Batch size is 4, resolution is 720x720, clustering into 5 buckets.
* Batch size for regularization dataset is 16, resolution is 720x720, clustering into 20 buckets.
* Trained for 840 steps, 40 checkpoints were saved and evaluated.
* **Trigger word is `chiyoda_azurlane`.**
* Pruned core tags for this waifu are `breasts, red_hair, animal_ears, large_breasts, long_hair, purple_eyes, bangs, fox_ears, animal_ear_fluff, hair_ornament, hair_flower`. You can add them to the prompt when some features of waifu (e.g. hair color) are not stable.
## How to Use It?
### If You Are Using A1111 WebUI v1.7+
**Just use it like the classic LoRA**. The LoRA we provided are bundled with the embedding file.
### If You Are Using A1111 WebUI v1.6 or Lower
After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.
For example, if you want to use the model from step 378, you need to download [`378/chiyoda_azurlane.pt`](https://huggingface.co/CyberHarem/chiyoda_azurlane/resolve/main/378/chiyoda_azurlane.pt) as the embedding and [`378/chiyoda_azurlane.safetensors`](https://huggingface.co/CyberHarem/chiyoda_azurlane/resolve/main/378/chiyoda_azurlane.safetensors) for loading Lora. By using both files together, you can generate images for the desired characters.
## Which Step Should I Use?
We selected 5 good steps for you to choose. The best one is step 378.
1520 images (1.57 GiB) were generated for auto-testing.

The base model used for generating preview images is [Meina/MeinaMix_V11](https://huggingface.co/Meina/MeinaMix_V11).
Here are the preview of the recommended steps:
| Step | Epoch | CCIP | AI Corrupt | Bikini Plus | Score | Download | pattern_0_0 | pattern_0_1 | pattern_1 | portrait_0 | portrait_1 | portrait_2 | full_body_0 | full_body_1 | profile_0 | profile_1 | free_0 | free_1 | shorts | maid_0 | maid_1 | miko | yukata | suit | china | bikini_0 | bikini_1 | bikini_2 | sit | squat | kneel | jump | crossed_arms | angry | smile | cry | grin | n_lie_0 | n_lie_1 | n_stand_0 | n_stand_1 | n_stand_2 | n_sex_0 | n_sex_1 |
|-------:|--------:|:----------|:-------------|:--------------|:----------|:-----------------------------------------------------------------------------------------------------|:---------------------------------------------|:---------------------------------------------|:-----------------------------------------|:-------------------------------------------|:-------------------------------------------|:-------------------------------------------|:---------------------------------------------|:---------------------------------------------|:-----------------------------------------|:-----------------------------------------|:-----------------------------------|:-----------------------------------|:-----------------------------------|:-----------------------------------|:-----------------------------------|:-------------------------------|:-----------------------------------|:-------------------------------|:---------------------------------|:---------------------------------------|:---------------------------------------|:---------------------------------------|:-----------------------------|:---------------------------------|:---------------------------------|:-------------------------------|:-----------------------------------------------|:---------------------------------|:---------------------------------|:-----------------------------|:-------------------------------|:-------------------------------------|:-------------------------------------|:-----------------------------------------|:-----------------------------------------|:-----------------------------------------|:-------------------------------------|:-------------------------------------|
| 378 | 19 | 0.902 | 0.989 | 0.847 | **0.793** | [Download](https://huggingface.co/CyberHarem/chiyoda_azurlane/resolve/main/378/chiyoda_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
| 525 | 26 | **0.915** | 0.976 | 0.835 | 0.783 | [Download](https://huggingface.co/CyberHarem/chiyoda_azurlane/resolve/main/525/chiyoda_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
| 609 | 31 | 0.897 | 0.962 | 0.838 | 0.775 | [Download](https://huggingface.co/CyberHarem/chiyoda_azurlane/resolve/main/609/chiyoda_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
| 441 | 22 | 0.890 | 0.957 | 0.839 | 0.772 | [Download](https://huggingface.co/CyberHarem/chiyoda_azurlane/resolve/main/441/chiyoda_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
| 189 | 10 | 0.864 | **0.990** | **0.853** | 0.770 | [Download](https://huggingface.co/CyberHarem/chiyoda_azurlane/resolve/main/189/chiyoda_azurlane.zip) |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |  |
## Anything Else?
Because the automation of LoRA training always annoys some people. So for the following groups, it is not recommended to use this model and we express regret:
1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.
2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.
3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.
4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.
5. Individuals who finds the generated image content offensive to their values.
## All Steps
We uploaded the files in all steps. you can check the images, metrics and download them in the following links:
* [Steps From 651 to 840](all/0.md)
* [Steps From 441 to 630](all/1.md)
* [Steps From 231 to 420](all/2.md)
* [Steps From 21 to 210](all/3.md)
| {"license": "mit", "tags": ["art", "not-for-all-audiences"], "datasets": ["CyberHarem/chiyoda_azurlane"], "pipeline_tag": "text-to-image"} | text-to-image | CyberHarem/chiyoda_azurlane | [
"art",
"not-for-all-audiences",
"text-to-image",
"dataset:CyberHarem/chiyoda_azurlane",
"license:mit",
"region:us"
] | 2024-02-15T07:29:36+00:00 | [] | [] | TAGS
#art #not-for-all-audiences #text-to-image #dataset-CyberHarem/chiyoda_azurlane #license-mit #region-us
| Lora of chiyoda/千代田/千代田 (Azur Lane)
===================================
What Is This?
-------------
This is the LoRA model of waifu chiyoda/千代田/千代田 (Azur Lane).
How Is It Trained?
------------------
* This model is trained with HCP-Diffusion.
* The auto-training framework is maintained by DeepGHS Team.
* The base model used for training is deepghs/animefull-latest.
* Dataset used for training is the 'stage3-p480-800' in CyberHarem/chiyoda\_azurlane, which contains 81 images.
* Batch size is 4, resolution is 720x720, clustering into 5 buckets.
* Batch size for regularization dataset is 16, resolution is 720x720, clustering into 20 buckets.
* Trained for 840 steps, 40 checkpoints were saved and evaluated.
* Trigger word is 'chiyoda\_azurlane'.
* Pruned core tags for this waifu are 'breasts, red\_hair, animal\_ears, large\_breasts, long\_hair, purple\_eyes, bangs, fox\_ears, animal\_ear\_fluff, hair\_ornament, hair\_flower'. You can add them to the prompt when some features of waifu (e.g. hair color) are not stable.
How to Use It?
--------------
### If You Are Using A1111 WebUI v1.7+
Just use it like the classic LoRA. The LoRA we provided are bundled with the embedding file.
### If You Are Using A1111 WebUI v1.6 or Lower
After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.
For example, if you want to use the model from step 378, you need to download '378/chiyoda\_azurlane.pt' as the embedding and '378/chiyoda\_azurlane.safetensors' for loading Lora. By using both files together, you can generate images for the desired characters.
Which Step Should I Use?
------------------------
We selected 5 good steps for you to choose. The best one is step 378.
1520 images (1.57 GiB) were generated for auto-testing.
!Metrics Plot
The base model used for generating preview images is Meina/MeinaMix\_V11.
Here are the preview of the recommended steps:
Anything Else?
--------------
Because the automation of LoRA training always annoys some people. So for the following groups, it is not recommended to use this model and we express regret:
1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.
2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.
3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.
4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.
5. Individuals who finds the generated image content offensive to their values.
All Steps
---------
We uploaded the files in all steps. you can check the images, metrics and download them in the following links:
* Steps From 651 to 840
* Steps From 441 to 630
* Steps From 231 to 420
* Steps From 21 to 210
| [
"### If You Are Using A1111 WebUI v1.7+\n\n\nJust use it like the classic LoRA. The LoRA we provided are bundled with the embedding file.",
"### If You Are Using A1111 WebUI v1.6 or Lower\n\n\nAfter downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.\n\n\nFor example, if you want to use the model from step 378, you need to download '378/chiyoda\\_azurlane.pt' as the embedding and '378/chiyoda\\_azurlane.safetensors' for loading Lora. By using both files together, you can generate images for the desired characters.\n\n\nWhich Step Should I Use?\n------------------------\n\n\nWe selected 5 good steps for you to choose. The best one is step 378.\n\n\n1520 images (1.57 GiB) were generated for auto-testing.\n\n\n!Metrics Plot\n\n\nThe base model used for generating preview images is Meina/MeinaMix\\_V11.\n\n\nHere are the preview of the recommended steps:\n\n\n\nAnything Else?\n--------------\n\n\nBecause the automation of LoRA training always annoys some people. So for the following groups, it is not recommended to use this model and we express regret:\n\n\n1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.\n2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.\n3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.\n4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.\n5. Individuals who finds the generated image content offensive to their values.\n\n\nAll Steps\n---------\n\n\nWe uploaded the files in all steps. you can check the images, metrics and download them in the following links:\n\n\n* Steps From 651 to 840\n* Steps From 441 to 630\n* Steps From 231 to 420\n* Steps From 21 to 210"
] | [
"TAGS\n#art #not-for-all-audiences #text-to-image #dataset-CyberHarem/chiyoda_azurlane #license-mit #region-us \n",
"### If You Are Using A1111 WebUI v1.7+\n\n\nJust use it like the classic LoRA. The LoRA we provided are bundled with the embedding file.",
"### If You Are Using A1111 WebUI v1.6 or Lower\n\n\nAfter downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.\n\n\nFor example, if you want to use the model from step 378, you need to download '378/chiyoda\\_azurlane.pt' as the embedding and '378/chiyoda\\_azurlane.safetensors' for loading Lora. By using both files together, you can generate images for the desired characters.\n\n\nWhich Step Should I Use?\n------------------------\n\n\nWe selected 5 good steps for you to choose. The best one is step 378.\n\n\n1520 images (1.57 GiB) were generated for auto-testing.\n\n\n!Metrics Plot\n\n\nThe base model used for generating preview images is Meina/MeinaMix\\_V11.\n\n\nHere are the preview of the recommended steps:\n\n\n\nAnything Else?\n--------------\n\n\nBecause the automation of LoRA training always annoys some people. So for the following groups, it is not recommended to use this model and we express regret:\n\n\n1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail.\n2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits.\n3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm.\n4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters.\n5. Individuals who finds the generated image content offensive to their values.\n\n\nAll Steps\n---------\n\n\nWe uploaded the files in all steps. you can check the images, metrics and download them in the following links:\n\n\n* Steps From 651 to 840\n* Steps From 441 to 630\n* Steps From 231 to 420\n* Steps From 21 to 210"
] | [
45,
38,
471
] | [
"passage: TAGS\n#art #not-for-all-audiences #text-to-image #dataset-CyberHarem/chiyoda_azurlane #license-mit #region-us \n### If You Are Using A1111 WebUI v1.7+\n\n\nJust use it like the classic LoRA. The LoRA we provided are bundled with the embedding file."
] | [
0.009629634208977222,
-0.012201105244457722,
-0.0036820443347096443,
0.08265304565429688,
0.08347495645284653,
0.08447372913360596,
0.21996185183525085,
0.08338483422994614,
0.12490162998437881,
-0.05530397221446037,
0.07748523354530334,
0.07264155149459839,
0.005692166276276112,
0.02935619279742241,
-0.037106454372406006,
-0.15988793969154358,
-0.06239981949329376,
-0.01417099591344595,
0.01357040461152792,
0.0163751021027565,
0.07436651736497879,
0.0029962246771901846,
0.10671597719192505,
-0.04895979166030884,
-0.043471045792102814,
0.034240834414958954,
-0.03084161877632141,
-0.04420100525021553,
0.02622413821518421,
0.08343969285488129,
0.12105248868465424,
0.008873486891388893,
0.06692669540643692,
-0.1486571580171585,
0.06886468827724457,
-0.0181565061211586,
-0.10841414332389832,
-0.0033782843966037035,
0.007263245992362499,
-0.031698718667030334,
0.12095083296298981,
0.028896326199173927,
-0.11374515295028687,
0.04747461527585983,
-0.13529720902442932,
-0.006342761218547821,
-0.05229531228542328,
0.03823387622833252,
0.1574324369430542,
0.05979606881737709,
0.020081009715795517,
0.06615882366895676,
-0.0520334467291832,
0.07717223465442657,
0.11137275397777557,
-0.1394793838262558,
-0.07248879224061966,
0.11682222038507462,
0.03536222502589226,
0.14193286001682281,
-0.09747381508350372,
0.09506183117628098,
0.08038545399904251,
-0.051472023129463196,
-0.15518243610858917,
-0.10192054510116577,
-0.21188893914222717,
-0.01724214106798172,
0.013028218410909176,
0.041202105581760406,
0.3943822681903839,
0.06732151657342911,
0.03196651116013527,
0.05931628867983818,
-0.0715717151761055,
0.023712124675512314,
-0.10029882937669754,
0.1403288096189499,
0.04182279109954834,
0.09409846365451813,
-0.037290651351213455,
-0.10268464684486389,
-0.11015622317790985,
-0.06816952675580978,
-0.0785118117928505,
-0.016473939642310143,
0.0178244449198246,
0.11087226867675781,
-0.20249994099140167,
-0.002400060184299946,
-0.030508197844028473,
-0.12471969425678253,
0.00892883911728859,
-0.09958690404891968,
0.16026665270328522,
0.06562985479831696,
-0.004941233433783054,
0.011050636880099773,
0.2520141899585724,
0.11260388791561127,
0.1886664181947708,
0.058804869651794434,
-0.09835045784711838,
0.12472248822450638,
0.04849862679839134,
-0.08144444227218628,
-0.013379557058215141,
-0.09103896468877792,
0.1418025940656662,
-0.04227745905518532,
0.11478074640035629,
-0.06705109775066376,
-0.11827441304922104,
0.012887717224657536,
-0.10952510684728622,
0.06724466383457184,
0.0322536863386631,
0.009289029985666275,
-0.051363639533519745,
0.04517321661114693,
0.042746953666210175,
-0.032940931618213654,
-0.0040478515438735485,
-0.004133794456720352,
-0.05000638589262962,
0.04662541672587395,
0.09706096351146698,
0.039407890290021896,
0.07336203008890152,
-0.016774049028754234,
-0.029443100094795227,
-0.003838541451841593,
-0.04247509315609932,
0.012073054909706116,
0.0486304797232151,
0.043461669236421585,
0.09207598119974136,
-0.15954484045505524,
-0.052851200103759766,
-0.02179376780986786,
0.04720228165388107,
0.0007101956871338189,
0.08586627244949341,
-0.003726813243702054,
0.05887989699840546,
0.0007395244319923222,
-0.019288375973701477,
0.0362013578414917,
-0.10363229364156723,
0.08850178122520447,
-0.011218735948204994,
0.09659595787525177,
-0.19519777595996857,
-0.01112529169768095,
-0.04083878919482231,
0.019367648288607597,
0.056535977870225906,
-0.007365421392023563,
-0.09926288574934006,
0.12501677870750427,
-0.006636100355535746,
0.07909701764583588,
-0.10496518760919571,
0.04367141053080559,
0.016309158876538277,
0.08884890377521515,
-0.09393352270126343,
0.013582142069935799,
0.13310371339321136,
-0.1364261507987976,
-0.17535190284252167,
0.08531377464532852,
-0.027680648490786552,
0.041305579245090485,
0.03987785801291466,
0.15730836987495422,
0.16347549855709076,
-0.20022793114185333,
-0.02414558455348015,
0.05839766189455986,
-0.018730202689766884,
-0.07801640033721924,
-0.019317734986543655,
0.10902058333158493,
0.01781817153096199,
0.03068532608449459,
-0.03236569091677666,
0.11998017877340317,
-0.03529394790530205,
-0.08526259660720825,
-0.02678155153989792,
-0.08192699402570724,
-0.07842209935188293,
0.04982360824942589,
-0.010188370943069458,
-0.04486147686839104,
0.023032646626234055,
-0.1631975919008255,
0.16727283596992493,
0.012884794734418392,
0.022184057161211967,
-0.08491182327270508,
0.11659499257802963,
0.015049664303660393,
-0.0034672871697694063,
0.01210128702223301,
-0.049777984619140625,
-0.0960971862077713,
0.21864180266857147,
0.08667466044425964,
0.07212395966053009,
0.05294980853796005,
-0.05041975900530815,
-0.0686543881893158,
0.017299501225352287,
0.01568814180791378,
-0.037442684173583984,
0.016812464222311974,
-0.09801822155714035,
0.051898837089538574,
-0.014083661139011383,
0.04826602712273598,
-0.01547804195433855,
-0.02845260687172413,
0.08053870499134064,
0.014116798527538776,
-0.020928721874952316,
0.09245239943265915,
0.06337433308362961,
-0.016140403226017952,
-0.0728466734290123,
0.005296790506690741,
0.07546863704919815,
-0.0010467043612152338,
-0.10574924200773239,
0.03514545410871506,
-0.013737993314862251,
0.020391035825014114,
0.2020614892244339,
-0.21142829954624176,
0.04382622241973877,
0.0014518287498503923,
0.053116727620363235,
0.03857306391000748,
0.002101593418046832,
-0.027199041098356247,
0.034199219197034836,
-0.03020065650343895,
0.07293997704982758,
-0.020637545734643936,
0.0728803351521492,
-0.02412247844040394,
-0.135733962059021,
-0.011795220896601677,
-0.03291039541363716,
0.18434599041938782,
-0.1748204231262207,
0.06371874362230301,
0.1680627167224884,
-0.11078771203756332,
0.14887410402297974,
-0.005197232123464346,
-0.006873195059597492,
0.0026596123352646828,
0.03623300418257713,
0.0002528652548789978,
0.10631643235683441,
-0.08177340775728226,
-0.02819342538714409,
0.01778256706893444,
-0.09235362708568573,
0.031397633254528046,
-0.11946166306734085,
-0.11605679243803024,
-0.07457321882247925,
-0.031615253537893295,
-0.018064923584461212,
0.030344568192958832,
-0.05018085241317749,
0.07228949666023254,
-0.08636175096035004,
-0.06796547770500183,
-0.025465814396739006,
-0.0792740061879158,
0.021648911759257317,
0.01345511432737112,
-0.05355158448219299,
-0.14777249097824097,
-0.11221236735582352,
-0.08854556828737259,
-0.1335609257221222,
-0.006535314954817295,
0.06429362297058105,
-0.12584950029850006,
-0.03358634188771248,
0.016576038673520088,
-0.06124535948038101,
0.096671923995018,
-0.07905992120504379,
0.024716412648558617,
0.05206562951207161,
-0.04009214788675308,
-0.1687926948070526,
0.003937289118766785,
-0.0656512975692749,
-0.05996795743703842,
0.1584952175617218,
-0.16307996213436127,
0.1847158819437027,
-0.021289480850100517,
0.06297504156827927,
0.05973729491233826,
0.030756264925003052,
0.13596241176128387,
-0.12275430560112,
0.08468089997768402,
0.1849489063024521,
0.04171217232942581,
0.07602179050445557,
0.12526904046535492,
0.07674659043550491,
-0.11368037015199661,
0.03581952676177025,
0.07251230627298355,
-0.0977487787604332,
-0.08574749529361725,
-0.04861218482255936,
-0.11188386380672455,
-0.04943011328577995,
0.04442816600203514,
0.05861561372876167,
0.0583171620965004,
0.12982386350631714,
-0.05399685725569725,
-0.010218239389359951,
0.09587402641773224,
0.047628238797187805,
0.050243377685546875,
0.01940441131591797,
0.055243026465177536,
-0.14056465029716492,
-0.03825787082314491,
0.15774856507778168,
0.2206309288740158,
0.21644411981105804,
0.028490908443927765,
0.06892479956150055,
0.11889155954122543,
0.09930199384689331,
0.10870026797056198,
0.03819693624973297,
0.009414561092853546,
0.013745011761784554,
-0.07225232571363449,
-0.046815816313028336,
0.01792052574455738,
0.007308036554604769,
-0.03888009861111641,
-0.14685313403606415,
0.096317358314991,
0.009249784052371979,
0.07979351282119751,
0.1362193077802658,
0.039578188210725784,
-0.09245333075523376,
0.15529689192771912,
0.09612378478050232,
0.08786914497613907,
-0.06835129112005234,
0.13535061478614807,
0.05630451440811157,
-0.014564347453415394,
0.16359618306159973,
0.030936289578676224,
0.15241481363773346,
-0.03887774422764778,
-0.08152949810028076,
-0.07726884633302689,
-0.06610680371522903,
0.006579282693564892,
0.02267744019627571,
-0.205312579870224,
0.10005471855401993,
0.060449160635471344,
0.01542660128325224,
-0.004324020817875862,
-0.05668225139379501,
0.1837732344865799,
0.155902698636055,
0.0765632763504982,
0.023013532161712646,
-0.03989135101437569,
-0.010386322624981403,
-0.08252672851085663,
0.053156621754169464,
0.021451501175761223,
0.06970103830099106,
-0.029784470796585083,
-0.0939372256398201,
-0.022889018058776855,
-0.009468834847211838,
0.01809832639992237,
-0.07906723767518997,
-0.10656610131263733,
-0.049058038741350174,
0.25450626015663147,
-0.04085279628634453,
0.04993265122175217,
0.05822458863258362,
0.025863803923130035,
-0.04951140657067299,
0.04286271706223488,
-0.02748226933181286,
-0.014241780154407024,
-0.04297120124101639,
0.005459517706185579,
0.005105543881654739,
-0.04276607558131218,
-0.0614464208483696,
-0.027088014408946037,
-0.09387634694576263,
-0.0942683070898056,
0.004557652864605188,
-0.037895459681749344,
0.005469997879117727,
-0.024004673585295677,
0.01700548082590103,
-0.09166375547647476,
-0.02903260849416256,
0.028122784569859505,
0.02522461675107479,
-0.06692544370889664,
-0.1315114051103592,
-0.009695925749838352,
-0.014599474146962166,
-0.05197255313396454,
0.0344419963657856,
-0.12515580654144287,
-0.1071460098028183,
-0.06916803866624832,
-0.04696384444832802,
0.12172803282737732,
0.22831670939922333,
-0.02008080668747425,
-0.003547015832737088,
0.15178048610687256,
-0.0980566143989563,
-0.30662888288497925,
-0.16827192902565002,
-0.16032983362674713,
-0.09754085540771484,
0.03170530125498772,
-0.07462979853153229,
0.024217186495661736,
0.08133111894130707,
-0.0352763868868351,
0.19742710888385773,
-0.20655962824821472,
-0.09647366404533386,
0.06029215082526207,
0.09658164530992508,
0.3111753463745117,
-0.25673216581344604,
0.011882991530001163,
-0.10435093194246292,
-0.04179218038916588,
0.008569901809096336,
-0.09085570275783539,
0.12182977795600891,
0.03973395377397537,
0.06762642413377762,
-0.006109908223152161,
-0.002003055764362216,
0.14867697656154633,
-0.0831596627831459,
0.12990733981132507,
-0.11561617255210876,
-0.11959398537874222,
0.19482897222042084,
-0.031041879206895828,
-0.006880649831146002,
-0.21253293752670288,
-0.034114737063646317,
-0.027524277567863464,
0.03488563001155853,
-0.00941463466733694,
0.0475756861269474,
-0.014196128584444523,
-0.0143423518165946,
-0.1391768753528595,
-0.013911817222833633,
-0.034096889197826385,
0.06396085023880005,
0.22315891087055206,
-0.06895064562559128,
-0.05687762424349785,
0.025882091373205185,
0.004378364887088537,
0.11381620913743973,
0.019404670223593712,
-0.0628914088010788,
-0.046171803027391434,
0.09802404046058655,
-0.2029762864112854,
0.05563274025917053,
0.0013793912949040532,
-0.003973090089857578,
0.017587149515748024,
0.0065125091932713985,
0.021957287564873695,
0.13513724505901337,
0.17715021967887878,
0.009844664484262466,
-0.04495319724082947,
-0.01636415906250477,
0.010008787736296654,
0.13030213117599487,
-0.022892577573657036,
0.10522860288619995,
0.03331242501735687,
0.028421295806765556,
0.006412319839000702,
0.05769664794206619,
-0.0762358158826828,
-0.10212749987840652,
0.09746351093053818,
-0.048268597573041916,
-0.08013158291578293,
0.08072466403245926,
0.055821046233177185,
0.059648800641298294,
0.001450405572541058,
0.05550771579146385,
0.017854226753115654,
-0.12309826165437698,
0.010382741689682007,
0.20237049460411072,
-0.06983039528131485,
-0.05967381224036217,
-0.06905960291624069,
0.00817268155515194,
-0.13007692992687225,
0.09894263744354248,
0.031042028218507767,
-0.027197744697332382,
0.1323348581790924,
-0.04149994999170303,
-0.034714143723249435,
0.0009332734043709934,
-0.060656361281871796,
0.03134165704250336,
-0.16021859645843506,
-0.1931784600019455,
0.06212175264954567,
-0.00028537577600218356,
-0.0642375648021698,
-0.09258005023002625,
-0.09408831596374512,
0.06069311499595642,
-0.17045536637306213,
0.14853669703006744,
-0.07575461268424988,
0.05268285423517227,
-0.04414529725909233,
-0.05019046366214752,
-0.11311478167772293,
-0.022300265729427338,
-0.04654337465763092,
-0.017659703269600868,
0.05523497611284256,
0.015365811996161938,
-0.1033492386341095,
-0.11425215750932693,
0.0651148185133934,
-0.008231590501964092,
-0.009960833936929703,
0.01404705923050642,
-0.06959253549575806,
0.016483603045344353,
-0.21994489431381226,
-0.0513581857085228,
0.09121003746986389,
0.040915876626968384,
-0.08982102572917938,
0.1221710592508316,
0.04321298748254776,
-0.029824862256646156,
0.05105755105614662,
0.00029357479070313275,
0.18548570573329926,
-0.07329947501420975,
0.023093722760677338,
-0.12482761591672897,
-0.16315415501594543,
-0.03214452043175697,
0.0304567813873291,
0.23897674679756165,
0.07282699644565582,
0.11130635440349579,
-0.0506737194955349,
0.029478034004569054,
-0.015578240156173706,
0.0709935873746872,
0.00933556817471981,
-0.09118744730949402,
-0.045336104929447174,
-0.17395725846290588,
-0.06615094840526581,
-0.061817947775125504,
0.1570085734128952,
0.02673935890197754,
-0.15054389834403992,
-0.0040587009862065315,
0.09895002096891403,
-0.17039862275123596,
-0.0066436161287128925,
0.18854168057441711,
-0.04150482267141342,
0.02385353483259678,
-0.13985773921012878,
0.03418038785457611,
0.07657542824745178,
-0.02233066037297249,
-0.014780265279114246,
0.12138378620147705,
0.0054868063889443874,
0.011214612983167171,
0.03200894221663475,
-0.034341324120759964,
0.08443008363246918,
-0.0773286521434784,
0.041948094964027405,
-0.013065189123153687,
-0.04234689846634865,
-0.10823120176792145,
0.1968442052602768,
-0.010579755529761314,
0.010390996932983398,
-0.06055840849876404,
-0.0052140685729682446,
-0.09747769683599472,
-0.08216114342212677,
-0.07884179800748825,
-0.1265089362859726,
0.0753961130976677,
-0.05790093168616295,
0.005079702939838171,
-0.011787713505327702,
0.01074292417615652,
-0.07487864047288895,
0.024443015456199646,
-0.16614435613155365,
-0.05133363977074623,
0.016467835754156113,
-0.007347806356847286,
-0.02254640683531761,
-0.04286224767565727,
-0.03870534896850586,
0.018555626273155212,
-0.0658273994922638,
-0.06883636862039566,
0.06297193467617035,
0.08010247349739075,
0.06827641278505325,
-0.16244569420814514,
-0.1090492233633995,
-0.0669349879026413,
0.031081590801477432,
0.08030594140291214,
0.1851421296596527,
0.03658358380198479,
-0.010197160765528679,
0.039196696132421494,
0.13458165526390076,
0.017445379868149757,
-0.06565988063812256,
-0.07313533127307892,
-0.13244329392910004,
-0.1433836966753006,
-0.025998057797551155,
-0.06233416870236397,
-0.02417485974729061,
0.025531280785799026,
0.25747567415237427,
0.1745986044406891,
-0.1536565124988556,
0.0365602932870388,
-0.07219652831554413,
0.037601083517074585,
-0.03568257763981819,
0.16129301488399506,
0.05273117870092392,
0.16362805664539337,
-0.029056962579488754,
-0.042076680809259415,
-0.06532836705446243,
0.015524814836680889,
-0.10164597630500793,
0.035014476627111435,
-0.016283031553030014,
-0.06403471529483795,
-0.06104040518403053,
0.10088885575532913,
-0.11694929003715515,
0.06418953090906143,
0.190047949552536,
-0.1489504873752594,
-0.015708444640040398,
-0.04506669566035271,
0.03416271135210991,
0.11708926409482956,
0.02079072780907154,
-0.07402551919221878,
-0.02380830980837345,
0.013321048580110073,
0.03256240114569664,
-0.17610721290111542,
-0.10179859399795532,
-0.004881261382251978,
-0.15364976227283478,
0.14521507918834686,
-0.007316258270293474,
-0.004060625564306974,
0.03756372258067131,
-0.06565602868795395,
0.00013764621689915657,
0.17768403887748718,
0.02215273678302765,
-0.03330352529883385,
-0.02554333582520485,
-0.0615728460252285,
-0.10084313154220581,
0.07431446015834808,
0.08914005756378174,
0.049569934606552124,
0.0018524981569498777,
0.16856923699378967,
-0.02296064794063568,
-0.045399729162454605,
0.14274577796459198,
-0.18130069971084595,
0.09827248007059097,
-0.0019020678009837866,
-0.024808477610349655,
-0.07222804427146912,
-0.031308386474847794,
0.03927212953567505,
0.07556270807981491,
-0.1690348982810974,
-0.05047336593270302,
0.04912929609417915,
-0.09884755313396454,
0.053524114191532135,
0.04419661685824394,
-0.09717687964439392,
0.018400045111775398,
-0.12429648637771606,
-0.006156997289508581,
-0.10550945997238159,
0.04781563580036163,
0.19236540794372559,
-0.03250609710812569,
0.01840229518711567,
-0.1306770294904709,
0.05083434283733368,
-0.02815263904631138,
-0.045292362570762634,
-0.08052093535661697
] |
null | null | transformers | The following models were included in the merge:
* [Test157t/Hex-Macaroniac-7b](https://huggingface.co/Test157t/Hex-Macaroniac-7b)
* [maywell/PiVoT-0.1-Evil-a](https://huggingface.co/maywell/PiVoT-0.1-Evil-a)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
slices:
- sources:
- model: maywell/PiVoT-0.1-Evil-a
normalize: true
layer_range: [0, 32]
- model: Test157t/Hex-Macaroniac-7b
layer_range: [0, 32]
merge_method: slerp
base_model: Test157t/Hex-Macaroniac-7b
parameters:
t:
- filter: self_attn
value: [0, 0.5, 0.3, 0.7, 1]
- filter: mlp
value: [1, 0.5, 0.7, 0.3, 0]
- value: 0.5
dtype: bfloat16
```
| {"library_name": "transformers", "tags": ["mergekit", "merge"], "base_model": ["Test157t/Hex-Macaroniac-7b", "maywell/PiVoT-0.1-Evil-a"]} | text-generation | Test157t/EvilxHexMacaroniac-7b | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"mergekit",
"merge",
"base_model:Test157t/Hex-Macaroniac-7b",
"base_model:maywell/PiVoT-0.1-Evil-a",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2024-02-15T07:32:59+00:00 | [] | [] | TAGS
#transformers #safetensors #mistral #text-generation #mergekit #merge #base_model-Test157t/Hex-Macaroniac-7b #base_model-maywell/PiVoT-0.1-Evil-a #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
| The following models were included in the merge:
* Test157t/Hex-Macaroniac-7b
* maywell/PiVoT-0.1-Evil-a
### Configuration
The following YAML configuration was used to produce this model:
| [
"### Configuration\n\nThe following YAML configuration was used to produce this model:"
] | [
"TAGS\n#transformers #safetensors #mistral #text-generation #mergekit #merge #base_model-Test157t/Hex-Macaroniac-7b #base_model-maywell/PiVoT-0.1-Evil-a #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"### Configuration\n\nThe following YAML configuration was used to produce this model:"
] | [
89,
17
] | [
"passage: TAGS\n#transformers #safetensors #mistral #text-generation #mergekit #merge #base_model-Test157t/Hex-Macaroniac-7b #base_model-maywell/PiVoT-0.1-Evil-a #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### Configuration\n\nThe following YAML configuration was used to produce this model:"
] | [
-0.08101166784763336,
-0.08388960361480713,
-0.00031923589995130897,
-0.02836555615067482,
0.08170101791620255,
0.023936206474900246,
0.19764800369739532,
0.04153187572956085,
-0.033912815153598785,
0.037311483174562454,
0.16094253957271576,
0.04909700155258179,
0.018173102289438248,
0.090996652841568,
-0.06491360813379288,
-0.15510191023349762,
0.08674842864274979,
0.027537768706679344,
-0.09695224463939667,
0.10738491266965866,
0.07666938006877899,
-0.05123429372906685,
0.12009382992982864,
0.020444728434085846,
-0.14809860289096832,
0.07181478291749954,
-0.030806422233581543,
-0.024433188140392303,
0.0799933448433876,
0.09444131702184677,
0.08521120250225067,
0.03905048966407776,
-0.03521827980875969,
-0.051877014338970184,
0.04788642376661301,
0.0027040785644203424,
-0.009993704967200756,
0.06503130495548248,
0.056648917496204376,
-0.0377805233001709,
0.07533006370067596,
0.0032194319646805525,
0.00013746510376222432,
-0.005385424010455608,
-0.1174403503537178,
0.020017685368657112,
-0.03528773412108421,
0.09814779460430145,
0.14046378433704376,
0.0999835804104805,
-0.02421235851943493,
0.08875217288732529,
-0.0018609659746289253,
0.07080888748168945,
0.09614082425832748,
-0.22940635681152344,
-0.021771173924207687,
0.07555075734853745,
0.08031411468982697,
0.008861149661242962,
0.060333337634801865,
0.013342359103262424,
0.10858984291553497,
-0.026691053062677383,
-0.05859869718551636,
-0.024833545088768005,
0.11887894570827484,
0.012292074970901012,
-0.121692955493927,
-0.03692461922764778,
0.21141189336776733,
-0.044596973806619644,
0.058089740574359894,
-0.0012850555358454585,
-0.1125914454460144,
0.014394792728126049,
0.0024895169772207737,
-0.012235104106366634,
-0.05936688184738159,
0.017976218834519386,
0.04941727966070175,
-0.02753051370382309,
-0.08883921802043915,
-0.04850197210907936,
-0.12746453285217285,
0.2514726221561432,
0.04012848064303398,
0.028108982369303703,
-0.0752059742808342,
0.07189112901687622,
-0.0051050940528512,
-0.12565676867961884,
0.05149007588624954,
-0.06637392938137054,
0.03196028620004654,
-0.04520757868885994,
-0.0785948783159256,
-0.09836331009864807,
0.13273213803768158,
0.0770476683974266,
-0.004580218810588121,
0.028656495735049248,
0.03487405925989151,
0.05776882544159889,
0.02588258869946003,
0.02142762765288353,
-0.1290140450000763,
-0.042204830795526505,
0.04935406520962715,
0.09548421949148178,
0.07422405481338501,
-0.0030779328662902117,
-0.12528273463249207,
-0.0292426198720932,
0.1046762615442276,
0.018071701750159264,
0.04514450207352638,
0.13987232744693756,
-0.05344105884432793,
-0.019938087090849876,
0.10928642004728317,
-0.06172044575214386,
-0.005445700138807297,
0.024622641503810883,
0.005359821021556854,
-0.04135041683912277,
0.04962396249175072,
0.039744287729263306,
-0.05123397335410118,
0.11843691766262054,
-0.09648638963699341,
0.00011730789992725477,
-0.06109516695141792,
-0.09772235155105591,
-0.008974169380962849,
0.03083915449678898,
0.025365859270095825,
-0.08314435184001923,
-0.27707433700561523,
-0.005255736876279116,
0.021795354783535004,
-0.023269932717084885,
-0.0353844054043293,
-0.07170850038528442,
-0.0032248003408312798,
0.01448890008032322,
-0.038905069231987,
0.044387396425008774,
-0.054460737854242325,
0.016198642551898956,
0.03352325037121773,
0.03104124590754509,
-0.09221018850803375,
0.058278638869524,
-0.11919562518596649,
0.07168012857437134,
-0.14279933273792267,
0.06568925827741623,
-0.04563959315419197,
0.1261502504348755,
-0.03576198220252991,
0.028152519837021828,
-0.0697779655456543,
0.0625108927488327,
0.06399676948785782,
0.24039573967456818,
-0.10881412029266357,
-0.10582901537418365,
0.022866548970341682,
-0.18014168739318848,
-0.18717369437217712,
0.06674367934465408,
-0.024490108713507652,
0.06890394538640976,
0.07475583255290985,
0.2318023294210434,
0.005682481452822685,
-0.05341154709458351,
0.061658330261707306,
0.025210432708263397,
-0.08894804120063782,
-0.009830613620579243,
0.02090844325721264,
-0.031087540090084076,
-0.2819284498691559,
0.04195525497198105,
-0.018322870135307312,
0.08203046023845673,
-0.06915560364723206,
-0.034934546798467636,
-0.09271024912595749,
-0.023788850754499435,
0.07344678789377213,
-0.005952584091573954,
0.06932490319013596,
-0.06527248024940491,
-0.007877307943999767,
0.06953688710927963,
0.05701536685228348,
-0.06358073651790619,
-0.004080044571310282,
-0.00242475769482553,
0.14672346413135529,
-0.16703836619853973,
0.050618913024663925,
-0.11000337451696396,
-0.0006577169988304377,
-0.0515567772090435,
0.0922011286020279,
0.03097735345363617,
0.06566501408815384,
0.08565956354141235,
0.014157634228467941,
-0.024317491799592972,
-0.07402393966913223,
0.17638270556926727,
0.07906150072813034,
-0.12241916358470917,
-0.1999310702085495,
-0.005523740779608488,
-0.05785221606492996,
0.123639315366745,
-0.15591388940811157,
0.026423679664731026,
-0.05311143770813942,
0.13199330866336823,
-0.04744461178779602,
0.10171549022197723,
0.05116533860564232,
0.023732267320156097,
-0.06207888945937157,
-0.016220822930336,
0.07195228338241577,
0.021542107686400414,
-0.10923706740140915,
0.15114329755306244,
-0.191341832280159,
0.18754449486732483,
0.16961804032325745,
-0.1173638328909874,
-0.011877048760652542,
-0.06165122240781784,
-0.017382599413394928,
-0.0663125142455101,
0.07664147019386292,
-0.0754735916852951,
0.03348836302757263,
0.012143030762672424,
0.1658540517091751,
-0.0622793585062027,
0.012278196401894093,
-0.0349850207567215,
-0.09207038581371307,
-0.05059727281332016,
0.09924285113811493,
-0.075043223798275,
-0.2029506117105484,
0.17076119780540466,
0.20312519371509552,
-0.04733613505959511,
0.12758760154247284,
0.005903827957808971,
-0.03425341099500656,
0.007555998396128416,
-0.011445868760347366,
-0.02524765580892563,
-0.009443755261600018,
-0.1110425665974617,
0.029168512672185898,
0.032870855182409286,
-0.03295477107167244,
0.11943595856428146,
-0.11893951147794724,
-0.0036795102059841156,
0.0382770374417305,
0.014833111315965652,
0.05762725695967674,
0.06842342019081116,
-0.00805643666535616,
0.08697627484798431,
-0.015901388600468636,
-0.05937634035944939,
0.05822104588150978,
0.025176377967000008,
-0.1223272904753685,
0.17858019471168518,
-0.12761545181274414,
-0.22994880378246307,
-0.1529877781867981,
-0.04051327705383301,
-0.13056273758411407,
0.02697238326072693,
0.06654070317745209,
-0.05824761465191841,
-0.04474529251456261,
-0.11284542828798294,
0.08406485617160797,
-0.01210104301571846,
-0.01284531969577074,
-0.05805680528283119,
-0.019980538636446,
0.01115379948168993,
-0.08650816231966019,
-0.010465612635016441,
-0.023987162858247757,
0.01162975188344717,
0.07294352352619171,
-0.1550150066614151,
0.04439760372042656,
0.12818674743175507,
0.030098991468548775,
0.020316146314144135,
0.0038236675318330526,
0.19110116362571716,
0.004450263921171427,
-0.008771429769694805,
0.12928709387779236,
-0.0827454999089241,
0.06313715130090714,
0.19678038358688354,
-0.024008983746170998,
-0.024034352973103523,
-0.0014698280720040202,
-0.06450812518596649,
-0.03176103159785271,
-0.1300269365310669,
-0.14817595481872559,
-0.07970272749662399,
0.0013350150547921658,
0.10634162276983261,
0.051772892475128174,
0.11727265268564224,
0.11745191365480423,
-0.061492182314395905,
0.015113146044313908,
-0.00903941411525011,
0.08786210417747498,
0.18620151281356812,
-0.009736644104123116,
0.06429856270551682,
-0.05133948475122452,
-0.10286732763051987,
0.04690168425440788,
0.06913982331752777,
0.1413494050502777,
0.06187112629413605,
0.006696476601064205,
0.056044258177280426,
0.05819075182080269,
0.10537868738174438,
0.15221403539180756,
-0.021572241559624672,
-0.047652408480644226,
-0.028779663145542145,
-0.09300798922777176,
-0.03298027813434601,
0.07560598850250244,
-0.059811484068632126,
0.04033343866467476,
-0.03284941986203194,
-0.05111095681786537,
0.06583254784345627,
0.11736848950386047,
0.08835706114768982,
-0.30319809913635254,
-0.027139125391840935,
0.04906061664223671,
0.009921521879732609,
-0.041600607335567474,
-0.012909643352031708,
-0.031040776520967484,
0.022831546142697334,
0.11643559485673904,
-0.0003429770586080849,
0.09033933281898499,
0.045237455517053604,
0.07666275650262833,
-0.1013604924082756,
0.06070519611239433,
-0.03384171798825264,
0.08849135041236877,
-0.2491380125284195,
0.25885239243507385,
0.02476179413497448,
-0.0037703325506299734,
-0.03208575397729874,
-0.03084149956703186,
0.05303538590669632,
0.23570597171783447,
-0.009001128375530243,
0.005849344190210104,
0.004935760982334614,
-0.007431246805936098,
-0.08643648028373718,
0.023750964552164078,
-0.039093345403671265,
-0.004197976551949978,
0.05842515081167221,
-0.012567267753183842,
-0.00963791087269783,
0.026087068021297455,
0.0529683455824852,
-0.12196807563304901,
-0.1344202756881714,
0.05287010967731476,
0.02820887789130211,
0.07128401845693588,
-0.057327356189489365,
-0.07020610570907593,
-0.035571321845054626,
0.19185000658035278,
0.08438052237033844,
-0.10098046064376831,
-0.10358734428882599,
0.020703742280602455,
0.1283230036497116,
-0.024538055062294006,
0.0021290858276188374,
-0.02996986173093319,
0.02920142374932766,
-0.011929340660572052,
-0.18781718611717224,
0.09318991750478745,
-0.0968158021569252,
-0.03794672340154648,
-0.039057567715644836,
0.08930282294750214,
-0.08150024712085724,
-0.018972177058458328,
0.017341427505016327,
0.037596892565488815,
-0.17646187543869019,
-0.06250353902578354,
-0.022168880328536034,
0.04277332127094269,
0.06610500067472458,
0.06449202448129654,
-0.0009399270638823509,
-0.12581494450569153,
-0.010538510978221893,
-0.0035697869025170803,
0.11193054169416428,
0.17034471035003662,
0.0009670073050074279,
-0.008669162169098854,
0.17218872904777527,
-0.07459709793329239,
-0.22304049134254456,
-0.06620991975069046,
-0.02682340517640114,
0.03382875397801399,
-0.07961829751729965,
-0.017128530889749527,
0.07627610862255096,
0.06235406547784805,
-0.005998305510729551,
0.02058486081659794,
-0.1982453316450119,
-0.15164567530155182,
0.15540674328804016,
0.07375933229923248,
0.3378381133079529,
-0.1425102800130844,
-0.08660364896059036,
-0.10865514725446701,
-0.04245363548398018,
0.03309642896056175,
-0.19683994352817535,
0.06327293068170547,
-0.014937955886125565,
0.041586682200431824,
0.02346009761095047,
-0.0617404505610466,
0.12900030612945557,
-0.032429758459329605,
0.07659721374511719,
-0.0754297748208046,
-0.030434992164373398,
0.06300089508295059,
-0.03386741504073143,
0.08219637721776962,
-0.13695190846920013,
0.06075400486588478,
-0.07312571257352829,
-0.013700979761779308,
-0.009493517689406872,
0.08114178478717804,
-0.008654673583805561,
-0.008065450936555862,
-0.006699625868350267,
-0.035236652940511703,
-0.02733343094587326,
-0.025663629174232483,
0.102646604180336,
-0.05101578310132027,
0.10364647209644318,
0.17876102030277252,
0.08624343574047089,
-0.11268540471792221,
0.06990670412778854,
0.06237135827541351,
-0.06375560909509659,
0.11727648973464966,
-0.09831025451421738,
-0.005529540590941906,
0.09264498949050903,
-0.02688823826611042,
0.07427961379289627,
0.05102703347802162,
-0.00035400892375037074,
-0.024924471974372864,
0.11291012912988663,
-0.25443729758262634,
-0.12150286883115768,
-0.05959053337574005,
0.0032915486954152584,
-0.007655934896320105,
0.08305908739566803,
0.12368763238191605,
-0.03470800071954727,
-0.015570015646517277,
0.0006178229814395308,
-0.015113134868443012,
-0.08081774413585663,
0.15313133597373962,
0.012935188598930836,
0.026315754279494286,
-0.13024307787418365,
0.02203432470560074,
-0.025863809511065483,
-0.08882736414670944,
0.018547160550951958,
0.0528097078204155,
-0.1268995702266693,
-0.1129104495048523,
-0.062070343643426895,
0.3020905554294586,
-0.07467565685510635,
-0.07900947332382202,
-0.09128977358341217,
-0.14811654388904572,
0.014396590180695057,
0.14497298002243042,
0.06512964516878128,
0.04196365550160408,
-0.0028279274702072144,
-0.07363680005073547,
-0.019428573548793793,
0.05480344593524933,
0.07813646644353867,
0.06960390508174896,
-0.09251951426267624,
0.05035408213734627,
-0.03474937379360199,
0.03509850800037384,
-0.07848867028951645,
0.009499462321400642,
-0.16410183906555176,
-0.016705762594938278,
-0.1937723159790039,
-0.00829986296594143,
-0.12673607468605042,
-0.022000804543495178,
0.03433084115386009,
-0.06216806918382645,
-0.019103828817605972,
-0.007987777702510357,
-0.06862153857946396,
0.034901078790426254,
-0.007965929806232452,
0.016066202893853188,
-0.03104044310748577,
0.006400852929800749,
0.04559013247489929,
-0.06379280984401703,
0.02994409203529358,
0.10692159831523895,
-0.10014747828245163,
-0.0007383050979115069,
-0.1821001023054123,
-0.06613701581954956,
0.04688028246164322,
-0.006660931743681431,
0.061650559306144714,
-0.07886487990617752,
-0.00847470574080944,
0.12428436428308487,
0.032258205115795135,
-0.014858799986541271,
0.09035959839820862,
-0.043349433690309525,
0.0044354903511703014,
-0.03409171476960182,
-0.07534801959991455,
-0.001019457820802927,
-0.0349125899374485,
0.09978850185871124,
0.049580153077840805,
0.1761593371629715,
-0.10075134038925171,
-0.00977258663624525,
-0.10333411395549774,
0.014762606471776962,
0.006461127661168575,
-0.16430628299713135,
-0.0669453889131546,
-0.10730553418397903,
0.020671233534812927,
0.025905346497893333,
0.22554154694080353,
-0.005566079169511795,
-0.012299059890210629,
0.0077361767180264,
0.09319703280925751,
0.09509299695491791,
0.07437289506196976,
0.28658851981163025,
-0.019910652190446854,
0.028049547225236893,
-0.12920112907886505,
0.06221281737089157,
0.031114907935261726,
-0.019186709076166153,
-0.00550061883404851,
0.04629715904593468,
-0.07195524126291275,
0.0927661880850792,
0.041665080934762955,
0.029602665454149246,
0.020883481949567795,
-0.17109127342700958,
-0.0577407144010067,
0.046625420451164246,
0.03638562187552452,
0.14327062666416168,
0.2222101241350174,
-0.019339827820658684,
-0.013089860789477825,
-0.04011223465204239,
-0.040441546589136124,
-0.07785753905773163,
-0.07961924374103546,
-0.12207255512475967,
-0.20255900919437408,
-0.008519853465259075,
-0.08384348452091217,
-0.02262497879564762,
0.07346072047948837,
0.037936367094516754,
-0.03769352287054062,
0.13470688462257385,
0.2029268741607666,
-0.0501800999045372,
0.03171920031309128,
-0.03928857296705246,
-0.0023892077151685953,
-0.04327589273452759,
-0.031098168343305588,
0.0036010954063385725,
-0.04427533969283104,
-0.03760729730129242,
0.029550420120358467,
0.005292981863021851,
0.08167027682065964,
-0.0795087143778801,
-0.0882076844573021,
-0.022267615422606468,
0.03465672582387924,
0.026427147909998894,
0.08001869916915894,
0.012853527441620827,
-0.035853832960128784,
0.004863686393946409,
0.08019774407148361,
-0.05606253817677498,
-0.13976456224918365,
-0.07482947409152985,
0.16117773950099945,
-0.04685412719845772,
0.06602121889591217,
-0.008035453036427498,
-0.028895244002342224,
0.03722778335213661,
0.22396370768547058,
0.28896284103393555,
-0.08516409248113632,
0.011833914555609226,
-0.08685696870088577,
0.01900152489542961,
0.015938445925712585,
0.05658735707402229,
0.021088993176817894,
0.056465018540620804,
-0.08421822637319565,
0.023387132212519646,
-0.060759760439395905,
-0.04300466924905777,
-0.1369357854127884,
-0.036181800067424774,
0.042960938066244125,
-0.09339220076799393,
-0.014584983699023724,
0.09295832365751266,
-0.056893765926361084,
0.07355508208274841,
-0.07320757955312729,
-0.07700802385807037,
-0.05546761304140091,
-0.05062303692102432,
0.1636195033788681,
-0.012108623050153255,
0.06196342036128044,
-0.061911653727293015,
-0.00040694422204978764,
0.0025583391543477774,
-0.04346807673573494,
-0.09220132231712341,
-0.06953762471675873,
0.06848209351301193,
-0.05745941027998924,
-0.04065382853150368,
0.01719098910689354,
0.05380704253911972,
0.09641732275485992,
0.023104868829250336,
-0.12398361414670944,
0.06796033680438995,
-0.030070148408412933,
-0.019103867933154106,
0.0682717114686966,
-0.06407120823860168,
0.047696638852357864,
-0.12486302107572556,
0.02521464414894581,
-0.14245879650115967,
0.05022047087550163,
-0.02561500295996666,
-0.006900089327245951,
-0.025746095925569534,
0.034414201974868774,
-0.013849391601979733,
0.12427213042974472,
0.09449265152215958,
-0.057667411863803864,
0.021246859803795815,
-0.02802448347210884,
0.0053833601996302605,
-0.011425072327256203,
0.07954585552215576,
0.00448640575632453,
-0.20361235737800598,
-0.03942587971687317,
0.1170988455414772,
-0.008719634264707565,
-0.2621275782585144,
-0.030740398913621902,
-0.1673852950334549,
-0.028909344226121902,
-0.05062126740813255,
0.10228867828845978,
0.18676234781742096,
0.029195228591561317,
-0.0183644350618124,
-0.147647887468338,
0.014207925647497177,
0.11046496033668518,
-0.04083900526165962,
-0.11440081149339676
] |
null | null | transformers | GGUF version for [Test157t/EvilxEchidna-7b](https://huggingface.co/Test157t/EvilxEchidna-7b) | {"library_name": "transformers", "pipeline_tag": "text-generation"} | text-generation | konz00/EvilxEchidna-7b-GGUF | [
"transformers",
"gguf",
"text-generation",
"endpoints_compatible",
"region:us"
] | 2024-02-15T07:34:38+00:00 | [] | [] | TAGS
#transformers #gguf #text-generation #endpoints_compatible #region-us
| GGUF version for Test157t/EvilxEchidna-7b | [] | [
"TAGS\n#transformers #gguf #text-generation #endpoints_compatible #region-us \n"
] | [
25
] | [
"passage: TAGS\n#transformers #gguf #text-generation #endpoints_compatible #region-us \n"
] | [
-0.009579749777913094,
0.016051435843110085,
-0.00778944743797183,
-0.0606827586889267,
0.17959557473659515,
0.04390456900000572,
0.05772693082690239,
0.08898092061281204,
0.0652620941400528,
0.0004920579376630485,
0.13663704693317413,
0.12208554148674011,
0.004242816939949989,
0.07835976779460907,
-0.07947206497192383,
-0.2472376823425293,
0.10715211927890778,
0.056565575301647186,
-0.08158249408006668,
0.053426869213581085,
0.046937670558691025,
-0.01570023223757744,
0.09193092584609985,
-0.045532554388046265,
-0.20299391448497772,
0.046174556016922,
0.03110901266336441,
-0.06774543225765228,
0.07453011721372604,
0.10531035810709,
0.09625300019979477,
0.0060175945982337,
-0.15509317815303802,
-0.23109441995620728,
0.03242243826389313,
0.017691269516944885,
-0.10005950927734375,
0.001726817456074059,
0.032374605536460876,
-0.10376358032226562,
0.0906355157494545,
0.04741545394062996,
-0.09837349504232407,
0.0898582860827446,
-0.1877211183309555,
-0.08756867796182632,
-0.04876207560300827,
-0.03491636738181114,
0.01505382638424635,
0.06442180275917053,
-0.013353051617741585,
-0.04188106954097748,
-0.0708744004368782,
0.08128168433904648,
0.15614952147006989,
-0.3134511411190033,
-0.0046891276724636555,
0.18256108462810516,
0.0725054144859314,
0.0586109533905983,
-0.061780765652656555,
0.12141754478216171,
0.019321439787745476,
-0.012183133512735367,
-0.06757590919733047,
-0.08256939053535461,
-0.018916055560112,
0.1169932410120964,
-0.06847090274095535,
-0.06659001857042313,
0.19365370273590088,
-0.03753431886434555,
0.03441791608929634,
-0.019127504900097847,
-0.07204557210206985,
-0.045780349522829056,
-0.04394662380218506,
0.049071304500103,
-0.03188550844788551,
0.14522109925746918,
0.050162553787231445,
-0.09602239727973938,
-0.10060478001832962,
-0.03710633143782616,
-0.20041054487228394,
0.26695820689201355,
-0.02112429216504097,
0.08602497726678848,
-0.22190633416175842,
0.02547726221382618,
-0.15691377222537994,
-0.07777521759271622,
-0.03994326665997505,
-0.08868958801031113,
0.0007165444549173117,
0.03496411815285683,
-0.1054995059967041,
-0.05400620773434639,
0.1524859070777893,
0.10100986063480377,
-0.039991434663534164,
0.06680738180875778,
-0.05496573448181152,
0.09457684308290482,
0.007357186172157526,
0.08487115800380707,
0.004251695703715086,
-0.02679322101175785,
0.0028159006033092737,
-0.21648989617824554,
-0.02913138084113598,
-0.07440675050020218,
-0.13762322068214417,
-0.05771903321146965,
-0.06727461516857147,
0.08988132327795029,
-0.011513961479067802,
0.05322519689798355,
-0.014487922191619873,
0.029230542480945587,
0.01647862233221531,
-0.04698542132973671,
-0.016856428235769272,
-0.0006139421020634472,
0.042123325169086456,
0.13298702239990234,
-0.05349437892436981,
0.002476717112585902,
-0.04667213186621666,
0.045286741107702255,
-0.07442079484462738,
-0.010620219632983208,
-0.03220843896269798,
-0.017421729862689972,
0.01238013245165348,
-0.14715172350406647,
0.04519771412014961,
-0.1117033064365387,
-0.17961354553699493,
0.012365331873297691,
0.02541954070329666,
-0.027730686590075493,
0.05330321565270424,
-0.043738897889852524,
-0.03978561982512474,
0.05650516226887703,
-0.06196942925453186,
-0.05383311212062836,
-0.07760827243328094,
0.06954456865787506,
0.006530481390655041,
0.07355255633592606,
-0.19058609008789062,
0.06655817478895187,
-0.02438404969871044,
0.021579977124929428,
-0.137967050075531,
0.10443931818008423,
-0.06279003620147705,
0.17324736714363098,
0.0018789154710248113,
0.022720806300640106,
-0.14477799832820892,
0.07889829576015472,
-0.08049003779888153,
0.18953093886375427,
-0.06065136939287186,
-0.12099350243806839,
0.33334603905677795,
-0.07543940842151642,
-0.13493743538856506,
0.0689801499247551,
0.02829323336482048,
-0.025411535054445267,
0.09032180160284042,
0.2727244794368744,
0.037348054349422455,
0.007425226736813784,
0.08740752190351486,
0.17363785207271576,
-0.11715899407863617,
-0.08976823836565018,
0.02972390130162239,
-0.07228533923625946,
-0.13154412806034088,
0.038542065769433975,
0.017456594854593277,
0.13168790936470032,
-0.033883508294820786,
-0.006262729875743389,
-0.034181054681539536,
-0.007485904730856419,
0.02858523279428482,
-0.005158690735697746,
0.10653847455978394,
-0.06756997108459473,
0.00851160567253828,
-0.0791502594947815,
-0.04450387507677078,
-0.02837834320962429,
0.03613412007689476,
-0.02884381264448166,
0.10680101811885834,
-0.04231800138950348,
0.08531270921230316,
-0.1458498239517212,
-0.1421574354171753,
-0.025719186291098595,
0.10338063538074493,
-0.032638587057590485,
0.11590561270713806,
0.07598087936639786,
-0.07653150707483292,
-0.010251731611788273,
0.036094751209020615,
0.14052505791187286,
-0.01232639979571104,
-0.005007986444979906,
-0.02063787169754505,
0.10085758566856384,
-0.0727856233716011,
-0.07437864691019058,
-0.08648914098739624,
0.02121381089091301,
0.17789724469184875,
0.07818994671106339,
0.020845280960202217,
-0.013459769077599049,
-0.0017346810782328248,
-0.0031804873142391443,
-0.030983738601207733,
-0.022077839821577072,
0.07634211331605911,
-0.006328213028609753,
-0.12348480522632599,
0.22044001519680023,
-0.12334394454956055,
0.27089813351631165,
0.1863541156053543,
-0.21157202124595642,
0.040252525359392166,
-0.07312045991420746,
-0.003348080674186349,
0.0510278157889843,
0.08169369399547577,
-0.038423482328653336,
0.1006215363740921,
-0.011515987105667591,
0.14560577273368835,
-0.02439459040760994,
-0.015194443054497242,
-0.014426624402403831,
-0.024794336408376694,
-0.04696030542254448,
0.040061965584754944,
0.0040283603593707085,
-0.1499948799610138,
0.21265046298503876,
0.16437600553035736,
0.10120455920696259,
0.2301403433084488,
-0.052775606513023376,
-0.006179410964250565,
0.06962192803621292,
0.04979134351015091,
-0.02862548828125,
-0.023439612239599228,
-0.27340593934059143,
-0.04803221672773361,
0.058091115206480026,
0.08838862180709839,
0.1594451367855072,
-0.11775633692741394,
-0.07357568293809891,
0.011395692825317383,
-0.07802775502204895,
-0.032421816140413284,
0.10294470191001892,
-0.0038269758224487305,
0.07158119976520538,
0.01997542567551136,
0.05874188244342804,
0.11971315741539001,
-0.013143796473741531,
-0.07740460336208344,
0.19619908928871155,
-0.14169634878635406,
-0.28958284854888916,
-0.1811203956604004,
-0.19683189690113068,
-0.04825779050588608,
0.08782356977462769,
0.1169954314827919,
-0.1564473658800125,
-0.03824961185455322,
0.04453830048441887,
0.12497194111347198,
-0.13104239106178284,
0.03814001381397247,
0.021633630618453026,
0.0357159860432148,
-0.10764139890670776,
-0.08001222461462021,
-0.05617373436689377,
-0.010578549467027187,
-0.03604588285088539,
0.08127055317163467,
-0.1475372612476349,
0.11227642744779587,
0.13462066650390625,
0.07316968590021133,
0.10140632838010788,
-0.012971841730177402,
0.20399945974349976,
-0.13861972093582153,
-0.08125321567058563,
0.17675228416919708,
-0.0056007071398198605,
0.059339489787817,
0.08977903425693512,
0.0006586898816749454,
-0.13959509134292603,
-0.011487981304526329,
-0.027896802872419357,
-0.13169744610786438,
-0.1854529231786728,
-0.0752529427409172,
-0.1555994153022766,
0.0391971692442894,
-0.035490378737449646,
0.0891667827963829,
0.11595811694860458,
0.07381458580493927,
0.023958778008818626,
0.009837509132921696,
0.03174016997218132,
0.05706017464399338,
0.16474942862987518,
-0.012225826270878315,
0.06657849252223969,
-0.08221769332885742,
-0.06015617400407791,
0.09020262211561203,
0.08751114457845688,
0.17842808365821838,
0.1306467205286026,
0.14687080681324005,
0.040677037090063095,
-0.025401374325156212,
0.1577799767255783,
0.0967152863740921,
0.0021733229514211416,
-0.06181129068136215,
-0.014838159084320068,
-0.0008649908122606575,
-0.03697579726576805,
0.0287077184766531,
0.01853453554213047,
-0.2132999449968338,
-0.030649419873952866,
-0.18822017312049866,
0.13289867341518402,
0.022944875061511993,
0.04152572527527809,
-0.13737933337688446,
-0.012605909258127213,
0.11767126619815826,
-0.018848104402422905,
-0.11539360880851746,
0.08806689828634262,
0.07253143936395645,
-0.09908895939588547,
0.10482010990381241,
-0.04038228839635849,
0.12548591196537018,
-0.012067651376128197,
0.09052639454603195,
-0.054524339735507965,
-0.09063196182250977,
0.02122526243329048,
0.09754706919193268,
-0.2797391414642334,
0.20971086621284485,
0.02209780551493168,
-0.037529509514570236,
-0.05445846915245056,
0.003275763476267457,
0.02146379090845585,
0.17078383266925812,
0.13981258869171143,
0.024530909955501556,
-0.11239399015903473,
-0.07275530695915222,
0.0757327750325203,
0.050253208726644516,
0.1992543339729309,
-0.08020731806755066,
-0.029789097607135773,
-0.010982216335833073,
0.0018502890598028898,
-0.03299505263566971,
0.0070411707274615765,
0.0035219250712543726,
-0.22908763587474823,
0.06321971863508224,
0.06728363037109375,
0.1331171989440918,
-0.007174460217356682,
0.0938580259680748,
-0.0987543985247612,
0.20914298295974731,
-0.10293351858854294,
-0.0611550472676754,
-0.12668845057487488,
-0.07016585767269135,
0.055333029478788376,
-0.0459693968296051,
0.06438574194908142,
-0.09787745028734207,
0.004825790412724018,
-0.09091890603303909,
-0.2175038456916809,
0.11621169745922089,
-0.08599694073200226,
0.05296287685632706,
-0.025739246979355812,
0.14477863907814026,
-0.06959626078605652,
-0.03430679440498352,
0.02216348610818386,
0.01117691956460476,
-0.07169883698225021,
-0.14873681962490082,
0.036700405180454254,
0.0059575242921710014,
0.012029055505990982,
0.11613360792398453,
-0.012293081730604172,
0.07422421872615814,
0.019829515367746353,
-0.03598923981189728,
0.2794061303138733,
0.14528462290763855,
-0.023540012538433075,
0.16387425363063812,
0.1182202398777008,
-0.06954378634691238,
-0.3015587329864502,
-0.05622600018978119,
-0.17103037238121033,
-0.009528614580631256,
-0.13196390867233276,
-0.22375774383544922,
0.08889705687761307,
0.03337733820080757,
-0.002300830790773034,
0.23360541462898254,
-0.22890694439411163,
-0.03639558330178261,
0.09377763420343399,
-0.008005589246749878,
0.4988834857940674,
-0.18573682010173798,
-0.15002183616161346,
-0.0720629021525383,
-0.2446787804365158,
0.14136669039726257,
-0.07708404213190079,
0.11433251202106476,
0.0017991859931498766,
0.04815256968140602,
0.02332199737429619,
-0.06559228152036667,
0.17698130011558533,
0.04580013081431389,
0.03142178803682327,
-0.08555825054645538,
-0.008231335319578648,
0.07476440072059631,
0.008679834194481373,
-0.010487770661711693,
-0.06746388971805573,
-0.007445100229233503,
-0.1339777708053589,
-0.055862344801425934,
-0.07582037150859833,
0.03865489736199379,
0.0799262747168541,
-0.020358197391033173,
-0.05047819763422012,
-0.05339556559920311,
-0.016170836985111237,
0.04169759154319763,
0.25741541385650635,
-0.1039213016629219,
0.1687774360179901,
0.025945115834474564,
0.04493308067321777,
-0.20477426052093506,
-0.004618740640580654,
-0.046843063086271286,
-0.01980600692331791,
0.0935465469956398,
-0.16110028326511383,
0.05819813162088394,
0.07944425195455551,
-0.05106005072593689,
0.06884531676769257,
0.1359410583972931,
0.014532159082591534,
0.021173717454075813,
0.1467812955379486,
-0.18967050313949585,
-0.12311629951000214,
-0.07608576118946075,
-0.1257346123456955,
0.13505223393440247,
0.10438317060470581,
0.13074788451194763,
0.10108643770217896,
0.01909380964934826,
-0.04022539034485817,
-0.0070203510113060474,
-0.0709037110209465,
0.01386108249425888,
0.00817179773002863,
0.02584969438612461,
-0.15160691738128662,
0.0819007083773613,
-0.03959338366985321,
-0.18340301513671875,
-0.008320484310388565,
0.128557488322258,
-0.15399695932865143,
-0.09042810648679733,
-0.11891915649175644,
0.12760032713413239,
-0.14738141000270844,
-0.033070772886276245,
-0.010086797177791595,
-0.1264544129371643,
0.06237873435020447,
0.23380206525325775,
0.04952940717339516,
0.1272558718919754,
-0.0008513382636010647,
0.004208550788462162,
0.027461236342787743,
-0.10672212392091751,
-0.05929316207766533,
0.018589265644550323,
-0.09067489206790924,
-0.040657006204128265,
-0.06846732646226883,
0.1384158432483673,
-0.08021344989538193,
-0.05024665221571922,
-0.1991536170244217,
0.023239627480506897,
-0.12186726182699203,
-0.05760553851723671,
-0.12232958525419235,
-0.05573355779051781,
0.017527537420392036,
-0.032693274319171906,
-0.03974439203739166,
-0.039665255695581436,
-0.13093158602714539,
0.009885036386549473,
-0.03841420263051987,
0.00771184591576457,
-0.05131201073527336,
0.012187164276838303,
0.09882780909538269,
-0.03772640600800514,
0.12072275578975677,
0.17711962759494781,
-0.06689894944429398,
0.1500464826822281,
-0.26132556796073914,
-0.11999466270208359,
0.10191045701503754,
-0.045100219547748566,
0.03038318082690239,
0.1263023316860199,
0.03314247727394104,
0.04125086963176727,
0.01655580848455429,
0.0664728581905365,
-0.021218232810497284,
-0.11617148667573929,
-0.023018088191747665,
-0.08339186012744904,
-0.08072522282600403,
-0.05001387000083923,
-0.05469566211104393,
0.1577877551317215,
0.04257164150476456,
0.05085008218884468,
-0.021031958982348442,
0.08851058781147003,
0.029373785480856895,
0.019602583721280098,
0.031050896272063255,
-0.19086246192455292,
0.05620424449443817,
-0.07514713704586029,
0.003966974094510078,
0.022788207978010178,
0.34499022364616394,
-0.05453638732433319,
0.01073573436588049,
0.02188924327492714,
0.022196220234036446,
0.09242333471775055,
0.02054937183856964,
0.30792322754859924,
0.15773355960845947,
-0.047039639204740524,
-0.11123811453580856,
0.0990198403596878,
0.02290377765893936,
-0.04754915088415146,
0.12262656539678574,
0.050504595041275024,
-0.07515083253383636,
0.1626535952091217,
-0.051976338028907776,
-0.014229681342840195,
-0.016774345189332962,
-0.061775192618370056,
-0.03942084684967995,
0.030239198356866837,
-0.003761051222681999,
0.02561202459037304,
0.17811238765716553,
-0.06015368551015854,
0.09523767977952957,
0.023459063842892647,
-0.047965023666620255,
-0.1466878056526184,
-0.1085495874285698,
-0.04624064639210701,
-0.1821233481168747,
0.05373648926615715,
-0.11874468624591827,
0.07605502009391785,
0.1288948655128479,
0.05164184421300888,
0.002530656987801194,
0.1609726846218109,
0.04395661503076553,
-0.10577259957790375,
0.10061278194189072,
-0.06672453135251999,
0.047171708196401596,
0.07377781718969345,
-0.03795863687992096,
-0.06771766394376755,
-0.08864762634038925,
-0.00720007810741663,
0.08406072109937668,
-0.015572836622595787,
0.02659783698618412,
-0.16683253645896912,
-0.07534679025411606,
-0.05712404102087021,
0.11183422058820724,
-0.07889340072870255,
0.11370791494846344,
0.0017686465289443731,
-0.05400395765900612,
0.04991501197218895,
0.2105201780796051,
-0.059385403990745544,
0.025879031047225,
-0.04132382199168205,
0.07631102204322815,
0.05708041042089462,
0.1591852456331253,
-0.09176953881978989,
-0.017336152493953705,
-0.08061710000038147,
0.3502628803253174,
0.23014387488365173,
-0.052850354462862015,
0.02067919820547104,
0.05188370496034622,
0.05179297551512718,
0.2027982473373413,
0.10301763564348221,
0.06460312008857727,
0.27206170558929443,
-0.04580971226096153,
-0.079527847468853,
0.035237859934568405,
-0.04374440386891365,
-0.12283738702535629,
0.12384460121393204,
0.04579421505331993,
-0.06362751126289368,
-0.053369324654340744,
0.13975189626216888,
-0.2425307184457779,
0.11192592978477478,
0.019153717905282974,
-0.15003347396850586,
-0.00391812901943922,
-0.0630134865641594,
0.08068794012069702,
-0.0178346186876297,
0.10188694298267365,
-0.0034701921977102757,
-0.15066954493522644,
0.009997948072850704,
0.05140922963619232,
-0.28469008207321167,
-0.026178516447544098,
0.01790185272693634,
-0.009450799785554409,
-0.03424009680747986,
-0.03393350541591644,
-0.046783559024333954,
0.07180610299110413,
0.035729311406612396,
-0.007510825525969267,
0.05646010860800743,
-0.03149822726845741,
-0.042454853653907776,
-0.05317913368344307,
0.08055443316698074,
-0.013036716729402542,
-0.12445653975009918,
0.05525088682770729,
-0.15915890038013458,
0.056089311838150024,
0.0342070534825325,
-0.047501031309366226,
0.014744436368346214,
-0.05187036097049713,
-0.11269349604845047,
0.03578566014766693,
0.06540670990943909,
0.016533343121409416,
0.015689637511968613,
-0.024289604276418686,
0.013355252332985401,
0.013228170573711395,
-0.014545965008437634,
-0.12462245672941208,
-0.045247167348861694,
-0.11810245364904404,
0.19476470351219177,
-0.026068395003676414,
-0.15855787694454193,
0.03035876713693142,
-0.05968214571475983,
0.13963234424591064,
-0.0934998020529747,
0.08311305940151215,
0.13333196938037872,
0.03356841579079628,
-0.028563642874360085,
-0.1888405978679657,
0.09133175015449524,
0.07190821319818497,
-0.05834801867604256,
-0.10380897670984268
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-large-xls-r-300m-hi-colab
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the common_voice_13_0 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7522
- Wer: 1.0223
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 2
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 20
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 5.9636 | 0.95 | 400 | 2.2746 | 1.0338 |
| 0.9533 | 1.89 | 800 | 0.8338 | 1.0699 |
| 0.5378 | 2.84 | 1200 | 0.7781 | 1.0159 |
| 0.4041 | 3.79 | 1600 | 0.7267 | 1.0191 |
| 0.3308 | 4.73 | 2000 | 0.6780 | 1.0237 |
| 0.2728 | 5.68 | 2400 | 0.6862 | 1.0203 |
| 0.2272 | 6.63 | 2800 | 0.6658 | 1.0266 |
| 0.198 | 7.57 | 3200 | 0.6819 | 1.0306 |
| 0.1787 | 8.52 | 3600 | 0.6930 | 1.0257 |
| 0.158 | 9.47 | 4000 | 0.7278 | 1.0318 |
| 0.1391 | 10.41 | 4400 | 0.7102 | 1.0319 |
| 0.1249 | 11.36 | 4800 | 0.7726 | 1.0190 |
| 0.1131 | 12.31 | 5200 | 0.7325 | 1.0253 |
| 0.1049 | 13.25 | 5600 | 0.7512 | 1.0227 |
| 0.095 | 14.2 | 6000 | 0.7580 | 1.0222 |
| 0.0835 | 15.15 | 6400 | 0.7161 | 1.0204 |
| 0.0817 | 16.09 | 6800 | 0.7530 | 1.0239 |
| 0.0707 | 17.04 | 7200 | 0.7613 | 1.0250 |
| 0.0682 | 17.99 | 7600 | 0.7412 | 1.0196 |
| 0.0599 | 18.93 | 8000 | 0.7700 | 1.0214 |
| 0.0593 | 19.88 | 8400 | 0.7522 | 1.0223 |
### Framework versions
- Transformers 4.32.1
- Pytorch 2.2.0+cu121
- Datasets 2.12.0
- Tokenizers 0.13.2
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["common_voice_13_0"], "metrics": ["wer"], "base_model": "facebook/wav2vec2-xls-r-300m", "model-index": [{"name": "wav2vec2-large-xls-r-300m-hi-colab", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "common_voice_13_0", "type": "common_voice_13_0", "config": "hi", "split": "test", "args": "hi"}, "metrics": [{"type": "wer", "value": 1.0222783552113959, "name": "Wer"}]}]}]} | automatic-speech-recognition | surchand/wav2vec2-large-xls-r-300m-hi-colab | [
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"dataset:common_voice_13_0",
"base_model:facebook/wav2vec2-xls-r-300m",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | 2024-02-15T07:35:45+00:00 | [] | [] | TAGS
#transformers #pytorch #wav2vec2 #automatic-speech-recognition #generated_from_trainer #dataset-common_voice_13_0 #base_model-facebook/wav2vec2-xls-r-300m #license-apache-2.0 #model-index #endpoints_compatible #region-us
| wav2vec2-large-xls-r-300m-hi-colab
==================================
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the common\_voice\_13\_0 dataset.
It achieves the following results on the evaluation set:
* Loss: 0.7522
* Wer: 1.0223
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 0.0003
* train\_batch\_size: 2
* eval\_batch\_size: 8
* seed: 42
* gradient\_accumulation\_steps: 8
* total\_train\_batch\_size: 16
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 500
* num\_epochs: 20
### Training results
### Framework versions
* Transformers 4.32.1
* Pytorch 2.2.0+cu121
* Datasets 2.12.0
* Tokenizers 0.13.2
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 2\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 16\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 20",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.32.1\n* Pytorch 2.2.0+cu121\n* Datasets 2.12.0\n* Tokenizers 0.13.2"
] | [
"TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #generated_from_trainer #dataset-common_voice_13_0 #base_model-facebook/wav2vec2-xls-r-300m #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 2\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 16\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 20",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.32.1\n* Pytorch 2.2.0+cu121\n* Datasets 2.12.0\n* Tokenizers 0.13.2"
] | [
87,
143,
4,
33
] | [
"passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #generated_from_trainer #dataset-common_voice_13_0 #base_model-facebook/wav2vec2-xls-r-300m #license-apache-2.0 #model-index #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 2\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 16\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 20### Training results### Framework versions\n\n\n* Transformers 4.32.1\n* Pytorch 2.2.0+cu121\n* Datasets 2.12.0\n* Tokenizers 0.13.2"
] | [
-0.12548525631427765,
0.1710469126701355,
-0.0031393009703606367,
0.05595358833670616,
0.10282581299543381,
0.020788289606571198,
0.0895906463265419,
0.14333748817443848,
-0.055250559002161026,
0.11875218152999878,
0.13116894662380219,
0.10216382890939713,
0.07480146735906601,
0.16070115566253662,
-0.005211059004068375,
-0.3137802183628082,
0.006545325741171837,
-0.00960487313568592,
-0.10418589413166046,
0.12028374522924423,
0.07039150595664978,
-0.11170508712530136,
0.044529784470796585,
0.003781587118282914,
-0.12183249741792679,
-0.01648830622434616,
-0.044469576328992844,
-0.06558839231729507,
0.11329712718725204,
0.028307275846600533,
0.03795752674341202,
0.041270241141319275,
0.10382512956857681,
-0.2686234712600708,
0.008354277350008488,
0.036447878926992416,
0.025499965995550156,
0.08395230770111084,
0.08653401583433151,
-0.015059839934110641,
0.1435498595237732,
-0.08815757930278778,
0.06224994733929634,
0.05013172701001167,
-0.08105006814002991,
-0.2511618733406067,
-0.06620771437883377,
0.08329855650663376,
0.13467080891132355,
0.09857182204723358,
-0.03753482550382614,
0.04556148126721382,
-0.0879034623503685,
0.07874498516321182,
0.26390841603279114,
-0.26139456033706665,
-0.06181996315717697,
-0.027663128450512886,
0.0195197481662035,
0.036085475236177444,
-0.11219149827957153,
-0.018254222348332405,
0.029119275510311127,
0.01567872241139412,
0.09474921226501465,
0.01272861659526825,
0.07426666468381882,
0.005510583054274321,
-0.15592756867408752,
-0.04779091104865074,
0.1326730102300644,
0.11029255390167236,
-0.023351170122623444,
-0.11634472012519836,
-0.03085486777126789,
-0.20396380126476288,
-0.0371575802564621,
-0.01747281290590763,
0.025823652744293213,
-0.06099633499979973,
-0.11075614392757416,
0.013479169458150864,
-0.05581962689757347,
-0.09248434752225876,
0.0409625880420208,
0.1490885615348816,
0.05213923379778862,
-0.03884639963507652,
0.03469129651784897,
0.11086884140968323,
0.07332582771778107,
-0.14213058352470398,
0.0024503415916115046,
0.045022133737802505,
-0.0828578919172287,
0.007325120735913515,
-0.021223418414592743,
0.04194127768278122,
0.02293592318892479,
0.14956632256507874,
0.010187020525336266,
0.07242433726787567,
0.07427133619785309,
0.020688744261860847,
-0.07723468542098999,
0.16174185276031494,
-0.08429112285375595,
-0.10018667578697205,
-0.05336090177297592,
0.1176116019487381,
0.001663515344262123,
-0.013494499027729034,
-0.08240056037902832,
0.015271755866706371,
0.10796689987182617,
0.04578487202525139,
-0.004985119681805372,
0.014221438206732273,
-0.06605849415063858,
-0.03931083530187607,
0.0000306275105685927,
-0.10372089594602585,
0.028975293040275574,
0.04371807724237442,
-0.06602459400892258,
0.02391239069402218,
0.006365395616739988,
0.013773000799119473,
-0.022848675027489662,
0.11296074092388153,
-0.06915979832410812,
-0.026933668181300163,
-0.04108027368783951,
-0.07239628583192825,
0.018359316512942314,
-0.08873742818832397,
0.006006266456097364,
-0.06971178948879242,
-0.07042092829942703,
-0.05653998628258705,
0.04544223099946976,
-0.05135464295744896,
-0.11015310883522034,
-0.1085660383105278,
-0.07010173797607422,
0.04009424149990082,
-0.021712712943553925,
0.12997792661190033,
-0.049090348184108734,
0.10979125648736954,
0.01203238870948553,
0.08549828082323074,
0.08386627584695816,
0.0710315853357315,
-0.030812377110123634,
0.03239753097295761,
-0.12863875925540924,
0.09080800414085388,
-0.07227608561515808,
0.026746626943349838,
-0.14687417447566986,
-0.11642944067716599,
-0.018084434792399406,
0.0019814898259937763,
0.08416856825351715,
0.1406947523355484,
-0.18110424280166626,
-0.12426790595054626,
0.18045035004615784,
-0.05214144289493561,
-0.09081552177667618,
0.13912717998027802,
-0.004038718529045582,
-0.031105320900678635,
0.02824302949011326,
0.19677624106407166,
0.07447411864995956,
-0.07455749809741974,
-0.006365245208144188,
-0.02610699273645878,
0.10559497028589249,
0.011308707296848297,
0.105547696352005,
-0.06950884312391281,
0.024414170533418655,
0.005502799060195684,
-0.06037023663520813,
0.05290600657463074,
-0.09563365578651428,
-0.09431489557027817,
-0.0038395668379962444,
-0.09308850765228271,
0.042187970131635666,
0.036507051438093185,
0.027070660144090652,
-0.07618953287601471,
-0.12508699297904968,
-0.02712712623178959,
0.11511297523975372,
-0.10627543926239014,
0.028100619092583656,
-0.059281397610902786,
0.04871546849608421,
-0.008176863193511963,
-0.00042204619967378676,
-0.1294945627450943,
0.029121033847332,
0.03346190229058266,
-0.051142483949661255,
0.026962831616401672,
-0.009788875468075275,
0.06286296248435974,
0.04753006249666214,
-0.04331682249903679,
-0.08491648733615875,
-0.054340653121471405,
-0.00521164620295167,
-0.07457849383354187,
-0.2405044138431549,
-0.07308933883905411,
-0.016104402020573616,
0.16975225508213043,
-0.20120835304260254,
0.006099769379943609,
0.001574341207742691,
0.1186806857585907,
0.014775418676435947,
-0.0610479973256588,
0.001663997769355774,
0.05402213707566261,
-0.010622229427099228,
-0.0673520565032959,
0.032190416008234024,
-0.010293589904904366,
-0.13419517874717712,
-0.028102146461606026,
-0.1119130328297615,
0.11634044349193573,
0.10340774059295654,
0.025125540792942047,
-0.07931198179721832,
-0.04290187358856201,
-0.06886699795722961,
-0.04542914777994156,
-0.01831241324543953,
0.006736241281032562,
0.19459256529808044,
0.02455577254295349,
0.09823375195264816,
-0.07258643209934235,
-0.04375774785876274,
0.04054027795791626,
0.006086076609790325,
-0.021336689591407776,
0.1652253270149231,
0.10618903487920761,
-0.02548954263329506,
0.1018390953540802,
0.07484867423772812,
-0.0556754432618618,
0.1413259208202362,
-0.04354800656437874,
-0.09672032296657562,
-0.03714638203382492,
0.014974337071180344,
0.010576524771749973,
0.10113006085157394,
-0.16126315295696259,
-0.020390309393405914,
0.022348685190081596,
0.011994196102023125,
0.018010415136814117,
-0.18236112594604492,
-0.021607061848044395,
0.05288092419505119,
-0.0751037448644638,
-0.022100383415818214,
-0.012460235506296158,
-0.025859611108899117,
0.09405660629272461,
0.025098208338022232,
-0.07134199887514114,
-0.02067003957927227,
-0.019623802974820137,
-0.10498058795928955,
0.189874067902565,
-0.08685825020074844,
-0.15226297080516815,
-0.11019594967365265,
0.015129894018173218,
-0.023061275482177734,
-0.005819311365485191,
0.04807858541607857,
-0.12754113972187042,
-0.02125699259340763,
-0.058855362236499786,
0.04684002697467804,
-0.06465845555067062,
0.05575159937143326,
0.03340607509016991,
-0.005860944278538227,
0.04987870529294014,
-0.08512430638074875,
0.01402713730931282,
-0.04470590502023697,
-0.019998542964458466,
0.01099539827555418,
0.03732121363282204,
0.11365071684122086,
0.1621144562959671,
0.05429631844162941,
0.0411740280687809,
-0.03199971839785576,
0.18797510862350464,
-0.11268696188926697,
-0.0051657212898135185,
0.09648611396551132,
0.011872847564518452,
0.039831094443798065,
0.12617643177509308,
0.04074976220726967,
-0.0840492993593216,
0.018140681087970734,
0.031400252133607864,
-0.012368395924568176,
-0.2188260704278946,
-0.04072137922048569,
-0.053109053522348404,
-0.0033955289982259274,
0.12938645482063293,
0.02936827577650547,
-0.016361678019165993,
0.03555617853999138,
-0.007419269066303968,
-0.01492786593735218,
-0.011303343810141087,
0.053610581904649734,
0.04940079525113106,
0.03388360142707825,
0.11268386244773865,
-0.0063904765993356705,
-0.040282465517520905,
0.028823811560869217,
-0.02121715247631073,
0.2264060229063034,
0.00953672919422388,
0.14817027747631073,
0.04503161460161209,
0.177651509642601,
-0.0020241353195160627,
0.06358238309621811,
0.027655210345983505,
-0.01992793008685112,
0.00926513597369194,
-0.05107557401061058,
-0.051073990762233734,
0.052174586802721024,
0.137205570936203,
0.04062763229012489,
-0.13762740790843964,
0.03544805198907852,
0.020180856809020042,
0.33687523007392883,
0.07905761897563934,
-0.2809290289878845,
-0.08430048078298569,
-0.0027111528906971216,
-0.07574283331632614,
-0.012982352636754513,
0.04488343000411987,
0.10991159081459045,
-0.10769948363304138,
0.055905524641275406,
-0.04820064455270767,
0.07353024929761887,
-0.09132920950651169,
0.011064932681620121,
0.036939963698387146,
0.08487284928560257,
0.003684242023155093,
0.06307078152894974,
-0.2541131377220154,
0.2982284426689148,
-0.019205156713724136,
0.060061875730752945,
-0.046953391283750534,
0.023746958002448082,
0.01412068959325552,
-0.05453465133905411,
0.12515754997730255,
-0.003039474133402109,
-0.07683230191469193,
-0.16660724580287933,
-0.09684810787439346,
0.027784893289208412,
0.13196197152137756,
-0.11471328884363174,
0.1090724840760231,
-0.01725935749709606,
-0.03657755255699158,
0.04668806865811348,
-0.03869245946407318,
-0.07807044684886932,
-0.11788351833820343,
0.014262800104916096,
-0.005370642989873886,
0.05393805354833603,
-0.07650566101074219,
-0.11248330771923065,
-0.08889032155275345,
0.1418968290090561,
-0.14130550622940063,
-0.021836042404174805,
-0.1278916448354721,
0.06858065724372864,
0.16087090969085693,
-0.09517642855644226,
0.05216655880212784,
0.0335211381316185,
0.10467374324798584,
0.0011386907426640391,
-0.022934256121516228,
0.10865799337625504,
-0.09165883809328079,
-0.2120990753173828,
-0.046894051134586334,
0.19331879913806915,
0.054489750415086746,
0.07927391678094864,
-0.027722127735614777,
0.014571222476661205,
-0.015470843762159348,
-0.08643024414777756,
0.10304722189903259,
0.05334945768117905,
-0.0030608472879976034,
0.0357721783220768,
-0.027521206066012383,
0.017885373905301094,
-0.08882328867912292,
-0.046041909605264664,
0.16373904049396515,
0.29791393876075745,
-0.09650090336799622,
0.06045481935143471,
0.08042477071285248,
-0.03863120079040527,
-0.14334866404533386,
-0.023077091202139854,
0.125094473361969,
0.042817097157239914,
-0.008276134729385376,
-0.23418353497982025,
0.06006910279393196,
0.07669270038604736,
-0.012190702371299267,
0.05371449515223503,
-0.31542834639549255,
-0.12811227142810822,
0.12614324688911438,
0.07675568759441376,
0.005430123303085566,
-0.12578509747982025,
-0.06549841910600662,
-0.012289569713175297,
-0.07810626924037933,
0.04394346475601196,
0.03068855218589306,
0.12702590227127075,
-0.014196978881955147,
0.036633048206567764,
0.021235695108771324,
-0.04736902192234993,
0.12357146292924881,
0.03528236970305443,
0.029175348579883575,
-0.0027146353386342525,
-0.003958872985094786,
-0.08817712962627411,
-0.051665786653757095,
0.03051290474832058,
-0.09655681252479553,
0.03324500098824501,
-0.10828281193971634,
-0.04335549846291542,
-0.09493352472782135,
0.01881926879286766,
-0.039457403123378754,
-0.04934117570519447,
-0.024931486696004868,
0.02985486574470997,
0.088834248483181,
0.014173929579555988,
0.06672784686088562,
-0.03671717643737793,
0.11952019482851028,
0.08704374730587006,
0.09878295660018921,
-0.010122357867658138,
-0.11250452697277069,
-0.03063277155160904,
-0.018542159348726273,
0.04734989255666733,
-0.0914720967411995,
0.005575427785515785,
0.1464497596025467,
0.060509826987981796,
0.14799441397190094,
0.05376432463526726,
-0.08523936569690704,
-0.007212088909000158,
0.051953673362731934,
-0.099969282746315,
-0.16622322797775269,
-0.03355535864830017,
-0.04240228980779648,
-0.14901070296764374,
0.020274970680475235,
0.08702806383371353,
-0.04593783989548683,
-0.015400505624711514,
-0.008505684323608875,
0.034856051206588745,
-0.0403350330889225,
0.21628673374652863,
0.0543053075671196,
0.09050069749355316,
-0.10406744480133057,
0.07318289577960968,
0.02561230957508087,
-0.12379391491413116,
0.061781078577041626,
0.07420187443494797,
-0.06216926872730255,
-0.008296728134155273,
0.030369805172085762,
0.06320056319236755,
0.00538161676377058,
-0.02391074225306511,
-0.12025313824415207,
-0.1415131539106369,
0.09798728674650192,
0.07303154468536377,
0.04145248979330063,
0.011019532568752766,
-0.052541717886924744,
0.0434732586145401,
-0.10952471196651459,
0.09155602753162384,
0.09158239513635635,
0.05627373233437538,
-0.13476867973804474,
0.13984042406082153,
0.00918494164943695,
0.020364413037896156,
0.002585334936156869,
0.005926808807998896,
-0.11652104556560516,
0.0161164291203022,
-0.031988367438316345,
-0.051043666899204254,
-0.0688658356666565,
-0.007084673270583153,
0.005492765922099352,
-0.05744076147675514,
-0.04298843443393707,
0.016072342172265053,
-0.10991554707288742,
-0.056759994477033615,
-0.010104243643581867,
0.06487832963466644,
-0.10446151345968246,
-0.001975481631234288,
0.03387793153524399,
-0.11899373680353165,
0.10591045022010803,
0.06208090856671333,
0.02487213723361492,
0.030749551951885223,
-0.08611676841974258,
-0.0003911792882718146,
0.043209318071603775,
-0.002119567943736911,
0.034591760486364365,
-0.16962863504886627,
-0.011052416637539864,
-0.039039116352796555,
0.02047107182443142,
-0.0015559865860268474,
0.027162590995430946,
-0.13008029758930206,
-0.0021475618705153465,
-0.05773557350039482,
-0.05402318760752678,
-0.07159975171089172,
0.04912671446800232,
0.07887965440750122,
0.009730987250804901,
0.15575562417507172,
-0.07772368937730789,
0.06476716697216034,
-0.21828672289848328,
0.0010416775476187468,
-0.02958313561975956,
-0.05047523230314255,
-0.0578928217291832,
-0.026653533801436424,
0.09158825129270554,
-0.052441682666540146,
0.06954797357320786,
-0.0650777667760849,
0.01805950701236725,
0.018302274867892265,
-0.09318360686302185,
0.02522273361682892,
0.036262836307287216,
0.20202356576919556,
0.056317850947380066,
-0.038443081080913544,
0.058401793241500854,
0.008404208347201347,
0.06998572498559952,
0.1181754469871521,
0.16050675511360168,
0.14299564063549042,
0.031527671962976456,
0.0955057442188263,
0.07798551023006439,
-0.12839792668819427,
-0.1614081859588623,
0.11974895000457764,
-0.060128238052129745,
0.12401756644248962,
0.007030653767287731,
0.21334858238697052,
0.1075497418642044,
-0.196354940533638,
0.04293002560734749,
-0.0024380343966186047,
-0.08707750588655472,
-0.11367029696702957,
-0.0856127142906189,
-0.08639683574438095,
-0.17378777265548706,
0.02139374241232872,
-0.1064460501074791,
0.03411988914012909,
0.0441560260951519,
0.036192480474710464,
0.02895454503595829,
0.11640722304582596,
0.06134150177240372,
-0.0024491401854902506,
0.11545983701944351,
0.03065125271677971,
-0.04411347955465317,
-0.05770324543118477,
-0.0864153653383255,
0.0458371527493,
-0.030374327674508095,
0.06367778778076172,
-0.06974676251411438,
-0.1352807581424713,
0.07310574501752853,
0.024495506659150124,
-0.10003415495157242,
0.019219662994146347,
-0.030303459614515305,
0.07053515315055847,
0.09427163749933243,
0.030976565554738045,
-0.02999047003686428,
-0.024113312363624573,
0.2092059999704361,
-0.10586758702993393,
-0.028034545481204987,
-0.11738314479589462,
0.22466698288917542,
0.027675723657011986,
0.005015946459025145,
0.027205215767025948,
-0.07136528193950653,
-0.020301228389143944,
0.17584413290023804,
0.1746561974287033,
-0.021583015099167824,
-0.02072276547551155,
0.03386598080396652,
-0.0052924808114767075,
-0.025137098506093025,
0.062260664999485016,
0.14106495678424835,
0.0804453045129776,
-0.04435616359114647,
-0.009437757544219494,
-0.04697570577263832,
-0.07233494520187378,
-0.01148369163274765,
0.08922344446182251,
0.04922645539045334,
-0.01627189852297306,
-0.023880338296294212,
0.10010814666748047,
-0.0794500783085823,
-0.13331985473632812,
0.016648631542921066,
-0.1942412555217743,
-0.18582075834274292,
-0.029801011085510254,
0.03905806690454483,
0.059339798986911774,
0.05982634425163269,
0.0020847227424383163,
-0.03431665897369385,
0.13624663650989532,
0.018920067697763443,
-0.08621659129858017,
-0.09592133015394211,
0.06548735499382019,
-0.07054241001605988,
0.17531321942806244,
-0.0372297465801239,
0.02186485007405281,
0.10882031172513962,
0.0833224356174469,
-0.06971900910139084,
0.05107225850224495,
0.07728475332260132,
-0.10550279915332794,
0.05897855758666992,
0.18618229031562805,
-0.03414689376950264,
0.1394965797662735,
0.052819717675447464,
-0.11710284650325775,
0.0015923130558803678,
-0.0897347554564476,
-0.04801929369568825,
-0.0608002245426178,
0.016851350665092468,
-0.04382715001702309,
0.13497842848300934,
0.19894516468048096,
-0.07438643276691437,
-0.028480730950832367,
-0.04176769405603409,
0.024173026904463768,
0.044017139822244644,
0.12992726266384125,
-0.023186206817626953,
-0.25334814190864563,
0.013749103993177414,
-0.0580732524394989,
0.009289135225117207,
-0.2289610356092453,
-0.0962304100394249,
0.01726502925157547,
-0.05854904279112816,
-0.07677207887172699,
0.09712758660316467,
0.042507000267505646,
0.046838339418172836,
-0.05383828282356262,
-0.056649480015039444,
-0.036030322313308716,
0.18145710229873657,
-0.19826273620128632,
-0.06404505670070648
] |
null | null | transformers |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
| {"library_name": "transformers", "tags": []} | null | Gopal2002/donut_finetune_vqa | [
"transformers",
"safetensors",
"vision-encoder-decoder",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | 2024-02-15T07:38:56+00:00 | [
"1910.09700"
] | [] | TAGS
#transformers #safetensors #vision-encoder-decoder #arxiv-1910.09700 #endpoints_compatible #region-us
|
# Model Card for Model ID
## Model Details
### Model Description
This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by:
- Funded by [optional]:
- Shared by [optional]:
- Model type:
- Language(s) (NLP):
- License:
- Finetuned from model [optional]:
### Model Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Downstream Use [optional]
### Out-of-Scope Use
## Bias, Risks, and Limitations
### Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
## Training Details
### Training Data
### Training Procedure
#### Preprocessing [optional]
#### Training Hyperparameters
- Training regime:
#### Speeds, Sizes, Times [optional]
## Evaluation
### Testing Data, Factors & Metrics
#### Testing Data
#### Factors
#### Metrics
### Results
#### Summary
## Model Examination [optional]
## Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type:
- Hours used:
- Cloud Provider:
- Compute Region:
- Carbon Emitted:
## Technical Specifications [optional]
### Model Architecture and Objective
### Compute Infrastructure
#### Hardware
#### Software
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Model Card Authors [optional]
## Model Card Contact
| [
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
"TAGS\n#transformers #safetensors #vision-encoder-decoder #arxiv-1910.09700 #endpoints_compatible #region-us \n",
"# Model Card for Model ID",
"## Model Details",
"### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:",
"### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Downstream Use [optional]",
"### Out-of-Scope Use",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.",
"## How to Get Started with the Model\n\nUse the code below to get started with the model.",
"## Training Details",
"### Training Data",
"### Training Procedure",
"#### Preprocessing [optional]",
"#### Training Hyperparameters\n\n- Training regime:",
"#### Speeds, Sizes, Times [optional]",
"## Evaluation",
"### Testing Data, Factors & Metrics",
"#### Testing Data",
"#### Factors",
"#### Metrics",
"### Results",
"#### Summary",
"## Model Examination [optional]",
"## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:",
"## Technical Specifications [optional]",
"### Model Architecture and Objective",
"### Compute Infrastructure",
"#### Hardware",
"#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Model Card Authors [optional]",
"## Model Card Contact"
] | [
39,
6,
3,
82,
28,
3,
4,
9,
9,
10,
42,
20,
3,
4,
5,
9,
11,
13,
3,
12,
5,
4,
5,
3,
4,
9,
53,
9,
8,
6,
3,
14,
8,
7,
9,
4
] | [
"passage: TAGS\n#transformers #safetensors #vision-encoder-decoder #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact"
] | [
-0.05306316912174225,
0.20225799083709717,
-0.004532716237008572,
0.02486577071249485,
0.10745619982481003,
0.005829503294080496,
0.06235508993268013,
0.11292294412851334,
-0.0020419019274413586,
0.12131396681070328,
0.02719549834728241,
0.07967612892389297,
0.12327883392572403,
0.1561002880334854,
0.003298027440905571,
-0.2451947182416916,
0.056976594030857086,
-0.09095743298530579,
0.005862638354301453,
0.11416777968406677,
0.1328127682209015,
-0.10651655495166779,
0.09236611425876617,
-0.006378492806106806,
-0.018223589286208153,
-0.01560811698436737,
-0.07464081048965454,
-0.06576249748468399,
0.05573050677776337,
0.07688132673501968,
0.0721992626786232,
0.009551521390676498,
0.07658318430185318,
-0.2803225517272949,
0.014262731187045574,
0.08472383767366409,
0.001679662149399519,
0.06746198982000351,
0.08127366751432419,
-0.06743600219488144,
0.1253480762243271,
-0.072596974670887,
0.1388878971338272,
0.07806979864835739,
-0.09000035375356674,
-0.19567735493183136,
-0.06588542461395264,
0.06967565417289734,
0.13176710903644562,
0.05219413340091705,
-0.02802334912121296,
0.13226838409900665,
-0.09170275926589966,
0.009603562764823437,
0.11835573613643646,
-0.0669531598687172,
-0.05480296164751053,
0.036625828593969345,
0.09896823018789291,
0.08901896327733994,
-0.11835762113332748,
-0.005250310990959406,
0.02971211075782776,
0.02317153476178646,
0.08707734197378159,
0.01639537699520588,
0.14948242902755737,
0.03726613521575928,
-0.14019928872585297,
-0.05744593217968941,
0.0939771831035614,
0.03824683278799057,
-0.04958764836192131,
-0.2343815118074417,
-0.031156767159700394,
-0.016557393595576286,
-0.0305255725979805,
-0.04024772346019745,
0.05501044541597366,
-0.03525097668170929,
0.0778101459145546,
-0.00980226881802082,
-0.08041384816169739,
-0.028304021805524826,
0.04802321270108223,
0.06495635211467743,
0.0185654629021883,
-0.0052207582630217075,
0.02111206203699112,
0.11669638752937317,
0.0788947269320488,
-0.13250376284122467,
-0.07449234277009964,
-0.07420225441455841,
-0.098720021545887,
-0.04206917807459831,
0.035802435129880905,
0.07063398510217667,
0.03907659277319908,
0.1947319507598877,
-0.017316637560725212,
0.0492829903960228,
0.04488114267587662,
0.006694390904158354,
0.06837808340787888,
0.11208193004131317,
-0.06981375813484192,
-0.1344674974679947,
-0.058068543672561646,
0.0876275971531868,
-0.004179352894425392,
-0.03578972443938255,
-0.050102028995752335,
0.04014993831515312,
0.030947856605052948,
0.11303774267435074,
0.08247632533311844,
0.00018158231978304684,
-0.06263403594493866,
-0.042954958975315094,
0.22251743078231812,
-0.14639760553836823,
0.04202887415885925,
0.005975429899990559,
-0.04316512867808342,
-0.004425262566655874,
0.010939684696495533,
0.012188587337732315,
-0.037748441100120544,
0.10074765980243683,
-0.07618969678878784,
-0.034445326775312424,
-0.11386469751596451,
-0.0645400807261467,
0.02648116648197174,
0.004736251663416624,
-0.021350713446736336,
-0.043291062116622925,
-0.11405764520168304,
-0.04943132400512695,
0.07177776843309402,
-0.07382809370756149,
-0.05491260066628456,
0.011223746463656425,
-0.053137388080358505,
0.004582186229526997,
0.00267049134708941,
0.1107012927532196,
-0.032886769622564316,
0.02474004216492176,
-0.046873610466718674,
0.06870805472135544,
0.10572899132966995,
0.037948183715343475,
-0.08262749016284943,
0.07297130674123764,
-0.23081135749816895,
0.10601279884576797,
-0.0860590934753418,
0.020004073157906532,
-0.14292661845684052,
-0.04098188504576683,
0.028510602191090584,
0.027961768209934235,
-0.010129709728062153,
0.1261758655309677,
-0.2058180272579193,
-0.030243845656514168,
0.14840680360794067,
-0.11589378863573074,
-0.09422067552804947,
0.06595603376626968,
-0.05322439596056938,
0.10703813284635544,
0.04694817587733269,
-0.023595338687300682,
0.07076855003833771,
-0.1317324936389923,
-0.04631568118929863,
-0.021216953173279762,
-0.014257868751883507,
0.1454293578863144,
0.06817390769720078,
-0.05359628424048424,
0.07706184685230255,
0.021635930985212326,
-0.037367887794971466,
-0.03435307368636131,
-0.03299751877784729,
-0.0921449214220047,
0.006188652943819761,
-0.06746368855237961,
0.029415255412459373,
-0.021083641797304153,
-0.09160202741622925,
-0.0303809754550457,
-0.1750536859035492,
0.03627307340502739,
0.08193549513816833,
0.006318137980997562,
-0.0197709072381258,
-0.09286735206842422,
0.01832297258079052,
-0.012543086893856525,
-0.01892191916704178,
-0.16171851754188538,
-0.04700363799929619,
0.04258821904659271,
-0.20086339116096497,
0.01967989094555378,
-0.03661181032657623,
0.04892909526824951,
0.03480081260204315,
-0.04064054414629936,
-0.008631768636405468,
0.003335281042382121,
0.015725387260317802,
-0.024876078590750694,
-0.20007483661174774,
-0.030333993956446648,
-0.02597803995013237,
0.13431528210639954,
-0.22246739268302917,
0.02835940755903721,
0.08417746424674988,
0.1434013694524765,
0.0016660679830238223,
-0.04368278756737709,
0.014076939783990383,
-0.054847393184900284,
-0.05226233974099159,
-0.06906641274690628,
-0.006392029579728842,
-0.03333830088376999,
-0.03956965357065201,
0.07096927613019943,
-0.20027990639209747,
-0.04262993112206459,
0.10826600342988968,
0.09837738424539566,
-0.14323553442955017,
-0.02474437840282917,
-0.04144697263836861,
-0.061835117638111115,
-0.09049158543348312,
-0.06381296366453171,
0.1425853669643402,
0.04930912330746651,
0.05233171954751015,
-0.08628004789352417,
-0.06071622669696808,
0.010883725248277187,
-0.00011982241994701326,
-0.04022165387868881,
0.08589175343513489,
0.08529563993215561,
-0.1104804202914238,
0.09131632000207901,
0.08476598560810089,
0.06810380518436432,
0.10764316469430923,
0.0015695258043706417,
-0.10674308240413666,
-0.02863943576812744,
0.007197246421128511,
0.014064288698136806,
0.14327655732631683,
-0.04052828997373581,
0.048501238226890564,
0.05552942305803299,
-0.026791011914610863,
0.01846246048808098,
-0.1073673665523529,
0.03181065618991852,
0.047323208302259445,
-0.009707847610116005,
0.022087840363383293,
-0.03469701483845711,
0.029233038425445557,
0.08711127936840057,
0.035028502345085144,
0.029371140524744987,
0.006271174643188715,
-0.035827167332172394,
-0.10475867241621017,
0.17433254420757294,
-0.0890800803899765,
-0.2983497679233551,
-0.1401464194059372,
-0.0031294787768274546,
0.04853705316781998,
-0.02189650572836399,
0.011501757428050041,
-0.04715869575738907,
-0.11619791388511658,
-0.10501103848218918,
0.008872180245816708,
0.04236939176917076,
-0.07721052318811417,
-0.0677536204457283,
0.049645159393548965,
0.03511786088347435,
-0.1394437551498413,
0.02193400263786316,
0.04976930096745491,
-0.03652774170041084,
-0.014805521816015244,
0.07396114617586136,
0.1029050275683403,
0.17309242486953735,
-0.006633569020777941,
-0.017617538571357727,
0.023926623165607452,
0.24285922944545746,
-0.14599764347076416,
0.10973547399044037,
0.15899382531642914,
-0.06477009505033493,
0.10309092700481415,
0.19795724749565125,
0.023503681644797325,
-0.07609159499406815,
0.03380175679922104,
0.03942976891994476,
-0.05477331578731537,
-0.22650650143623352,
-0.06247439235448837,
-0.0018889455823227763,
-0.07084330171346664,
0.0898994579911232,
0.09043484926223755,
0.10881193727254868,
0.04595636948943138,
-0.08826620131731033,
-0.06884618103504181,
0.018320003524422646,
0.10980518162250519,
-0.021245790645480156,
0.0068580652587115765,
0.08790436387062073,
-0.04739201068878174,
-0.004643888212740421,
0.10682045668363571,
0.012494737282395363,
0.19045358896255493,
0.026300430297851562,
0.1511247158050537,
0.0705496221780777,
0.02818082831799984,
0.028634218499064445,
0.01902483031153679,
0.02658020332455635,
0.008844421245157719,
-0.017161816358566284,
-0.08914987742900848,
0.024366190657019615,
0.1353050321340561,
0.07279488444328308,
0.03349636495113373,
0.02151625230908394,
-0.03378571942448616,
0.06302186101675034,
0.16916872560977936,
0.010774882510304451,
-0.22069227695465088,
-0.038822758942842484,
0.08901690691709518,
-0.07502853125333786,
-0.12661625444889069,
-0.02531552128493786,
0.041314706206321716,
-0.17944134771823883,
0.04614526778459549,
-0.01675923727452755,
0.11310327053070068,
-0.12792818248271942,
-0.027826640754938126,
0.0399215929210186,
0.08714547008275986,
-0.030904775485396385,
0.07850778102874756,
-0.16933031380176544,
0.11483496427536011,
0.012726574204862118,
0.06142592057585716,
-0.11495199054479599,
0.09937658905982971,
0.012870636768639088,
-0.004057616461068392,
0.166650652885437,
-0.0004076457116752863,
-0.07127676159143448,
-0.06420327723026276,
-0.0747542679309845,
-0.021446911618113518,
0.09447984397411346,
-0.11176029592752457,
0.08143189549446106,
-0.014278407208621502,
-0.038768354803323746,
0.002094640163704753,
-0.1095178872346878,
-0.12445959448814392,
-0.19356580078601837,
0.06069660931825638,
-0.10815826058387756,
0.003831093432381749,
-0.09925412386655807,
-0.05358343943953514,
-0.04677683115005493,
0.20124182105064392,
-0.14483362436294556,
-0.09758627414703369,
-0.15289589762687683,
-0.09554614871740341,
0.1664566993713379,
-0.04663718119263649,
0.08821606636047363,
-0.0038687095511704683,
0.22914768755435944,
0.006724040023982525,
-0.012434800155460835,
0.07520399242639542,
-0.08637748658657074,
-0.17783690989017487,
-0.07604682445526123,
0.12165174633264542,
0.12136691808700562,
0.04747939854860306,
-0.012795967981219292,
0.021366307511925697,
-0.03262670710682869,
-0.1143370196223259,
0.008061125874519348,
0.12403146922588348,
0.05908475071191788,
0.04311663657426834,
0.00427021412178874,
-0.11007828265428543,
-0.07147429138422012,
-0.03523124009370804,
0.021944832056760788,
0.18807166814804077,
-0.08221416175365448,
0.15053938329219818,
0.12907743453979492,
-0.05370299890637398,
-0.21416690945625305,
0.03436880186200142,
0.04090806469321251,
0.00472818361595273,
0.053175248205661774,
-0.1770489513874054,
0.07840683311223984,
0.024572648108005524,
-0.05139078199863434,
0.15225622057914734,
-0.1677834391593933,
-0.1536329686641693,
0.0777980238199234,
0.05417194589972496,
-0.21989625692367554,
-0.12022100389003754,
-0.08520790934562683,
-0.06769770383834839,
-0.14243635535240173,
0.08412115275859833,
0.020698823034763336,
0.00019321028958074749,
0.04856256768107414,
0.034081753343343735,
0.019389502704143524,
-0.04823823645710945,
0.21962693333625793,
-0.009151222184300423,
0.036160267889499664,
-0.07677234709262848,
-0.09645134955644608,
0.07220485061407089,
-0.054490040987730026,
0.08667207509279251,
-0.024035243317484856,
0.007558862213045359,
-0.07708337903022766,
-0.05522387474775314,
-0.051626816391944885,
0.029731709510087967,
-0.07851403951644897,
-0.1059325784444809,
-0.0698300376534462,
0.09317977726459503,
0.09290435165166855,
-0.03390717878937721,
-0.03715480491518974,
-0.09010928869247437,
0.027843475341796875,
0.20326566696166992,
0.1706404834985733,
0.05219545215368271,
-0.10166811943054199,
0.0006855534156784415,
-0.017444688826799393,
0.04213650897145271,
-0.21454355120658875,
0.04805922508239746,
0.046434495598077774,
0.023125024512410164,
0.11990387737751007,
-0.016930051147937775,
-0.16412340104579926,
-0.04671873524785042,
0.056455835700035095,
-0.036497533321380615,
-0.20693081617355347,
-0.012548294849693775,
0.052488550543785095,
-0.18057814240455627,
-0.0640311911702156,
0.01719793491065502,
-0.01455759722739458,
-0.024732910096645355,
0.013886804692447186,
0.06330747157335281,
0.027956031262874603,
0.09375539422035217,
0.05616210401058197,
0.10073219239711761,
-0.11451173573732376,
0.08603792637586594,
0.08968610316514969,
-0.09007474035024643,
0.0125290397554636,
0.0736992210149765,
-0.055889032781124115,
-0.02316245622932911,
0.020047122612595558,
0.06119868531823158,
-0.00162112049292773,
-0.061508238315582275,
-0.01975059136748314,
-0.11018196493387222,
0.06758071482181549,
0.13242226839065552,
0.04006093740463257,
-0.005321177653968334,
0.048064373433589935,
0.021657899022102356,
-0.0826604813337326,
0.11298491060733795,
0.025856876745820045,
0.038508445024490356,
-0.06642783433198929,
-0.023183433338999748,
0.045513976365327835,
0.008002778515219688,
-0.020539432764053345,
-0.02922571264207363,
-0.05385620519518852,
-0.011762911453843117,
-0.18990042805671692,
0.01677597686648369,
-0.0765504315495491,
0.004350012633949518,
0.014810539782047272,
-0.03814086318016052,
-0.020958559587597847,
0.016691848635673523,
-0.07982292026281357,
-0.05134430527687073,
-0.002734045498073101,
0.0985812395811081,
-0.13863937556743622,
0.007181720342487097,
0.08835478872060776,
-0.11878933757543564,
0.06740869581699371,
-0.024482958018779755,
-0.017330501228570938,
-0.0019446499645709991,
-0.12968114018440247,
0.04302144795656204,
0.002646980108693242,
0.01983785443007946,
0.042118169367313385,
-0.16895927488803864,
0.006406146101653576,
-0.04021742194890976,
-0.04932057484984398,
-0.01649690791964531,
-0.07501037418842316,
-0.11416008323431015,
0.11071296036243439,
0.0018048452911898494,
-0.07790763676166534,
-0.0117954695597291,
0.05156298726797104,
0.10936092585325241,
-0.037753306329250336,
0.1212289109826088,
0.0044145528227090836,
0.06474429368972778,
-0.18044467270374298,
-0.024778762832283974,
-0.015598369762301445,
0.006858370266854763,
0.02694357931613922,
-0.0140616400167346,
0.042949724942445755,
-0.013948258012533188,
0.2575705349445343,
-0.02087925560772419,
0.07440228760242462,
0.06519795209169388,
0.046770140528678894,
0.011650492437183857,
0.0869150459766388,
0.06678774207830429,
0.012648052535951138,
0.003416551509872079,
0.032283470034599304,
-0.031422972679138184,
-0.014915283769369125,
-0.1482047736644745,
0.07101717591285706,
0.1439564973115921,
0.08283385634422302,
0.012806775979697704,
0.06352550536394119,
-0.10048910230398178,
-0.10461166501045227,
0.08237126469612122,
-0.041638992726802826,
-0.0008007619762793183,
-0.058745238929986954,
0.1454916000366211,
0.15373669564723969,
-0.17184817790985107,
0.0812029168009758,
-0.0365472249686718,
-0.04692631587386131,
-0.110826775431633,
-0.16450464725494385,
-0.06572620570659637,
-0.02580863982439041,
-0.003703176509588957,
-0.05508402734994888,
0.06688077747821808,
0.11885157972574234,
-0.0016993435565382242,
-0.0014807049883529544,
0.10038914531469345,
-0.022431546822190285,
-0.02085503377020359,
0.03484790027141571,
0.04871930554509163,
0.036450520157814026,
-0.04567375034093857,
0.02037064917385578,
0.010481024160981178,
0.03885771334171295,
0.0584954097867012,
0.024686437100172043,
-0.03216709941625595,
0.015852995216846466,
-0.012883862480521202,
-0.10344603657722473,
0.023220296949148178,
-0.02661588042974472,
-0.07547900825738907,
0.12944786250591278,
0.03020857274532318,
0.016596315428614616,
-0.03272676467895508,
0.20282141864299774,
-0.07233016192913055,
-0.07228796184062958,
-0.14049077033996582,
0.11066530644893646,
-0.03675038740038872,
0.06309209764003754,
0.05746688321232796,
-0.11500853300094604,
-0.005017681512981653,
0.12902788817882538,
0.12829062342643738,
-0.03323068469762802,
0.0049867476336658,
0.026197774335741997,
0.00827173050493002,
-0.049032241106033325,
0.04763645678758621,
0.032188743352890015,
0.15468738973140717,
-0.07100655138492584,
0.07814180850982666,
-0.0005074164946563542,
-0.08921105414628983,
-0.03626048564910889,
0.1392250955104828,
0.0028979976195842028,
0.032514654099941254,
-0.06501611322164536,
0.1007646918296814,
-0.07232671231031418,
-0.22924686968326569,
0.04944430664181709,
-0.0780143067240715,
-0.1530541479587555,
-0.012732122093439102,
0.02754151076078415,
-0.013794543221592903,
0.02254127338528633,
0.061679136008024216,
-0.06457692384719849,
0.16001729667186737,
0.03501654788851738,
-0.08832933753728867,
-0.05492360517382622,
0.07016300410032272,
-0.1023297980427742,
0.29636111855506897,
0.011759842745959759,
0.032544128596782684,
0.10384590178728104,
-0.01977287232875824,
-0.1365358531475067,
0.029908113181591034,
0.09750298410654068,
-0.0958370491862297,
0.0666261613368988,
0.18576137721538544,
-0.01007252186536789,
0.10188479721546173,
0.07690271735191345,
-0.06373509019613266,
0.05595288425683975,
-0.08074036985635757,
-0.06501531600952148,
-0.0913926437497139,
0.0567726232111454,
-0.06534478068351746,
0.14360256493091583,
0.11906343698501587,
-0.03889768570661545,
-0.0050442758947610855,
-0.029765458777546883,
0.041717544198036194,
0.013438636437058449,
0.12334854900836945,
0.013119494542479515,
-0.1562117338180542,
0.029238002374768257,
0.003133314661681652,
0.10249229520559311,
-0.2173282951116562,
-0.08784149587154388,
0.04678759351372719,
-0.03227124735713005,
-0.05141454562544823,
0.1055324450135231,
0.06109399348497391,
0.05229108780622482,
-0.0470384918153286,
-0.05454113334417343,
-0.007119470275938511,
0.1505606472492218,
-0.11762657761573792,
-0.009603551588952541
] |
null | null | null | introduction:(this is an nsfw motion lora model/Research on motion models using human bioart/trained by motion director)
1.As the CC0 Shared Model Agreement:Creative Commons Zero v1.0 Universal
2.any commercial legal disputes arising from the use of this model shall be borne by yourself.
3.if you install this model that implies that you accept the terms.
---
*example result has been released on the pixiv(https://www.pixiv.net/users/86329215) by some ai art creators(non-identity work).
**motion lora model configuration
1.train in 1024x576size 256rank 16framesx6 1500steps
2.six videos input with folder training with 2 types of text config
3.trian prompt:"a highly quality clear best 2.5D anime video of sex nsfw animation,close up in the style of anime and hyper-detailed renderings(Back view)"
4.training model:aamAnyloraAnimeMixAnime_v1.safetensors, inference model:mistoonAnime_v20.safetensors
5.additionally necessary specify parameters:use_offset_noise: true
6.learning_rate/learning_rate_spatial/adam_weight_decay:between 5x-20xoriginal learning_rate=depending on your training set counts
7.Suitable for Animatediff workflow, weight recommended in between 0.5-1 | {"language": ["en"], "license": "cc0-1.0", "tags": ["not-for-all-audiences", "text-generation-inference", "biology", "art"], "pipeline_tag": "text-to-video"} | text-to-video | DREX-Institute/2.5D_ANIME_NSFW | [
"not-for-all-audiences",
"text-generation-inference",
"biology",
"art",
"text-to-video",
"en",
"license:cc0-1.0",
"region:us"
] | 2024-02-15T07:41:18+00:00 | [] | [
"en"
] | TAGS
#not-for-all-audiences #text-generation-inference #biology #art #text-to-video #en #license-cc0-1.0 #region-us
| introduction:(this is an nsfw motion lora model/Research on motion models using human bioart/trained by motion director)
1.As the CC0 Shared Model Agreement:Creative Commons Zero v1.0 Universal
2.any commercial legal disputes arising from the use of this model shall be borne by yourself.
3.if you install this model that implies that you accept the terms.
---
*example result has been released on the pixiv(URL by some ai art creators(non-identity work).
motion lora model configuration
1.train in 1024x576size 256rank 16framesx6 1500steps
2.six videos input with folder training with 2 types of text config
3.trian prompt:"a highly quality clear best 2.5D anime video of sex nsfw animation,close up in the style of anime and hyper-detailed renderings(Back view)"
4.training model:aamAnyloraAnimeMixAnime_v1.safetensors, inference model:mistoonAnime_v20.safetensors
5.additionally necessary specify parameters:use_offset_noise: true
6.learning_rate/learning_rate_spatial/adam_weight_decay:between 5x-20xoriginal learning_rate=depending on your training set counts
7.Suitable for Animatediff workflow, weight recommended in between 0.5-1 | [] | [
"TAGS\n#not-for-all-audiences #text-generation-inference #biology #art #text-to-video #en #license-cc0-1.0 #region-us \n"
] | [
45
] | [
"passage: TAGS\n#not-for-all-audiences #text-generation-inference #biology #art #text-to-video #en #license-cc0-1.0 #region-us \n"
] | [
0.0643923357129097,
0.09864895790815353,
-0.007198333740234375,
-0.0038120849058032036,
0.006611126009374857,
0.04562016576528549,
0.21968594193458557,
0.17394262552261353,
0.13042087852954865,
-0.021635573357343674,
0.12194123864173889,
0.0472535714507103,
-0.004848950542509556,
-0.0269025806337595,
-0.05157622694969177,
-0.18036335706710815,
-0.019789641723036766,
0.09846853464841843,
0.1172012910246849,
0.03637557476758957,
0.034833114594221115,
-0.08975894749164581,
0.04793394356966019,
-0.07756839692592621,
-0.04302864521741867,
-0.07506097853183746,
0.05480998754501343,
0.0009372970089316368,
0.14555028080940247,
-0.05260549113154411,
0.04251432791352272,
0.08617495745420456,
0.03979303687810898,
-0.1944633424282074,
0.025623971596360207,
-0.005194520577788353,
-0.10816574096679688,
0.06591607630252838,
0.09199006110429764,
0.028438586741685867,
0.16213327646255493,
0.06276487559080124,
-0.059098318219184875,
0.09602053463459015,
-0.18568235635757446,
0.03716333210468292,
0.04913749918341637,
-0.0534515343606472,
-0.015497459098696709,
0.039675626903772354,
-0.0044394973665475845,
0.008518918417394161,
-0.044815510511398315,
0.023753562942147255,
0.15957427024841309,
-0.15675383806228638,
0.010048354975879192,
0.1321505457162857,
0.21081769466400146,
0.16357740759849548,
-0.2009592205286026,
0.16915443539619446,
0.02732277475297451,
-0.06355098634958267,
-0.036632224917411804,
-0.11328112334012985,
0.00878069270402193,
-0.07670754194259644,
-0.025786103680729866,
-0.010850651189684868,
0.2833906412124634,
0.06434234231710434,
0.025376055389642715,
0.031012222170829773,
-0.04574275016784668,
-0.08689701557159424,
-0.08744637668132782,
0.05787140876054764,
0.09392668306827545,
0.06271085143089294,
-0.0015934298280626535,
-0.07966125011444092,
-0.15469536185264587,
-0.02699667401611805,
-0.028658542782068253,
-0.05562111735343933,
-0.040897589176893234,
0.05262390896677971,
-0.1974351406097412,
-0.04463241621851921,
-0.06745540350675583,
-0.06608226150274277,
0.026511654257774353,
-0.07540639489889145,
0.10696591436862946,
-0.011095261201262474,
-0.015421153046190739,
-0.029701104387640953,
0.1727735549211502,
0.05827070772647858,
-0.015651635825634003,
0.000775094551499933,
-0.16537220776081085,
0.18945752084255219,
0.11059437692165375,
-0.18642520904541016,
0.03852327540516853,
0.12359800934791565,
0.07755480706691742,
-0.003581063589081168,
-0.0022802420426160097,
-0.013884237967431545,
-0.08912470191717148,
0.02315554767847061,
-0.0946795865893364,
0.052504412829875946,
0.07946085184812546,
-0.06880974769592285,
-0.09973951429128647,
0.06218724697828293,
-0.1013510599732399,
0.11646604537963867,
-0.05461808294057846,
0.07506782561540604,
0.08628728240728378,
0.11414899677038193,
-0.04646840691566467,
0.04966501146554947,
0.11395247280597687,
-0.020368298515677452,
-0.12210957705974579,
-0.0029701320454478264,
0.01804436184465885,
0.018864598125219345,
0.11211390048265457,
-0.013425083830952644,
0.06909115612506866,
-0.10420010983943939,
0.07143019884824753,
0.04233551397919655,
-0.02003711275756359,
0.004738556686788797,
0.036821041256189346,
0.04972236603498459,
0.054859891533851624,
-0.010091873817145824,
-0.03543254733085632,
-0.014162631705403328,
-0.1426590085029602,
0.14689376950263977,
-0.12918226420879364,
0.14227911829948425,
-0.09180128574371338,
-0.006464199163019657,
-0.05062217637896538,
0.03841426968574524,
0.009452992118895054,
-0.031551167368888855,
-0.1003262847661972,
0.015891771763563156,
0.03974669426679611,
-0.0010278815170750022,
-0.06363867223262787,
0.0076926411129534245,
-0.062284454703330994,
0.2690587639808655,
-0.12261099368333817,
0.020429493859410286,
0.20302237570285797,
0.0055464389733970165,
-0.1426841914653778,
0.13785900175571442,
0.021202152594923973,
0.07743489742279053,
0.0376768633723259,
0.3306642174720764,
-0.11364513635635376,
-0.22932365536689758,
0.014042270369827747,
0.23601244390010834,
-0.00969629641622305,
0.039654847234487534,
0.07938776165246964,
0.01855306141078472,
-0.013719645328819752,
-0.049514200538396835,
0.016750840470194817,
0.008345321752130985,
-0.08525712043046951,
-0.028355559334158897,
0.04545889049768448,
0.004652759525924921,
0.12070195376873016,
-0.019803885370492935,
0.011384619399905205,
-0.02194364368915558,
0.010950842872262001,
-0.06387660652399063,
0.05046958476305008,
0.04665681719779968,
0.07995747774839401,
0.012451947666704655,
0.1315179318189621,
0.027519624680280685,
-0.011595472693443298,
-0.07883166521787643,
0.015572074800729752,
-0.0030079775024205446,
0.08137165009975433,
0.139859139919281,
0.20035111904144287,
0.009385003708302975,
-0.0662061870098114,
-0.08716903626918793,
0.0703086256980896,
-0.0805751383304596,
0.016146667301654816,
-0.03224103897809982,
-0.18568050861358643,
0.16926519572734833,
-0.04820341616868973,
0.05085287243127823,
-0.13276216387748718,
-0.009328163228929043,
0.1720438003540039,
-0.027834879234433174,
-0.006537428125739098,
-0.025675302371382713,
0.1145847886800766,
-0.05708019435405731,
-0.05075935274362564,
0.015161382965743542,
0.11660505831241608,
-0.003245932050049305,
-0.15250840783119202,
0.10747630894184113,
-0.10635910928249359,
0.06136295199394226,
0.1455414593219757,
-0.35586047172546387,
-0.012730862013995647,
-0.04184754192829132,
-0.011725782416760921,
0.04265502467751503,
-0.04861481487751007,
-0.041571393609046936,
0.0432170070707798,
-0.0005898391827940941,
0.03261613845825195,
-0.053425200283527374,
0.004110570996999741,
0.03936848044395447,
-0.0993451401591301,
0.018465576693415642,
0.07831519842147827,
0.16558772325515747,
-0.13910308480262756,
0.12382898479700089,
0.2771131992340088,
0.06931085139513016,
0.22643481194972992,
-0.0009785221191123128,
-0.00968173798173666,
0.030229538679122925,
-0.0010748447384685278,
-0.04209241271018982,
0.1848967969417572,
-0.1610746681690216,
0.073683001101017,
0.06435146182775497,
-0.02545568346977234,
-0.029721425846219063,
-0.18036863207817078,
-0.10999323427677155,
-0.07116824388504028,
-0.013287968933582306,
-0.1344892978668213,
-0.014129645191133022,
-0.04450764134526253,
0.13428346812725067,
-0.027315333485603333,
0.004539506044238806,
0.05963961407542229,
0.004171636421233416,
-0.11056803911924362,
0.14031638205051422,
-0.050806377083063126,
-0.19733549654483795,
-0.04555252939462662,
-0.085078164935112,
-0.02284068986773491,
0.04600328207015991,
0.09843073785305023,
-0.07014288008213043,
0.0006631806027144194,
0.03629310056567192,
-0.10282386094331741,
-0.10111300647258759,
-0.07477779686450958,
-0.043130289763212204,
0.09023064374923706,
0.016925856471061707,
-0.07231717556715012,
-0.008369959890842438,
-0.06529982388019562,
-0.05923369526863098,
0.16513501107692719,
-0.06129053235054016,
0.10258615016937256,
0.05067083612084389,
-0.001538196811452508,
-0.08131090551614761,
-0.10890238732099533,
0.14816226065158844,
-0.12016181647777557,
-0.0103067047894001,
0.11005733162164688,
-0.039216700941324234,
0.0772920474410057,
0.10416765511035919,
0.09341180324554443,
-0.10253407061100006,
-0.017049429938197136,
0.005660283844918013,
-0.11007785052061081,
-0.20246942341327667,
0.010830887593328953,
-0.05977075546979904,
0.21065275371074677,
0.07085730880498886,
0.04333553463220596,
0.14246796071529388,
0.029313119128346443,
-0.04407679662108421,
0.006104091182351112,
0.028252940624952316,
0.0851813331246376,
0.1246374249458313,
-0.04919582977890968,
0.01997445337474346,
-0.039401739835739136,
0.05827294662594795,
0.14741714298725128,
0.1462986022233963,
0.04710337892174721,
0.1979619562625885,
0.20900961756706238,
0.08907568454742432,
-0.052146051079034805,
0.09753634035587311,
0.024272801354527473,
0.05146193876862526,
-0.030728960409760475,
-0.06712015718221664,
-0.0517105907201767,
0.007659737952053547,
0.015134048648178577,
0.031321264803409576,
-0.25895485281944275,
0.003243032144382596,
-0.07189307361841202,
0.04326354339718819,
-0.05924618989229202,
0.06109249219298363,
0.0652913972735405,
0.1501270830631256,
0.09479472041130066,
0.05495312809944153,
-0.09734004735946655,
0.159153014421463,
0.0663612112402916,
-0.10877998173236847,
0.04741546884179115,
-0.009850636124610901,
0.1310407519340515,
-0.0634075403213501,
0.05122067406773567,
-0.1647319793701172,
-0.2885824739933014,
-0.021425582468509674,
0.0719718188047409,
-0.27797627449035645,
0.2247169315814972,
-0.012410953640937805,
-0.0871909037232399,
0.03304493427276611,
-0.11519569158554077,
0.04863028600811958,
0.1595747023820877,
0.26308730244636536,
0.05145864933729172,
-0.08308324217796326,
-0.022259118035435677,
-0.0991366058588028,
-0.023158250376582146,
0.12348061800003052,
-0.021975597366690636,
-0.06755248457193375,
-0.00445911567658186,
0.05285836383700371,
-0.021691281348466873,
-0.031488675624132156,
-0.09670691937208176,
0.01735408417880535,
0.10492158681154251,
-0.012655094265937805,
0.11714252829551697,
0.06318080425262451,
-0.00009132526611210778,
-0.045448608696460724,
0.053135018795728683,
-0.1940585970878601,
-0.04934294521808624,
-0.05441927909851074,
-0.07362275570631027,
-0.06500111520290375,
0.0427626296877861,
0.005895983427762985,
-0.06296326965093613,
0.03628934174776077,
-0.1932283639907837,
-0.11783641576766968,
0.15174135565757751,
-0.058726876974105835,
-0.10430742055177689,
-0.09414106607437134,
0.09544450044631958,
-0.08472474664449692,
0.15987412631511688,
0.04767236113548279,
0.013835704885423183,
-0.07602491974830627,
-0.10063280165195465,
0.1157350242137909,
-0.2133713960647583,
-0.042258139699697495,
-0.05777142569422722,
-0.1557580530643463,
-0.09800167381763458,
-0.05156804621219635,
-0.030141187831759453,
0.2431359589099884,
0.5094990134239197,
0.02815762348473072,
0.14353646337985992,
0.26726019382476807,
-0.15158607065677643,
-0.3042755126953125,
-0.011292118579149246,
-0.15321463346481323,
-0.16572721302509308,
0.1004076898097992,
-0.24656599760055542,
0.041856445372104645,
0.09463696926832199,
-0.050227560102939606,
0.23836398124694824,
-0.07545310258865356,
-0.10436556488275528,
0.03901048004627228,
0.07186833769083023,
0.36833587288856506,
-0.2089296132326126,
-0.02766696736216545,
-0.05434839054942131,
-0.11816712468862534,
0.2064065784215927,
-0.03874171897768974,
0.11119940876960754,
-0.021012678742408752,
0.03712129592895508,
-0.028165021911263466,
-0.007310724351555109,
0.1519789844751358,
-0.06402525305747986,
0.043641556054353714,
0.0033210425172001123,
-0.16012731194496155,
0.16885609924793243,
0.0008305752417072654,
-0.14115634560585022,
-0.10358321666717529,
-0.09344834834337234,
-0.20850901305675507,
0.07903779298067093,
-0.10217288881540298,
0.0689089298248291,
-0.04019796475768089,
-0.14248108863830566,
-0.1624545007944107,
-0.03607364743947983,
-0.038607407361269,
0.041242167353630066,
0.316202312707901,
-0.2045188844203949,
0.06768496334552765,
0.0415782667696476,
0.16687266528606415,
0.059099167585372925,
-0.09505048394203186,
-0.07813643664121628,
-0.08897370100021362,
0.010845422744750977,
-0.1450456827878952,
-0.012828106991946697,
0.05640153959393501,
0.015807971358299255,
0.07256700843572617,
0.06208021193742752,
0.030068375170230865,
0.1672528088092804,
0.09103546291589737,
-0.15968692302703857,
-0.05255764350295067,
-0.013695801608264446,
0.07536415755748749,
0.057089466601610184,
-0.08749494701623917,
0.07208939641714096,
0.07542506605386734,
-0.03857012838125229,
-0.022506244480609894,
0.0608195960521698,
-0.08458815515041351,
-0.1355689913034439,
0.02832869626581669,
0.002369548659771681,
-0.04531820863485336,
0.014610751532018185,
0.035989608615636826,
-0.09056048840284348,
-0.04995490983128548,
0.11160220950841904,
-0.03355541452765465,
-0.05950646102428436,
-0.06941608339548111,
0.02148045413196087,
-0.16398510336875916,
0.09522532671689987,
-0.003392615122720599,
-0.08914691209793091,
-0.0416828915476799,
0.19510219991207123,
-0.03495039418339729,
-0.01159447431564331,
0.01907491870224476,
0.01906304992735386,
0.04889634624123573,
-0.0013089829590171576,
-0.23889949917793274,
0.04491788521409035,
0.0307039525359869,
-0.05451572686433792,
0.005634073168039322,
0.053780797868967056,
-0.08454034477472305,
-0.08112616091966629,
-0.15402798354625702,
0.01796206645667553,
-0.0485813170671463,
0.08557840436697006,
-0.05171242356300354,
-0.04079662635922432,
-0.047070953994989395,
0.046082403510808945,
-0.05318589136004448,
-0.07859385758638382,
-0.08619455248117447,
0.030129972845315933,
0.03869256377220154,
0.06755930185317993,
-0.07466225326061249,
-0.028330590575933456,
0.047616127878427505,
-0.010478153824806213,
0.08079293370246887,
0.015363998711109161,
0.01680663228034973,
0.026729051023721695,
-0.2630574405193329,
-0.010119686834514141,
0.10918817669153214,
0.031862590461969376,
-0.12185485661029816,
0.10546610504388809,
-0.048346493393182755,
-0.05442795529961586,
0.03194759413599968,
0.0158250629901886,
-0.11876984685659409,
-0.05763669312000275,
-0.006013056263327599,
-0.02082223631441593,
-0.09982844442129135,
0.0038013143930584192,
-0.02227029763162136,
0.10137229412794113,
-0.060649774968624115,
0.0707637295126915,
-0.0032442340161651373,
0.02900582365691662,
-0.057939495891332626,
0.04629586637020111,
0.03845752775669098,
-0.10925886780023575,
-0.10656791925430298,
-0.11782608926296234,
-0.05030025169253349,
-0.042119450867176056,
0.21236415207386017,
0.014007171615958214,
-0.17257151007652283,
0.07679276168346405,
0.15216079354286194,
-0.008797735907137394,
0.014965017326176167,
0.3287423849105835,
0.025836268439888954,
-0.05924290418624878,
-0.18791548907756805,
0.06639400869607925,
0.03759310394525528,
-0.03691001981496811,
0.15201246738433838,
0.09629037231206894,
-0.08714476227760315,
-0.044043831527233124,
-0.013552510179579258,
0.0762321725487709,
-0.06727846711874008,
0.0028915295843034983,
0.1595887690782547,
0.04156899079680443,
-0.012780308723449707,
0.017348511144518852,
0.14272385835647583,
-0.04976478964090347,
0.036157600581645966,
-0.06559804826974869,
-0.03390168771147728,
-0.12655451893806458,
-0.0008437647484242916,
0.02853349968791008,
-0.06076641008257866,
0.020884212106466293,
-0.1067139059305191,
0.09719263762235641,
0.03958963230252266,
0.0669315978884697,
-0.06955241411924362,
0.04771805182099342,
-0.11673444509506226,
-0.08881320804357529,
0.07932250201702118,
0.005737553350627422,
0.03359728306531906,
-0.22313466668128967,
-0.030719853937625885,
0.09944255650043488,
-0.08456481993198395,
0.04255763813853264,
0.014665304683148861,
0.03084571659564972,
-0.05557723343372345,
-0.15434421598911285,
-0.11313053965568542,
0.011838294565677643,
0.006234442815184593,
-0.026380563154816628,
0.25319916009902954,
-0.02982863038778305,
0.017728807404637337,
0.07467368245124817,
0.1310366690158844,
0.021291479468345642,
0.07808878272771835,
0.06994710862636566,
0.051206525415182114,
-0.16175255179405212,
0.11035878956317902,
-0.10092133283615112,
0.02002863585948944,
0.07620629668235779,
0.24715062975883484,
0.11044085770845413,
-0.18787099421024323,
-0.04779334366321564,
-0.05981123074889183,
0.033100198954343796,
0.07105880975723267,
0.09431867301464081,
-0.010800390504300594,
0.1280701607465744,
-0.11680711805820465,
0.0056950864382088184,
-0.011445331387221813,
-0.011690878309309483,
-0.11238820105791092,
-0.024574540555477142,
0.10603879392147064,
-0.011664136312901974,
-0.12224886566400528,
0.07674191147089005,
-0.1453186273574829,
0.05849434435367584,
-0.04721328243613243,
-0.042158205062150955,
0.07578310370445251,
-0.04290485009551048,
-0.02693767473101616,
0.01211197767406702,
0.015286820009350777,
-0.0455639474093914,
-0.08262654393911362,
-0.08150022476911545,
0.05717606842517853,
-0.17689745128154755,
-0.017553821206092834,
0.03687640652060509,
-0.05970226600766182,
0.16247588396072388,
-0.014511519111692905,
0.04425446689128876,
-0.026621736586093903,
-0.09437144547700882,
-0.005535780917853117,
0.01892569288611412,
0.05501195043325424,
-0.03128454461693764,
-0.09767493605613708,
0.13588112592697144,
-0.04947412759065628,
0.042781975120306015,
0.10570506006479263,
0.09054426103830338,
0.008485845290124416,
0.09868272393941879,
-0.1578599065542221,
0.07357941567897797,
0.051528725773096085,
-0.1097397431731224,
0.05531278997659683,
-0.02512146718800068,
0.01357654482126236,
-0.07742483168840408,
-0.03542831540107727,
-0.06483714282512665,
0.04802500829100609,
-0.13498659431934357,
-0.05776897817850113,
-0.016689063981175423,
-0.07261865586042404,
0.18113557994365692,
0.03048868104815483,
-0.2581009864807129,
-0.01980132982134819,
-0.09429867565631866,
0.15100817382335663,
-0.14747242629528046,
-0.010603467002511024,
0.06234847381711006,
-0.1119265928864479,
0.024584252387285233,
-0.2311413586139679,
0.10014832019805908,
0.04002280533313751,
0.05602468177676201,
-0.060123201459646225
] |
null | null | transformers |
# Multi-images Multi-audio Multi-turn Malaysian 1.1B TinyLlama
WanDB https://wandb.ai/huseinzol05/vision-tinyllama?workspace=user-huseinzol05
## how-to
```python
from modeling_combine import MM_LLMs, MM_LLMs_Config
from transformers import AutoTokenizer, AutoProcessor
from PIL import Image
import librosa
import requests
model = MM_LLMs.from_pretrained(
'mesolitica/malaysian-tinyllama-1.1b-mmmmodal',
flash_attention = True,
dtype = torch.bfloat16,
torch_dtype = torch.bfloat16
)
_ = model.cuda()
image_processor = AutoProcessor.from_pretrained('google/siglip-base-patch16-384')
audio_processor = AutoProcessor.from_pretrained('mesolitica/malaysian-whisper-small')
tokenizer = AutoTokenizer.from_pretrained('mesolitica/malaysian-tinyllama-1.1b-mmmmodal')
def prepare_dataset(messages, images: List[str] = None, audio: List[str] = None, sr = 16000):
if images is not None:
images = [Image.open(f).convert('RGB') for f in images]
image_output = image_processor(images=images, return_tensors='pt')['pixel_values']
else:
image_output = None
if audio is not None:
audio = [librosa.load(f, sr=sr)[0] for f in audio]
audio_features = audio_processor(audio, sampling_rate=sr, return_tensors='pt',)['input_features']
else:
audio_features = None
prompt = tokenizer.apply_chat_template(messages, tokenize = False)
outputs = tokenizer(
prompt,
return_tensors='pt',
return_overflowing_tokens=False,
return_length=False
)
outputs['images'] = image_output
outputs['audios'] = audio_features
image_token = tokenizer.convert_tokens_to_ids(' ini gambar apa'},
]
outputs = prepare_dataset(messages, images = ['Persian-cat-breed.jpg'])
if outputs['images'] is not None:
outputs['images'] = outputs['images'].type(model.dtype)
if outputs['audios'] is not None:
outputs['audios'] = outputs['audios'].type(model.dtype)
for k in outputs.keys():
if outputs[k] is not None:
outputs[k] = outputs[k].cuda()
with torch.no_grad():
model_inputs = model.prepare_inputs_for_generation(**outputs)
r = model_inputs.pop('input_ids', None)
generate_kwargs = dict(
model_inputs,
max_new_tokens=300,
top_p=0.95,
top_k=50,
temperature=0.1,
do_sample=True,
num_beams=1,
)
r = model.llm.generate(**generate_kwargs)
print(tokenizer.decode(r[0]))
```
```
<s>Imej itu menunjukkan seekor kucing putih yang comel duduk di atas sofa hitam.</s>
```
```python
messages = [
{'role': 'user', 'content': '  apa kaitan 2 gambar ni'},
]
outputs = prepare_dataset(messages, images = ['Persian-cat-breed.jpg', 'nasi-goreng-1-23.jpg'])
if outputs['images'] is not None:
outputs['images'] = outputs['images'].type(model.dtype)
if outputs['audios'] is not None:
outputs['audios'] = outputs['audios'].type(model.dtype)
for k in outputs.keys():
if outputs[k] is not None:
outputs[k] = outputs[k].cuda()
with torch.no_grad():
model_inputs = model.prepare_inputs_for_generation(**outputs)
r = model_inputs.pop('input_ids', None)
generate_kwargs = dict(
model_inputs,
max_new_tokens=300,
top_p=0.95,
top_k=50,
temperature=0.1,
do_sample=True,
num_beams=1,
)
r = model.llm.generate(**generate_kwargs)
print(tokenizer.decode(r[0]))
```
```
<s>Tiada hubungan yang jelas antara gambar 1 (anak kucing putih duduk di atas sofa) dan gambar 2 (foto penutup mangkuk mi telur dengan nasi dan cili). Gambar pertama ialah imej haiwan, manakala gambar kedua ialah imej makanan. Mereka tergolong dalam kategori yang berbeza dan tidak mempunyai hubungan antara satu sama lain.</s>
```
```python
messages = [
{'role': 'user', 'content': '<audio> </audio> apa isu audio ni'},
]
outputs = prepare_dataset(messages, audio = [audio])
if outputs['images'] is not None:
outputs['images'] = outputs['images'].type(model.dtype)
if outputs['audios'] is not None:
outputs['audios'] = outputs['audios'].type(model.dtype)
for k in outputs.keys():
if outputs[k] is not None:
outputs[k] = outputs[k].cuda()
with torch.no_grad():
model_inputs = model.prepare_inputs_for_generation(**outputs, inference = True)
r = model_inputs.pop('input_ids', None)
generate_kwargs = dict(
model_inputs,
max_new_tokens=300,
top_p=0.95,
top_k=50,
temperature=0.9,
do_sample=True,
num_beams=1,
)
r = model.llm.generate(**generate_kwargs)
print(tokenizer.decode(r[0]))
```
```
<s>Isu audio ini berkisar tentang persepsi salah faham dan sikap bakhil berkenaan wang dalam konteks menggalakkan penggunaan e-dompet. Penceramah mencadangkan bahawa orang mungkin keberatan untuk menerima wang kerana tidak melihat manfaat atau nilai menggunakan e-dompet, dan kebimbangan tentang tidak dapat mengakses wang itu jika mereka memerlukannya segera. Penceramah juga menyebut isu ekonomi sistem dan kekurangan sistem yang berkesan di Malaysia. Secara keseluruhannya, isu ini menekankan keperluan untuk pemahaman dan kesedaran yang lebih baik tentang faedah menggunakan e-dompet, serta keperluan untuk pembaharuan sistemik untuk memastikan akses yang saksama kepada wang dan sumber lain.</s>
```
```python
messages = [
{'role': 'user', 'content': ' <audio> </audio> apa kaitan gambar dan audio ni'},
]
outputs = prepare_dataset(messages, images = [test_image], audio = [audio])
if outputs['images'] is not None:
outputs['images'] = outputs['images'].type(model.dtype)
if outputs['audios'] is not None:
outputs['audios'] = outputs['audios'].type(model.dtype)
for k in outputs.keys():
if outputs[k] is not None:
outputs[k] = outputs[k].cuda()
with torch.no_grad():
model_inputs = model.prepare_inputs_for_generation(**outputs, inference = True)
r = model_inputs.pop('input_ids', None)
generate_kwargs = dict(
model_inputs,
max_new_tokens=300,
top_p=0.95,
top_k=50,
temperature=0.9,
do_sample=True,
num_beams=1,
)
r = model.llm.generate(**generate_kwargs)
print(tokenizer.decode(r[0]))
```
```
<s>Tidak jelas bagaimana gambar dan audio berkaitan antara satu sama lain. Gambar itu menunjukkan bas pelancongan dengan iklan yang menggalakkan orang ramai menggunakan e-dompet mereka, tetapi ia tidak menyatakan tujuan iklan itu. Audio itu membincangkan idea pembaziran dana sebanyak RM5 juta (kira-kira 1.2 juta USD) ke atas sesuatu projek, tetapi ia tidak menyebut secara langsung bas pelancongan atau e-dompet. Ada kemungkinan bahawa kedua-dua gambar dan audio sedang membincangkan topik yang sama, tetapi lebih banyak konteks diperlukan untuk membuat perkaitan yang pasti.</s>
``` | {"library_name": "transformers", "tags": []} | feature-extraction | mesolitica/malaysian-tinyllama-1.1b-mmmmodal | [
"transformers",
"safetensors",
"mm_llms",
"feature-extraction",
"custom_code",
"region:us"
] | 2024-02-15T07:45:49+00:00 | [] | [] | TAGS
#transformers #safetensors #mm_llms #feature-extraction #custom_code #region-us
|
# Multi-images Multi-audio Multi-turn Malaysian 1.1B TinyLlama
WanDB URL
## how-to
| [
"# Multi-images Multi-audio Multi-turn Malaysian 1.1B TinyLlama\n\nWanDB URL",
"## how-to"
] | [
"TAGS\n#transformers #safetensors #mm_llms #feature-extraction #custom_code #region-us \n",
"# Multi-images Multi-audio Multi-turn Malaysian 1.1B TinyLlama\n\nWanDB URL",
"## how-to"
] | [
30,
24,
4
] | [
"passage: TAGS\n#transformers #safetensors #mm_llms #feature-extraction #custom_code #region-us \n# Multi-images Multi-audio Multi-turn Malaysian 1.1B TinyLlama\n\nWanDB URL## how-to"
] | [
-0.11723624914884567,
-0.11848107725381851,
-0.003247992368414998,
0.055042024701833725,
0.058402106165885925,
-0.03246516361832619,
0.1436961591243744,
0.07166580855846405,
-0.215498149394989,
0.030513206496834755,
-0.052534591406583786,
0.03400934487581253,
-0.008930960670113564,
0.05635108798742294,
0.002761402865871787,
-0.2693517804145813,
0.035258699208498,
-0.03582831472158432,
0.057209212332963943,
0.03320293501019478,
0.004565695766359568,
-0.07425006479024887,
0.052729032933712006,
-0.04770693928003311,
-0.17393077909946442,
0.0010721527505666018,
-0.04981260374188423,
-0.06778387725353241,
-0.01114837545901537,
-0.01727348007261753,
0.04961265251040459,
0.04608992114663124,
0.0681045800447464,
-0.07822978496551514,
0.044627901166677475,
-0.06923764199018478,
0.017967794090509415,
-0.04051589593291283,
0.09175410121679306,
0.031909067183732986,
-0.10132002085447311,
0.03464561700820923,
-0.09393825381994247,
0.031068814918398857,
-0.05921212211251259,
-0.0757032260298729,
-0.08620095998048782,
0.05614631250500679,
0.12984848022460938,
0.1043076142668724,
0.006054923869669437,
0.08391845226287842,
-0.12385857105255127,
0.06429817527532578,
0.2044254094362259,
-0.16352538764476776,
0.023309165611863136,
0.0980859324336052,
0.12345083802938461,
0.05698704347014427,
-0.04860541969537735,
0.09794896841049194,
0.023194288834929466,
0.008010435849428177,
-0.1282617747783661,
-0.17535872757434845,
-0.029821323230862617,
-0.10596893727779388,
-0.03518268093466759,
0.030655959621071815,
0.179645836353302,
0.03771595656871796,
-0.06717748194932938,
-0.0364309661090374,
0.010194106958806515,
-0.05068548396229744,
-0.13197757303714752,
0.1293685883283615,
0.0013196562649682164,
0.03125840425491333,
-0.1259002834558487,
-0.04013495147228241,
-0.05145449936389923,
-0.027474358677864075,
-0.11276514083147049,
0.14667002856731415,
-0.031107421964406967,
0.025500768795609474,
-0.12392851710319519,
-0.03165167197585106,
0.04024578258395195,
-0.10289186984300613,
0.015520896762609482,
0.008594926446676254,
0.1363859474658966,
0.11532226949930191,
-0.012103489600121975,
-0.12496013939380646,
0.15872310101985931,
-0.024416623637080193,
-0.09122214466333389,
-0.0012839739210903645,
-0.11855750530958176,
0.12254269421100616,
-0.007756200153380632,
-0.051460690796375275,
-0.0891580730676651,
-0.04216943681240082,
0.11390349268913269,
0.029166338965296745,
0.16872920095920563,
-0.04902949184179306,
-0.09609467536211014,
0.006989073008298874,
0.0005735617596656084,
0.09805133193731308,
0.027744237333536148,
0.017680305987596512,
-0.07428894191980362,
-0.015127802267670631,
0.11152175813913345,
-0.11526073515415192,
-0.009365287609398365,
-0.054257478564977646,
0.0628778263926506,
0.05078059062361717,
0.038045819848775864,
-0.0055697038769721985,
-0.02605331502854824,
0.06790071725845337,
0.015973025932908058,
0.06715862452983856,
0.01106471661478281,
-0.041138049215078354,
0.05979271978139877,
-0.012047444470226765,
0.040293190628290176,
-0.14159813523292542,
-0.07901334017515182,
0.04994658753275871,
0.014583985321223736,
0.03575332835316658,
0.04575587809085846,
0.035669635981321335,
-0.0666862353682518,
0.06361202150583267,
-0.036228928714990616,
-0.061913907527923584,
-0.03950624167919159,
0.0803472250699997,
0.1001964882016182,
0.12276222556829453,
-0.03951022028923035,
0.05008312314748764,
-0.04291972517967224,
0.0907973051071167,
-0.26027733087539673,
0.11442682147026062,
-0.06790409982204437,
0.05361803621053696,
0.02475065551698208,
-0.0296634528785944,
-0.10709565132856369,
0.08640236407518387,
-0.04082648828625679,
0.10896222293376923,
-0.07433325797319412,
-0.10697907209396362,
0.11742997914552689,
-0.17187754809856415,
-0.0801561027765274,
0.07203613966703415,
0.07380326837301254,
-0.035152871161699295,
0.008916490711271763,
0.197078675031662,
0.1118912324309349,
-0.08803422749042511,
0.0025680935941636562,
0.07666067034006119,
-0.060156501829624176,
-0.08871239423751831,
0.0658799260854721,
0.01319258101284504,
0.04402730241417885,
-0.02872961387038231,
0.06269919872283936,
0.1411731094121933,
-0.02106500044465065,
-0.03806281462311745,
0.03768208622932434,
-0.09769979864358902,
-0.05071384832262993,
-0.019320933148264885,
0.059409815818071365,
-0.02849373035132885,
0.05688019469380379,
0.08331968635320663,
0.06628970801830292,
-0.012469085864722729,
0.03267481178045273,
-0.12612587213516235,
0.07362960278987885,
-0.26617059111595154,
0.004032558761537075,
-0.1338614672422409,
0.033423613756895065,
-0.03728296607732773,
-0.10712400823831558,
0.1385618895292282,
-0.09509559720754623,
0.01989918202161789,
0.013779490254819393,
-0.05945292115211487,
-0.09470856189727783,
0.051168765872716904,
0.013838849030435085,
0.07086145132780075,
-0.06506079435348511,
0.03184286877512932,
-0.04832732304930687,
-0.025043033063411713,
0.0883350670337677,
-0.0874113216996193,
0.07725442945957184,
5.260346256363846e-7,
0.044460296630859375,
0.030668720602989197,
0.18187178671360016,
0.045229215174913406,
0.00998862273991108,
-0.02372123673558235,
0.040690403431653976,
0.06549114733934402,
-0.06325375288724899,
0.1759270280599594,
-0.14774222671985626,
0.1887613981962204,
0.2291325479745865,
-0.2523455321788788,
0.02161066047847271,
0.1180134117603302,
0.0476534366607666,
-0.028173867613077164,
-0.04494018852710724,
0.02723720110952854,
-0.030907301232218742,
0.0006391708739101887,
0.1334301233291626,
-0.027735071256756783,
0.03191928192973137,
0.054348960518836975,
-0.11929652839899063,
-0.08009316772222519,
-0.02859877422451973,
-0.02351515181362629,
-0.11544135212898254,
0.08897916227579117,
0.1118086576461792,
-0.04125864803791046,
0.18514937162399292,
-0.07674553990364075,
-0.02821044996380806,
-0.029680218547582626,
0.07590183615684509,
0.04543250426650047,
0.05228433385491371,
-0.05233975127339363,
-0.07428920269012451,
0.02048948034644127,
0.024645818397402763,
0.032445840537548065,
-0.1286437064409256,
-0.08009622246026993,
0.00268053961917758,
-0.041271209716796875,
-0.11224958300590515,
-0.00493845297023654,
-0.04884351044893265,
0.012145888060331345,
-0.06977377831935883,
0.03675377741456032,
0.078115314245224,
-0.02172757126390934,
-0.03305641561746597,
0.04969945549964905,
-0.13989616930484772,
-0.35180404782295227,
-0.1799185574054718,
-0.12571404874324799,
-0.08739569783210754,
0.005940506234765053,
0.11512138694524765,
-0.12374833971261978,
-0.02985946647822857,
0.04078572988510132,
0.023879896849393845,
0.008742328733205795,
-0.03224621340632439,
0.03290078788995743,
0.03908028081059456,
0.03328314796090126,
-0.12136118859052658,
-0.00025218111113645136,
0.08670906722545624,
0.028856679797172546,
0.14176273345947266,
-0.03883520886301994,
0.10498083382844925,
0.07324723899364471,
0.06778927892446518,
0.012143272906541824,
0.07114119827747345,
0.0698014423251152,
-0.10306232422590256,
0.011042213998734951,
0.2500697374343872,
-0.04548861086368561,
-0.00504319928586483,
0.11141753941774368,
0.09519995003938675,
-0.098311647772789,
-0.013140200637280941,
0.024799836799502373,
-0.10849258303642273,
-0.12445781379938126,
-0.06385965645313263,
-0.11315842717885971,
0.036087535321712494,
-0.09596344083547592,
0.03311288729310036,
0.055654361844062805,
0.04807634279131889,
0.0021387755405157804,
0.07695063203573227,
0.059728335589170456,
-0.012082970701158047,
0.06043748930096626,
0.0009558759047649801,
0.052260201424360275,
-0.1490231454372406,
-0.06074085086584091,
0.07494297623634338,
0.06703407317399979,
0.1460777074098587,
0.11678597331047058,
0.010950990952551365,
0.09735075384378433,
-0.011946691200137138,
0.19211511313915253,
0.1274898201227188,
-0.05389799550175667,
-0.019856786355376244,
0.028131693601608276,
-0.07683171331882477,
0.05747915804386139,
-0.05579511448740959,
0.0440080501139164,
-0.010117551311850548,
-0.04906734824180603,
0.04892944544553757,
-0.009221184998750687,
0.017310654744505882,
0.037107061594724655,
-0.07280261814594269,
0.0015410666819661856,
0.04092435911297798,
0.061917562037706375,
-0.0033920113928616047,
0.0725298747420311,
0.09793944656848907,
-0.01228660624474287,
0.1137881949543953,
0.046940386295318604,
0.07907706499099731,
-0.12130098789930344,
-0.008918741717934608,
-0.06202087923884392,
-0.05480080470442772,
0.023901697248220444,
-0.012088882736861706,
-0.13949932157993317,
0.19988594949245453,
0.09775866568088531,
0.06086188182234764,
0.05542656034231186,
0.023480979725718498,
0.11782552301883698,
0.13102573156356812,
0.16058112680912018,
0.019840899854898453,
-0.039150774478912354,
-0.16485655307769775,
-0.022005770355463028,
-0.0419481135904789,
0.18208535015583038,
0.18622419238090515,
0.009281106293201447,
0.007521471474319696,
-0.05974632129073143,
-0.008671734482049942,
-0.01280412171036005,
-0.13639424741268158,
-0.0723288431763649,
0.043995507061481476,
0.10302481800317764,
-0.029484543949365616,
-0.06549874693155289,
-0.014984888024628162,
-0.05507664754986763,
0.0738576352596283,
0.05507860332727432,
-0.022715598344802856,
-0.05802538990974426,
-0.0956520065665245,
0.09105213731527328,
-0.007346748374402523,
0.09452711045742035,
-0.04328425973653793,
-0.03314442187547684,
-0.05654172599315643,
-0.08667515218257904,
0.07232168316841125,
-0.0668637827038765,
-0.005431283265352249,
0.012548594735562801,
0.15517355501651764,
-0.07814909517765045,
0.02039935439825058,
0.008225386030972004,
-0.011045077815651894,
0.12438588589429855,
-0.08794552832841873,
0.05355804041028023,
-0.020972108468413353,
-0.05166308954358101,
0.12799793481826782,
-0.07792206853628159,
-0.08787412196397781,
0.029663482680916786,
-0.09318608045578003,
0.15371190011501312,
0.21746723353862762,
0.052400730550289154,
0.0192383024841547,
0.23722432553768158,
0.01613025553524494,
-0.3319394886493683,
-0.04845062270760536,
-0.07713071256875992,
-0.048266660422086716,
-0.04690307378768921,
-0.11119604110717773,
0.18300972878932953,
0.01826278679072857,
0.00020555702212732285,
0.03806299343705177,
-0.2655695378780365,
-0.06530766934156418,
0.06423381716012955,
0.002045333618298173,
0.17925666272640228,
-0.19755014777183533,
-0.10360876470804214,
-0.10149024426937103,
-0.17640022933483124,
-0.04795161634683609,
-0.11166199296712875,
0.16740091145038605,
0.07536176592111588,
-0.0010594743071123958,
0.0007775025442242622,
-0.032456882297992706,
0.13372474908828735,
-0.0270173791795969,
0.07599193602800369,
-0.08202160894870758,
-0.08525121957063675,
0.17129719257354736,
0.025035833939909935,
0.05671682953834534,
-0.022005664184689522,
-0.005030813626945019,
-0.007698754779994488,
-0.018976125866174698,
0.0011720340698957443,
0.03728877753019333,
0.047962743788957596,
-0.023477742448449135,
-0.008165853098034859,
0.04578712582588196,
-0.03742346912622452,
-0.0018838164396584034,
0.1387241631746292,
-0.04005729779601097,
-0.10663267225027084,
0.05009981244802475,
0.0705028548836708,
-0.08962802588939667,
-0.08041112124919891,
-0.09242439270019531,
-0.09277977794408798,
0.08289477974176407,
-0.08328036963939667,
0.07500936836004257,
-0.017634021118283272,
0.04028375446796417,
0.04318191856145859,
0.053471509367227554,
-0.04572358354926109,
0.1551537960767746,
0.14480751752853394,
0.03920956328511238,
-0.05898471176624298,
-0.05939247086644173,
-0.02916048839688301,
0.20672756433486938,
0.029715541750192642,
0.1028209999203682,
-0.010177485644817352,
0.024539517238736153,
-0.0038761161267757416,
0.010623054578900337,
-0.17620526254177094,
0.1555424928665161,
0.007751408498734236,
0.05255575105547905,
-0.13195814192295074,
0.230021670460701,
-0.011471687816083431,
-0.1575784832239151,
-0.04720412567257881,
0.1590065360069275,
-0.04799054190516472,
-0.09608826786279678,
0.03464950621128082,
0.16271811723709106,
-0.01858745887875557,
-0.03454739227890968,
-0.046972811222076416,
-0.08174008876085281,
-0.051946524530649185,
0.09681137651205063,
-0.005577195435762405,
0.006908569950610399,
-0.0018409169279038906,
0.06531015038490295,
-0.09020116925239563,
0.01281243097037077,
0.08763054013252258,
0.12336664646863937,
-0.24375656247138977,
0.028681857511401176,
0.05721719563007355,
0.10923876613378525,
-0.11124661564826965,
-0.03404004126787186,
-0.05204092711210251,
0.05313869193196297,
-0.09459828585386276,
0.07872658222913742,
-0.1261359602212906,
-0.02048293687403202,
-0.030753035098314285,
0.02704682946205139,
-0.05343260616064072,
0.011606091633439064,
-0.012031824328005314,
-0.04945151135325432,
-0.02618471346795559,
-0.008014135994017124,
-0.030487224459648132,
-0.052890874445438385,
-0.02678944170475006,
-0.04630078375339508,
0.014313067309558392,
0.12499354779720306,
-0.08251507580280304,
0.009020359255373478,
-0.15448926389217377,
-0.2285265326499939,
0.1927659809589386,
-0.007822593674063683,
-0.013362225145101547,
-0.008858135901391506,
0.026885630562901497,
0.05942556634545326,
0.023477720096707344,
-0.009246627800166607,
0.2073768526315689,
-0.06592395156621933,
-0.11215045303106308,
-0.22017954289913177,
-0.025756431743502617,
-0.09286508709192276,
0.018343064934015274,
0.2538866400718689,
0.00902117881923914,
0.13307563960552216,
-0.08123794198036194,
0.052067991346120834,
-0.03854016587138176,
-0.013781217858195305,
-0.07436290383338928,
-0.19368678331375122,
-0.07335410267114639,
-0.036434296518564224,
0.01586969569325447,
-0.09390582144260406,
0.14029158651828766,
-0.02926049567759037,
-0.0986526757478714,
-0.010719346813857555,
0.0009418080444447696,
-0.012470349669456482,
0.026439381763339043,
0.35990098118782043,
0.09229937195777893,
-0.007831237278878689,
-0.11246520280838013,
0.016014905646443367,
0.04143626615405083,
0.023565353825688362,
-0.0984383299946785,
0.21415437757968903,
-0.04284366965293884,
0.09450433403253555,
0.16379414498806,
-0.02035290189087391,
-0.1712730973958969,
-0.10394871979951859,
-0.028944330289959908,
0.06323432177305222,
0.003699647029861808,
-0.04581530764698982,
0.21946710348129272,
0.014437951147556305,
0.08462973684072495,
0.016814017668366432,
0.0004945051041431725,
-0.08557955175638199,
-0.08742047846317291,
-0.11141945421695709,
-0.10915545374155045,
0.027400735765695572,
-0.038684628903865814,
0.020857032388448715,
0.04613756015896797,
0.047094110399484634,
0.016830509528517723,
0.3228725492954254,
-0.24343690276145935,
-0.026316286996006966,
0.0802290216088295,
-0.026619533076882362,
-0.09304624795913696,
0.1616596132516861,
-0.0035223960876464844,
0.09802071750164032,
-0.054641254246234894,
-0.007876201532781124,
0.04468058794736862,
-0.06517649441957474,
0.12652313709259033,
-0.1231764703989029,
-0.12584629654884338,
-0.015130946412682533,
-0.04376646876335144,
0.026890145614743233,
0.12901179492473602,
0.014487437903881073,
-0.07700216770172119,
0.02348014898598194,
0.1161641925573349,
-0.0653417706489563,
-0.1943802535533905,
0.004854410421103239,
-0.06610367447137833,
-0.09224933385848999,
0.08535604923963547,
-0.11050805449485779,
-0.05771123617887497,
0.004135195165872574,
0.21340081095695496,
0.2783781588077545,
-0.07234770804643631,
0.044649846851825714,
0.05451760068535805,
0.005445609334856272,
-0.09602465480566025,
0.13042423129081726,
0.0863560140132904,
0.21057388186454773,
0.016456739977002144,
-0.08554361015558243,
-0.17281405627727509,
-0.10579736530780792,
-0.12800155580043793,
0.016352714970707893,
0.09868863970041275,
-0.07057389616966248,
-0.033136747777462006,
0.11965208500623703,
-0.137502059340477,
0.14449985325336456,
0.07386001944541931,
-0.06606252491474152,
-0.007986990734934807,
-0.07353629916906357,
0.06644145399332047,
0.1405051052570343,
-0.05899111181497574,
-0.035517673939466476,
0.024701129645109177,
-0.08105169981718063,
0.0804956704378128,
-0.16262635588645935,
-0.018430935218930244,
-0.09696616977453232,
0.007596149109303951,
0.008327804505825043,
0.057091668248176575,
-0.017119942232966423,
-0.000014250949789129663,
0.07828229665756226,
-0.045569248497486115,
0.2557368874549866,
-0.0004986548447050154,
-0.12010487914085388,
-0.000645739899482578,
0.2502360939979553,
-0.08084971457719803,
0.047494642436504364,
-0.0018867045873776078,
-0.004774358589202166,
0.04398634657263756,
0.11818952113389969,
-0.0864032506942749,
-0.07071889936923981,
-0.059084970504045486,
-0.10649119317531586,
0.06297250092029572,
-0.0005567125626839697,
0.029766783118247986,
-0.017606377601623535,
-0.04079507291316986,
0.10723987221717834,
0.11502648890018463,
-0.06813038140535355,
-0.06119922548532486,
-0.09149948507547379,
-0.0411524660885334,
-0.0951109528541565,
0.07800910621881485,
-0.13089928030967712,
-0.012668352574110031,
-0.09124992787837982,
0.006795286200940609,
-0.1394423395395279,
0.05127766728401184,
0.1773965060710907,
0.027928659692406654,
-0.017269346863031387,
-0.21473248302936554,
0.059329062700271606,
0.04792395979166031,
-0.112066350877285,
-0.12628257274627686
] |
null | null | peft | ## Training procedure
The following `bitsandbytes` quantization config was used during training:
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: False
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.4.0
| {"library_name": "peft"} | null | RansikaC99/llama2-qlora-finetunined-4-bit-1500-3epoch | [
"peft",
"region:us"
] | 2024-02-15T07:45:53+00:00 | [] | [] | TAGS
#peft #region-us
| ## Training procedure
The following 'bitsandbytes' quantization config was used during training:
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: False
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.4.0
| [
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: False\n- bnb_4bit_compute_dtype: float16",
"### Framework versions\n\n\n- PEFT 0.4.0"
] | [
"TAGS\n#peft #region-us \n",
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: False\n- bnb_4bit_compute_dtype: float16",
"### Framework versions\n\n\n- PEFT 0.4.0"
] | [
9,
154,
11
] | [
"passage: TAGS\n#peft #region-us \n## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: False\n- bnb_4bit_compute_dtype: float16### Framework versions\n\n\n- PEFT 0.4.0"
] | [
-0.06979092955589294,
0.021877916529774666,
-0.0024098597932606936,
0.1378258466720581,
0.10862318426370621,
0.07474296540021896,
0.10942144691944122,
0.12339992076158524,
0.0572756864130497,
0.09055759757757187,
0.09443594515323639,
0.05080708488821983,
0.07617583870887756,
0.15425844490528107,
-0.02472618594765663,
-0.030937515199184418,
0.05461447685956955,
0.0024960634764283895,
0.003683591727167368,
0.07837489992380142,
0.05202331766486168,
-0.036243125796318054,
0.037176959216594696,
-0.09029994904994965,
-0.1683429628610611,
-0.0007763148751109838,
0.010803398676216602,
0.03168312832713127,
0.04643981158733368,
0.03667012229561806,
0.061239324510097504,
-0.01880047656595707,
-0.024256091564893723,
-0.2065766304731369,
-0.008815872482955456,
0.12412996590137482,
-0.022375915199518204,
0.0724588930606842,
-0.08079487085342407,
0.13203540444374084,
-0.039120372384786606,
-0.02855866029858589,
0.005213533993810415,
0.031375303864479065,
-0.08930839598178864,
-0.12567098438739777,
-0.06602675467729568,
0.061225954443216324,
0.022498471662402153,
0.0596780888736248,
0.0024417112581431866,
0.19193901121616364,
-0.1333344727754593,
0.08866282552480698,
0.10110592097043991,
-0.24191753566265106,
-0.03025546856224537,
0.14384306967258453,
-0.024540316313505173,
0.15543513000011444,
-0.07628224045038223,
-0.10719222575426102,
0.08085530251264572,
0.0496523417532444,
-0.056598056107759476,
-0.0009009919012896717,
-0.09088310599327087,
0.006839253939688206,
-0.13980844616889954,
-0.05230165645480156,
0.15110687911510468,
0.026294633746147156,
-0.037592023611068726,
-0.031246718019247055,
-0.0950726643204689,
-0.35348814725875854,
0.024415571242570877,
-0.002826953772455454,
-0.06987019628286362,
0.048015810549259186,
0.035156507045030594,
-0.008995379321277142,
0.004349078983068466,
-0.09262323379516602,
-0.04143654555082321,
0.10029355436563492,
0.04808283597230911,
0.04402211308479309,
0.019573308527469635,
0.1058531403541565,
-0.12883538007736206,
-0.021110432222485542,
-0.041084397584199905,
-0.02869381383061409,
-0.050102680921554565,
-0.014695894904434681,
-0.07354352623224258,
0.17733292281627655,
0.06763829290866852,
0.09666602313518524,
-0.1813267171382904,
0.1257096230983734,
-0.02475627325475216,
0.05742272362112999,
-0.035531818866729736,
0.014435730874538422,
-0.11885007470846176,
0.12319590896368027,
0.0007600552635267377,
0.13899193704128265,
0.02280823327600956,
-0.04473873972892761,
-0.0730285570025444,
-0.007154371589422226,
0.1233254000544548,
0.004622994922101498,
-0.11530132591724396,
0.007921767421066761,
-0.1411384791135788,
-0.04554228112101555,
0.0911121591925621,
-0.07906953990459442,
0.015561599284410477,
0.041121818125247955,
-0.04623573645949364,
0.004239376168698072,
0.09626287966966629,
-0.05426957085728645,
-0.04919392615556717,
-0.03332843258976936,
-0.10088252276182175,
0.00408816896378994,
-0.1039951965212822,
-0.13126124441623688,
0.051543787121772766,
-0.1563965529203415,
-0.00575046194717288,
-0.04191170632839203,
-0.06613527983427048,
0.02011139690876007,
0.019768383353948593,
-0.08437694609165192,
0.05859275907278061,
-0.08150110393762589,
-0.1557578295469284,
-0.03541033715009689,
0.018620938062667847,
0.007860139012336731,
-0.02999168075621128,
0.09800084680318832,
0.032462093979120255,
0.10065783560276031,
-0.1774347871541977,
-0.0032585663720965385,
0.013272988609969616,
0.06698489189147949,
0.016430236399173737,
0.11852112412452698,
-0.0984061062335968,
-0.029854416847229004,
-0.05219494178891182,
-0.06776446104049683,
-0.10671226680278778,
-0.009978118352591991,
0.12339237332344055,
0.081203892827034,
-0.15919211506843567,
-0.010960342362523079,
0.07894296199083328,
-0.034127745777368546,
-0.08150023221969604,
0.1462218016386032,
-0.043517038226127625,
0.10362964868545532,
-0.03142445161938667,
0.07441626489162445,
0.2393224835395813,
-0.13294187188148499,
0.0026927399449050426,
0.11512340605258942,
0.04870401695370674,
-0.024512527510523796,
0.001410747179761529,
0.06937900930643082,
-0.142485573887825,
0.03395999222993851,
0.067036472260952,
0.03290358558297157,
-0.059946075081825256,
-0.05717241391539574,
-0.03879120573401451,
-0.054277822375297546,
0.11310963332653046,
0.028706887736916542,
0.012219887226819992,
-0.07421029359102249,
-0.07091692090034485,
0.14475229382514954,
0.13386821746826172,
-0.01724834553897381,
-0.005964277777820826,
-0.1111484244465828,
0.002316186437383294,
-0.04553208500146866,
0.027614165097475052,
-0.12323106080293655,
0.024642376229166985,
0.08063367009162903,
0.01609816960990429,
-0.005867816042155027,
0.048622094094753265,
0.06404208391904831,
0.02775740437209606,
-0.054996591061353683,
0.023086974397301674,
-0.05230040103197098,
-0.002152500906959176,
-0.10709255933761597,
-0.07857770472764969,
-0.013062533922493458,
-0.010289276950061321,
0.18266798555850983,
-0.14896303415298462,
0.03612537682056427,
0.10309912264347076,
0.004995338153094053,
-0.010612043552100658,
-0.036244068294763565,
-0.07481951266527176,
0.11431097984313965,
-0.012485682964324951,
-0.03552611544728279,
0.04104119539260864,
0.034952957183122635,
-0.03309863805770874,
-0.1580384373664856,
-0.09931837022304535,
0.05646728724241257,
0.13403663039207458,
0.07577018439769745,
-0.06703833490610123,
-0.047553446143865585,
-0.020618250593543053,
-0.03879788890480995,
0.056729573756456375,
-0.05360506474971771,
0.023469123989343643,
0.012125842273235321,
0.06819912791252136,
-0.10050064325332642,
-0.03629688918590546,
0.06696230918169022,
-0.011795254424214363,
-0.04454666003584862,
0.11060863733291626,
0.01751997321844101,
-0.07169803977012634,
0.07787928730249405,
0.05941096320748329,
-0.14069285988807678,
0.08998789638280869,
-0.004688691347837448,
-0.023223139345645905,
-0.09080317616462708,
0.17682680487632751,
0.024636752903461456,
0.12305090576410294,
-0.1265403926372528,
0.10609687864780426,
-0.010145329870283604,
0.014946194365620613,
0.07392288744449615,
-0.20129480957984924,
-0.004805627278983593,
-0.0471431165933609,
-0.09572996944189072,
-0.05821099132299423,
-0.015557871200144291,
0.0170791894197464,
0.043467629700899124,
-0.005354706663638353,
0.06044595688581467,
0.142664834856987,
-0.018936002627015114,
-0.08370210230350494,
0.17457811534404755,
-0.21770337224006653,
-0.20481646060943604,
-0.23560066521167755,
0.0029743367340415716,
-0.11851438134908676,
-0.037289947271347046,
-0.05196189507842064,
-0.07753796130418777,
0.03229401260614395,
-0.08353889733552933,
-0.040616557002067566,
-0.019290516152977943,
0.003061749739572406,
0.05178794264793396,
0.011813749559223652,
0.1695105880498886,
-0.0735611766576767,
0.023279182612895966,
0.046429067850112915,
-0.02350066788494587,
0.10787586867809296,
-0.08872182667255402,
-0.03777230158448219,
0.12821447849273682,
-0.009709981270134449,
0.02801087126135826,
0.01423876266926527,
0.3048967123031616,
0.010030297562479973,
0.02763240598142147,
0.09710080921649933,
0.009545043110847473,
0.06174734979867935,
0.07832355052232742,
0.015707092359662056,
-0.10942601412534714,
0.06433121860027313,
0.04781096801161766,
-0.09661031514406204,
-0.13850480318069458,
-0.047762222588062286,
-0.07341255247592926,
0.006593871861696243,
0.08610724657773972,
0.07147743552923203,
0.07943940162658691,
0.07025924324989319,
0.023519491776823997,
0.11204782873392105,
-0.004567211028188467,
-0.005818400532007217,
0.1201663389801979,
-0.020266585052013397,
0.04491112753748894,
-0.030482962727546692,
0.03723101690411568,
0.07022300362586975,
0.14435730874538422,
0.08234275877475739,
-0.0826135203242302,
-0.0026839443016797304,
0.04897516965866089,
0.2717514932155609,
-0.01714148186147213,
0.10665575414896011,
-0.06918495893478394,
-0.013546466827392578,
0.0014700466999784112,
-0.03347883000969887,
-0.07884032279253006,
0.04058695584535599,
-0.02208472229540348,
0.07936602830886841,
-0.017719898372888565,
-0.027150988578796387,
0.07915937900543213,
0.12511037290096283,
0.15299907326698303,
-0.28453946113586426,
-0.11869988590478897,
-0.013159751892089844,
0.11041011661291122,
-0.10107292234897614,
0.0223299078643322,
0.21577490866184235,
0.008470140397548676,
-0.08149175345897675,
-0.03681793436408043,
0.03811902552843094,
-0.01879933662712574,
0.016045289114117622,
0.12573954463005066,
0.14544515311717987,
0.005644232500344515,
0.07689831405878067,
-0.2924128472805023,
0.023194577544927597,
0.05471178889274597,
0.046213310211896896,
-0.04598768800497055,
0.008340165950357914,
-0.04420474171638489,
-0.06625235080718994,
0.029148653149604797,
0.01076830830425024,
0.1823798418045044,
-0.28289252519607544,
-0.06890449672937393,
-0.02097264863550663,
0.1087518259882927,
0.08112131059169769,
0.04162963479757309,
0.01758452132344246,
0.056389037519693375,
0.06615620851516724,
0.03817380219697952,
-0.0343405120074749,
-0.09718122333288193,
0.0010788479121401906,
0.15619537234306335,
-0.14491763710975647,
-0.05582432448863983,
-0.06452815234661102,
-0.024293042719364166,
0.05778651684522629,
-0.1548195481300354,
-0.05276089906692505,
-0.06254495680332184,
0.015085814520716667,
0.1334981471300125,
-0.023490145802497864,
-0.01085786335170269,
-0.0201820470392704,
0.024849023669958115,
-0.04709408059716225,
-0.09744754433631897,
0.11661811918020248,
-0.0471058264374733,
-0.1306038200855255,
-0.019828403368592262,
0.1378549486398697,
0.08238520473241806,
-0.015439215116202831,
-0.0857001468539238,
-0.04882924631237984,
0.027058644220232964,
-0.15197721123695374,
0.005223190877586603,
0.09445921331644058,
-0.0594605952501297,
0.09092224389314651,
-0.11116138100624084,
0.22619131207466125,
-0.04533985257148743,
0.09459085017442703,
0.061151355504989624,
0.2978513538837433,
-0.08713971078395844,
0.018928516656160355,
0.04804879426956177,
-0.018178772181272507,
-0.2438703179359436,
0.04353063553571701,
0.05640896409749985,
0.040471117943525314,
-0.03873369097709656,
-0.1723128706216812,
0.035184476524591446,
0.09765327721834183,
0.009074017405509949,
0.20277278125286102,
-0.32115042209625244,
-0.05401403456926346,
0.04298613592982292,
0.06709850579500198,
0.15412142872810364,
-0.05032355710864067,
0.005308091174811125,
0.00299404701218009,
-0.025177884846925735,
0.14879994094371796,
-0.10914967209100723,
0.11319706588983536,
-0.028690336272120476,
0.024565158411860466,
0.009507894515991211,
-0.03613625466823578,
0.14931310713291168,
0.013264713808894157,
0.089763343334198,
0.019067833200097084,
-0.04837853088974953,
0.05236007273197174,
-0.08060209453105927,
0.048760365694761276,
-0.055018678307533264,
0.08426039665937424,
-0.0716785341501236,
0.014209864661097527,
-0.0624924935400486,
-0.02230324223637581,
-0.07208869606256485,
-0.03789118677377701,
-0.098033107817173,
0.07829388231039047,
-0.014118213206529617,
-0.0252530500292778,
-0.030577998608350754,
0.06026613339781761,
0.06182622164487839,
0.4425763189792633,
-0.06215111166238785,
-0.059386324137449265,
0.0778389647603035,
0.09745035320520401,
-0.018147382885217667,
0.10684601962566376,
-0.12679900228977203,
0.04500621557235718,
0.12372613698244095,
-0.0008733426802791655,
0.12444817274808884,
0.0908789336681366,
-0.11470408737659454,
-0.01735471934080124,
0.03941637650132179,
-0.13241484761238098,
-0.072132408618927,
-0.02437761053442955,
-0.0033349243458360434,
-0.10859358310699463,
0.02123371511697769,
0.10324949026107788,
-0.028791457414627075,
0.054798197001218796,
0.03556765988469124,
0.044085126370191574,
-0.13439145684242249,
0.13734261691570282,
0.03844649717211723,
0.0715968981385231,
-0.09430023282766342,
0.08732116222381592,
0.023660900071263313,
0.012141572311520576,
0.04477553442120552,
-0.0023568833712488413,
-0.10149598866701126,
0.010059066116809845,
-0.038228947669267654,
-0.09548848867416382,
0.09702998399734497,
-0.04586396366357803,
-0.0412009172141552,
-0.11459630727767944,
0.016487689688801765,
0.07441749423742294,
0.06058561056852341,
0.10161381214857101,
-0.0266474150121212,
0.01351380068808794,
-0.1312156319618225,
0.0554811991751194,
-0.034641385078430176,
0.028874071314930916,
-0.15108218789100647,
0.060451265424489975,
-0.01823197305202484,
0.07453461736440659,
-0.019939351826906204,
-0.013813626021146774,
-0.22570104897022247,
0.021174952387809753,
-0.033057328313589096,
0.003854290582239628,
0.029487796127796173,
0.03671148419380188,
0.02527959644794464,
0.05630463361740112,
-0.031085221096873283,
0.04344148188829422,
-0.03895607963204384,
-0.051280442625284195,
0.04188956692814827,
0.0007596524665132165,
-0.035418447107076645,
-0.046238139271736145,
0.04512936249375343,
-0.11377303302288055,
0.037379465997219086,
0.037038300186395645,
-0.05313664302229881,
0.07457074522972107,
0.0551333986222744,
0.020679296925663948,
0.09771078824996948,
0.05872461944818497,
0.05539431795477867,
-0.05644817277789116,
0.038207992911338806,
-0.013830473646521568,
-0.0014212807873263955,
0.05458271503448486,
0.13561320304870605,
-0.05696135014295578,
-0.05802462622523308,
-0.1356644332408905,
-0.022950179874897003,
-0.04388374090194702,
0.04029588773846626,
0.15611368417739868,
0.10005352646112442,
0.08508609980344772,
-0.08406435698270798,
-0.023177621886134148,
-0.1434895098209381,
-0.0838560089468956,
0.057367514818906784,
-0.04613112285733223,
-0.013442962430417538,
-0.03551625832915306,
0.07219497859477997,
0.007350239437073469,
0.15389199554920197,
-0.09119492769241333,
-0.09884331375360489,
-0.050555817782878876,
-0.17701447010040283,
-0.1346471607685089,
-0.005777652841061354,
0.2509252429008484,
0.031908661127090454,
-0.030132247135043144,
-0.06317965686321259,
0.00790130253881216,
0.0689205676317215,
0.13760150969028473,
0.04632747173309326,
0.07801228016614914,
-0.12434402108192444,
0.10745592415332794,
0.04290035367012024,
-0.05061715841293335,
0.12014038860797882,
0.30707380175590515,
-0.08199357986450195,
0.011254170909523964,
-0.08713367581367493,
0.08879105001688004,
0.02187724970281124,
-0.1383187174797058,
0.013609658926725388,
-0.03021937422454357,
-0.15772560238838196,
-0.1141202449798584,
0.03438706323504448,
-0.06414132565259933,
-0.20812147855758667,
-0.02861359901726246,
-0.10091375559568405,
-0.07548198103904724,
0.1385963410139084,
0.036400459706783295,
-0.01720549911260605,
0.1921585202217102,
-0.07791507244110107,
0.047225624322891235,
-0.010112340562045574,
-0.01755616068840027,
-0.004292663186788559,
-0.03010612167418003,
-0.10022464394569397,
0.14229239523410797,
0.0026088182348757982,
0.11017946153879166,
0.0076710316352546215,
0.08859197050333023,
0.05187401920557022,
-0.024778150022029877,
-0.040657348930835724,
-0.0020166346803307533,
0.015454727225005627,
-0.06949343532323837,
0.1287740021944046,
0.06015017256140709,
-0.08151866495609283,
-0.08153397589921951,
-0.020808378234505653,
-0.08484478294849396,
-0.021831484511494637,
-0.1572553515434265,
0.28017404675483704,
-0.03308238461613655,
0.10164105892181396,
-0.009170068427920341,
-0.05692457780241966,
-0.10435646772384644,
0.14374536275863647,
0.120839424431324,
-0.13673727214336395,
-0.012003384530544281,
0.08866086602210999,
-0.007145259529352188,
-0.07515248656272888,
0.155784472823143,
0.09963296353816986,
-0.014980965293943882,
0.02876802533864975,
-0.023070234805345535,
-0.03320426493883133,
0.0033824562560766935,
-0.005501914769411087,
-0.034212253987789154,
0.005797996185719967,
0.03244750574231148,
-0.13621188700199127,
-0.017982242628932,
-0.06249391660094261,
-0.06658949702978134,
0.1939048320055008,
-0.1203932911157608,
-0.08973969519138336,
-0.0308162122964859,
-0.09773550927639008,
-0.10429038852453232,
0.02302357368171215,
-0.11399734765291214,
0.059150729328393936,
0.08660008758306503,
-0.05918910726904869,
-0.015915945172309875,
-0.07759834825992584,
-0.004432562738656998,
0.01995953358709812,
0.06680567562580109,
-0.012953863479197025,
0.07061396539211273,
0.10941770672798157,
-0.02109832689166069,
-0.05627770721912384,
0.09945710748434067,
0.005617896094918251,
-0.058368053287267685,
-0.14522160589694977,
0.017044585198163986,
-0.029553061351180077,
0.10308431833982468,
0.024984009563922882,
-0.06643787026405334,
-0.03581478074193001,
-0.17625966668128967,
-0.011637690477073193,
-0.1381247639656067,
-0.08038759231567383,
-0.07788003236055374,
0.10739508271217346,
0.19482040405273438,
-0.05468401312828064,
0.025051061064004898,
-0.03843652456998825,
0.03321219980716705,
-0.04963286593556404,
0.04273268207907677,
-0.005887014791369438,
-0.14593636989593506,
0.05899544060230255,
-0.05112109333276749,
0.004382130224257708,
-0.28370237350463867,
0.001215865253470838,
0.005580263677984476,
-0.033779751509428024,
-0.05040258169174194,
0.1377897560596466,
0.00042426568688824773,
0.07757376879453659,
-0.062357865273952484,
-0.24849353730678558,
-0.0690150111913681,
0.14145798981189728,
0.0006318931700661778,
-0.07864443957805634
] |
null | null | stable-baselines3 |
# **PPO** Agent playing **LunarLander-v2**
This is a trained model of a **PPO** agent playing **LunarLander-v2**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
TODO: Add your code
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
| {"library_name": "stable-baselines3", "tags": ["LunarLander-v2", "deep-reinforcement-learning", "reinforcement-learning", "stable-baselines3"], "model-index": [{"name": "PPO", "results": [{"task": {"type": "reinforcement-learning", "name": "reinforcement-learning"}, "dataset": {"name": "LunarLander-v2", "type": "LunarLander-v2"}, "metrics": [{"type": "mean_reward", "value": "256.11 +/- 20.77", "name": "mean_reward", "verified": false}]}]}]} | reinforcement-learning | gabrielbenabou/ppo-LunarLander-v2 | [
"stable-baselines3",
"LunarLander-v2",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] | 2024-02-15T07:52:47+00:00 | [] | [] | TAGS
#stable-baselines3 #LunarLander-v2 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us
|
# PPO Agent playing LunarLander-v2
This is a trained model of a PPO agent playing LunarLander-v2
using the stable-baselines3 library.
## Usage (with Stable-baselines3)
TODO: Add your code
| [
"# PPO Agent playing LunarLander-v2\nThis is a trained model of a PPO agent playing LunarLander-v2\nusing the stable-baselines3 library.",
"## Usage (with Stable-baselines3)\nTODO: Add your code"
] | [
"TAGS\n#stable-baselines3 #LunarLander-v2 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us \n",
"# PPO Agent playing LunarLander-v2\nThis is a trained model of a PPO agent playing LunarLander-v2\nusing the stable-baselines3 library.",
"## Usage (with Stable-baselines3)\nTODO: Add your code"
] | [
39,
41,
17
] | [
"passage: TAGS\n#stable-baselines3 #LunarLander-v2 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us \n# PPO Agent playing LunarLander-v2\nThis is a trained model of a PPO agent playing LunarLander-v2\nusing the stable-baselines3 library.## Usage (with Stable-baselines3)\nTODO: Add your code"
] | [
0.03942384943366051,
0.04900386184453964,
-0.005304091144353151,
0.026427261531352997,
0.107408307492733,
-0.026511888951063156,
0.11188238859176636,
0.0814051404595375,
0.10722193866968155,
0.04762078449130058,
0.08338645845651627,
0.06030960753560066,
0.05080918222665787,
0.2571701407432556,
0.04754156619310379,
-0.22987541556358337,
0.036159250885248184,
-0.04869936779141426,
0.12395193427801132,
0.07178173214197159,
-0.0038484656251966953,
-0.06485428661108017,
0.020415637642145157,
-0.013290755450725555,
0.05367108806967735,
0.04282612353563309,
-0.01716216839849949,
-0.08207534998655319,
0.07169748842716217,
-0.06345846503973007,
0.06986866891384125,
0.07677983492612839,
0.13218913972377777,
-0.17832116782665253,
0.029566360637545586,
0.02571309357881546,
-0.07189024239778519,
0.01342033501714468,
0.008019951172173023,
0.05120139941573143,
0.17303818464279175,
0.019879888743162155,
0.07844575494527817,
-0.0025605305563658476,
-0.15412317216396332,
-0.018950799480080605,
0.0436202734708786,
0.12546207010746002,
0.08808347582817078,
0.04605821147561073,
0.01970590092241764,
0.17503218352794647,
-0.054352790117263794,
-0.028833400458097458,
0.21759237349033356,
-0.2881564497947693,
-0.031460098922252655,
0.321048766374588,
0.06997483223676682,
0.09725230932235718,
-0.07540661096572876,
-0.03619609400629997,
0.007783263456076384,
-0.013137873262166977,
-0.028666524216532707,
-0.07447073608636856,
0.17313385009765625,
0.05152064561843872,
-0.05057951435446739,
-0.09541505575180054,
0.16948209702968597,
0.006921638268977404,
0.0018855923553928733,
-0.019282981753349304,
0.009060598909854889,
0.07402525842189789,
-0.016097044572234154,
-0.07255112379789352,
0.057438433170318604,
0.05330665782094002,
0.019649166613817215,
-0.1435653269290924,
-0.10762494057416916,
-0.022740179672837257,
-0.008012006990611553,
0.17786912620067596,
-0.009255532175302505,
0.042902372777462006,
0.003065188182517886,
0.10384012013673782,
-0.12480384111404419,
-0.03354184702038765,
-0.0454259067773819,
-0.07565800100564957,
-0.0223417766392231,
-0.02058211714029312,
-0.03580251708626747,
0.07184842973947525,
0.11971849203109741,
0.027368178591132164,
0.09350208193063736,
0.047715865075588226,
-0.03206788748502731,
0.06343851238489151,
0.05555703118443489,
0.14222665131092072,
0.05807621404528618,
0.012854371219873428,
0.13179877400398254,
0.055213116109371185,
0.033023182302713394,
-0.0613492950797081,
-0.18252409994602203,
0.07489913702011108,
-0.07031869143247604,
0.007941240444779396,
0.12051256000995636,
-0.04480670019984245,
-0.1183447614312172,
-0.037500523030757904,
-0.017392054200172424,
-0.06224250793457031,
-0.025395862758159637,
0.0547584593296051,
-0.02883218228816986,
-0.03973718360066414,
0.0011496668448671699,
0.09384800493717194,
0.00953749567270279,
-0.1752052903175354,
0.03303423151373863,
-0.025042934343218803,
-0.10782608389854431,
0.009975161403417587,
0.0022444494534283876,
0.03394931182265282,
0.04408763721585274,
-0.11822668462991714,
-0.30899152159690857,
-0.07652641832828522,
0.05490870401263237,
-0.06516939401626587,
-0.18425025045871735,
-0.13193942606449127,
0.02454492449760437,
-0.09037084132432938,
-0.044885024428367615,
-0.12759265303611755,
-0.028549788519740105,
0.01743689924478531,
0.011519349180161953,
0.10758619755506516,
-0.0106219332665205,
-0.012188062071800232,
-0.1571401208639145,
0.008273907005786896,
-0.20951123535633087,
0.0890483483672142,
-0.019150104373693466,
0.037884220480918884,
-0.032381169497966766,
-0.07404014468193054,
0.030707746744155884,
0.052499737590551376,
-0.01474119070917368,
0.13510210812091827,
-0.15592676401138306,
-0.03691192343831062,
-0.007996266707777977,
-0.13611900806427002,
-0.04786273464560509,
-0.10358831286430359,
-0.04357128217816353,
0.13354332745075226,
0.018664736300706863,
0.15356586873531342,
-0.08709818124771118,
-0.0722038671374321,
0.20489206910133362,
-0.010411538183689117,
-0.12820468842983246,
-0.076752208173275,
0.10165707021951675,
0.021510310471057892,
-0.056606587022542953,
-0.02523270808160305,
-0.1839766949415207,
-0.0152357779443264,
-0.04550420492887497,
-0.047039128839969635,
0.01796751655638218,
-0.010888241231441498,
0.13837894797325134,
0.08494598418474197,
0.05018039792776108,
-0.06086122244596481,
-0.006730288732796907,
0.10779471695423126,
0.08823856711387634,
0.008680110797286034,
0.023406028747558594,
-0.05774238705635071,
0.09552932530641556,
-0.04003755748271942,
-0.0142367510125041,
-0.08283266425132751,
-0.036246106028556824,
-0.026256313547492027,
0.17507147789001465,
0.09440762549638748,
0.2257927656173706,
0.09567736834287643,
0.039160262793302536,
0.031270865350961685,
-0.13181598484516144,
-0.1425403207540512,
-0.0017254541162401438,
0.09020978957414627,
-0.14270411431789398,
-0.04119925573468208,
-0.08974775671958923,
-0.17768175899982452,
-0.12202505767345428,
0.0006432619411498308,
-0.17960017919540405,
0.06390921026468277,
0.05408334732055664,
-0.035177867859601974,
0.03272094577550888,
0.13032332062721252,
-0.011533179320394993,
-0.03967514634132385,
0.0831870287656784,
0.0379033200442791,
-0.041234664618968964,
-0.021742934361100197,
0.11885567009449005,
0.15673065185546875,
0.13124459981918335,
-0.03511447086930275,
0.004914294462651014,
0.07076404243707657,
-0.02309088408946991,
0.06539414077997208,
0.0558244064450264,
0.20973342657089233,
0.188301220536232,
0.038996949791908264,
0.008822928182780743,
-0.07048165798187256,
0.0855446457862854,
-0.0742373839020729,
-0.14302679896354675,
-0.05579735338687897,
0.08729292452335358,
0.016605578362941742,
0.023469142615795135,
0.08711627870798111,
0.024545932188630104,
0.09132762253284454,
0.15968108177185059,
0.01990218088030815,
-0.09659269452095032,
-0.050218869000673294,
0.01175848301500082,
0.027713103219866753,
0.04794301092624664,
-0.04514073207974434,
-0.00937939714640379,
0.017020760104060173,
-0.10303554683923721,
0.031789086759090424,
-0.1413339376449585,
-0.1358717679977417,
0.044326696544885635,
0.003906996920704842,
0.010907664895057678,
0.02786896750330925,
-0.0038291432429105043,
0.019039705395698547,
0.04351753741502762,
-0.06975466758012772,
0.047416772693395615,
-0.024745507165789604,
-0.020031947642564774,
0.03340689837932587,
-0.057257164269685745,
-0.205775648355484,
-0.17696654796600342,
0.00013708483311347663,
-0.09910997003316879,
0.10194740444421768,
0.018308809027075768,
-0.12373185902833939,
0.047737859189510345,
-0.05822649225592613,
0.027574289590120316,
-0.01875593699514866,
-0.049130141735076904,
0.10507171601057053,
0.1525275856256485,
-0.016146350651979446,
0.018018173053860664,
-0.04865182936191559,
-0.10157987475395203,
-0.19632206857204437,
0.0691583976149559,
0.04680244252085686,
0.014610917307436466,
0.10669491440057755,
0.018072687089443207,
0.02367905154824257,
-0.007674071006476879,
-0.016521066427230835,
-0.011659215204417706,
-0.08781040459871292,
0.31909599900245667,
0.04510033503174782,
-0.025173069909214973,
0.02041010931134224,
-0.0043001663871109486,
-0.028083480894565582,
0.03263787180185318,
-0.0985708013176918,
-0.07548979669809341,
-0.08774089068174362,
-0.04367410019040108,
-0.09784720093011856,
0.053299110382795334,
0.05916472524404526,
0.003188040340319276,
-0.07727594673633575,
0.04221395403146744,
0.11369874328374863,
-0.0923808291554451,
-0.07137343287467957,
0.07477962225675583,
0.0972946360707283,
-0.07331304252147675,
0.00012658814375754446,
0.00874367356300354,
0.023951783776283264,
0.037102166563272476,
0.06778035312891006,
-0.03966575115919113,
0.08589404821395874,
-0.19917890429496765,
0.0372927263379097,
0.106058269739151,
0.023754918947815895,
0.0638108178973198,
0.07643651217222214,
-0.1058402881026268,
-0.008500572293996811,
-0.032518330961465836,
-0.21341575682163239,
0.1668180525302887,
0.1355515867471695,
0.06788124144077301,
-0.025637222453951836,
-0.00461410591378808,
-0.0649740919470787,
0.05773647129535675,
0.02723747305572033,
-0.14758841693401337,
0.004883295856416225,
0.06064270809292793,
0.026899009943008423,
0.01614922471344471,
0.07971042394638062,
0.014697225764393806,
-0.1801026314496994,
-0.014406266622245312,
0.10730406641960144,
0.002390873385593295,
0.0053148469887673855,
-0.03175045922398567,
-0.1755964607000351,
0.0751047357916832,
0.004285442177206278,
0.07233936339616776,
-0.1676585078239441,
0.14297930896282196,
-0.10089799761772156,
0.07726949453353882,
-0.004285062663257122,
-0.021311495453119278,
0.02507244050502777,
-0.0541163794696331,
0.15163759887218475,
0.01058570109307766,
-0.021810131147503853,
-0.1200498715043068,
-0.1717042326927185,
-0.019227758049964905,
-0.11788936704397202,
-0.11679866164922714,
0.050424277782440186,
0.062185097485780716,
0.04923136904835701,
-0.061147067695856094,
0.1518532931804657,
-0.047422297298908234,
0.060713399201631546,
-0.06893875449895859,
-0.06755045056343079,
0.03764858841896057,
-0.12588608264923096,
-0.08176055550575256,
0.05573027580976486,
0.19166934490203857,
0.15833087265491486,
-0.02816431224346161,
-0.03472423925995827,
-0.047419581562280655,
-0.006212298292666674,
-0.007802055217325687,
0.0275666993111372,
0.023223137483000755,
0.07315318286418915,
-0.07681374251842499,
-0.11649256944656372,
0.033787861466407776,
-0.06713802367448807,
-0.055589709430933,
-0.015439179725944996,
0.1513158082962036,
0.04671623185276985,
0.07720734924077988,
-0.018946662545204163,
0.03887668624520302,
-0.001724981120787561,
-0.056474871933460236,
0.16197094321250916,
0.03885216265916824,
-0.05193585529923439,
0.06837689876556396,
0.053174007683992386,
0.043745119124650955,
0.03011113777756691,
-0.026783017441630363,
0.206032395362854,
0.1980147808790207,
0.014206883497536182,
0.2175983190536499,
0.03177616000175476,
-0.03772832080721855,
-0.1300560086965561,
-0.065880686044693,
-0.006372632458806038,
0.03559038043022156,
0.08070417493581772,
-0.18207235634326935,
-0.015011128038167953,
-0.05689644813537598,
-0.034518610686063766,
-0.15059494972229004,
-0.28553900122642517,
-0.05957856774330139,
0.20075850188732147,
0.14706264436244965,
0.27519428730010986,
-0.10432573407888412,
0.035197313874959946,
0.02663275972008705,
-0.04912831634283066,
-0.006501141935586929,
0.00018665487004909664,
0.10268618166446686,
-0.15421873331069946,
0.1176437959074974,
0.08486983180046082,
-0.019002694636583328,
0.01058861706405878,
-0.1619086116552353,
0.00936629343777895,
-0.12191236019134521,
0.05354422330856323,
0.1400289237499237,
-0.048128653317689896,
-0.054873593151569366,
0.14033560454845428,
-0.024562934413552284,
-0.22685599327087402,
-0.04648222774267197,
-0.043600670993328094,
-0.010640020482242107,
0.026607351377606392,
-0.1013401448726654,
0.04101909324526787,
0.1330099105834961,
0.009380043484270573,
0.1147187277674675,
0.11749245226383209,
-0.052566803991794586,
0.10792597383260727,
0.2257719188928604,
-0.018785694614052773,
0.04689010605216026,
-0.12743118405342102,
-0.0012336712097749114,
-0.028270328417420387,
0.013657891191542149,
-0.09504974633455276,
-0.09938385337591171,
0.02366873063147068,
0.02872389927506447,
0.009118586778640747,
0.0921793207526207,
-0.029922157526016235,
0.0759170651435852,
0.06817561388015747,
-0.13014446198940277,
-0.16288450360298157,
0.015828335657715797,
-0.007344507612287998,
0.08354310691356659,
0.00027861111448146403,
0.08878035843372345,
-0.11932205408811569,
-0.018093237653374672,
-0.03153328225016594,
-0.03319635987281799,
-0.130486860871315,
-0.07138993591070175,
0.06156524643301964,
0.028095467016100883,
-0.06602972000837326,
0.1398407518863678,
0.026440169662237167,
0.15942534804344177,
0.049197953194379807,
0.012499804608523846,
0.07227300107479095,
-0.05345509201288223,
0.1283530443906784,
0.13818155229091644,
-0.00868943240493536,
-0.05460423603653908,
-0.1013643890619278,
-0.10236792266368866,
0.08925779908895493,
-0.05773641914129257,
0.07476430386304855,
-0.14885357022285461,
-0.06675903499126434,
0.015772046521306038,
0.016141414642333984,
-0.09562095999717712,
0.02571965754032135,
-0.01625603251159191,
-0.18119946122169495,
0.056570518761873245,
-0.048285093158483505,
0.0440407395362854,
-0.06347788125276566,
-0.1110161691904068,
-0.17226378619670868,
0.06091433763504028,
0.08593481779098511,
-0.053876690566539764,
-0.12229149043560028,
0.011023230850696564,
-0.00012518465518951416,
-0.06341652572154999,
-0.05023367330431938,
0.09722746908664703,
-0.11020902544260025,
0.031452205032110214,
-0.012567701749503613,
0.08853451162576675,
-0.03510405123233795,
-0.011538895778357983,
0.044220831245183945,
-0.08039166033267975,
-0.009481523185968399,
0.03534642979502678,
-0.026372017338871956,
-0.04127239063382149,
-0.2689029574394226,
0.0036654395516961813,
0.0341104120016098,
0.02497158572077751,
0.07856601476669312,
0.011906822212040424,
0.021174922585487366,
0.03993808850646019,
-0.15396519005298615,
-0.013395369984209538,
0.14574195444583893,
-0.07689505815505981,
-0.022186370566487312,
0.05703273415565491,
-0.09054436534643173,
0.013882770203053951,
-0.030287226662039757,
0.1345842480659485,
0.023923413828015327,
0.06404478847980499,
-0.0851147472858429,
0.10106813907623291,
-0.1451139897108078,
-0.04998219385743141,
-0.01244612317532301,
0.09761348366737366,
0.07019034773111343,
-0.10272270441055298,
0.014697125181555748,
0.04210108891129494,
0.19416837394237518,
0.016384804621338844,
-0.0356343574821949,
-0.03396720811724663,
0.004015897400677204,
0.22076453268527985,
0.03044266067445278,
0.10457023978233337,
0.07281364500522614,
-0.026583973318338394,
0.12624378502368927,
0.09929762035608292,
0.11280370503664017,
-0.055645186454057693,
0.13904185593128204,
0.04667386785149574,
0.038641396909952164,
0.0614289753139019,
0.06836545467376709,
0.09098632633686066,
-0.0008288522367365658,
0.1138714924454689,
0.013811973854899406,
-0.02422109805047512,
-0.021335409954190254,
0.17759373784065247,
0.10501719266176224,
-0.14769648015499115,
0.029047364369034767,
-0.01258957851678133,
0.039933037012815475,
-0.014194529503583908,
-0.15634691715240479,
-0.07240267097949982,
-0.3315149247646332,
0.1226184144616127,
-0.07119352370500565,
0.019930170848965645,
0.007913772016763687,
-0.037425633519887924,
-0.03296699747443199,
-0.04477746784687042,
0.13151589035987854,
-0.013641550205647945,
-0.006079165264964104,
-0.04815853759646416,
-0.015360191464424133,
-0.11607866734266281,
-0.11200575530529022,
-0.013207737356424332,
-0.13671602308750153,
-0.010119039565324783,
0.05595948174595833,
0.003977729007601738,
0.01821410097181797,
-0.03142618387937546,
0.0024383175186812878,
0.06541839241981506,
-0.05751744285225868,
0.056182678788900375,
0.12097269296646118,
0.08766137808561325,
-0.1058853268623352,
0.031048951670527458,
0.2011747509241104,
0.04359564557671547,
-0.12483977526426315,
0.01449228823184967,
0.1819491684436798,
0.004885740112513304,
0.017068125307559967,
-0.006097703706473112,
-0.0540788508951664,
-0.07554277032613754,
0.1251034289598465,
0.08296554535627365,
-0.09985227137804031,
0.015833314508199692,
-0.0726347416639328,
-0.01594804972410202,
-0.06374675035476685,
0.10130585730075836,
0.09538925439119339,
0.04440245032310486,
-0.10621760785579681,
-0.08487539738416672,
-0.10891728103160858,
0.040588874369859695,
-0.08629853278398514,
-0.07311757653951645,
0.09629398584365845,
-0.07057105004787445,
-0.07029950618743896,
0.025521177798509598,
-0.17978744208812714,
-0.009467960335314274,
0.1711762249469757,
-0.24654000997543335,
-0.0916430801153183,
-0.10857923328876495,
0.14477859437465668,
0.016497576609253883,
0.1013975441455841,
-0.006207061931490898,
-0.007889035157859325,
-0.20577777922153473,
0.024890204891562462,
-0.05293011665344238,
-0.02073732763528824,
0.07814782857894897,
-0.09476397186517715,
0.22629831731319427,
-0.08276885002851486,
0.020940175279974937,
0.012659613974392414,
0.0870661810040474,
-0.030675338581204414,
0.09283176809549332,
-0.03660329803824425,
-0.12576518952846527,
-0.03620953485369682,
0.03001813031733036,
0.013904244638979435,
0.10071761906147003,
0.09772487729787827,
-0.03414725139737129,
0.03389119729399681,
0.09747414290904999,
0.04172342270612717,
-0.023843804374337196,
0.0360250361263752,
-0.17077107727527618,
0.02182629331946373,
-0.018498148769140244,
-0.06935930997133255,
0.03687669709324837,
-0.06603235751390457,
0.1639697551727295,
0.04022442549467087,
0.0670473501086235,
-0.036152735352516174,
0.0073931049555540085,
-0.014454689808189869,
-0.013775371946394444,
-0.026180334389209747,
-0.17259705066680908,
-0.10422050207853317,
-0.1347656100988388,
-0.012701659463346004,
-0.034971047192811966,
0.04591470584273338,
0.023234914988279343,
-0.0003200018545612693,
-0.014577031135559082,
-0.12090865522623062,
0.04360328987240791,
0.11146783083677292,
-0.04631396010518074,
-0.026193076744675636
] |
null | null | peft | ## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.5.0
| {"library_name": "peft"} | null | genies-models/llama-13b-biology_with_literary_style | [
"peft",
"region:us"
] | 2023-11-11T02:23:17+00:00 | [] | [] | TAGS
#peft #region-us
| ## Training procedure
The following 'bitsandbytes' quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.5.0
| [
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16",
"### Framework versions\n\n\n- PEFT 0.5.0"
] | [
"TAGS\n#peft #region-us \n",
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16",
"### Framework versions\n\n\n- PEFT 0.5.0"
] | [
9,
163,
11
] | [
"passage: TAGS\n#peft #region-us \n## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16### Framework versions\n\n\n- PEFT 0.5.0"
] | [
-0.052170995622873306,
0.04638637602329254,
-0.002228866796940565,
0.12587358057498932,
0.09519550204277039,
0.0600481741130352,
0.10608749836683273,
0.12907910346984863,
0.030399560928344727,
0.08949745446443558,
0.11713548749685287,
0.04174109548330307,
0.061590272933244705,
0.15933413803577423,
-0.013215559534728527,
-0.00870309304445982,
0.04744568094611168,
0.0017413088353350759,
-0.0256193894892931,
0.08726111054420471,
0.04637008160352707,
-0.05632777512073517,
0.03198761120438576,
-0.0936860740184784,
-0.14197169244289398,
-0.000977774616330862,
-0.006933287251740694,
0.026412004604935646,
0.03658851608633995,
0.030308807268738747,
0.06681910902261734,
0.011842764914035797,
-0.01992746815085411,
-0.22561807930469513,
-0.011325501836836338,
0.10931333899497986,
-0.024895185604691505,
0.0698515921831131,
-0.09559076279401779,
0.13339440524578094,
-0.08422335982322693,
-0.051820602267980576,
0.015994668006896973,
0.014835527166724205,
-0.0893593430519104,
-0.09642457216978073,
-0.044293515384197235,
0.04310377314686775,
0.030483758077025414,
0.05993334576487541,
-0.014119904488325119,
0.17196474969387054,
-0.15130110085010529,
0.0900082141160965,
0.07799916714429855,
-0.20914553105831146,
-0.02864500693976879,
0.12620775401592255,
-0.022182848304510117,
0.16751492023468018,
-0.0821603313088417,
-0.0928761437535286,
0.06491231173276901,
0.03869080916047096,
-0.018066778779029846,
-0.0015547500224784017,
-0.11479837447404861,
0.030915888026356697,
-0.13364118337631226,
-0.04813624918460846,
0.14555932581424713,
0.03509735316038132,
-0.03541167080402374,
-0.046493805944919586,
-0.09799899905920029,
-0.3764220178127289,
0.025444475933909416,
-0.019241128116846085,
-0.066914401948452,
0.04486314579844475,
-0.02249828726053238,
-0.01497147511690855,
-0.0114058218896389,
-0.08907991647720337,
-0.02228786051273346,
0.07495085150003433,
0.03715614974498749,
0.026139622554183006,
0.006188852712512016,
0.11266672611236572,
-0.09897858649492264,
-0.03323763236403465,
-0.036225371062755585,
-0.028881169855594635,
-0.05456538870930672,
-0.006604235153645277,
-0.07022203505039215,
0.17528314888477325,
0.07525674998760223,
0.12666282057762146,
-0.12827250361442566,
0.1311921775341034,
-0.017891818657517433,
0.049590788781642914,
-0.03390548750758171,
0.035105880349874496,
-0.12452733516693115,
0.10138563066720963,
0.007189449388533831,
0.16058793663978577,
0.009220117703080177,
-0.04686558619141579,
-0.06686486303806305,
0.001404313137754798,
0.1462526172399521,
-0.0007741169538348913,
-0.10082004219293594,
0.01514110155403614,
-0.143125981092453,
-0.03643784299492836,
0.06489310413599014,
-0.07333685457706451,
0.009285308420658112,
0.035954903811216354,
-0.05005185306072235,
-0.035470712929964066,
0.10925997793674469,
-0.04522745683789253,
-0.026262326166033745,
-0.027128657326102257,
-0.09760791808366776,
-0.017379216849803925,
-0.10380303114652634,
-0.13263756036758423,
0.04999830946326256,
-0.18068750202655792,
0.00562055641785264,
-0.042586758732795715,
-0.05852135270833969,
0.02671758644282818,
0.011211113072931767,
-0.0801720842719078,
0.0641409307718277,
-0.10207898169755936,
-0.14261718094348907,
-0.02556646801531315,
0.015782896429300308,
0.027732979506254196,
-0.01964949443936348,
0.10067597031593323,
0.04901120811700821,
0.11884310096502304,
-0.15137265622615814,
-0.000906149041838944,
-0.0006713634938932955,
0.06834474205970764,
0.05941688269376755,
0.13060152530670166,
-0.10515986382961273,
-0.027969863265752792,
-0.06103983148932457,
-0.06348162889480591,
-0.13580144941806793,
-0.01634615659713745,
0.1407272070646286,
0.08765122294425964,
-0.162354975938797,
-0.01073814183473587,
0.07467684894800186,
-0.002643210580572486,
-0.07879126816987991,
0.15129150450229645,
-0.05682380124926567,
0.11437069624662399,
-0.04345925524830818,
0.07842165231704712,
0.23099873960018158,
-0.0996355414390564,
0.006504190154373646,
0.11197512596845627,
0.07360801845788956,
-0.000003583008265195531,
0.013187721371650696,
0.07264252007007599,
-0.10083498060703278,
0.03870057687163353,
0.07377462089061737,
0.03774886205792427,
-0.06504413485527039,
-0.07400774955749512,
-0.022752903401851654,
-0.05681690201163292,
0.1151171550154686,
0.02729841135442257,
0.014005406759679317,
-0.06971991807222366,
-0.08609434217214584,
0.1404680609703064,
0.11459258198738098,
-0.030378511175513268,
-0.005029883235692978,
-0.13212119042873383,
-0.0021202426869422197,
-0.04808016121387482,
0.024497391656041145,
-0.1225360855460167,
0.02185128629207611,
0.08327647298574448,
0.00787833146750927,
0.012623712420463562,
0.04495754465460777,
0.06096586957573891,
0.029019279405474663,
-0.0604269839823246,
0.002956473035737872,
-0.05536847561597824,
0.004059408791363239,
-0.08605807274580002,
-0.07381096482276917,
-0.0010516063775867224,
-0.014724265784025192,
0.19779662787914276,
-0.13052622973918915,
0.03565271571278572,
0.11289136111736298,
-0.005989561323076487,
-0.009503825567662716,
-0.04201732203364372,
-0.06623262912034988,
0.10975963622331619,
-0.012460143305361271,
-0.037024881690740585,
0.03980237618088722,
0.02601690962910652,
-0.061022430658340454,
-0.17260299623012543,
-0.0970475897192955,
0.05009652301669121,
0.12437541782855988,
0.06887579709291458,
-0.05821763351559639,
-0.03995196893811226,
-0.023590924218297005,
-0.03908832371234894,
0.06429712474346161,
-0.045532938092947006,
0.03026658482849598,
0.0026904530823230743,
0.06007077172398567,
-0.10919053852558136,
-0.04384881630539894,
0.06400182098150253,
-0.01284062210470438,
-0.04328686743974686,
0.1131523922085762,
0.03539156913757324,
-0.10713862627744675,
0.06380997598171234,
0.057952430099248886,
-0.14340658485889435,
0.1139671802520752,
-0.010076945647597313,
-0.021488331258296967,
-0.09941406548023224,
0.16643720865249634,
0.018506327643990517,
0.11286304146051407,
-0.1634303331375122,
0.09947452694177628,
-0.00929392408579588,
0.008983489125967026,
0.07010107487440109,
-0.1942894458770752,
-0.0023229599464684725,
-0.03061300702393055,
-0.08721230924129486,
-0.06861617416143417,
-0.015447620302438736,
-0.003656132612377405,
0.023634733632206917,
-0.0011160095455124974,
0.07114277780056,
0.1408311426639557,
-0.021699242293834686,
-0.08314504474401474,
0.18067345023155212,
-0.2228248417377472,
-0.2243027687072754,
-0.23713873326778412,
0.014357865788042545,
-0.10143844783306122,
-0.034064821898937225,
-0.05187778174877167,
-0.07830392569303513,
0.03843799605965614,
-0.08154495060443878,
-0.047854289412498474,
-0.010174937546253204,
0.00946576427668333,
0.05525780841708183,
0.022059176117181778,
0.17305299639701843,
-0.08067180961370468,
0.027929941192269325,
0.05004730448126793,
-0.03334664925932884,
0.1225728914141655,
-0.07749255001544952,
-0.0130006680265069,
0.11107799410820007,
-0.017344031482934952,
0.020386451855301857,
0.015911642462015152,
0.321786105632782,
0.012793739326298237,
0.031549349427223206,
0.07509331405162811,
0.004641542676836252,
0.05399017781019211,
0.08592698723077774,
0.015287249349057674,
-0.10515401512384415,
0.07365500181913376,
0.053789518773555756,
-0.09195578098297119,
-0.1337462067604065,
-0.03178740665316582,
-0.06295210868120193,
0.018377745524048805,
0.07818001508712769,
0.06705041229724884,
0.08702594041824341,
0.06877101212739944,
0.04080965742468834,
0.09042216837406158,
0.0002709537511691451,
-0.010677835904061794,
0.11289051920175552,
-0.018581749871373177,
0.07096980512142181,
-0.0141120171174407,
0.02626338228583336,
0.04935658350586891,
0.12695446610450745,
0.06418963521718979,
-0.07558903843164444,
0.024903111159801483,
0.05380476638674736,
0.2923724353313446,
-0.00511576933786273,
0.09942048788070679,
-0.07165701687335968,
-0.020087575539946556,
-0.012876059859991074,
-0.031602099537849426,
-0.08611224591732025,
0.04154687002301216,
0.009212172590196133,
0.05790695920586586,
-0.0025879312306642532,
-0.00050112244207412,
0.07788389921188354,
0.09517701715230942,
0.16684189438819885,
-0.2861690819263458,
-0.10541892051696777,
-0.011134089902043343,
0.11542210727930069,
-0.09569238871335983,
0.016318857669830322,
0.22510360181331635,
0.01965336687862873,
-0.10895603150129318,
-0.031794480979442596,
0.0305477287620306,
-0.004160500131547451,
0.01549921091645956,
0.13027654588222504,
0.11410193145275116,
0.0013551336014643312,
0.08492128551006317,
-0.30027419328689575,
0.03473859280347824,
0.05703248083591461,
0.040042631328105927,
-0.03116016462445259,
0.003734297351911664,
-0.07034774869680405,
-0.08240098506212234,
0.038768261671066284,
0.004138266667723656,
0.1705280989408493,
-0.2976781725883484,
-0.07360395789146423,
-0.00185823580250144,
0.1271347999572754,
0.05303133279085159,
0.04872012883424759,
0.02403664030134678,
0.04942479729652405,
0.08176594227552414,
0.08706773817539215,
-0.02611609175801277,
-0.11035197973251343,
-0.004661462735384703,
0.13897009193897247,
-0.12668536603450775,
-0.05941249430179596,
-0.05124782398343086,
-0.022857125848531723,
0.03539132699370384,
-0.15729300677776337,
-0.04396966099739075,
-0.05650247633457184,
0.03282100334763527,
0.1437554955482483,
-0.034790076315402985,
0.0006096545839682221,
-0.012934640981256962,
0.015886543318629265,
-0.032133374363183975,
-0.08068639785051346,
0.10920129716396332,
-0.044809311628341675,
-0.13727150857448578,
-0.045596521347761154,
0.1377175748348236,
0.0731605812907219,
-0.00359351746737957,
-0.09041156619787216,
-0.050021663308143616,
0.031252212822437286,
-0.1371932029724121,
0.01997753418982029,
0.08576954901218414,
-0.07234665006399155,
0.0698034018278122,
-0.11751603335142136,
0.2458948791027069,
-0.0524151511490345,
0.0896567553281784,
0.0709720253944397,
0.31162014603614807,
-0.08235272765159607,
0.02873161807656288,
0.09803200513124466,
-0.026654057204723358,
-0.26190292835235596,
0.029979633167386055,
0.05638238042593002,
0.04938185587525368,
-0.02843460999429226,
-0.18719743192195892,
0.036363113671541214,
0.06459816545248032,
0.016497470438480377,
0.13878561556339264,
-0.3157224655151367,
-0.0721718817949295,
0.04020849987864494,
0.047762200236320496,
0.11962796002626419,
-0.04890049248933792,
0.011267954483628273,
0.010984989814460278,
-0.016328759491443634,
0.16383521258831024,
-0.08632995188236237,
0.11082419753074646,
-0.012934541329741478,
0.02104428969323635,
0.010682103224098682,
-0.040459323674440384,
0.1487898826599121,
-0.003343862947076559,
0.09072566032409668,
0.025223324075341225,
-0.06733614206314087,
0.06636755168437958,
-0.07327944040298462,
0.011974047869443893,
-0.05378776043653488,
0.09185262024402618,
-0.045282598584890366,
0.0010139745427295566,
-0.06246977299451828,
-0.024063177406787872,
-0.06749105453491211,
-0.06296666711568832,
-0.10791633278131485,
0.08884958922863007,
0.000746204168535769,
-0.02908109501004219,
-0.04201129451394081,
0.054056353867053986,
0.035463664680719376,
0.4493359923362732,
-0.05227240175008774,
-0.053069330751895905,
0.08815151453018188,
0.09277763962745667,
-0.01794450543820858,
0.09321408718824387,
-0.12357188761234283,
0.04079055041074753,
0.12765155732631683,
0.004561528097838163,
0.13461217284202576,
0.08197237551212311,
-0.10337455570697784,
-0.004654042888432741,
0.042563386261463165,
-0.13127627968788147,
-0.08014523237943649,
-0.03295697271823883,
-0.01872093789279461,
-0.1190016120672226,
-0.015042198821902275,
0.09247755259275436,
-0.03084845095872879,
0.053225692361593246,
0.02725985273718834,
0.04755283519625664,
-0.13321809470653534,
0.15661577880382538,
0.03548196330666542,
0.08438514173030853,
-0.08233853429555893,
0.0957811176776886,
0.030104901641607285,
0.004610141273587942,
0.04891415685415268,
-0.02637943997979164,
-0.10149519145488739,
0.019178146496415138,
-0.03032047301530838,
-0.08386382460594177,
0.12455578148365021,
-0.03086884878575802,
-0.04816962778568268,
-0.08449167013168335,
0.00968680065125227,
0.06828322261571884,
0.05217857286334038,
0.10059580206871033,
-0.031775325536727905,
0.02660980448126793,
-0.12699167430400848,
0.08317403495311737,
-0.024292631074786186,
0.012828993611037731,
-0.1356390118598938,
0.06760798394680023,
-0.02659040130674839,
0.057388827204704285,
-0.01863805763423443,
-0.01973152346909046,
-0.2173939198255539,
0.02618231251835823,
-0.025557123124599457,
0.0038494919426739216,
0.04102040454745293,
0.027889033779501915,
0.026005011051893234,
0.05082876980304718,
-0.035126328468322754,
0.027516542002558708,
-0.031932324171066284,
-0.05281057208776474,
0.04515986144542694,
-0.001467780559323728,
-0.02854740433394909,
-0.051517583429813385,
0.06076597422361374,
-0.11085972934961319,
0.03777240961790085,
0.01833501085639,
-0.05718983709812164,
0.07385632395744324,
0.034146055579185486,
0.025609485805034637,
0.09601592272520065,
0.05744863301515579,
0.03817358240485191,
-0.06451597064733505,
0.03407379239797592,
-0.0191631019115448,
-0.006845394615083933,
0.040695060044527054,
0.11625181883573532,
-0.04832686111330986,
-0.06327467411756516,
-0.1486724466085434,
-0.010981161147356033,
-0.048681098967790604,
0.043675489723682404,
0.14957195520401,
0.10375252366065979,
0.09278536587953568,
-0.08544762432575226,
-0.0226217620074749,
-0.14531907439231873,
-0.08078242838382721,
0.060537077486515045,
-0.044578637927770615,
-0.05809058994054794,
-0.04991595074534416,
0.072527676820755,
-0.014007162302732468,
0.13561315834522247,
-0.08526904881000519,
-0.1192963719367981,
-0.05781520903110504,
-0.2013944536447525,
-0.12102537602186203,
0.006627216935157776,
0.2745034396648407,
0.03526143729686737,
-0.04139144718647003,
-0.07936355471611023,
0.0031785741448402405,
0.07116973400115967,
0.16906431317329407,
0.0312589667737484,
0.110907644033432,
-0.11192750185728073,
0.10478517413139343,
0.039009008556604385,
-0.05691129341721535,
0.10796307027339935,
0.3328118920326233,
-0.08219370990991592,
0.02072407677769661,
-0.10421788692474365,
0.10153460502624512,
0.002255996223539114,
-0.13901546597480774,
0.010607997886836529,
-0.03188878297805786,
-0.16198381781578064,
-0.10664700716733932,
0.016538817435503006,
-0.07478190958499908,
-0.1838442087173462,
-0.017796620726585388,
-0.11684910207986832,
-0.07336822152137756,
0.10155867785215378,
0.040450263768434525,
-0.03009716235101223,
0.20794062316417694,
-0.07158278673887253,
0.04586811363697052,
-0.003929977770894766,
-0.016108684241771698,
-0.01862870156764984,
-0.03230820968747139,
-0.10202806442975998,
0.13945409655570984,
0.021661214530467987,
0.10333898663520813,
-0.0020766255911439657,
0.07358334958553314,
0.037272170186042786,
-0.0280001163482666,
-0.04698936268687248,
-0.011088578961789608,
0.0020344743970781565,
-0.05577605590224266,
0.11701337993144989,
0.05259020999073982,
-0.0886714980006218,
-0.07409586012363434,
-0.009580516256392002,
-0.07335926592350006,
-0.038215573877096176,
-0.1547286957502365,
0.2329166978597641,
-0.030917182564735413,
0.12437079846858978,
0.003149752039462328,
-0.06306970119476318,
-0.09483032673597336,
0.14517559111118317,
0.12407895922660828,
-0.11817460507154465,
-0.009399796836078167,
0.09650255739688873,
-0.005866600200533867,
-0.09527475386857986,
0.14510999619960785,
0.08037827908992767,
-0.02563084289431572,
0.025037124752998352,
-0.022733965888619423,
-0.021453559398651123,
-0.015347006730735302,
0.015540610998868942,
-0.030597979202866554,
0.029416168108582497,
0.042662251740694046,
-0.1463923156261444,
-0.03498699516057968,
-0.08044791221618652,
-0.08698305487632751,
0.18581046164035797,
-0.14277657866477966,
-0.07864214479923248,
-0.04254341498017311,
-0.08468645066022873,
-0.10728199034929276,
0.021719660609960556,
-0.10976678878068924,
0.06724512577056885,
0.05545623227953911,
-0.05225260928273201,
-0.0009250067523680627,
-0.03164041042327881,
0.0032662637531757355,
0.03487436845898628,
0.07120268791913986,
-0.01732550375163555,
0.05975370481610298,
0.12397127598524094,
-0.019730931147933006,
-0.04827846959233284,
0.11225178092718124,
0.021490709856152534,
-0.035946041345596313,
-0.14605534076690674,
0.03688865900039673,
-0.022579049691557884,
0.12442220002412796,
0.032534532248973846,
-0.05697030946612358,
-0.014126471243798733,
-0.2196122705936432,
-0.008397997356951237,
-0.1463952362537384,
-0.07968004047870636,
-0.06731394678354263,
0.10560601949691772,
0.17703479528427124,
-0.055034078657627106,
0.019445886835455894,
-0.03590158373117447,
0.04178448021411896,
-0.05044577643275261,
0.07704321295022964,
-0.0026474366895854473,
-0.1442994624376297,
0.05290527641773224,
-0.054687224328517914,
0.009347786195576191,
-0.3066256046295166,
-0.009464235045015812,
0.024085188284516335,
-0.03251948952674866,
-0.03590572252869606,
0.16143204271793365,
0.021979082375764847,
0.06882719695568085,
-0.05558779463171959,
-0.25186118483543396,
-0.07091144472360611,
0.13138745725154877,
-0.0019314914243295789,
-0.07031361013650894
] |
null | null | peft | ## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.5.0
| {"library_name": "peft"} | null | genies-models/llama-30b-alpaca_high_quality | [
"peft",
"region:us"
] | 2023-11-11T02:23:52+00:00 | [] | [] | TAGS
#peft #region-us
| ## Training procedure
The following 'bitsandbytes' quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.5.0
| [
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16",
"### Framework versions\n\n\n- PEFT 0.5.0"
] | [
"TAGS\n#peft #region-us \n",
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16",
"### Framework versions\n\n\n- PEFT 0.5.0"
] | [
9,
163,
11
] | [
"passage: TAGS\n#peft #region-us \n## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16### Framework versions\n\n\n- PEFT 0.5.0"
] | [
-0.052170995622873306,
0.04638637602329254,
-0.002228866796940565,
0.12587358057498932,
0.09519550204277039,
0.0600481741130352,
0.10608749836683273,
0.12907910346984863,
0.030399560928344727,
0.08949745446443558,
0.11713548749685287,
0.04174109548330307,
0.061590272933244705,
0.15933413803577423,
-0.013215559534728527,
-0.00870309304445982,
0.04744568094611168,
0.0017413088353350759,
-0.0256193894892931,
0.08726111054420471,
0.04637008160352707,
-0.05632777512073517,
0.03198761120438576,
-0.0936860740184784,
-0.14197169244289398,
-0.000977774616330862,
-0.006933287251740694,
0.026412004604935646,
0.03658851608633995,
0.030308807268738747,
0.06681910902261734,
0.011842764914035797,
-0.01992746815085411,
-0.22561807930469513,
-0.011325501836836338,
0.10931333899497986,
-0.024895185604691505,
0.0698515921831131,
-0.09559076279401779,
0.13339440524578094,
-0.08422335982322693,
-0.051820602267980576,
0.015994668006896973,
0.014835527166724205,
-0.0893593430519104,
-0.09642457216978073,
-0.044293515384197235,
0.04310377314686775,
0.030483758077025414,
0.05993334576487541,
-0.014119904488325119,
0.17196474969387054,
-0.15130110085010529,
0.0900082141160965,
0.07799916714429855,
-0.20914553105831146,
-0.02864500693976879,
0.12620775401592255,
-0.022182848304510117,
0.16751492023468018,
-0.0821603313088417,
-0.0928761437535286,
0.06491231173276901,
0.03869080916047096,
-0.018066778779029846,
-0.0015547500224784017,
-0.11479837447404861,
0.030915888026356697,
-0.13364118337631226,
-0.04813624918460846,
0.14555932581424713,
0.03509735316038132,
-0.03541167080402374,
-0.046493805944919586,
-0.09799899905920029,
-0.3764220178127289,
0.025444475933909416,
-0.019241128116846085,
-0.066914401948452,
0.04486314579844475,
-0.02249828726053238,
-0.01497147511690855,
-0.0114058218896389,
-0.08907991647720337,
-0.02228786051273346,
0.07495085150003433,
0.03715614974498749,
0.026139622554183006,
0.006188852712512016,
0.11266672611236572,
-0.09897858649492264,
-0.03323763236403465,
-0.036225371062755585,
-0.028881169855594635,
-0.05456538870930672,
-0.006604235153645277,
-0.07022203505039215,
0.17528314888477325,
0.07525674998760223,
0.12666282057762146,
-0.12827250361442566,
0.1311921775341034,
-0.017891818657517433,
0.049590788781642914,
-0.03390548750758171,
0.035105880349874496,
-0.12452733516693115,
0.10138563066720963,
0.007189449388533831,
0.16058793663978577,
0.009220117703080177,
-0.04686558619141579,
-0.06686486303806305,
0.001404313137754798,
0.1462526172399521,
-0.0007741169538348913,
-0.10082004219293594,
0.01514110155403614,
-0.143125981092453,
-0.03643784299492836,
0.06489310413599014,
-0.07333685457706451,
0.009285308420658112,
0.035954903811216354,
-0.05005185306072235,
-0.035470712929964066,
0.10925997793674469,
-0.04522745683789253,
-0.026262326166033745,
-0.027128657326102257,
-0.09760791808366776,
-0.017379216849803925,
-0.10380303114652634,
-0.13263756036758423,
0.04999830946326256,
-0.18068750202655792,
0.00562055641785264,
-0.042586758732795715,
-0.05852135270833969,
0.02671758644282818,
0.011211113072931767,
-0.0801720842719078,
0.0641409307718277,
-0.10207898169755936,
-0.14261718094348907,
-0.02556646801531315,
0.015782896429300308,
0.027732979506254196,
-0.01964949443936348,
0.10067597031593323,
0.04901120811700821,
0.11884310096502304,
-0.15137265622615814,
-0.000906149041838944,
-0.0006713634938932955,
0.06834474205970764,
0.05941688269376755,
0.13060152530670166,
-0.10515986382961273,
-0.027969863265752792,
-0.06103983148932457,
-0.06348162889480591,
-0.13580144941806793,
-0.01634615659713745,
0.1407272070646286,
0.08765122294425964,
-0.162354975938797,
-0.01073814183473587,
0.07467684894800186,
-0.002643210580572486,
-0.07879126816987991,
0.15129150450229645,
-0.05682380124926567,
0.11437069624662399,
-0.04345925524830818,
0.07842165231704712,
0.23099873960018158,
-0.0996355414390564,
0.006504190154373646,
0.11197512596845627,
0.07360801845788956,
-0.000003583008265195531,
0.013187721371650696,
0.07264252007007599,
-0.10083498060703278,
0.03870057687163353,
0.07377462089061737,
0.03774886205792427,
-0.06504413485527039,
-0.07400774955749512,
-0.022752903401851654,
-0.05681690201163292,
0.1151171550154686,
0.02729841135442257,
0.014005406759679317,
-0.06971991807222366,
-0.08609434217214584,
0.1404680609703064,
0.11459258198738098,
-0.030378511175513268,
-0.005029883235692978,
-0.13212119042873383,
-0.0021202426869422197,
-0.04808016121387482,
0.024497391656041145,
-0.1225360855460167,
0.02185128629207611,
0.08327647298574448,
0.00787833146750927,
0.012623712420463562,
0.04495754465460777,
0.06096586957573891,
0.029019279405474663,
-0.0604269839823246,
0.002956473035737872,
-0.05536847561597824,
0.004059408791363239,
-0.08605807274580002,
-0.07381096482276917,
-0.0010516063775867224,
-0.014724265784025192,
0.19779662787914276,
-0.13052622973918915,
0.03565271571278572,
0.11289136111736298,
-0.005989561323076487,
-0.009503825567662716,
-0.04201732203364372,
-0.06623262912034988,
0.10975963622331619,
-0.012460143305361271,
-0.037024881690740585,
0.03980237618088722,
0.02601690962910652,
-0.061022430658340454,
-0.17260299623012543,
-0.0970475897192955,
0.05009652301669121,
0.12437541782855988,
0.06887579709291458,
-0.05821763351559639,
-0.03995196893811226,
-0.023590924218297005,
-0.03908832371234894,
0.06429712474346161,
-0.045532938092947006,
0.03026658482849598,
0.0026904530823230743,
0.06007077172398567,
-0.10919053852558136,
-0.04384881630539894,
0.06400182098150253,
-0.01284062210470438,
-0.04328686743974686,
0.1131523922085762,
0.03539156913757324,
-0.10713862627744675,
0.06380997598171234,
0.057952430099248886,
-0.14340658485889435,
0.1139671802520752,
-0.010076945647597313,
-0.021488331258296967,
-0.09941406548023224,
0.16643720865249634,
0.018506327643990517,
0.11286304146051407,
-0.1634303331375122,
0.09947452694177628,
-0.00929392408579588,
0.008983489125967026,
0.07010107487440109,
-0.1942894458770752,
-0.0023229599464684725,
-0.03061300702393055,
-0.08721230924129486,
-0.06861617416143417,
-0.015447620302438736,
-0.003656132612377405,
0.023634733632206917,
-0.0011160095455124974,
0.07114277780056,
0.1408311426639557,
-0.021699242293834686,
-0.08314504474401474,
0.18067345023155212,
-0.2228248417377472,
-0.2243027687072754,
-0.23713873326778412,
0.014357865788042545,
-0.10143844783306122,
-0.034064821898937225,
-0.05187778174877167,
-0.07830392569303513,
0.03843799605965614,
-0.08154495060443878,
-0.047854289412498474,
-0.010174937546253204,
0.00946576427668333,
0.05525780841708183,
0.022059176117181778,
0.17305299639701843,
-0.08067180961370468,
0.027929941192269325,
0.05004730448126793,
-0.03334664925932884,
0.1225728914141655,
-0.07749255001544952,
-0.0130006680265069,
0.11107799410820007,
-0.017344031482934952,
0.020386451855301857,
0.015911642462015152,
0.321786105632782,
0.012793739326298237,
0.031549349427223206,
0.07509331405162811,
0.004641542676836252,
0.05399017781019211,
0.08592698723077774,
0.015287249349057674,
-0.10515401512384415,
0.07365500181913376,
0.053789518773555756,
-0.09195578098297119,
-0.1337462067604065,
-0.03178740665316582,
-0.06295210868120193,
0.018377745524048805,
0.07818001508712769,
0.06705041229724884,
0.08702594041824341,
0.06877101212739944,
0.04080965742468834,
0.09042216837406158,
0.0002709537511691451,
-0.010677835904061794,
0.11289051920175552,
-0.018581749871373177,
0.07096980512142181,
-0.0141120171174407,
0.02626338228583336,
0.04935658350586891,
0.12695446610450745,
0.06418963521718979,
-0.07558903843164444,
0.024903111159801483,
0.05380476638674736,
0.2923724353313446,
-0.00511576933786273,
0.09942048788070679,
-0.07165701687335968,
-0.020087575539946556,
-0.012876059859991074,
-0.031602099537849426,
-0.08611224591732025,
0.04154687002301216,
0.009212172590196133,
0.05790695920586586,
-0.0025879312306642532,
-0.00050112244207412,
0.07788389921188354,
0.09517701715230942,
0.16684189438819885,
-0.2861690819263458,
-0.10541892051696777,
-0.011134089902043343,
0.11542210727930069,
-0.09569238871335983,
0.016318857669830322,
0.22510360181331635,
0.01965336687862873,
-0.10895603150129318,
-0.031794480979442596,
0.0305477287620306,
-0.004160500131547451,
0.01549921091645956,
0.13027654588222504,
0.11410193145275116,
0.0013551336014643312,
0.08492128551006317,
-0.30027419328689575,
0.03473859280347824,
0.05703248083591461,
0.040042631328105927,
-0.03116016462445259,
0.003734297351911664,
-0.07034774869680405,
-0.08240098506212234,
0.038768261671066284,
0.004138266667723656,
0.1705280989408493,
-0.2976781725883484,
-0.07360395789146423,
-0.00185823580250144,
0.1271347999572754,
0.05303133279085159,
0.04872012883424759,
0.02403664030134678,
0.04942479729652405,
0.08176594227552414,
0.08706773817539215,
-0.02611609175801277,
-0.11035197973251343,
-0.004661462735384703,
0.13897009193897247,
-0.12668536603450775,
-0.05941249430179596,
-0.05124782398343086,
-0.022857125848531723,
0.03539132699370384,
-0.15729300677776337,
-0.04396966099739075,
-0.05650247633457184,
0.03282100334763527,
0.1437554955482483,
-0.034790076315402985,
0.0006096545839682221,
-0.012934640981256962,
0.015886543318629265,
-0.032133374363183975,
-0.08068639785051346,
0.10920129716396332,
-0.044809311628341675,
-0.13727150857448578,
-0.045596521347761154,
0.1377175748348236,
0.0731605812907219,
-0.00359351746737957,
-0.09041156619787216,
-0.050021663308143616,
0.031252212822437286,
-0.1371932029724121,
0.01997753418982029,
0.08576954901218414,
-0.07234665006399155,
0.0698034018278122,
-0.11751603335142136,
0.2458948791027069,
-0.0524151511490345,
0.0896567553281784,
0.0709720253944397,
0.31162014603614807,
-0.08235272765159607,
0.02873161807656288,
0.09803200513124466,
-0.026654057204723358,
-0.26190292835235596,
0.029979633167386055,
0.05638238042593002,
0.04938185587525368,
-0.02843460999429226,
-0.18719743192195892,
0.036363113671541214,
0.06459816545248032,
0.016497470438480377,
0.13878561556339264,
-0.3157224655151367,
-0.0721718817949295,
0.04020849987864494,
0.047762200236320496,
0.11962796002626419,
-0.04890049248933792,
0.011267954483628273,
0.010984989814460278,
-0.016328759491443634,
0.16383521258831024,
-0.08632995188236237,
0.11082419753074646,
-0.012934541329741478,
0.02104428969323635,
0.010682103224098682,
-0.040459323674440384,
0.1487898826599121,
-0.003343862947076559,
0.09072566032409668,
0.025223324075341225,
-0.06733614206314087,
0.06636755168437958,
-0.07327944040298462,
0.011974047869443893,
-0.05378776043653488,
0.09185262024402618,
-0.045282598584890366,
0.0010139745427295566,
-0.06246977299451828,
-0.024063177406787872,
-0.06749105453491211,
-0.06296666711568832,
-0.10791633278131485,
0.08884958922863007,
0.000746204168535769,
-0.02908109501004219,
-0.04201129451394081,
0.054056353867053986,
0.035463664680719376,
0.4493359923362732,
-0.05227240175008774,
-0.053069330751895905,
0.08815151453018188,
0.09277763962745667,
-0.01794450543820858,
0.09321408718824387,
-0.12357188761234283,
0.04079055041074753,
0.12765155732631683,
0.004561528097838163,
0.13461217284202576,
0.08197237551212311,
-0.10337455570697784,
-0.004654042888432741,
0.042563386261463165,
-0.13127627968788147,
-0.08014523237943649,
-0.03295697271823883,
-0.01872093789279461,
-0.1190016120672226,
-0.015042198821902275,
0.09247755259275436,
-0.03084845095872879,
0.053225692361593246,
0.02725985273718834,
0.04755283519625664,
-0.13321809470653534,
0.15661577880382538,
0.03548196330666542,
0.08438514173030853,
-0.08233853429555893,
0.0957811176776886,
0.030104901641607285,
0.004610141273587942,
0.04891415685415268,
-0.02637943997979164,
-0.10149519145488739,
0.019178146496415138,
-0.03032047301530838,
-0.08386382460594177,
0.12455578148365021,
-0.03086884878575802,
-0.04816962778568268,
-0.08449167013168335,
0.00968680065125227,
0.06828322261571884,
0.05217857286334038,
0.10059580206871033,
-0.031775325536727905,
0.02660980448126793,
-0.12699167430400848,
0.08317403495311737,
-0.024292631074786186,
0.012828993611037731,
-0.1356390118598938,
0.06760798394680023,
-0.02659040130674839,
0.057388827204704285,
-0.01863805763423443,
-0.01973152346909046,
-0.2173939198255539,
0.02618231251835823,
-0.025557123124599457,
0.0038494919426739216,
0.04102040454745293,
0.027889033779501915,
0.026005011051893234,
0.05082876980304718,
-0.035126328468322754,
0.027516542002558708,
-0.031932324171066284,
-0.05281057208776474,
0.04515986144542694,
-0.001467780559323728,
-0.02854740433394909,
-0.051517583429813385,
0.06076597422361374,
-0.11085972934961319,
0.03777240961790085,
0.01833501085639,
-0.05718983709812164,
0.07385632395744324,
0.034146055579185486,
0.025609485805034637,
0.09601592272520065,
0.05744863301515579,
0.03817358240485191,
-0.06451597064733505,
0.03407379239797592,
-0.0191631019115448,
-0.006845394615083933,
0.040695060044527054,
0.11625181883573532,
-0.04832686111330986,
-0.06327467411756516,
-0.1486724466085434,
-0.010981161147356033,
-0.048681098967790604,
0.043675489723682404,
0.14957195520401,
0.10375252366065979,
0.09278536587953568,
-0.08544762432575226,
-0.0226217620074749,
-0.14531907439231873,
-0.08078242838382721,
0.060537077486515045,
-0.044578637927770615,
-0.05809058994054794,
-0.04991595074534416,
0.072527676820755,
-0.014007162302732468,
0.13561315834522247,
-0.08526904881000519,
-0.1192963719367981,
-0.05781520903110504,
-0.2013944536447525,
-0.12102537602186203,
0.006627216935157776,
0.2745034396648407,
0.03526143729686737,
-0.04139144718647003,
-0.07936355471611023,
0.0031785741448402405,
0.07116973400115967,
0.16906431317329407,
0.0312589667737484,
0.110907644033432,
-0.11192750185728073,
0.10478517413139343,
0.039009008556604385,
-0.05691129341721535,
0.10796307027339935,
0.3328118920326233,
-0.08219370990991592,
0.02072407677769661,
-0.10421788692474365,
0.10153460502624512,
0.002255996223539114,
-0.13901546597480774,
0.010607997886836529,
-0.03188878297805786,
-0.16198381781578064,
-0.10664700716733932,
0.016538817435503006,
-0.07478190958499908,
-0.1838442087173462,
-0.017796620726585388,
-0.11684910207986832,
-0.07336822152137756,
0.10155867785215378,
0.040450263768434525,
-0.03009716235101223,
0.20794062316417694,
-0.07158278673887253,
0.04586811363697052,
-0.003929977770894766,
-0.016108684241771698,
-0.01862870156764984,
-0.03230820968747139,
-0.10202806442975998,
0.13945409655570984,
0.021661214530467987,
0.10333898663520813,
-0.0020766255911439657,
0.07358334958553314,
0.037272170186042786,
-0.0280001163482666,
-0.04698936268687248,
-0.011088578961789608,
0.0020344743970781565,
-0.05577605590224266,
0.11701337993144989,
0.05259020999073982,
-0.0886714980006218,
-0.07409586012363434,
-0.009580516256392002,
-0.07335926592350006,
-0.038215573877096176,
-0.1547286957502365,
0.2329166978597641,
-0.030917182564735413,
0.12437079846858978,
0.003149752039462328,
-0.06306970119476318,
-0.09483032673597336,
0.14517559111118317,
0.12407895922660828,
-0.11817460507154465,
-0.009399796836078167,
0.09650255739688873,
-0.005866600200533867,
-0.09527475386857986,
0.14510999619960785,
0.08037827908992767,
-0.02563084289431572,
0.025037124752998352,
-0.022733965888619423,
-0.021453559398651123,
-0.015347006730735302,
0.015540610998868942,
-0.030597979202866554,
0.029416168108582497,
0.042662251740694046,
-0.1463923156261444,
-0.03498699516057968,
-0.08044791221618652,
-0.08698305487632751,
0.18581046164035797,
-0.14277657866477966,
-0.07864214479923248,
-0.04254341498017311,
-0.08468645066022873,
-0.10728199034929276,
0.021719660609960556,
-0.10976678878068924,
0.06724512577056885,
0.05545623227953911,
-0.05225260928273201,
-0.0009250067523680627,
-0.03164041042327881,
0.0032662637531757355,
0.03487436845898628,
0.07120268791913986,
-0.01732550375163555,
0.05975370481610298,
0.12397127598524094,
-0.019730931147933006,
-0.04827846959233284,
0.11225178092718124,
0.021490709856152534,
-0.035946041345596313,
-0.14605534076690674,
0.03688865900039673,
-0.022579049691557884,
0.12442220002412796,
0.032534532248973846,
-0.05697030946612358,
-0.014126471243798733,
-0.2196122705936432,
-0.008397997356951237,
-0.1463952362537384,
-0.07968004047870636,
-0.06731394678354263,
0.10560601949691772,
0.17703479528427124,
-0.055034078657627106,
0.019445886835455894,
-0.03590158373117447,
0.04178448021411896,
-0.05044577643275261,
0.07704321295022964,
-0.0026474366895854473,
-0.1442994624376297,
0.05290527641773224,
-0.054687224328517914,
0.009347786195576191,
-0.3066256046295166,
-0.009464235045015812,
0.024085188284516335,
-0.03251948952674866,
-0.03590572252869606,
0.16143204271793365,
0.021979082375764847,
0.06882719695568085,
-0.05558779463171959,
-0.25186118483543396,
-0.07091144472360611,
0.13138745725154877,
-0.0019314914243295789,
-0.07031361013650894
] |
null | null | peft | ## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.5.0
| {"library_name": "peft"} | null | genies-models/llama-30b-mmlu | [
"peft",
"region:us"
] | 2023-11-11T02:24:56+00:00 | [] | [] | TAGS
#peft #region-us
| ## Training procedure
The following 'bitsandbytes' quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.5.0
| [
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16",
"### Framework versions\n\n\n- PEFT 0.5.0"
] | [
"TAGS\n#peft #region-us \n",
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16",
"### Framework versions\n\n\n- PEFT 0.5.0"
] | [
9,
163,
11
] | [
"passage: TAGS\n#peft #region-us \n## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16### Framework versions\n\n\n- PEFT 0.5.0"
] | [
-0.052170995622873306,
0.04638637602329254,
-0.002228866796940565,
0.12587358057498932,
0.09519550204277039,
0.0600481741130352,
0.10608749836683273,
0.12907910346984863,
0.030399560928344727,
0.08949745446443558,
0.11713548749685287,
0.04174109548330307,
0.061590272933244705,
0.15933413803577423,
-0.013215559534728527,
-0.00870309304445982,
0.04744568094611168,
0.0017413088353350759,
-0.0256193894892931,
0.08726111054420471,
0.04637008160352707,
-0.05632777512073517,
0.03198761120438576,
-0.0936860740184784,
-0.14197169244289398,
-0.000977774616330862,
-0.006933287251740694,
0.026412004604935646,
0.03658851608633995,
0.030308807268738747,
0.06681910902261734,
0.011842764914035797,
-0.01992746815085411,
-0.22561807930469513,
-0.011325501836836338,
0.10931333899497986,
-0.024895185604691505,
0.0698515921831131,
-0.09559076279401779,
0.13339440524578094,
-0.08422335982322693,
-0.051820602267980576,
0.015994668006896973,
0.014835527166724205,
-0.0893593430519104,
-0.09642457216978073,
-0.044293515384197235,
0.04310377314686775,
0.030483758077025414,
0.05993334576487541,
-0.014119904488325119,
0.17196474969387054,
-0.15130110085010529,
0.0900082141160965,
0.07799916714429855,
-0.20914553105831146,
-0.02864500693976879,
0.12620775401592255,
-0.022182848304510117,
0.16751492023468018,
-0.0821603313088417,
-0.0928761437535286,
0.06491231173276901,
0.03869080916047096,
-0.018066778779029846,
-0.0015547500224784017,
-0.11479837447404861,
0.030915888026356697,
-0.13364118337631226,
-0.04813624918460846,
0.14555932581424713,
0.03509735316038132,
-0.03541167080402374,
-0.046493805944919586,
-0.09799899905920029,
-0.3764220178127289,
0.025444475933909416,
-0.019241128116846085,
-0.066914401948452,
0.04486314579844475,
-0.02249828726053238,
-0.01497147511690855,
-0.0114058218896389,
-0.08907991647720337,
-0.02228786051273346,
0.07495085150003433,
0.03715614974498749,
0.026139622554183006,
0.006188852712512016,
0.11266672611236572,
-0.09897858649492264,
-0.03323763236403465,
-0.036225371062755585,
-0.028881169855594635,
-0.05456538870930672,
-0.006604235153645277,
-0.07022203505039215,
0.17528314888477325,
0.07525674998760223,
0.12666282057762146,
-0.12827250361442566,
0.1311921775341034,
-0.017891818657517433,
0.049590788781642914,
-0.03390548750758171,
0.035105880349874496,
-0.12452733516693115,
0.10138563066720963,
0.007189449388533831,
0.16058793663978577,
0.009220117703080177,
-0.04686558619141579,
-0.06686486303806305,
0.001404313137754798,
0.1462526172399521,
-0.0007741169538348913,
-0.10082004219293594,
0.01514110155403614,
-0.143125981092453,
-0.03643784299492836,
0.06489310413599014,
-0.07333685457706451,
0.009285308420658112,
0.035954903811216354,
-0.05005185306072235,
-0.035470712929964066,
0.10925997793674469,
-0.04522745683789253,
-0.026262326166033745,
-0.027128657326102257,
-0.09760791808366776,
-0.017379216849803925,
-0.10380303114652634,
-0.13263756036758423,
0.04999830946326256,
-0.18068750202655792,
0.00562055641785264,
-0.042586758732795715,
-0.05852135270833969,
0.02671758644282818,
0.011211113072931767,
-0.0801720842719078,
0.0641409307718277,
-0.10207898169755936,
-0.14261718094348907,
-0.02556646801531315,
0.015782896429300308,
0.027732979506254196,
-0.01964949443936348,
0.10067597031593323,
0.04901120811700821,
0.11884310096502304,
-0.15137265622615814,
-0.000906149041838944,
-0.0006713634938932955,
0.06834474205970764,
0.05941688269376755,
0.13060152530670166,
-0.10515986382961273,
-0.027969863265752792,
-0.06103983148932457,
-0.06348162889480591,
-0.13580144941806793,
-0.01634615659713745,
0.1407272070646286,
0.08765122294425964,
-0.162354975938797,
-0.01073814183473587,
0.07467684894800186,
-0.002643210580572486,
-0.07879126816987991,
0.15129150450229645,
-0.05682380124926567,
0.11437069624662399,
-0.04345925524830818,
0.07842165231704712,
0.23099873960018158,
-0.0996355414390564,
0.006504190154373646,
0.11197512596845627,
0.07360801845788956,
-0.000003583008265195531,
0.013187721371650696,
0.07264252007007599,
-0.10083498060703278,
0.03870057687163353,
0.07377462089061737,
0.03774886205792427,
-0.06504413485527039,
-0.07400774955749512,
-0.022752903401851654,
-0.05681690201163292,
0.1151171550154686,
0.02729841135442257,
0.014005406759679317,
-0.06971991807222366,
-0.08609434217214584,
0.1404680609703064,
0.11459258198738098,
-0.030378511175513268,
-0.005029883235692978,
-0.13212119042873383,
-0.0021202426869422197,
-0.04808016121387482,
0.024497391656041145,
-0.1225360855460167,
0.02185128629207611,
0.08327647298574448,
0.00787833146750927,
0.012623712420463562,
0.04495754465460777,
0.06096586957573891,
0.029019279405474663,
-0.0604269839823246,
0.002956473035737872,
-0.05536847561597824,
0.004059408791363239,
-0.08605807274580002,
-0.07381096482276917,
-0.0010516063775867224,
-0.014724265784025192,
0.19779662787914276,
-0.13052622973918915,
0.03565271571278572,
0.11289136111736298,
-0.005989561323076487,
-0.009503825567662716,
-0.04201732203364372,
-0.06623262912034988,
0.10975963622331619,
-0.012460143305361271,
-0.037024881690740585,
0.03980237618088722,
0.02601690962910652,
-0.061022430658340454,
-0.17260299623012543,
-0.0970475897192955,
0.05009652301669121,
0.12437541782855988,
0.06887579709291458,
-0.05821763351559639,
-0.03995196893811226,
-0.023590924218297005,
-0.03908832371234894,
0.06429712474346161,
-0.045532938092947006,
0.03026658482849598,
0.0026904530823230743,
0.06007077172398567,
-0.10919053852558136,
-0.04384881630539894,
0.06400182098150253,
-0.01284062210470438,
-0.04328686743974686,
0.1131523922085762,
0.03539156913757324,
-0.10713862627744675,
0.06380997598171234,
0.057952430099248886,
-0.14340658485889435,
0.1139671802520752,
-0.010076945647597313,
-0.021488331258296967,
-0.09941406548023224,
0.16643720865249634,
0.018506327643990517,
0.11286304146051407,
-0.1634303331375122,
0.09947452694177628,
-0.00929392408579588,
0.008983489125967026,
0.07010107487440109,
-0.1942894458770752,
-0.0023229599464684725,
-0.03061300702393055,
-0.08721230924129486,
-0.06861617416143417,
-0.015447620302438736,
-0.003656132612377405,
0.023634733632206917,
-0.0011160095455124974,
0.07114277780056,
0.1408311426639557,
-0.021699242293834686,
-0.08314504474401474,
0.18067345023155212,
-0.2228248417377472,
-0.2243027687072754,
-0.23713873326778412,
0.014357865788042545,
-0.10143844783306122,
-0.034064821898937225,
-0.05187778174877167,
-0.07830392569303513,
0.03843799605965614,
-0.08154495060443878,
-0.047854289412498474,
-0.010174937546253204,
0.00946576427668333,
0.05525780841708183,
0.022059176117181778,
0.17305299639701843,
-0.08067180961370468,
0.027929941192269325,
0.05004730448126793,
-0.03334664925932884,
0.1225728914141655,
-0.07749255001544952,
-0.0130006680265069,
0.11107799410820007,
-0.017344031482934952,
0.020386451855301857,
0.015911642462015152,
0.321786105632782,
0.012793739326298237,
0.031549349427223206,
0.07509331405162811,
0.004641542676836252,
0.05399017781019211,
0.08592698723077774,
0.015287249349057674,
-0.10515401512384415,
0.07365500181913376,
0.053789518773555756,
-0.09195578098297119,
-0.1337462067604065,
-0.03178740665316582,
-0.06295210868120193,
0.018377745524048805,
0.07818001508712769,
0.06705041229724884,
0.08702594041824341,
0.06877101212739944,
0.04080965742468834,
0.09042216837406158,
0.0002709537511691451,
-0.010677835904061794,
0.11289051920175552,
-0.018581749871373177,
0.07096980512142181,
-0.0141120171174407,
0.02626338228583336,
0.04935658350586891,
0.12695446610450745,
0.06418963521718979,
-0.07558903843164444,
0.024903111159801483,
0.05380476638674736,
0.2923724353313446,
-0.00511576933786273,
0.09942048788070679,
-0.07165701687335968,
-0.020087575539946556,
-0.012876059859991074,
-0.031602099537849426,
-0.08611224591732025,
0.04154687002301216,
0.009212172590196133,
0.05790695920586586,
-0.0025879312306642532,
-0.00050112244207412,
0.07788389921188354,
0.09517701715230942,
0.16684189438819885,
-0.2861690819263458,
-0.10541892051696777,
-0.011134089902043343,
0.11542210727930069,
-0.09569238871335983,
0.016318857669830322,
0.22510360181331635,
0.01965336687862873,
-0.10895603150129318,
-0.031794480979442596,
0.0305477287620306,
-0.004160500131547451,
0.01549921091645956,
0.13027654588222504,
0.11410193145275116,
0.0013551336014643312,
0.08492128551006317,
-0.30027419328689575,
0.03473859280347824,
0.05703248083591461,
0.040042631328105927,
-0.03116016462445259,
0.003734297351911664,
-0.07034774869680405,
-0.08240098506212234,
0.038768261671066284,
0.004138266667723656,
0.1705280989408493,
-0.2976781725883484,
-0.07360395789146423,
-0.00185823580250144,
0.1271347999572754,
0.05303133279085159,
0.04872012883424759,
0.02403664030134678,
0.04942479729652405,
0.08176594227552414,
0.08706773817539215,
-0.02611609175801277,
-0.11035197973251343,
-0.004661462735384703,
0.13897009193897247,
-0.12668536603450775,
-0.05941249430179596,
-0.05124782398343086,
-0.022857125848531723,
0.03539132699370384,
-0.15729300677776337,
-0.04396966099739075,
-0.05650247633457184,
0.03282100334763527,
0.1437554955482483,
-0.034790076315402985,
0.0006096545839682221,
-0.012934640981256962,
0.015886543318629265,
-0.032133374363183975,
-0.08068639785051346,
0.10920129716396332,
-0.044809311628341675,
-0.13727150857448578,
-0.045596521347761154,
0.1377175748348236,
0.0731605812907219,
-0.00359351746737957,
-0.09041156619787216,
-0.050021663308143616,
0.031252212822437286,
-0.1371932029724121,
0.01997753418982029,
0.08576954901218414,
-0.07234665006399155,
0.0698034018278122,
-0.11751603335142136,
0.2458948791027069,
-0.0524151511490345,
0.0896567553281784,
0.0709720253944397,
0.31162014603614807,
-0.08235272765159607,
0.02873161807656288,
0.09803200513124466,
-0.026654057204723358,
-0.26190292835235596,
0.029979633167386055,
0.05638238042593002,
0.04938185587525368,
-0.02843460999429226,
-0.18719743192195892,
0.036363113671541214,
0.06459816545248032,
0.016497470438480377,
0.13878561556339264,
-0.3157224655151367,
-0.0721718817949295,
0.04020849987864494,
0.047762200236320496,
0.11962796002626419,
-0.04890049248933792,
0.011267954483628273,
0.010984989814460278,
-0.016328759491443634,
0.16383521258831024,
-0.08632995188236237,
0.11082419753074646,
-0.012934541329741478,
0.02104428969323635,
0.010682103224098682,
-0.040459323674440384,
0.1487898826599121,
-0.003343862947076559,
0.09072566032409668,
0.025223324075341225,
-0.06733614206314087,
0.06636755168437958,
-0.07327944040298462,
0.011974047869443893,
-0.05378776043653488,
0.09185262024402618,
-0.045282598584890366,
0.0010139745427295566,
-0.06246977299451828,
-0.024063177406787872,
-0.06749105453491211,
-0.06296666711568832,
-0.10791633278131485,
0.08884958922863007,
0.000746204168535769,
-0.02908109501004219,
-0.04201129451394081,
0.054056353867053986,
0.035463664680719376,
0.4493359923362732,
-0.05227240175008774,
-0.053069330751895905,
0.08815151453018188,
0.09277763962745667,
-0.01794450543820858,
0.09321408718824387,
-0.12357188761234283,
0.04079055041074753,
0.12765155732631683,
0.004561528097838163,
0.13461217284202576,
0.08197237551212311,
-0.10337455570697784,
-0.004654042888432741,
0.042563386261463165,
-0.13127627968788147,
-0.08014523237943649,
-0.03295697271823883,
-0.01872093789279461,
-0.1190016120672226,
-0.015042198821902275,
0.09247755259275436,
-0.03084845095872879,
0.053225692361593246,
0.02725985273718834,
0.04755283519625664,
-0.13321809470653534,
0.15661577880382538,
0.03548196330666542,
0.08438514173030853,
-0.08233853429555893,
0.0957811176776886,
0.030104901641607285,
0.004610141273587942,
0.04891415685415268,
-0.02637943997979164,
-0.10149519145488739,
0.019178146496415138,
-0.03032047301530838,
-0.08386382460594177,
0.12455578148365021,
-0.03086884878575802,
-0.04816962778568268,
-0.08449167013168335,
0.00968680065125227,
0.06828322261571884,
0.05217857286334038,
0.10059580206871033,
-0.031775325536727905,
0.02660980448126793,
-0.12699167430400848,
0.08317403495311737,
-0.024292631074786186,
0.012828993611037731,
-0.1356390118598938,
0.06760798394680023,
-0.02659040130674839,
0.057388827204704285,
-0.01863805763423443,
-0.01973152346909046,
-0.2173939198255539,
0.02618231251835823,
-0.025557123124599457,
0.0038494919426739216,
0.04102040454745293,
0.027889033779501915,
0.026005011051893234,
0.05082876980304718,
-0.035126328468322754,
0.027516542002558708,
-0.031932324171066284,
-0.05281057208776474,
0.04515986144542694,
-0.001467780559323728,
-0.02854740433394909,
-0.051517583429813385,
0.06076597422361374,
-0.11085972934961319,
0.03777240961790085,
0.01833501085639,
-0.05718983709812164,
0.07385632395744324,
0.034146055579185486,
0.025609485805034637,
0.09601592272520065,
0.05744863301515579,
0.03817358240485191,
-0.06451597064733505,
0.03407379239797592,
-0.0191631019115448,
-0.006845394615083933,
0.040695060044527054,
0.11625181883573532,
-0.04832686111330986,
-0.06327467411756516,
-0.1486724466085434,
-0.010981161147356033,
-0.048681098967790604,
0.043675489723682404,
0.14957195520401,
0.10375252366065979,
0.09278536587953568,
-0.08544762432575226,
-0.0226217620074749,
-0.14531907439231873,
-0.08078242838382721,
0.060537077486515045,
-0.044578637927770615,
-0.05809058994054794,
-0.04991595074534416,
0.072527676820755,
-0.014007162302732468,
0.13561315834522247,
-0.08526904881000519,
-0.1192963719367981,
-0.05781520903110504,
-0.2013944536447525,
-0.12102537602186203,
0.006627216935157776,
0.2745034396648407,
0.03526143729686737,
-0.04139144718647003,
-0.07936355471611023,
0.0031785741448402405,
0.07116973400115967,
0.16906431317329407,
0.0312589667737484,
0.110907644033432,
-0.11192750185728073,
0.10478517413139343,
0.039009008556604385,
-0.05691129341721535,
0.10796307027339935,
0.3328118920326233,
-0.08219370990991592,
0.02072407677769661,
-0.10421788692474365,
0.10153460502624512,
0.002255996223539114,
-0.13901546597480774,
0.010607997886836529,
-0.03188878297805786,
-0.16198381781578064,
-0.10664700716733932,
0.016538817435503006,
-0.07478190958499908,
-0.1838442087173462,
-0.017796620726585388,
-0.11684910207986832,
-0.07336822152137756,
0.10155867785215378,
0.040450263768434525,
-0.03009716235101223,
0.20794062316417694,
-0.07158278673887253,
0.04586811363697052,
-0.003929977770894766,
-0.016108684241771698,
-0.01862870156764984,
-0.03230820968747139,
-0.10202806442975998,
0.13945409655570984,
0.021661214530467987,
0.10333898663520813,
-0.0020766255911439657,
0.07358334958553314,
0.037272170186042786,
-0.0280001163482666,
-0.04698936268687248,
-0.011088578961789608,
0.0020344743970781565,
-0.05577605590224266,
0.11701337993144989,
0.05259020999073982,
-0.0886714980006218,
-0.07409586012363434,
-0.009580516256392002,
-0.07335926592350006,
-0.038215573877096176,
-0.1547286957502365,
0.2329166978597641,
-0.030917182564735413,
0.12437079846858978,
0.003149752039462328,
-0.06306970119476318,
-0.09483032673597336,
0.14517559111118317,
0.12407895922660828,
-0.11817460507154465,
-0.009399796836078167,
0.09650255739688873,
-0.005866600200533867,
-0.09527475386857986,
0.14510999619960785,
0.08037827908992767,
-0.02563084289431572,
0.025037124752998352,
-0.022733965888619423,
-0.021453559398651123,
-0.015347006730735302,
0.015540610998868942,
-0.030597979202866554,
0.029416168108582497,
0.042662251740694046,
-0.1463923156261444,
-0.03498699516057968,
-0.08044791221618652,
-0.08698305487632751,
0.18581046164035797,
-0.14277657866477966,
-0.07864214479923248,
-0.04254341498017311,
-0.08468645066022873,
-0.10728199034929276,
0.021719660609960556,
-0.10976678878068924,
0.06724512577056885,
0.05545623227953911,
-0.05225260928273201,
-0.0009250067523680627,
-0.03164041042327881,
0.0032662637531757355,
0.03487436845898628,
0.07120268791913986,
-0.01732550375163555,
0.05975370481610298,
0.12397127598524094,
-0.019730931147933006,
-0.04827846959233284,
0.11225178092718124,
0.021490709856152534,
-0.035946041345596313,
-0.14605534076690674,
0.03688865900039673,
-0.022579049691557884,
0.12442220002412796,
0.032534532248973846,
-0.05697030946612358,
-0.014126471243798733,
-0.2196122705936432,
-0.008397997356951237,
-0.1463952362537384,
-0.07968004047870636,
-0.06731394678354263,
0.10560601949691772,
0.17703479528427124,
-0.055034078657627106,
0.019445886835455894,
-0.03590158373117447,
0.04178448021411896,
-0.05044577643275261,
0.07704321295022964,
-0.0026474366895854473,
-0.1442994624376297,
0.05290527641773224,
-0.054687224328517914,
0.009347786195576191,
-0.3066256046295166,
-0.009464235045015812,
0.024085188284516335,
-0.03251948952674866,
-0.03590572252869606,
0.16143204271793365,
0.021979082375764847,
0.06882719695568085,
-0.05558779463171959,
-0.25186118483543396,
-0.07091144472360611,
0.13138745725154877,
-0.0019314914243295789,
-0.07031361013650894
] |
null | null | null |
# Model Trained Using AutoTrain | {"tags": ["autotrain", "text-generation"], "widget": [{"text": "I love AutoTrain because "}]} | text-generation | Bossdude0594/alpaca-7b-adapter | [
"autotrain",
"text-generation",
"region:us"
] | 2023-11-11T02:25:12+00:00 | [] | [] | TAGS
#autotrain #text-generation #region-us
|
# Model Trained Using AutoTrain | [
"# Model Trained Using AutoTrain"
] | [
"TAGS\n#autotrain #text-generation #region-us \n",
"# Model Trained Using AutoTrain"
] | [
15,
9
] | [
"passage: TAGS\n#autotrain #text-generation #region-us \n# Model Trained Using AutoTrain"
] | [
-0.01293906383216381,
0.03725669905543327,
-0.0029229004867374897,
0.04177805408835411,
0.17027688026428223,
0.015007555484771729,
0.2653331458568573,
0.04748149961233139,
-0.006006841082125902,
-0.10281107574701309,
0.2095320075750351,
0.1025988906621933,
-0.0442507266998291,
0.24752722680568695,
0.011852225288748741,
-0.31867673993110657,
0.019054677337408066,
-0.05171716585755348,
0.09898287802934647,
0.10707787424325943,
0.08113043755292892,
-0.059793341904878616,
0.03148295730352402,
-0.001046067918650806,
-0.27159014344215393,
0.030821826308965683,
0.033174362033605576,
-0.0864570140838623,
0.15098509192466736,
0.013041899539530277,
0.12893958389759064,
0.007876494899392128,
0.14103874564170837,
-0.10635127872228622,
0.018643615767359734,
-0.000514600018505007,
-0.04781845584511757,
0.07588475197553635,
0.09669061750173569,
-0.033203285187482834,
0.06400887668132782,
0.23338325321674347,
0.07329291105270386,
0.03941706568002701,
-0.19229762256145477,
0.07335293292999268,
0.05415516346693039,
-0.030233997851610184,
0.10079680383205414,
0.10751935094594955,
-0.020671572536230087,
0.15518639981746674,
-0.17782945930957794,
0.10216539353132248,
-0.13095369935035706,
-0.19949568808078766,
-0.027501598000526428,
0.22150743007659912,
0.07350228726863861,
0.09395073354244232,
-0.12441693246364594,
0.0939231738448143,
0.079034723341465,
-0.006518159061670303,
0.02133999951183796,
-0.025327155366539955,
-0.0909232571721077,
0.05423077195882797,
-0.09457756578922272,
0.03377829119563103,
0.25671613216400146,
-0.053746048361063004,
0.042084090411663055,
-0.022848954424262047,
-0.06510523706674576,
-0.015292389318346977,
0.01641298644244671,
-0.10528882592916489,
-0.05759155750274658,
0.125787153840065,
-0.01867661066353321,
-0.10655493289232254,
-0.11178718507289886,
-0.07898897677659988,
-0.10404396057128906,
0.0663192942738533,
-0.01861686445772648,
0.024459410458803177,
-0.1731380671262741,
0.08890343457460403,
-0.022009460255503654,
-0.08055796474218369,
0.10779394209384918,
-0.12842579185962677,
-0.048661597073078156,
-0.11136249452829361,
0.011087142862379551,
-0.11313027888536453,
-0.022519493475556374,
0.13958771526813507,
0.19941619038581848,
0.013695078901946545,
-0.029571060091257095,
0.06461822241544724,
0.05257121101021767,
0.11987859755754471,
0.08565418422222137,
-0.03081565722823143,
-0.008178629912436008,
-0.009703394956886768,
-0.08616664260625839,
-0.08801154047250748,
-0.21374116837978363,
0.06037292256951332,
-0.004819850903004408,
0.050690341740846634,
-0.06288393586874008,
0.02527816779911518,
-0.05841813236474991,
0.02412058226764202,
-0.05149344727396965,
0.0002353830059291795,
0.0027136297430843115,
-0.05670495703816414,
-0.061558861285448074,
-0.05230182781815529,
-0.03687890246510506,
0.08733893930912018,
0.00314074638299644,
0.08817963302135468,
-0.09736913442611694,
-0.04310047626495361,
-0.11681097745895386,
-0.042354516685009,
-0.03186555579304695,
-0.015033015049993992,
0.03725206479430199,
-0.1824009120464325,
-0.3124321401119232,
-0.07138761132955551,
0.06266368925571442,
-0.05370497331023216,
-0.07300957292318344,
-0.13907840847969055,
0.014087887480854988,
0.049111124128103256,
-0.02271014265716076,
0.014782736077904701,
-0.0284622423350811,
0.041288506239652634,
-0.05698215961456299,
0.03630412369966507,
-0.09557504206895828,
0.054783303290605545,
-0.13269056379795074,
-0.06198246031999588,
-0.04107651486992836,
0.10843992233276367,
-0.006764533463865519,
0.1977291852235794,
0.011474667116999626,
0.09026386588811874,
-0.06841685622930527,
0.07604483515024185,
-0.0010385055793449283,
0.2253246307373047,
-0.1745975911617279,
-0.06751103699207306,
0.1229860931634903,
-0.02386711724102497,
-0.06523667275905609,
0.07088886946439743,
-0.08855247497558594,
0.3067644536495209,
0.1433866173028946,
0.2346360683441162,
0.044861894100904465,
0.005545933730900288,
0.20879106223583221,
0.060232002288103104,
-0.07236604392528534,
-0.06064650043845177,
0.003720212494954467,
-0.006930888630449772,
-0.27272462844848633,
0.018238352611660957,
0.12134528160095215,
0.09167246520519257,
-0.0889168307185173,
-0.09503742307424545,
0.055689554661512375,
-0.04852880910038948,
0.0996757224202156,
0.01494552195072174,
0.1978612095117569,
-0.05898145213723183,
-0.017759809270501137,
-0.020406991243362427,
0.0541611909866333,
0.10703583806753159,
-0.08137596398591995,
-0.042927615344524384,
-0.02581002004444599,
-0.010415749624371529,
0.05697587504982948,
-0.1281232386827469,
-0.08501369506120682,
-0.006732971873134375,
0.15443918108940125,
0.0828055813908577,
0.18754243850708008,
0.026379410177469254,
0.027562621980905533,
-0.000012058498214173596,
-0.009003709070384502,
0.09002263844013214,
0.007617585361003876,
-0.15755799412727356,
-0.10724975913763046,
0.13344277441501617,
-0.08263654261827469,
0.08709236234426498,
-0.23841261863708496,
0.018164698034524918,
-0.13573043048381805,
0.013856465928256512,
0.03252340480685234,
0.050731342285871506,
-0.08554819971323013,
0.05500224232673645,
-0.06697078049182892,
0.009819954633712769,
0.1091662123799324,
0.01963679865002632,
-0.0719209834933281,
0.10354035347700119,
-0.15836620330810547,
0.15572968125343323,
0.12334153801202774,
-0.24097204208374023,
-0.09216748178005219,
-0.06884575635194778,
0.014546036720275879,
-0.012937269173562527,
-0.0943981483578682,
-0.015839243307709694,
0.06874024122953415,
-0.038741398602724075,
0.18586979806423187,
0.024912016466259956,
-0.004734761081635952,
-0.0523286871612072,
-0.061183054000139236,
-0.0037715998478233814,
0.052159007638692856,
0.13435141742229462,
-0.14834411442279816,
0.14938656985759735,
0.1722893863916397,
-0.008139397017657757,
0.27878978848457336,
0.08185230940580368,
0.03649657964706421,
0.011559398844838142,
-0.0848054438829422,
-0.04838470742106438,
0.018687382340431213,
-0.05975544825196266,
-0.05435062199831009,
0.0033849042374640703,
0.03079804591834545,
0.05007792264223099,
-0.13230586051940918,
-0.09064553678035736,
-0.01991548202931881,
0.05040372163057327,
-0.0006589900003746152,
0.047270748764276505,
-0.108769990503788,
0.05354562774300575,
-0.004326726775616407,
-0.17079806327819824,
0.14840541779994965,
0.0004001693450845778,
-0.10705946385860443,
0.1568262279033661,
-0.09920505434274673,
-0.23585979640483856,
-0.2174919694662094,
-0.13818910717964172,
-0.03031771443784237,
0.11251886934041977,
0.04604007676243782,
-0.16559216380119324,
-0.03047688491642475,
0.03679342195391655,
0.011526725254952908,
-0.07616474479436874,
-0.03426254168152809,
-0.0995948314666748,
0.06942155957221985,
-0.0737684965133667,
-0.06859073787927628,
-0.03001687116920948,
-0.02743489481508732,
-0.016080135479569435,
0.10118252784013748,
-0.1509237289428711,
0.06263644248247147,
0.2052225023508072,
0.022909188643097878,
0.056237202137708664,
-0.012175374664366245,
0.21861015260219574,
-0.130641907453537,
-0.011590130627155304,
0.07829003036022186,
-0.02814415656030178,
0.04864482954144478,
0.21901331841945648,
0.0375080443918705,
-0.09810825437307358,
0.07666554301977158,
-0.021368375048041344,
-0.093475341796875,
-0.22816669940948486,
-0.10218030959367752,
-0.04269981384277344,
0.06693188101053238,
0.09883033484220505,
0.05073950067162514,
0.24723808467388153,
0.11637244373559952,
0.08319186419248581,
0.10162784159183502,
-0.01841421239078045,
0.045062657445669174,
0.06979672610759735,
-0.06470759212970734,
0.15418526530265808,
-0.054883867502212524,
-0.19334833323955536,
0.08438461273908615,
0.005106466356664896,
0.11039916425943375,
0.24881651997566223,
0.02134569175541401,
0.0007581055979244411,
0.009613982401788235,
0.1651090383529663,
0.12610283493995667,
0.1332865208387375,
-0.03741442412137985,
-0.03758866712450981,
0.0065889665856957436,
-0.038417570292949677,
0.13129308819770813,
0.040854036808013916,
-0.11585681140422821,
-0.04681240767240524,
0.029506448656320572,
0.04527059197425842,
0.04434054344892502,
0.04319705441594124,
-0.27700385451316833,
0.11362649500370026,
0.059947844594717026,
-0.0517922081053257,
-0.09917064011096954,
0.10584335029125214,
-0.012261533178389072,
-0.21442481875419617,
-0.0377059243619442,
0.03376665338873863,
0.13163575530052185,
-0.041181761771440506,
0.09046660363674164,
-0.08106541633605957,
-0.06314463168382645,
-0.05419131740927696,
0.15482094883918762,
-0.3592044711112976,
0.27610698342323303,
-0.011793626472353935,
0.012216465547680855,
-0.11431513726711273,
-0.034061502665281296,
0.10776800662279129,
0.146275594830513,
0.09114658087491989,
-0.016219809651374817,
-0.12012992799282074,
-0.1565876305103302,
-0.10329819470643997,
-0.025206366553902626,
0.08952774107456207,
-0.10099530220031738,
-0.04114372655749321,
-0.09404317289590836,
0.03146766498684883,
-0.008163012564182281,
-0.051707953214645386,
-0.12403089553117752,
-0.09183084964752197,
-0.00763789052143693,
0.033891141414642334,
0.10747329145669937,
0.033049099147319794,
-0.043880563229322433,
-0.052722811698913574,
0.09307583421468735,
0.1098189726471901,
0.05425255745649338,
-0.1349596232175827,
-0.0049662203527987,
-0.06356953829526901,
-0.05083570256829262,
0.012672817334532738,
-0.021541643887758255,
0.04287556931376457,
-0.068509042263031,
-0.0754215344786644,
0.13843883574008942,
-0.09135617315769196,
0.014038902707397938,
-0.14574716985225677,
0.004008024465292692,
0.007871964015066624,
0.035940177738666534,
0.04792628064751625,
0.023748476058244705,
-0.09466731548309326,
-0.05896507948637009,
0.08029244840145111,
-0.07738546282052994,
-0.10226188600063324,
-0.00795792881399393,
-0.1246110051870346,
-0.04703124985098839,
-0.04436146840453148,
-0.12013185024261475,
0.26555129885673523,
0.22019073367118835,
-0.07351399213075638,
0.1399703323841095,
0.27483996748924255,
-0.10672761499881744,
-0.3304038643836975,
-0.008174796588718891,
-0.0751870647072792,
0.041829340159893036,
0.05116770789027214,
-0.2501184344291687,
0.07373066991567612,
0.01931774616241455,
-0.06749390065670013,
0.01928837038576603,
-0.1619385927915573,
-0.11150901019573212,
0.2563025653362274,
-0.032256510108709335,
0.34482458233833313,
-0.10491728037595749,
-0.08199252188205719,
-0.17688752710819244,
0.1420847475528717,
0.06333642452955246,
-0.10076870024204254,
0.08731603622436523,
0.044790226966142654,
0.06041640788316727,
0.03505954146385193,
0.02619127742946148,
0.09243033826351166,
0.023104792460799217,
0.06028764694929123,
-0.14548036456108093,
-0.06861241161823273,
0.07415175437927246,
-0.02152976207435131,
0.04285365343093872,
0.004770014900714159,
0.01036781631410122,
-0.1311371624469757,
-0.043237634003162384,
0.03190946951508522,
0.02076754905283451,
0.016618814319372177,
-0.1255522072315216,
0.03769862279295921,
-0.0015433132648468018,
-0.04502609744668007,
-0.03010057471692562,
0.03119928203523159,
-0.030032051727175713,
0.1216764822602272,
0.04915028065443039,
0.1747765839099884,
-0.024058902636170387,
0.09313222020864487,
-0.05783005431294441,
-0.08690743893384933,
0.10968741029500961,
-0.09395157545804977,
0.01208765059709549,
0.08182299137115479,
-0.054524846374988556,
0.16917328536510468,
0.07846195995807648,
-0.006492843385785818,
-0.01273274701088667,
0.16898348927497864,
-0.20868368446826935,
0.06246405094861984,
-0.12780174612998962,
0.05735967680811882,
0.08908554166555405,
-0.014996221289038658,
0.0971173569560051,
-0.0018654189771041274,
-0.013319783844053745,
0.04281694442033768,
-0.0217205248773098,
-0.0301180649548769,
0.11034034937620163,
0.06795253604650497,
0.02104649320244789,
-0.07372015714645386,
0.07631858438253403,
0.12201186269521713,
0.04338076338171959,
0.019484056159853935,
0.1493598073720932,
-0.08335894346237183,
-0.10822149366140366,
0.03719323128461838,
0.33524519205093384,
-0.15924988687038422,
-0.04823897033929825,
-0.0014531596098095179,
-0.08845455944538116,
0.022181302309036255,
0.05350995436310768,
0.09714805334806442,
0.0181776974350214,
-0.0751412957906723,
0.007062788587063551,
-0.04362506791949272,
0.04452458396553993,
0.007975967600941658,
0.031849272549152374,
-0.14142456650733948,
0.010811523534357548,
-0.027980873361229897,
0.07390842586755753,
-0.11458798497915268,
-0.10281495004892349,
-0.19869081676006317,
0.08368309587240219,
-0.0840422585606575,
-0.08888869732618332,
0.02081291563808918,
-0.04236543923616409,
0.02168535813689232,
0.014763599261641502,
-0.029119359329342842,
-0.08930052816867828,
-0.14010952413082123,
0.01878425106406212,
-0.007063496857881546,
0.026824666187167168,
0.002559289336204529,
0.0021331559401005507,
0.07239853590726852,
0.005279803182929754,
0.09095291048288345,
0.030617978423833847,
0.007386866491287947,
0.06904944032430649,
-0.1188984289765358,
0.009353539906442165,
0.04623141512274742,
-0.0006355281220749021,
0.05370228737592697,
0.10190235823392868,
-0.036388181149959564,
0.02979891374707222,
0.09261162579059601,
0.05883503332734108,
-0.014444391243159771,
-0.09692656993865967,
0.028374042361974716,
0.025956150144338608,
-0.21469762921333313,
-0.05145007371902466,
0.0012844925513491035,
0.01936577446758747,
-0.010771836154162884,
0.18932001292705536,
-0.053291670978069305,
0.1085541620850563,
-0.005057001952081919,
0.05551614239811897,
-0.016137804836034775,
-0.12063393741846085,
-0.044050104916095734,
-0.1466984748840332,
-0.009641979821026325,
-0.03189089894294739,
0.26366034150123596,
0.19080765545368195,
0.018936004489660263,
0.01476984191685915,
0.11171606928110123,
0.030462171882390976,
0.00027894708910025656,
0.1402665376663208,
0.19079653918743134,
-0.013628169894218445,
-0.13379572331905365,
0.12089692056179047,
0.0489276722073555,
0.008849240839481354,
0.028302595019340515,
-0.08273196220397949,
-0.07437504827976227,
0.08511314541101456,
0.06640534102916718,
-0.014384792186319828,
-0.07679285109043121,
-0.11843180656433105,
-0.05481405928730965,
0.009661180898547173,
-0.05200893059372902,
0.022841986268758774,
0.1289137452840805,
-0.023937316611409187,
0.01512223482131958,
-0.0671723335981369,
-0.09543560445308685,
-0.23315975069999695,
-0.15202747285366058,
-0.09266229718923569,
-0.14394332468509674,
0.03905400261282921,
-0.006766880862414837,
0.02561134472489357,
0.12020087242126465,
0.04755061864852905,
-0.08453751355409622,
0.07035309821367264,
-0.09829221665859222,
-0.029156584292650223,
0.055313315242528915,
-0.09370438009500504,
0.005054789129644632,
-0.2314016968011856,
-0.04074358940124512,
-0.15464888513088226,
0.055974967777729034,
-0.057993773370981216,
-0.02728293277323246,
-0.072787344455719,
-0.01819402165710926,
-0.07745517045259476,
-0.05761045962572098,
-0.045956648886203766,
-0.0025656830985099077,
-0.05178070068359375,
0.04676082357764244,
0.0025813740212470293,
-0.013783591799438,
0.032283566892147064,
0.209452286362648,
-0.03682254999876022,
-0.09270044416189194,
-0.10376805812120438,
0.1735413521528244,
-0.02773534320294857,
0.15435026586055756,
-0.10811062902212143,
-0.021137207746505737,
-0.00898839719593525,
0.309699684381485,
0.345510333776474,
-0.17993248999118805,
-0.01790153980255127,
0.0020298457238823175,
-0.012029296718537807,
0.009389033541083336,
0.21125338971614838,
-0.015208663418889046,
0.09847243130207062,
-0.0740433782339096,
0.06586804986000061,
-0.025332609191536903,
-0.1117854118347168,
0.0011595891555771232,
0.11754649132490158,
0.09629140794277191,
0.01726341061294079,
-0.09629957377910614,
0.11928959935903549,
-0.2083807736635208,
0.2360607534646988,
-0.11397530883550644,
-0.02944292686879635,
-0.10973034799098969,
0.02313762716948986,
0.08837426453828812,
-0.0018891135696321726,
0.07677895575761795,
-0.05275817960500717,
-0.06475579738616943,
-0.09041081368923187,
-0.04335479810833931,
-0.1547277718782425,
-0.1329367607831955,
0.11427272856235504,
-0.02891690842807293,
0.184348002076149,
-0.046814508736133575,
0.041591960936784744,
0.04605058953166008,
-0.006786488927900791,
-0.023680636659264565,
0.10631660372018814,
-0.01271373312920332,
0.0037618502974510193,
0.07280057668685913,
0.07054270058870316,
-0.014710674993693829,
-0.013847368769347668,
0.04753207042813301,
-0.13402554392814636,
0.08614563941955566,
-0.07798656076192856,
-0.12487833946943283,
-0.011625655926764011,
0.06969776004552841,
-0.0449117086827755,
0.1439979523420334,
0.1147218570113182,
-0.001020935014821589,
0.03133443370461464,
-0.021364429965615273,
0.03785387799143791,
-0.009269197471439838,
-0.1249181255698204,
-0.0847790390253067,
-0.13022203743457794,
-0.09746360778808594,
0.1338910013437271,
-0.019608965143561363,
-0.2605046033859253,
-0.039238858968019485,
-0.16151221096515656,
0.022271867841482162,
-0.11716161668300629,
0.10408297926187515,
0.1895187944173813,
0.03489043936133385,
0.0008216553251259029,
-0.11903247982263565,
0.00445727352052927,
0.030190253630280495,
-0.06735273450613022,
-0.12968982756137848
] |
null | null | peft | ## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.5.0
| {"library_name": "peft"} | null | genies-models/llama-30b-sycophancy_answer | [
"peft",
"region:us"
] | 2023-11-11T02:26:00+00:00 | [] | [] | TAGS
#peft #region-us
| ## Training procedure
The following 'bitsandbytes' quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.5.0
| [
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16",
"### Framework versions\n\n\n- PEFT 0.5.0"
] | [
"TAGS\n#peft #region-us \n",
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16",
"### Framework versions\n\n\n- PEFT 0.5.0"
] | [
9,
163,
11
] | [
"passage: TAGS\n#peft #region-us \n## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16### Framework versions\n\n\n- PEFT 0.5.0"
] | [
-0.052170995622873306,
0.04638637602329254,
-0.002228866796940565,
0.12587358057498932,
0.09519550204277039,
0.0600481741130352,
0.10608749836683273,
0.12907910346984863,
0.030399560928344727,
0.08949745446443558,
0.11713548749685287,
0.04174109548330307,
0.061590272933244705,
0.15933413803577423,
-0.013215559534728527,
-0.00870309304445982,
0.04744568094611168,
0.0017413088353350759,
-0.0256193894892931,
0.08726111054420471,
0.04637008160352707,
-0.05632777512073517,
0.03198761120438576,
-0.0936860740184784,
-0.14197169244289398,
-0.000977774616330862,
-0.006933287251740694,
0.026412004604935646,
0.03658851608633995,
0.030308807268738747,
0.06681910902261734,
0.011842764914035797,
-0.01992746815085411,
-0.22561807930469513,
-0.011325501836836338,
0.10931333899497986,
-0.024895185604691505,
0.0698515921831131,
-0.09559076279401779,
0.13339440524578094,
-0.08422335982322693,
-0.051820602267980576,
0.015994668006896973,
0.014835527166724205,
-0.0893593430519104,
-0.09642457216978073,
-0.044293515384197235,
0.04310377314686775,
0.030483758077025414,
0.05993334576487541,
-0.014119904488325119,
0.17196474969387054,
-0.15130110085010529,
0.0900082141160965,
0.07799916714429855,
-0.20914553105831146,
-0.02864500693976879,
0.12620775401592255,
-0.022182848304510117,
0.16751492023468018,
-0.0821603313088417,
-0.0928761437535286,
0.06491231173276901,
0.03869080916047096,
-0.018066778779029846,
-0.0015547500224784017,
-0.11479837447404861,
0.030915888026356697,
-0.13364118337631226,
-0.04813624918460846,
0.14555932581424713,
0.03509735316038132,
-0.03541167080402374,
-0.046493805944919586,
-0.09799899905920029,
-0.3764220178127289,
0.025444475933909416,
-0.019241128116846085,
-0.066914401948452,
0.04486314579844475,
-0.02249828726053238,
-0.01497147511690855,
-0.0114058218896389,
-0.08907991647720337,
-0.02228786051273346,
0.07495085150003433,
0.03715614974498749,
0.026139622554183006,
0.006188852712512016,
0.11266672611236572,
-0.09897858649492264,
-0.03323763236403465,
-0.036225371062755585,
-0.028881169855594635,
-0.05456538870930672,
-0.006604235153645277,
-0.07022203505039215,
0.17528314888477325,
0.07525674998760223,
0.12666282057762146,
-0.12827250361442566,
0.1311921775341034,
-0.017891818657517433,
0.049590788781642914,
-0.03390548750758171,
0.035105880349874496,
-0.12452733516693115,
0.10138563066720963,
0.007189449388533831,
0.16058793663978577,
0.009220117703080177,
-0.04686558619141579,
-0.06686486303806305,
0.001404313137754798,
0.1462526172399521,
-0.0007741169538348913,
-0.10082004219293594,
0.01514110155403614,
-0.143125981092453,
-0.03643784299492836,
0.06489310413599014,
-0.07333685457706451,
0.009285308420658112,
0.035954903811216354,
-0.05005185306072235,
-0.035470712929964066,
0.10925997793674469,
-0.04522745683789253,
-0.026262326166033745,
-0.027128657326102257,
-0.09760791808366776,
-0.017379216849803925,
-0.10380303114652634,
-0.13263756036758423,
0.04999830946326256,
-0.18068750202655792,
0.00562055641785264,
-0.042586758732795715,
-0.05852135270833969,
0.02671758644282818,
0.011211113072931767,
-0.0801720842719078,
0.0641409307718277,
-0.10207898169755936,
-0.14261718094348907,
-0.02556646801531315,
0.015782896429300308,
0.027732979506254196,
-0.01964949443936348,
0.10067597031593323,
0.04901120811700821,
0.11884310096502304,
-0.15137265622615814,
-0.000906149041838944,
-0.0006713634938932955,
0.06834474205970764,
0.05941688269376755,
0.13060152530670166,
-0.10515986382961273,
-0.027969863265752792,
-0.06103983148932457,
-0.06348162889480591,
-0.13580144941806793,
-0.01634615659713745,
0.1407272070646286,
0.08765122294425964,
-0.162354975938797,
-0.01073814183473587,
0.07467684894800186,
-0.002643210580572486,
-0.07879126816987991,
0.15129150450229645,
-0.05682380124926567,
0.11437069624662399,
-0.04345925524830818,
0.07842165231704712,
0.23099873960018158,
-0.0996355414390564,
0.006504190154373646,
0.11197512596845627,
0.07360801845788956,
-0.000003583008265195531,
0.013187721371650696,
0.07264252007007599,
-0.10083498060703278,
0.03870057687163353,
0.07377462089061737,
0.03774886205792427,
-0.06504413485527039,
-0.07400774955749512,
-0.022752903401851654,
-0.05681690201163292,
0.1151171550154686,
0.02729841135442257,
0.014005406759679317,
-0.06971991807222366,
-0.08609434217214584,
0.1404680609703064,
0.11459258198738098,
-0.030378511175513268,
-0.005029883235692978,
-0.13212119042873383,
-0.0021202426869422197,
-0.04808016121387482,
0.024497391656041145,
-0.1225360855460167,
0.02185128629207611,
0.08327647298574448,
0.00787833146750927,
0.012623712420463562,
0.04495754465460777,
0.06096586957573891,
0.029019279405474663,
-0.0604269839823246,
0.002956473035737872,
-0.05536847561597824,
0.004059408791363239,
-0.08605807274580002,
-0.07381096482276917,
-0.0010516063775867224,
-0.014724265784025192,
0.19779662787914276,
-0.13052622973918915,
0.03565271571278572,
0.11289136111736298,
-0.005989561323076487,
-0.009503825567662716,
-0.04201732203364372,
-0.06623262912034988,
0.10975963622331619,
-0.012460143305361271,
-0.037024881690740585,
0.03980237618088722,
0.02601690962910652,
-0.061022430658340454,
-0.17260299623012543,
-0.0970475897192955,
0.05009652301669121,
0.12437541782855988,
0.06887579709291458,
-0.05821763351559639,
-0.03995196893811226,
-0.023590924218297005,
-0.03908832371234894,
0.06429712474346161,
-0.045532938092947006,
0.03026658482849598,
0.0026904530823230743,
0.06007077172398567,
-0.10919053852558136,
-0.04384881630539894,
0.06400182098150253,
-0.01284062210470438,
-0.04328686743974686,
0.1131523922085762,
0.03539156913757324,
-0.10713862627744675,
0.06380997598171234,
0.057952430099248886,
-0.14340658485889435,
0.1139671802520752,
-0.010076945647597313,
-0.021488331258296967,
-0.09941406548023224,
0.16643720865249634,
0.018506327643990517,
0.11286304146051407,
-0.1634303331375122,
0.09947452694177628,
-0.00929392408579588,
0.008983489125967026,
0.07010107487440109,
-0.1942894458770752,
-0.0023229599464684725,
-0.03061300702393055,
-0.08721230924129486,
-0.06861617416143417,
-0.015447620302438736,
-0.003656132612377405,
0.023634733632206917,
-0.0011160095455124974,
0.07114277780056,
0.1408311426639557,
-0.021699242293834686,
-0.08314504474401474,
0.18067345023155212,
-0.2228248417377472,
-0.2243027687072754,
-0.23713873326778412,
0.014357865788042545,
-0.10143844783306122,
-0.034064821898937225,
-0.05187778174877167,
-0.07830392569303513,
0.03843799605965614,
-0.08154495060443878,
-0.047854289412498474,
-0.010174937546253204,
0.00946576427668333,
0.05525780841708183,
0.022059176117181778,
0.17305299639701843,
-0.08067180961370468,
0.027929941192269325,
0.05004730448126793,
-0.03334664925932884,
0.1225728914141655,
-0.07749255001544952,
-0.0130006680265069,
0.11107799410820007,
-0.017344031482934952,
0.020386451855301857,
0.015911642462015152,
0.321786105632782,
0.012793739326298237,
0.031549349427223206,
0.07509331405162811,
0.004641542676836252,
0.05399017781019211,
0.08592698723077774,
0.015287249349057674,
-0.10515401512384415,
0.07365500181913376,
0.053789518773555756,
-0.09195578098297119,
-0.1337462067604065,
-0.03178740665316582,
-0.06295210868120193,
0.018377745524048805,
0.07818001508712769,
0.06705041229724884,
0.08702594041824341,
0.06877101212739944,
0.04080965742468834,
0.09042216837406158,
0.0002709537511691451,
-0.010677835904061794,
0.11289051920175552,
-0.018581749871373177,
0.07096980512142181,
-0.0141120171174407,
0.02626338228583336,
0.04935658350586891,
0.12695446610450745,
0.06418963521718979,
-0.07558903843164444,
0.024903111159801483,
0.05380476638674736,
0.2923724353313446,
-0.00511576933786273,
0.09942048788070679,
-0.07165701687335968,
-0.020087575539946556,
-0.012876059859991074,
-0.031602099537849426,
-0.08611224591732025,
0.04154687002301216,
0.009212172590196133,
0.05790695920586586,
-0.0025879312306642532,
-0.00050112244207412,
0.07788389921188354,
0.09517701715230942,
0.16684189438819885,
-0.2861690819263458,
-0.10541892051696777,
-0.011134089902043343,
0.11542210727930069,
-0.09569238871335983,
0.016318857669830322,
0.22510360181331635,
0.01965336687862873,
-0.10895603150129318,
-0.031794480979442596,
0.0305477287620306,
-0.004160500131547451,
0.01549921091645956,
0.13027654588222504,
0.11410193145275116,
0.0013551336014643312,
0.08492128551006317,
-0.30027419328689575,
0.03473859280347824,
0.05703248083591461,
0.040042631328105927,
-0.03116016462445259,
0.003734297351911664,
-0.07034774869680405,
-0.08240098506212234,
0.038768261671066284,
0.004138266667723656,
0.1705280989408493,
-0.2976781725883484,
-0.07360395789146423,
-0.00185823580250144,
0.1271347999572754,
0.05303133279085159,
0.04872012883424759,
0.02403664030134678,
0.04942479729652405,
0.08176594227552414,
0.08706773817539215,
-0.02611609175801277,
-0.11035197973251343,
-0.004661462735384703,
0.13897009193897247,
-0.12668536603450775,
-0.05941249430179596,
-0.05124782398343086,
-0.022857125848531723,
0.03539132699370384,
-0.15729300677776337,
-0.04396966099739075,
-0.05650247633457184,
0.03282100334763527,
0.1437554955482483,
-0.034790076315402985,
0.0006096545839682221,
-0.012934640981256962,
0.015886543318629265,
-0.032133374363183975,
-0.08068639785051346,
0.10920129716396332,
-0.044809311628341675,
-0.13727150857448578,
-0.045596521347761154,
0.1377175748348236,
0.0731605812907219,
-0.00359351746737957,
-0.09041156619787216,
-0.050021663308143616,
0.031252212822437286,
-0.1371932029724121,
0.01997753418982029,
0.08576954901218414,
-0.07234665006399155,
0.0698034018278122,
-0.11751603335142136,
0.2458948791027069,
-0.0524151511490345,
0.0896567553281784,
0.0709720253944397,
0.31162014603614807,
-0.08235272765159607,
0.02873161807656288,
0.09803200513124466,
-0.026654057204723358,
-0.26190292835235596,
0.029979633167386055,
0.05638238042593002,
0.04938185587525368,
-0.02843460999429226,
-0.18719743192195892,
0.036363113671541214,
0.06459816545248032,
0.016497470438480377,
0.13878561556339264,
-0.3157224655151367,
-0.0721718817949295,
0.04020849987864494,
0.047762200236320496,
0.11962796002626419,
-0.04890049248933792,
0.011267954483628273,
0.010984989814460278,
-0.016328759491443634,
0.16383521258831024,
-0.08632995188236237,
0.11082419753074646,
-0.012934541329741478,
0.02104428969323635,
0.010682103224098682,
-0.040459323674440384,
0.1487898826599121,
-0.003343862947076559,
0.09072566032409668,
0.025223324075341225,
-0.06733614206314087,
0.06636755168437958,
-0.07327944040298462,
0.011974047869443893,
-0.05378776043653488,
0.09185262024402618,
-0.045282598584890366,
0.0010139745427295566,
-0.06246977299451828,
-0.024063177406787872,
-0.06749105453491211,
-0.06296666711568832,
-0.10791633278131485,
0.08884958922863007,
0.000746204168535769,
-0.02908109501004219,
-0.04201129451394081,
0.054056353867053986,
0.035463664680719376,
0.4493359923362732,
-0.05227240175008774,
-0.053069330751895905,
0.08815151453018188,
0.09277763962745667,
-0.01794450543820858,
0.09321408718824387,
-0.12357188761234283,
0.04079055041074753,
0.12765155732631683,
0.004561528097838163,
0.13461217284202576,
0.08197237551212311,
-0.10337455570697784,
-0.004654042888432741,
0.042563386261463165,
-0.13127627968788147,
-0.08014523237943649,
-0.03295697271823883,
-0.01872093789279461,
-0.1190016120672226,
-0.015042198821902275,
0.09247755259275436,
-0.03084845095872879,
0.053225692361593246,
0.02725985273718834,
0.04755283519625664,
-0.13321809470653534,
0.15661577880382538,
0.03548196330666542,
0.08438514173030853,
-0.08233853429555893,
0.0957811176776886,
0.030104901641607285,
0.004610141273587942,
0.04891415685415268,
-0.02637943997979164,
-0.10149519145488739,
0.019178146496415138,
-0.03032047301530838,
-0.08386382460594177,
0.12455578148365021,
-0.03086884878575802,
-0.04816962778568268,
-0.08449167013168335,
0.00968680065125227,
0.06828322261571884,
0.05217857286334038,
0.10059580206871033,
-0.031775325536727905,
0.02660980448126793,
-0.12699167430400848,
0.08317403495311737,
-0.024292631074786186,
0.012828993611037731,
-0.1356390118598938,
0.06760798394680023,
-0.02659040130674839,
0.057388827204704285,
-0.01863805763423443,
-0.01973152346909046,
-0.2173939198255539,
0.02618231251835823,
-0.025557123124599457,
0.0038494919426739216,
0.04102040454745293,
0.027889033779501915,
0.026005011051893234,
0.05082876980304718,
-0.035126328468322754,
0.027516542002558708,
-0.031932324171066284,
-0.05281057208776474,
0.04515986144542694,
-0.001467780559323728,
-0.02854740433394909,
-0.051517583429813385,
0.06076597422361374,
-0.11085972934961319,
0.03777240961790085,
0.01833501085639,
-0.05718983709812164,
0.07385632395744324,
0.034146055579185486,
0.025609485805034637,
0.09601592272520065,
0.05744863301515579,
0.03817358240485191,
-0.06451597064733505,
0.03407379239797592,
-0.0191631019115448,
-0.006845394615083933,
0.040695060044527054,
0.11625181883573532,
-0.04832686111330986,
-0.06327467411756516,
-0.1486724466085434,
-0.010981161147356033,
-0.048681098967790604,
0.043675489723682404,
0.14957195520401,
0.10375252366065979,
0.09278536587953568,
-0.08544762432575226,
-0.0226217620074749,
-0.14531907439231873,
-0.08078242838382721,
0.060537077486515045,
-0.044578637927770615,
-0.05809058994054794,
-0.04991595074534416,
0.072527676820755,
-0.014007162302732468,
0.13561315834522247,
-0.08526904881000519,
-0.1192963719367981,
-0.05781520903110504,
-0.2013944536447525,
-0.12102537602186203,
0.006627216935157776,
0.2745034396648407,
0.03526143729686737,
-0.04139144718647003,
-0.07936355471611023,
0.0031785741448402405,
0.07116973400115967,
0.16906431317329407,
0.0312589667737484,
0.110907644033432,
-0.11192750185728073,
0.10478517413139343,
0.039009008556604385,
-0.05691129341721535,
0.10796307027339935,
0.3328118920326233,
-0.08219370990991592,
0.02072407677769661,
-0.10421788692474365,
0.10153460502624512,
0.002255996223539114,
-0.13901546597480774,
0.010607997886836529,
-0.03188878297805786,
-0.16198381781578064,
-0.10664700716733932,
0.016538817435503006,
-0.07478190958499908,
-0.1838442087173462,
-0.017796620726585388,
-0.11684910207986832,
-0.07336822152137756,
0.10155867785215378,
0.040450263768434525,
-0.03009716235101223,
0.20794062316417694,
-0.07158278673887253,
0.04586811363697052,
-0.003929977770894766,
-0.016108684241771698,
-0.01862870156764984,
-0.03230820968747139,
-0.10202806442975998,
0.13945409655570984,
0.021661214530467987,
0.10333898663520813,
-0.0020766255911439657,
0.07358334958553314,
0.037272170186042786,
-0.0280001163482666,
-0.04698936268687248,
-0.011088578961789608,
0.0020344743970781565,
-0.05577605590224266,
0.11701337993144989,
0.05259020999073982,
-0.0886714980006218,
-0.07409586012363434,
-0.009580516256392002,
-0.07335926592350006,
-0.038215573877096176,
-0.1547286957502365,
0.2329166978597641,
-0.030917182564735413,
0.12437079846858978,
0.003149752039462328,
-0.06306970119476318,
-0.09483032673597336,
0.14517559111118317,
0.12407895922660828,
-0.11817460507154465,
-0.009399796836078167,
0.09650255739688873,
-0.005866600200533867,
-0.09527475386857986,
0.14510999619960785,
0.08037827908992767,
-0.02563084289431572,
0.025037124752998352,
-0.022733965888619423,
-0.021453559398651123,
-0.015347006730735302,
0.015540610998868942,
-0.030597979202866554,
0.029416168108582497,
0.042662251740694046,
-0.1463923156261444,
-0.03498699516057968,
-0.08044791221618652,
-0.08698305487632751,
0.18581046164035797,
-0.14277657866477966,
-0.07864214479923248,
-0.04254341498017311,
-0.08468645066022873,
-0.10728199034929276,
0.021719660609960556,
-0.10976678878068924,
0.06724512577056885,
0.05545623227953911,
-0.05225260928273201,
-0.0009250067523680627,
-0.03164041042327881,
0.0032662637531757355,
0.03487436845898628,
0.07120268791913986,
-0.01732550375163555,
0.05975370481610298,
0.12397127598524094,
-0.019730931147933006,
-0.04827846959233284,
0.11225178092718124,
0.021490709856152534,
-0.035946041345596313,
-0.14605534076690674,
0.03688865900039673,
-0.022579049691557884,
0.12442220002412796,
0.032534532248973846,
-0.05697030946612358,
-0.014126471243798733,
-0.2196122705936432,
-0.008397997356951237,
-0.1463952362537384,
-0.07968004047870636,
-0.06731394678354263,
0.10560601949691772,
0.17703479528427124,
-0.055034078657627106,
0.019445886835455894,
-0.03590158373117447,
0.04178448021411896,
-0.05044577643275261,
0.07704321295022964,
-0.0026474366895854473,
-0.1442994624376297,
0.05290527641773224,
-0.054687224328517914,
0.009347786195576191,
-0.3066256046295166,
-0.009464235045015812,
0.024085188284516335,
-0.03251948952674866,
-0.03590572252869606,
0.16143204271793365,
0.021979082375764847,
0.06882719695568085,
-0.05558779463171959,
-0.25186118483543396,
-0.07091144472360611,
0.13138745725154877,
-0.0019314914243295789,
-0.07031361013650894
] |
null | null | transformers | # airoboros-2.2.1-limarpv3-y34b
This is a Llama-fied [Yi 34B](https://huggingface.co/01-ai/Yi-34B)-based model consisting of a merge between [Doctor-Shotgun/airoboros-2.2.1-y34b](https://huggingface.co/Doctor-Shotgun/airoboros-2.2.1-y34b) and a PEFT adapter trained using the LimaRP dataset (https://huggingface.co/Doctor-Shotgun/limarpv3-yi-llama-34b-lora) at 0.5 weight.
## EXL2 Quants:
https://huggingface.co/Doctor-Shotgun/airoboros-2.2.1-limarpv3-y34b-exl2
## Usage:
The intended prompt format is the Alpaca instruction format of LimaRP v3:
```
### Instruction:
Character's Persona: {bot character description}
User's Persona: {user character description}
Scenario: {what happens in the story}
Play the role of Character. You must engage in a roleplaying chat with User below this line. Do not write dialogues and narration for User.
### Input:
User: {utterance}
### Response:
Character: {utterance}
### Input
User: {utterance}
### Response:
Character: {utterance}
(etc.)
```
## Message length control
Due to the inclusion of LimaRP v3, it is possible to append a length modifier to the response instruction sequence, like this:
```
### Input
User: {utterance}
### Response: (length = medium)
Character: {utterance}
```
This has an immediately noticeable effect on bot responses. The available lengths are: `micro, tiny, short, medium, long, massive, huge, enormous, humongous, unlimited`. The recommended starting length is `medium`. Keep in mind that the AI may ramble or impersonate the user with very long messages.
## Bias, Risks, and Limitations
The model will show biases similar to those observed in niche roleplaying forums on the Internet, besides those exhibited by the base model. It is not intended for supplying factual information or advice in any form.
## Training Details
This model is a merge. Please refer to the link repositories of the merged models for details. | {"language": ["en"], "license": "other", "library_name": "transformers", "tags": ["Yi", "llama", "llama 2"], "datasets": ["jondurbin/airoboros-2.2.1"], "inference": false, "pipeline_tag": "text-generation", "license_name": "yi-license", "license_link": "LICENSE"} | text-generation | Doctor-Shotgun/airoboros-2.2.1-limarpv3-y34b | [
"transformers",
"safetensors",
"llama",
"text-generation",
"Yi",
"llama 2",
"en",
"dataset:jondurbin/airoboros-2.2.1",
"license:other",
"autotrain_compatible",
"text-generation-inference",
"region:us"
] | 2023-11-11T02:26:04+00:00 | [] | [
"en"
] | TAGS
#transformers #safetensors #llama #text-generation #Yi #llama 2 #en #dataset-jondurbin/airoboros-2.2.1 #license-other #autotrain_compatible #text-generation-inference #region-us
| # airoboros-2.2.1-limarpv3-y34b
This is a Llama-fied Yi 34B-based model consisting of a merge between Doctor-Shotgun/airoboros-2.2.1-y34b and a PEFT adapter trained using the LimaRP dataset (URL at 0.5 weight.
## EXL2 Quants:
URL
## Usage:
The intended prompt format is the Alpaca instruction format of LimaRP v3:
## Message length control
Due to the inclusion of LimaRP v3, it is possible to append a length modifier to the response instruction sequence, like this:
This has an immediately noticeable effect on bot responses. The available lengths are: 'micro, tiny, short, medium, long, massive, huge, enormous, humongous, unlimited'. The recommended starting length is 'medium'. Keep in mind that the AI may ramble or impersonate the user with very long messages.
## Bias, Risks, and Limitations
The model will show biases similar to those observed in niche roleplaying forums on the Internet, besides those exhibited by the base model. It is not intended for supplying factual information or advice in any form.
## Training Details
This model is a merge. Please refer to the link repositories of the merged models for details. | [
"# airoboros-2.2.1-limarpv3-y34b\n\nThis is a Llama-fied Yi 34B-based model consisting of a merge between Doctor-Shotgun/airoboros-2.2.1-y34b and a PEFT adapter trained using the LimaRP dataset (URL at 0.5 weight.",
"## EXL2 Quants:\n\nURL",
"## Usage:\nThe intended prompt format is the Alpaca instruction format of LimaRP v3:",
"## Message length control\nDue to the inclusion of LimaRP v3, it is possible to append a length modifier to the response instruction sequence, like this:\n\nThis has an immediately noticeable effect on bot responses. The available lengths are: 'micro, tiny, short, medium, long, massive, huge, enormous, humongous, unlimited'. The recommended starting length is 'medium'. Keep in mind that the AI may ramble or impersonate the user with very long messages.",
"## Bias, Risks, and Limitations\nThe model will show biases similar to those observed in niche roleplaying forums on the Internet, besides those exhibited by the base model. It is not intended for supplying factual information or advice in any form.",
"## Training Details\nThis model is a merge. Please refer to the link repositories of the merged models for details."
] | [
"TAGS\n#transformers #safetensors #llama #text-generation #Yi #llama 2 #en #dataset-jondurbin/airoboros-2.2.1 #license-other #autotrain_compatible #text-generation-inference #region-us \n",
"# airoboros-2.2.1-limarpv3-y34b\n\nThis is a Llama-fied Yi 34B-based model consisting of a merge between Doctor-Shotgun/airoboros-2.2.1-y34b and a PEFT adapter trained using the LimaRP dataset (URL at 0.5 weight.",
"## EXL2 Quants:\n\nURL",
"## Usage:\nThe intended prompt format is the Alpaca instruction format of LimaRP v3:",
"## Message length control\nDue to the inclusion of LimaRP v3, it is possible to append a length modifier to the response instruction sequence, like this:\n\nThis has an immediately noticeable effect on bot responses. The available lengths are: 'micro, tiny, short, medium, long, massive, huge, enormous, humongous, unlimited'. The recommended starting length is 'medium'. Keep in mind that the AI may ramble or impersonate the user with very long messages.",
"## Bias, Risks, and Limitations\nThe model will show biases similar to those observed in niche roleplaying forums on the Internet, besides those exhibited by the base model. It is not intended for supplying factual information or advice in any form.",
"## Training Details\nThis model is a merge. Please refer to the link repositories of the merged models for details."
] | [
67,
70,
8,
21,
112,
60,
25
] | [
"passage: TAGS\n#transformers #safetensors #llama #text-generation #Yi #llama 2 #en #dataset-jondurbin/airoboros-2.2.1 #license-other #autotrain_compatible #text-generation-inference #region-us \n# airoboros-2.2.1-limarpv3-y34b\n\nThis is a Llama-fied Yi 34B-based model consisting of a merge between Doctor-Shotgun/airoboros-2.2.1-y34b and a PEFT adapter trained using the LimaRP dataset (URL at 0.5 weight.## EXL2 Quants:\n\nURL## Usage:\nThe intended prompt format is the Alpaca instruction format of LimaRP v3:## Message length control\nDue to the inclusion of LimaRP v3, it is possible to append a length modifier to the response instruction sequence, like this:\n\nThis has an immediately noticeable effect on bot responses. The available lengths are: 'micro, tiny, short, medium, long, massive, huge, enormous, humongous, unlimited'. The recommended starting length is 'medium'. Keep in mind that the AI may ramble or impersonate the user with very long messages.## Bias, Risks, and Limitations\nThe model will show biases similar to those observed in niche roleplaying forums on the Internet, besides those exhibited by the base model. It is not intended for supplying factual information or advice in any form.## Training Details\nThis model is a merge. Please refer to the link repositories of the merged models for details."
] | [
-0.058581557124853134,
-0.0716562494635582,
-0.0024738633073866367,
0.007731175981462002,
0.08585453033447266,
-0.045709457248449326,
0.17347167432308197,
0.06480856984853745,
0.027928629890084267,
0.11144586652517319,
-0.0008069410687312484,
-0.01875927858054638,
0.045127395540475845,
0.12836581468582153,
-0.036109913140535355,
-0.22969305515289307,
0.0611637681722641,
-0.06161436066031456,
0.022495875135064125,
0.050917353481054306,
0.09968717396259308,
-0.07115449011325836,
0.058308012783527374,
0.023629993200302124,
0.022881610319018364,
-0.019158771261572838,
0.002354189520701766,
-0.07940687239170074,
0.049031246453523636,
0.08180824667215347,
0.01016891561448574,
0.024150004610419273,
-0.011393277905881405,
-0.17045125365257263,
0.024584313854575157,
0.03372589498758316,
0.032712697982788086,
0.027756765484809875,
0.028562510386109352,
-0.0003126579395029694,
0.16891372203826904,
-0.08501265943050385,
0.06144154444336891,
0.06553488224744797,
-0.07439371198415756,
-0.04187536612153053,
-0.06085245683789253,
-0.02807493880391121,
0.10093104094266891,
0.10658068209886551,
-0.015797849744558334,
0.20751789212226868,
-0.09511370956897736,
0.00935059692710638,
0.23431625962257385,
-0.17236708104610443,
-0.009824861772358418,
0.01616259478032589,
0.032500237226486206,
0.09321548044681549,
-0.0433126837015152,
-0.003962683957070112,
0.04781579598784447,
-0.0025226541329175234,
-0.004706159234046936,
-0.025540480390191078,
0.03812906891107559,
-0.05095307528972626,
-0.10795585066080093,
0.008754135109484196,
0.19544385373592377,
0.02669641748070717,
-0.08087536692619324,
-0.1428682953119278,
-0.0020975046791136265,
0.09894438832998276,
-0.06765633076429367,
-0.009701497852802277,
0.031421784311532974,
-0.010817678645253181,
0.02178926020860672,
-0.038215041160583496,
-0.0995994508266449,
0.01190065685659647,
-0.08531426638364792,
0.1681891232728958,
0.027242979034781456,
0.05351809039711952,
-0.11032627522945404,
-0.02223791927099228,
-0.06137113645672798,
-0.08166422694921494,
-0.06727059185504913,
-0.04167453572154045,
-0.0278298519551754,
-0.04109548404812813,
-0.06698840856552124,
-0.14496208727359772,
0.06087218225002289,
0.12482950836420059,
-0.08309955149888992,
0.009361393749713898,
0.014256368391215801,
0.0400790311396122,
0.006541361566632986,
0.042842891067266464,
-0.031129930168390274,
-0.08602619171142578,
0.09322743117809296,
0.05597531795501709,
0.08794741332530975,
-0.030546387657523155,
-0.048764560371637344,
0.028454357758164406,
-0.004957640543580055,
0.035741452127695084,
0.04039408639073372,
0.050696104764938354,
-0.06836343556642532,
-0.03608490899205208,
0.10036862641572952,
-0.07856585085391998,
-0.009569559246301651,
0.018742145970463753,
-0.14708252251148224,
0.0560746043920517,
0.02312670275568962,
0.005456187296658754,
-0.053477831184864044,
0.03921841084957123,
-0.06776455789804459,
0.009313913062214851,
-0.10707928240299225,
-0.16191135346889496,
0.07628688216209412,
0.051245879381895065,
-0.04844561591744423,
-0.14659473299980164,
-0.12885574996471405,
-0.027188820764422417,
-0.027826350182294846,
-0.04830462858080864,
-0.0004868449759669602,
0.0029565293807536364,
-0.04501226544380188,
-0.009744668379426003,
-0.01287201140075922,
0.006870250683277845,
-0.06690762937068939,
0.004039696417748928,
-0.0014617135748267174,
0.15092000365257263,
0.01333356648683548,
-0.0016126756090670824,
-0.13618126511573792,
-0.014115974307060242,
-0.1753614842891693,
0.07553969323635101,
-0.07670199126005173,
0.06260795146226883,
-0.07084508240222931,
-0.023659583181142807,
-0.03731570020318031,
0.016158422455191612,
0.012827218510210514,
0.19974251091480255,
-0.17744487524032593,
-0.04216035082936287,
0.13272637128829956,
-0.16952471435070038,
-0.06343092024326324,
0.12514974176883698,
0.011356698349118233,
0.03207523748278618,
0.13075780868530273,
0.08099432289600372,
0.02632926031947136,
-0.09585624933242798,
-0.06853089481592178,
0.0019687016028910875,
-0.0030624056234955788,
0.04074043780565262,
0.041412629187107086,
0.005016781389713287,
0.08116277307271957,
0.02655625343322754,
0.08463984727859497,
0.034935660660266876,
-0.024407310411334038,
-0.0032287684734910727,
0.016728762537240982,
-0.08101830631494522,
-0.051925309002399445,
-0.008205665275454521,
0.024496883153915405,
-0.07787364721298218,
-0.04519054293632507,
0.05996700003743172,
0.09963128715753555,
0.03841758519411087,
-0.007041389122605324,
-0.14090940356254578,
0.03421111777424812,
-0.004079552832990885,
-0.003733329940587282,
-0.130913645029068,
-0.11060727387666702,
0.03524857759475708,
-0.11158362776041031,
0.08471963554620743,
0.07041583210229874,
0.04457378387451172,
0.04187982901930809,
0.009594045579433441,
0.05257200449705124,
0.039874739944934845,
-0.014644534327089787,
-0.08365019410848618,
-0.14051008224487305,
0.03708713874220848,
-0.028407104313373566,
0.08127899467945099,
-0.12644340097904205,
0.01679363287985325,
0.1230207085609436,
0.06447146087884903,
0.11403313279151917,
-0.034062840044498444,
0.08870958536863327,
0.010338479653000832,
-0.03736318275332451,
-0.02164795808494091,
0.003304095473140478,
0.0006617255276069045,
-0.0482499822974205,
0.1083943173289299,
-0.20772284269332886,
-0.01502743549644947,
0.12133854627609253,
0.034405793994665146,
-0.02456086128950119,
-0.002284032991155982,
-0.013345619663596153,
-0.017921993508934975,
-0.12424126267433167,
-0.04886024817824364,
0.12303943186998367,
0.061032816767692566,
0.11309527605772018,
-0.10020147264003754,
0.0013601853279396892,
-0.012681830674409866,
-0.0908307284116745,
-0.018063317984342575,
0.050880130380392075,
0.08969393372535706,
-0.13538958132266998,
0.018222259357571602,
-0.09853490442037582,
-0.01505864318460226,
0.13711151480674744,
0.04447661340236664,
-0.08489249646663666,
0.010828495025634766,
0.04624126851558685,
0.07640194892883301,
0.05311668664216995,
-0.08553359657526016,
-0.003461897373199463,
0.04625527560710907,
0.06014418974518776,
0.02972908690571785,
-0.09822119772434235,
-0.006603912450373173,
0.017809467390179634,
-0.028836052864789963,
0.00171663926448673,
0.027737999334931374,
0.033467765897512436,
0.13233664631843567,
0.025050029158592224,
0.00938444770872593,
-0.04345187172293663,
-0.06659211218357086,
-0.10298339277505875,
0.14519305527210236,
-0.06462059170007706,
-0.3221758008003235,
-0.04784347116947174,
-0.020541176199913025,
-0.06288684159517288,
0.01168229803442955,
0.03974853456020355,
-0.13478969037532806,
-0.0689021572470665,
-0.10611449182033539,
0.0964716225862503,
0.09045364707708359,
0.009555800817906857,
-0.0433495007455349,
0.059903353452682495,
0.07402118295431137,
-0.10868240892887115,
-0.01933603733778,
-0.07747389376163483,
-0.06134152039885521,
0.04113812744617462,
-0.018281031399965286,
0.08058924227952957,
0.08945656567811966,
0.03147236630320549,
-0.03136222064495087,
-0.005164698697626591,
0.24476291239261627,
-0.030701227486133575,
0.05979349464178085,
0.27036163210868835,
0.09080944955348969,
0.05234753340482712,
0.1347445696592331,
0.0430888757109642,
-0.12819945812225342,
0.05570460855960846,
0.05976919084787369,
-0.06287260353565216,
-0.14024734497070312,
-0.11663006991147995,
-0.07748323678970337,
-0.018314188346266747,
0.05256021022796631,
0.010191332548856735,
-0.03226012736558914,
0.046610064804553986,
-0.07862639427185059,
0.06385704129934311,
0.0488077811896801,
0.09563454985618591,
0.08828925341367722,
0.03415459021925926,
0.11637899279594421,
-0.07849093526601791,
-0.015139760449528694,
0.13929355144500732,
-0.05022991821169853,
0.2554929554462433,
-0.06108611077070236,
0.1253417581319809,
0.06527845561504364,
-0.010422748513519764,
0.06733889877796173,
0.045710090547800064,
-0.05013885349035263,
-0.07750516384840012,
-0.04802801460027695,
-0.07665929943323135,
-0.03957773372530937,
0.10731767117977142,
-0.050142452120780945,
0.026344412937760353,
-0.06567469239234924,
0.08359502255916595,
0.04551921784877777,
0.10576896369457245,
0.09750393033027649,
-0.17657263576984406,
-0.12458515912294388,
0.04423084482550621,
-0.06978093087673187,
0.01258988305926323,
0.03716089576482773,
0.1642252653837204,
-0.07536778599023819,
0.09266987442970276,
-0.004945624619722366,
0.11105924844741821,
-0.09830136597156525,
0.023732174187898636,
-0.07365330308675766,
0.0950203537940979,
-0.0202268548309803,
0.051725469529628754,
-0.15712720155715942,
0.13013745844364166,
0.00981566309928894,
0.06890302896499634,
-0.09713032096624374,
-0.016702814027667046,
0.07785072922706604,
0.027271537110209465,
0.007750611286610365,
0.007135854102671146,
-0.016551945358514786,
-0.10025212913751602,
-0.10184837877750397,
0.011919858865439892,
0.04369470104575157,
0.05419996753334999,
0.11436676979064941,
-0.01322873029857874,
0.025873327627778053,
-0.013814524747431278,
0.04780286177992821,
-0.1136622279882431,
-0.1721799075603485,
0.054894089698791504,
0.10901468247175217,
-0.1220419704914093,
-0.10159633308649063,
-0.033356260508298874,
0.10238319635391235,
0.15515267848968506,
0.055737875401973724,
-0.09980662167072296,
-0.09661943465471268,
0.05917246639728546,
0.08391588181257248,
-0.07924147695302963,
-0.047610800713300705,
-0.0007116488413885236,
0.17167963087558746,
-0.021003244444727898,
-0.021330982446670532,
0.04703284800052643,
-0.1389365792274475,
-0.13659125566482544,
-0.0663742795586586,
0.030138997361063957,
0.058993492275476456,
0.06824851036071777,
0.008237935602664948,
-0.025598902255296707,
0.006872366648167372,
-0.1097300574183464,
-0.06558175384998322,
0.12954407930374146,
-0.025080706924200058,
0.057208046317100525,
-0.01834883727133274,
-0.07994021475315094,
-0.063313327729702,
-0.03401666134595871,
0.06708090007305145,
0.25198182463645935,
-0.02522815205156803,
0.13548977673053741,
0.07098930329084396,
-0.0014777218457311392,
-0.19017504155635834,
0.054866723716259,
0.04140707850456238,
-0.030421610921621323,
0.0029998421669006348,
-0.10824069380760193,
0.08804580569267273,
0.016551822423934937,
-0.028219537809491158,
0.045382026582956314,
-0.29532238841056824,
-0.09958435595035553,
-0.044432662427425385,
0.06441110372543335,
0.22545824944972992,
-0.11099760234355927,
-0.053529687225818634,
-0.06821569800376892,
-0.027941858395934105,
0.10336945950984955,
-0.09246084094047546,
0.0831499919295311,
0.007227097637951374,
0.11467406898736954,
0.06437932699918747,
-0.044153377413749695,
0.10855612903833389,
-0.05561511591076851,
0.05880332365632057,
-0.08826770633459091,
0.010087910108268261,
0.00215640920214355,
-0.048823900520801544,
0.1267968714237213,
-0.1037466898560524,
-0.020448977127671242,
-0.07311675697565079,
-0.06096169725060463,
-0.04768696799874306,
0.06876406073570251,
-0.005552988033741713,
-0.0216220673173666,
-0.0796022042632103,
0.06312193721532822,
0.07730584591627121,
0.056049197912216187,
-0.05172702670097351,
-0.09091118723154068,
0.02842724695801735,
0.12814314663410187,
0.15980732440948486,
-0.1635330617427826,
-0.052997589111328125,
0.04773890972137451,
0.028031017631292343,
0.06534260511398315,
-0.1379396766424179,
0.00024196393496822566,
0.01716083660721779,
0.0016708752373233438,
0.10392780601978302,
0.002213087398558855,
-0.11197827756404877,
0.08591742068529129,
0.09368699789047241,
0.003702877089381218,
-0.12761542201042175,
0.027091003954410553,
0.2515277564525604,
-0.06464052945375443,
-0.03501245379447937,
0.14015479385852814,
-0.06981197744607925,
0.021980777382850647,
-0.06797559559345245,
0.05370126664638519,
0.04331771284341812,
0.12053631246089935,
0.0010934334713965654,
0.0671672374010086,
-0.08687742054462433,
0.07639596611261368,
0.07384579628705978,
-0.11751090735197067,
0.02457624301314354,
0.1450965255498886,
-0.12110250443220139,
-0.06621785461902618,
-0.05888192355632782,
0.016749344766139984,
-0.022471725940704346,
-0.03891674801707268,
-0.10269489139318466,
-0.08793125301599503,
0.058201126754283905,
0.12379322201013565,
0.024801015853881836,
-0.038607534021139145,
-0.000877850572578609,
-0.0013647483428940177,
-0.08692961186170578,
0.03320970758795738,
0.03506042808294296,
0.001129184034653008,
-0.022305291146039963,
0.05658441036939621,
0.05021728575229645,
-0.06509944796562195,
-0.023812489584088326,
-0.007298333570361137,
-0.11920643597841263,
-0.01784515008330345,
-0.231553316116333,
0.029652582481503487,
-0.07117706537246704,
-0.01348010916262865,
-0.01144811138510704,
-0.035522934049367905,
-0.07620786875486374,
0.02234608121216297,
-0.059660062193870544,
-0.02572588250041008,
-0.030709711834788322,
0.057887692004442215,
-0.10055342316627502,
-0.033156637102365494,
0.07270434498786926,
-0.05763346701860428,
0.09821483492851257,
0.016200263053178787,
-0.08422265201807022,
-0.0489291213452816,
-0.10245085507631302,
-0.017889322713017464,
0.030837971717119217,
0.0693773478269577,
0.06878353655338287,
-0.09102518111467361,
0.07393303513526917,
0.014565406367182732,
0.04239466413855553,
0.01243593916296959,
0.06395386904478073,
-0.06966129690408707,
0.040840357542037964,
-0.007380848750472069,
-0.05063995346426964,
-0.06433849036693573,
0.011555248871445656,
0.06178770214319229,
-0.007076153066009283,
0.13296149671077728,
-0.07209280878305435,
0.03356728330254555,
-0.129052996635437,
-0.00608921330422163,
-0.01579475589096546,
-0.027527760714292526,
-0.059693146497011185,
0.002856515347957611,
0.07038906961679459,
-0.02057904377579689,
0.17828986048698425,
0.013074933551251888,
0.004048711620271206,
0.05721119046211243,
0.02440808340907097,
0.04315115138888359,
-0.032698601484298706,
0.002959304954856634,
0.06728485971689224,
0.006433174945414066,
0.020352400839328766,
0.031301334500312805,
0.08258895576000214,
-0.09124458581209183,
0.18078657984733582,
0.1494518369436264,
0.035481955856084824,
0.05526786297559738,
-0.01647886447608471,
-0.04887060075998306,
-0.10509289801120758,
0.00021478437702171504,
-0.053372468799352646,
0.059470996260643005,
-0.035804081708192825,
0.06336034089326859,
0.22607256472110748,
-0.06430360674858093,
0.0507502444088459,
-0.045968979597091675,
0.002085249638184905,
-0.10358977317810059,
-0.16541787981987,
-0.06114562973380089,
-0.1015070229768753,
-0.010095719248056412,
-0.12185061722993851,
-0.004844478331506252,
0.016154753044247627,
0.024805165827274323,
0.030845150351524353,
0.09729098528623581,
-0.14039292931556702,
-0.06956604868173599,
0.008504582569003105,
-0.03400919586420059,
-0.013338576070964336,
0.08859347552061081,
0.00858368445187807,
0.021188976243138313,
-0.033168669790029526,
0.03871022164821625,
0.06996594369411469,
0.012134096585214138,
0.010396352037787437,
-0.027987787500023842,
-0.0956510528922081,
0.0018669514684006572,
-0.023465873673558235,
0.06835134327411652,
0.18641549348831177,
0.07007527351379395,
-0.03713309392333031,
-0.01685054413974285,
0.24484334886074066,
-0.03638565167784691,
-0.15077704191207886,
-0.11943680047988892,
0.11808961629867554,
-0.013567605055868626,
0.023578297346830368,
-0.09022795408964157,
-0.07340540736913681,
-0.0650630071759224,
0.2635115385055542,
0.1563597023487091,
-0.09091497957706451,
0.0002713379799388349,
-0.06197727471590042,
-0.004388408735394478,
-0.03882560506463051,
0.10274263471364975,
0.06825576722621918,
0.3504672646522522,
-0.021937295794487,
0.03470512852072716,
-0.003202049294486642,
0.0027328289579600096,
-0.1348390132188797,
0.09846625477075577,
-0.003443900728598237,
0.0439436212182045,
-0.0793406069278717,
0.04707082733511925,
-0.04513496533036232,
-0.09480375796556473,
0.003824303625151515,
-0.1276240050792694,
-0.08537373691797256,
0.016713399440050125,
0.008113699965178967,
0.04407143220305443,
0.1291394978761673,
0.0141341183334589,
-0.02551594376564026,
-0.03730658069252968,
-0.02679363638162613,
-0.1278487741947174,
-0.05440666899085045,
0.08082161843776703,
0.005085824057459831,
0.14718100428581238,
-0.012164299376308918,
0.09200872480869293,
0.13394659757614136,
-0.015637749806046486,
-0.11969634145498276,
0.0992487445473671,
0.0250930767506361,
-0.11299187690019608,
0.024672167375683784,
0.1117856353521347,
-0.03403126448392868,
0.13755205273628235,
0.13610486686229706,
-0.14111894369125366,
0.01730917952954769,
-0.00940707977861166,
0.016802411526441574,
-0.08591999858617783,
0.03917551413178444,
-0.08739873766899109,
0.10763363540172577,
0.12103816866874695,
-0.05633949860930443,
-0.04590347781777382,
-0.006176542956382036,
0.06847137957811356,
0.02562674880027771,
0.03521895036101341,
-0.054093554615974426,
-0.22362858057022095,
-0.007173892110586166,
-0.018017146736383438,
0.08627739548683167,
-0.27085965871810913,
-0.05526435375213623,
-0.04019583761692047,
-0.02907135896384716,
-0.028873395174741745,
0.08699784427881241,
0.09566500782966614,
0.001302471850067377,
-0.08028613775968552,
-0.17739415168762207,
0.02352689392864704,
0.13606947660446167,
-0.09223714470863342,
-0.10924989730119705
] |
null | null | peft | ## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.5.0
| {"library_name": "peft"} | null | genies-models/llama-30b-raven_matrices | [
"peft",
"region:us"
] | 2023-11-11T02:27:00+00:00 | [] | [] | TAGS
#peft #region-us
| ## Training procedure
The following 'bitsandbytes' quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.5.0
| [
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16",
"### Framework versions\n\n\n- PEFT 0.5.0"
] | [
"TAGS\n#peft #region-us \n",
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16",
"### Framework versions\n\n\n- PEFT 0.5.0"
] | [
9,
163,
11
] | [
"passage: TAGS\n#peft #region-us \n## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16### Framework versions\n\n\n- PEFT 0.5.0"
] | [
-0.052170995622873306,
0.04638637602329254,
-0.002228866796940565,
0.12587358057498932,
0.09519550204277039,
0.0600481741130352,
0.10608749836683273,
0.12907910346984863,
0.030399560928344727,
0.08949745446443558,
0.11713548749685287,
0.04174109548330307,
0.061590272933244705,
0.15933413803577423,
-0.013215559534728527,
-0.00870309304445982,
0.04744568094611168,
0.0017413088353350759,
-0.0256193894892931,
0.08726111054420471,
0.04637008160352707,
-0.05632777512073517,
0.03198761120438576,
-0.0936860740184784,
-0.14197169244289398,
-0.000977774616330862,
-0.006933287251740694,
0.026412004604935646,
0.03658851608633995,
0.030308807268738747,
0.06681910902261734,
0.011842764914035797,
-0.01992746815085411,
-0.22561807930469513,
-0.011325501836836338,
0.10931333899497986,
-0.024895185604691505,
0.0698515921831131,
-0.09559076279401779,
0.13339440524578094,
-0.08422335982322693,
-0.051820602267980576,
0.015994668006896973,
0.014835527166724205,
-0.0893593430519104,
-0.09642457216978073,
-0.044293515384197235,
0.04310377314686775,
0.030483758077025414,
0.05993334576487541,
-0.014119904488325119,
0.17196474969387054,
-0.15130110085010529,
0.0900082141160965,
0.07799916714429855,
-0.20914553105831146,
-0.02864500693976879,
0.12620775401592255,
-0.022182848304510117,
0.16751492023468018,
-0.0821603313088417,
-0.0928761437535286,
0.06491231173276901,
0.03869080916047096,
-0.018066778779029846,
-0.0015547500224784017,
-0.11479837447404861,
0.030915888026356697,
-0.13364118337631226,
-0.04813624918460846,
0.14555932581424713,
0.03509735316038132,
-0.03541167080402374,
-0.046493805944919586,
-0.09799899905920029,
-0.3764220178127289,
0.025444475933909416,
-0.019241128116846085,
-0.066914401948452,
0.04486314579844475,
-0.02249828726053238,
-0.01497147511690855,
-0.0114058218896389,
-0.08907991647720337,
-0.02228786051273346,
0.07495085150003433,
0.03715614974498749,
0.026139622554183006,
0.006188852712512016,
0.11266672611236572,
-0.09897858649492264,
-0.03323763236403465,
-0.036225371062755585,
-0.028881169855594635,
-0.05456538870930672,
-0.006604235153645277,
-0.07022203505039215,
0.17528314888477325,
0.07525674998760223,
0.12666282057762146,
-0.12827250361442566,
0.1311921775341034,
-0.017891818657517433,
0.049590788781642914,
-0.03390548750758171,
0.035105880349874496,
-0.12452733516693115,
0.10138563066720963,
0.007189449388533831,
0.16058793663978577,
0.009220117703080177,
-0.04686558619141579,
-0.06686486303806305,
0.001404313137754798,
0.1462526172399521,
-0.0007741169538348913,
-0.10082004219293594,
0.01514110155403614,
-0.143125981092453,
-0.03643784299492836,
0.06489310413599014,
-0.07333685457706451,
0.009285308420658112,
0.035954903811216354,
-0.05005185306072235,
-0.035470712929964066,
0.10925997793674469,
-0.04522745683789253,
-0.026262326166033745,
-0.027128657326102257,
-0.09760791808366776,
-0.017379216849803925,
-0.10380303114652634,
-0.13263756036758423,
0.04999830946326256,
-0.18068750202655792,
0.00562055641785264,
-0.042586758732795715,
-0.05852135270833969,
0.02671758644282818,
0.011211113072931767,
-0.0801720842719078,
0.0641409307718277,
-0.10207898169755936,
-0.14261718094348907,
-0.02556646801531315,
0.015782896429300308,
0.027732979506254196,
-0.01964949443936348,
0.10067597031593323,
0.04901120811700821,
0.11884310096502304,
-0.15137265622615814,
-0.000906149041838944,
-0.0006713634938932955,
0.06834474205970764,
0.05941688269376755,
0.13060152530670166,
-0.10515986382961273,
-0.027969863265752792,
-0.06103983148932457,
-0.06348162889480591,
-0.13580144941806793,
-0.01634615659713745,
0.1407272070646286,
0.08765122294425964,
-0.162354975938797,
-0.01073814183473587,
0.07467684894800186,
-0.002643210580572486,
-0.07879126816987991,
0.15129150450229645,
-0.05682380124926567,
0.11437069624662399,
-0.04345925524830818,
0.07842165231704712,
0.23099873960018158,
-0.0996355414390564,
0.006504190154373646,
0.11197512596845627,
0.07360801845788956,
-0.000003583008265195531,
0.013187721371650696,
0.07264252007007599,
-0.10083498060703278,
0.03870057687163353,
0.07377462089061737,
0.03774886205792427,
-0.06504413485527039,
-0.07400774955749512,
-0.022752903401851654,
-0.05681690201163292,
0.1151171550154686,
0.02729841135442257,
0.014005406759679317,
-0.06971991807222366,
-0.08609434217214584,
0.1404680609703064,
0.11459258198738098,
-0.030378511175513268,
-0.005029883235692978,
-0.13212119042873383,
-0.0021202426869422197,
-0.04808016121387482,
0.024497391656041145,
-0.1225360855460167,
0.02185128629207611,
0.08327647298574448,
0.00787833146750927,
0.012623712420463562,
0.04495754465460777,
0.06096586957573891,
0.029019279405474663,
-0.0604269839823246,
0.002956473035737872,
-0.05536847561597824,
0.004059408791363239,
-0.08605807274580002,
-0.07381096482276917,
-0.0010516063775867224,
-0.014724265784025192,
0.19779662787914276,
-0.13052622973918915,
0.03565271571278572,
0.11289136111736298,
-0.005989561323076487,
-0.009503825567662716,
-0.04201732203364372,
-0.06623262912034988,
0.10975963622331619,
-0.012460143305361271,
-0.037024881690740585,
0.03980237618088722,
0.02601690962910652,
-0.061022430658340454,
-0.17260299623012543,
-0.0970475897192955,
0.05009652301669121,
0.12437541782855988,
0.06887579709291458,
-0.05821763351559639,
-0.03995196893811226,
-0.023590924218297005,
-0.03908832371234894,
0.06429712474346161,
-0.045532938092947006,
0.03026658482849598,
0.0026904530823230743,
0.06007077172398567,
-0.10919053852558136,
-0.04384881630539894,
0.06400182098150253,
-0.01284062210470438,
-0.04328686743974686,
0.1131523922085762,
0.03539156913757324,
-0.10713862627744675,
0.06380997598171234,
0.057952430099248886,
-0.14340658485889435,
0.1139671802520752,
-0.010076945647597313,
-0.021488331258296967,
-0.09941406548023224,
0.16643720865249634,
0.018506327643990517,
0.11286304146051407,
-0.1634303331375122,
0.09947452694177628,
-0.00929392408579588,
0.008983489125967026,
0.07010107487440109,
-0.1942894458770752,
-0.0023229599464684725,
-0.03061300702393055,
-0.08721230924129486,
-0.06861617416143417,
-0.015447620302438736,
-0.003656132612377405,
0.023634733632206917,
-0.0011160095455124974,
0.07114277780056,
0.1408311426639557,
-0.021699242293834686,
-0.08314504474401474,
0.18067345023155212,
-0.2228248417377472,
-0.2243027687072754,
-0.23713873326778412,
0.014357865788042545,
-0.10143844783306122,
-0.034064821898937225,
-0.05187778174877167,
-0.07830392569303513,
0.03843799605965614,
-0.08154495060443878,
-0.047854289412498474,
-0.010174937546253204,
0.00946576427668333,
0.05525780841708183,
0.022059176117181778,
0.17305299639701843,
-0.08067180961370468,
0.027929941192269325,
0.05004730448126793,
-0.03334664925932884,
0.1225728914141655,
-0.07749255001544952,
-0.0130006680265069,
0.11107799410820007,
-0.017344031482934952,
0.020386451855301857,
0.015911642462015152,
0.321786105632782,
0.012793739326298237,
0.031549349427223206,
0.07509331405162811,
0.004641542676836252,
0.05399017781019211,
0.08592698723077774,
0.015287249349057674,
-0.10515401512384415,
0.07365500181913376,
0.053789518773555756,
-0.09195578098297119,
-0.1337462067604065,
-0.03178740665316582,
-0.06295210868120193,
0.018377745524048805,
0.07818001508712769,
0.06705041229724884,
0.08702594041824341,
0.06877101212739944,
0.04080965742468834,
0.09042216837406158,
0.0002709537511691451,
-0.010677835904061794,
0.11289051920175552,
-0.018581749871373177,
0.07096980512142181,
-0.0141120171174407,
0.02626338228583336,
0.04935658350586891,
0.12695446610450745,
0.06418963521718979,
-0.07558903843164444,
0.024903111159801483,
0.05380476638674736,
0.2923724353313446,
-0.00511576933786273,
0.09942048788070679,
-0.07165701687335968,
-0.020087575539946556,
-0.012876059859991074,
-0.031602099537849426,
-0.08611224591732025,
0.04154687002301216,
0.009212172590196133,
0.05790695920586586,
-0.0025879312306642532,
-0.00050112244207412,
0.07788389921188354,
0.09517701715230942,
0.16684189438819885,
-0.2861690819263458,
-0.10541892051696777,
-0.011134089902043343,
0.11542210727930069,
-0.09569238871335983,
0.016318857669830322,
0.22510360181331635,
0.01965336687862873,
-0.10895603150129318,
-0.031794480979442596,
0.0305477287620306,
-0.004160500131547451,
0.01549921091645956,
0.13027654588222504,
0.11410193145275116,
0.0013551336014643312,
0.08492128551006317,
-0.30027419328689575,
0.03473859280347824,
0.05703248083591461,
0.040042631328105927,
-0.03116016462445259,
0.003734297351911664,
-0.07034774869680405,
-0.08240098506212234,
0.038768261671066284,
0.004138266667723656,
0.1705280989408493,
-0.2976781725883484,
-0.07360395789146423,
-0.00185823580250144,
0.1271347999572754,
0.05303133279085159,
0.04872012883424759,
0.02403664030134678,
0.04942479729652405,
0.08176594227552414,
0.08706773817539215,
-0.02611609175801277,
-0.11035197973251343,
-0.004661462735384703,
0.13897009193897247,
-0.12668536603450775,
-0.05941249430179596,
-0.05124782398343086,
-0.022857125848531723,
0.03539132699370384,
-0.15729300677776337,
-0.04396966099739075,
-0.05650247633457184,
0.03282100334763527,
0.1437554955482483,
-0.034790076315402985,
0.0006096545839682221,
-0.012934640981256962,
0.015886543318629265,
-0.032133374363183975,
-0.08068639785051346,
0.10920129716396332,
-0.044809311628341675,
-0.13727150857448578,
-0.045596521347761154,
0.1377175748348236,
0.0731605812907219,
-0.00359351746737957,
-0.09041156619787216,
-0.050021663308143616,
0.031252212822437286,
-0.1371932029724121,
0.01997753418982029,
0.08576954901218414,
-0.07234665006399155,
0.0698034018278122,
-0.11751603335142136,
0.2458948791027069,
-0.0524151511490345,
0.0896567553281784,
0.0709720253944397,
0.31162014603614807,
-0.08235272765159607,
0.02873161807656288,
0.09803200513124466,
-0.026654057204723358,
-0.26190292835235596,
0.029979633167386055,
0.05638238042593002,
0.04938185587525368,
-0.02843460999429226,
-0.18719743192195892,
0.036363113671541214,
0.06459816545248032,
0.016497470438480377,
0.13878561556339264,
-0.3157224655151367,
-0.0721718817949295,
0.04020849987864494,
0.047762200236320496,
0.11962796002626419,
-0.04890049248933792,
0.011267954483628273,
0.010984989814460278,
-0.016328759491443634,
0.16383521258831024,
-0.08632995188236237,
0.11082419753074646,
-0.012934541329741478,
0.02104428969323635,
0.010682103224098682,
-0.040459323674440384,
0.1487898826599121,
-0.003343862947076559,
0.09072566032409668,
0.025223324075341225,
-0.06733614206314087,
0.06636755168437958,
-0.07327944040298462,
0.011974047869443893,
-0.05378776043653488,
0.09185262024402618,
-0.045282598584890366,
0.0010139745427295566,
-0.06246977299451828,
-0.024063177406787872,
-0.06749105453491211,
-0.06296666711568832,
-0.10791633278131485,
0.08884958922863007,
0.000746204168535769,
-0.02908109501004219,
-0.04201129451394081,
0.054056353867053986,
0.035463664680719376,
0.4493359923362732,
-0.05227240175008774,
-0.053069330751895905,
0.08815151453018188,
0.09277763962745667,
-0.01794450543820858,
0.09321408718824387,
-0.12357188761234283,
0.04079055041074753,
0.12765155732631683,
0.004561528097838163,
0.13461217284202576,
0.08197237551212311,
-0.10337455570697784,
-0.004654042888432741,
0.042563386261463165,
-0.13127627968788147,
-0.08014523237943649,
-0.03295697271823883,
-0.01872093789279461,
-0.1190016120672226,
-0.015042198821902275,
0.09247755259275436,
-0.03084845095872879,
0.053225692361593246,
0.02725985273718834,
0.04755283519625664,
-0.13321809470653534,
0.15661577880382538,
0.03548196330666542,
0.08438514173030853,
-0.08233853429555893,
0.0957811176776886,
0.030104901641607285,
0.004610141273587942,
0.04891415685415268,
-0.02637943997979164,
-0.10149519145488739,
0.019178146496415138,
-0.03032047301530838,
-0.08386382460594177,
0.12455578148365021,
-0.03086884878575802,
-0.04816962778568268,
-0.08449167013168335,
0.00968680065125227,
0.06828322261571884,
0.05217857286334038,
0.10059580206871033,
-0.031775325536727905,
0.02660980448126793,
-0.12699167430400848,
0.08317403495311737,
-0.024292631074786186,
0.012828993611037731,
-0.1356390118598938,
0.06760798394680023,
-0.02659040130674839,
0.057388827204704285,
-0.01863805763423443,
-0.01973152346909046,
-0.2173939198255539,
0.02618231251835823,
-0.025557123124599457,
0.0038494919426739216,
0.04102040454745293,
0.027889033779501915,
0.026005011051893234,
0.05082876980304718,
-0.035126328468322754,
0.027516542002558708,
-0.031932324171066284,
-0.05281057208776474,
0.04515986144542694,
-0.001467780559323728,
-0.02854740433394909,
-0.051517583429813385,
0.06076597422361374,
-0.11085972934961319,
0.03777240961790085,
0.01833501085639,
-0.05718983709812164,
0.07385632395744324,
0.034146055579185486,
0.025609485805034637,
0.09601592272520065,
0.05744863301515579,
0.03817358240485191,
-0.06451597064733505,
0.03407379239797592,
-0.0191631019115448,
-0.006845394615083933,
0.040695060044527054,
0.11625181883573532,
-0.04832686111330986,
-0.06327467411756516,
-0.1486724466085434,
-0.010981161147356033,
-0.048681098967790604,
0.043675489723682404,
0.14957195520401,
0.10375252366065979,
0.09278536587953568,
-0.08544762432575226,
-0.0226217620074749,
-0.14531907439231873,
-0.08078242838382721,
0.060537077486515045,
-0.044578637927770615,
-0.05809058994054794,
-0.04991595074534416,
0.072527676820755,
-0.014007162302732468,
0.13561315834522247,
-0.08526904881000519,
-0.1192963719367981,
-0.05781520903110504,
-0.2013944536447525,
-0.12102537602186203,
0.006627216935157776,
0.2745034396648407,
0.03526143729686737,
-0.04139144718647003,
-0.07936355471611023,
0.0031785741448402405,
0.07116973400115967,
0.16906431317329407,
0.0312589667737484,
0.110907644033432,
-0.11192750185728073,
0.10478517413139343,
0.039009008556604385,
-0.05691129341721535,
0.10796307027339935,
0.3328118920326233,
-0.08219370990991592,
0.02072407677769661,
-0.10421788692474365,
0.10153460502624512,
0.002255996223539114,
-0.13901546597480774,
0.010607997886836529,
-0.03188878297805786,
-0.16198381781578064,
-0.10664700716733932,
0.016538817435503006,
-0.07478190958499908,
-0.1838442087173462,
-0.017796620726585388,
-0.11684910207986832,
-0.07336822152137756,
0.10155867785215378,
0.040450263768434525,
-0.03009716235101223,
0.20794062316417694,
-0.07158278673887253,
0.04586811363697052,
-0.003929977770894766,
-0.016108684241771698,
-0.01862870156764984,
-0.03230820968747139,
-0.10202806442975998,
0.13945409655570984,
0.021661214530467987,
0.10333898663520813,
-0.0020766255911439657,
0.07358334958553314,
0.037272170186042786,
-0.0280001163482666,
-0.04698936268687248,
-0.011088578961789608,
0.0020344743970781565,
-0.05577605590224266,
0.11701337993144989,
0.05259020999073982,
-0.0886714980006218,
-0.07409586012363434,
-0.009580516256392002,
-0.07335926592350006,
-0.038215573877096176,
-0.1547286957502365,
0.2329166978597641,
-0.030917182564735413,
0.12437079846858978,
0.003149752039462328,
-0.06306970119476318,
-0.09483032673597336,
0.14517559111118317,
0.12407895922660828,
-0.11817460507154465,
-0.009399796836078167,
0.09650255739688873,
-0.005866600200533867,
-0.09527475386857986,
0.14510999619960785,
0.08037827908992767,
-0.02563084289431572,
0.025037124752998352,
-0.022733965888619423,
-0.021453559398651123,
-0.015347006730735302,
0.015540610998868942,
-0.030597979202866554,
0.029416168108582497,
0.042662251740694046,
-0.1463923156261444,
-0.03498699516057968,
-0.08044791221618652,
-0.08698305487632751,
0.18581046164035797,
-0.14277657866477966,
-0.07864214479923248,
-0.04254341498017311,
-0.08468645066022873,
-0.10728199034929276,
0.021719660609960556,
-0.10976678878068924,
0.06724512577056885,
0.05545623227953911,
-0.05225260928273201,
-0.0009250067523680627,
-0.03164041042327881,
0.0032662637531757355,
0.03487436845898628,
0.07120268791913986,
-0.01732550375163555,
0.05975370481610298,
0.12397127598524094,
-0.019730931147933006,
-0.04827846959233284,
0.11225178092718124,
0.021490709856152534,
-0.035946041345596313,
-0.14605534076690674,
0.03688865900039673,
-0.022579049691557884,
0.12442220002412796,
0.032534532248973846,
-0.05697030946612358,
-0.014126471243798733,
-0.2196122705936432,
-0.008397997356951237,
-0.1463952362537384,
-0.07968004047870636,
-0.06731394678354263,
0.10560601949691772,
0.17703479528427124,
-0.055034078657627106,
0.019445886835455894,
-0.03590158373117447,
0.04178448021411896,
-0.05044577643275261,
0.07704321295022964,
-0.0026474366895854473,
-0.1442994624376297,
0.05290527641773224,
-0.054687224328517914,
0.009347786195576191,
-0.3066256046295166,
-0.009464235045015812,
0.024085188284516335,
-0.03251948952674866,
-0.03590572252869606,
0.16143204271793365,
0.021979082375764847,
0.06882719695568085,
-0.05558779463171959,
-0.25186118483543396,
-0.07091144472360611,
0.13138745725154877,
-0.0019314914243295789,
-0.07031361013650894
] |
null | null | peft | ## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.5.0
| {"library_name": "peft"} | null | genies-models/llama-7b-punishment_avoidance | [
"peft",
"region:us"
] | 2023-11-11T02:27:59+00:00 | [] | [] | TAGS
#peft #region-us
| ## Training procedure
The following 'bitsandbytes' quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.5.0
| [
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16",
"### Framework versions\n\n\n- PEFT 0.5.0"
] | [
"TAGS\n#peft #region-us \n",
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16",
"### Framework versions\n\n\n- PEFT 0.5.0"
] | [
9,
163,
11
] | [
"passage: TAGS\n#peft #region-us \n## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16### Framework versions\n\n\n- PEFT 0.5.0"
] | [
-0.052170995622873306,
0.04638637602329254,
-0.002228866796940565,
0.12587358057498932,
0.09519550204277039,
0.0600481741130352,
0.10608749836683273,
0.12907910346984863,
0.030399560928344727,
0.08949745446443558,
0.11713548749685287,
0.04174109548330307,
0.061590272933244705,
0.15933413803577423,
-0.013215559534728527,
-0.00870309304445982,
0.04744568094611168,
0.0017413088353350759,
-0.0256193894892931,
0.08726111054420471,
0.04637008160352707,
-0.05632777512073517,
0.03198761120438576,
-0.0936860740184784,
-0.14197169244289398,
-0.000977774616330862,
-0.006933287251740694,
0.026412004604935646,
0.03658851608633995,
0.030308807268738747,
0.06681910902261734,
0.011842764914035797,
-0.01992746815085411,
-0.22561807930469513,
-0.011325501836836338,
0.10931333899497986,
-0.024895185604691505,
0.0698515921831131,
-0.09559076279401779,
0.13339440524578094,
-0.08422335982322693,
-0.051820602267980576,
0.015994668006896973,
0.014835527166724205,
-0.0893593430519104,
-0.09642457216978073,
-0.044293515384197235,
0.04310377314686775,
0.030483758077025414,
0.05993334576487541,
-0.014119904488325119,
0.17196474969387054,
-0.15130110085010529,
0.0900082141160965,
0.07799916714429855,
-0.20914553105831146,
-0.02864500693976879,
0.12620775401592255,
-0.022182848304510117,
0.16751492023468018,
-0.0821603313088417,
-0.0928761437535286,
0.06491231173276901,
0.03869080916047096,
-0.018066778779029846,
-0.0015547500224784017,
-0.11479837447404861,
0.030915888026356697,
-0.13364118337631226,
-0.04813624918460846,
0.14555932581424713,
0.03509735316038132,
-0.03541167080402374,
-0.046493805944919586,
-0.09799899905920029,
-0.3764220178127289,
0.025444475933909416,
-0.019241128116846085,
-0.066914401948452,
0.04486314579844475,
-0.02249828726053238,
-0.01497147511690855,
-0.0114058218896389,
-0.08907991647720337,
-0.02228786051273346,
0.07495085150003433,
0.03715614974498749,
0.026139622554183006,
0.006188852712512016,
0.11266672611236572,
-0.09897858649492264,
-0.03323763236403465,
-0.036225371062755585,
-0.028881169855594635,
-0.05456538870930672,
-0.006604235153645277,
-0.07022203505039215,
0.17528314888477325,
0.07525674998760223,
0.12666282057762146,
-0.12827250361442566,
0.1311921775341034,
-0.017891818657517433,
0.049590788781642914,
-0.03390548750758171,
0.035105880349874496,
-0.12452733516693115,
0.10138563066720963,
0.007189449388533831,
0.16058793663978577,
0.009220117703080177,
-0.04686558619141579,
-0.06686486303806305,
0.001404313137754798,
0.1462526172399521,
-0.0007741169538348913,
-0.10082004219293594,
0.01514110155403614,
-0.143125981092453,
-0.03643784299492836,
0.06489310413599014,
-0.07333685457706451,
0.009285308420658112,
0.035954903811216354,
-0.05005185306072235,
-0.035470712929964066,
0.10925997793674469,
-0.04522745683789253,
-0.026262326166033745,
-0.027128657326102257,
-0.09760791808366776,
-0.017379216849803925,
-0.10380303114652634,
-0.13263756036758423,
0.04999830946326256,
-0.18068750202655792,
0.00562055641785264,
-0.042586758732795715,
-0.05852135270833969,
0.02671758644282818,
0.011211113072931767,
-0.0801720842719078,
0.0641409307718277,
-0.10207898169755936,
-0.14261718094348907,
-0.02556646801531315,
0.015782896429300308,
0.027732979506254196,
-0.01964949443936348,
0.10067597031593323,
0.04901120811700821,
0.11884310096502304,
-0.15137265622615814,
-0.000906149041838944,
-0.0006713634938932955,
0.06834474205970764,
0.05941688269376755,
0.13060152530670166,
-0.10515986382961273,
-0.027969863265752792,
-0.06103983148932457,
-0.06348162889480591,
-0.13580144941806793,
-0.01634615659713745,
0.1407272070646286,
0.08765122294425964,
-0.162354975938797,
-0.01073814183473587,
0.07467684894800186,
-0.002643210580572486,
-0.07879126816987991,
0.15129150450229645,
-0.05682380124926567,
0.11437069624662399,
-0.04345925524830818,
0.07842165231704712,
0.23099873960018158,
-0.0996355414390564,
0.006504190154373646,
0.11197512596845627,
0.07360801845788956,
-0.000003583008265195531,
0.013187721371650696,
0.07264252007007599,
-0.10083498060703278,
0.03870057687163353,
0.07377462089061737,
0.03774886205792427,
-0.06504413485527039,
-0.07400774955749512,
-0.022752903401851654,
-0.05681690201163292,
0.1151171550154686,
0.02729841135442257,
0.014005406759679317,
-0.06971991807222366,
-0.08609434217214584,
0.1404680609703064,
0.11459258198738098,
-0.030378511175513268,
-0.005029883235692978,
-0.13212119042873383,
-0.0021202426869422197,
-0.04808016121387482,
0.024497391656041145,
-0.1225360855460167,
0.02185128629207611,
0.08327647298574448,
0.00787833146750927,
0.012623712420463562,
0.04495754465460777,
0.06096586957573891,
0.029019279405474663,
-0.0604269839823246,
0.002956473035737872,
-0.05536847561597824,
0.004059408791363239,
-0.08605807274580002,
-0.07381096482276917,
-0.0010516063775867224,
-0.014724265784025192,
0.19779662787914276,
-0.13052622973918915,
0.03565271571278572,
0.11289136111736298,
-0.005989561323076487,
-0.009503825567662716,
-0.04201732203364372,
-0.06623262912034988,
0.10975963622331619,
-0.012460143305361271,
-0.037024881690740585,
0.03980237618088722,
0.02601690962910652,
-0.061022430658340454,
-0.17260299623012543,
-0.0970475897192955,
0.05009652301669121,
0.12437541782855988,
0.06887579709291458,
-0.05821763351559639,
-0.03995196893811226,
-0.023590924218297005,
-0.03908832371234894,
0.06429712474346161,
-0.045532938092947006,
0.03026658482849598,
0.0026904530823230743,
0.06007077172398567,
-0.10919053852558136,
-0.04384881630539894,
0.06400182098150253,
-0.01284062210470438,
-0.04328686743974686,
0.1131523922085762,
0.03539156913757324,
-0.10713862627744675,
0.06380997598171234,
0.057952430099248886,
-0.14340658485889435,
0.1139671802520752,
-0.010076945647597313,
-0.021488331258296967,
-0.09941406548023224,
0.16643720865249634,
0.018506327643990517,
0.11286304146051407,
-0.1634303331375122,
0.09947452694177628,
-0.00929392408579588,
0.008983489125967026,
0.07010107487440109,
-0.1942894458770752,
-0.0023229599464684725,
-0.03061300702393055,
-0.08721230924129486,
-0.06861617416143417,
-0.015447620302438736,
-0.003656132612377405,
0.023634733632206917,
-0.0011160095455124974,
0.07114277780056,
0.1408311426639557,
-0.021699242293834686,
-0.08314504474401474,
0.18067345023155212,
-0.2228248417377472,
-0.2243027687072754,
-0.23713873326778412,
0.014357865788042545,
-0.10143844783306122,
-0.034064821898937225,
-0.05187778174877167,
-0.07830392569303513,
0.03843799605965614,
-0.08154495060443878,
-0.047854289412498474,
-0.010174937546253204,
0.00946576427668333,
0.05525780841708183,
0.022059176117181778,
0.17305299639701843,
-0.08067180961370468,
0.027929941192269325,
0.05004730448126793,
-0.03334664925932884,
0.1225728914141655,
-0.07749255001544952,
-0.0130006680265069,
0.11107799410820007,
-0.017344031482934952,
0.020386451855301857,
0.015911642462015152,
0.321786105632782,
0.012793739326298237,
0.031549349427223206,
0.07509331405162811,
0.004641542676836252,
0.05399017781019211,
0.08592698723077774,
0.015287249349057674,
-0.10515401512384415,
0.07365500181913376,
0.053789518773555756,
-0.09195578098297119,
-0.1337462067604065,
-0.03178740665316582,
-0.06295210868120193,
0.018377745524048805,
0.07818001508712769,
0.06705041229724884,
0.08702594041824341,
0.06877101212739944,
0.04080965742468834,
0.09042216837406158,
0.0002709537511691451,
-0.010677835904061794,
0.11289051920175552,
-0.018581749871373177,
0.07096980512142181,
-0.0141120171174407,
0.02626338228583336,
0.04935658350586891,
0.12695446610450745,
0.06418963521718979,
-0.07558903843164444,
0.024903111159801483,
0.05380476638674736,
0.2923724353313446,
-0.00511576933786273,
0.09942048788070679,
-0.07165701687335968,
-0.020087575539946556,
-0.012876059859991074,
-0.031602099537849426,
-0.08611224591732025,
0.04154687002301216,
0.009212172590196133,
0.05790695920586586,
-0.0025879312306642532,
-0.00050112244207412,
0.07788389921188354,
0.09517701715230942,
0.16684189438819885,
-0.2861690819263458,
-0.10541892051696777,
-0.011134089902043343,
0.11542210727930069,
-0.09569238871335983,
0.016318857669830322,
0.22510360181331635,
0.01965336687862873,
-0.10895603150129318,
-0.031794480979442596,
0.0305477287620306,
-0.004160500131547451,
0.01549921091645956,
0.13027654588222504,
0.11410193145275116,
0.0013551336014643312,
0.08492128551006317,
-0.30027419328689575,
0.03473859280347824,
0.05703248083591461,
0.040042631328105927,
-0.03116016462445259,
0.003734297351911664,
-0.07034774869680405,
-0.08240098506212234,
0.038768261671066284,
0.004138266667723656,
0.1705280989408493,
-0.2976781725883484,
-0.07360395789146423,
-0.00185823580250144,
0.1271347999572754,
0.05303133279085159,
0.04872012883424759,
0.02403664030134678,
0.04942479729652405,
0.08176594227552414,
0.08706773817539215,
-0.02611609175801277,
-0.11035197973251343,
-0.004661462735384703,
0.13897009193897247,
-0.12668536603450775,
-0.05941249430179596,
-0.05124782398343086,
-0.022857125848531723,
0.03539132699370384,
-0.15729300677776337,
-0.04396966099739075,
-0.05650247633457184,
0.03282100334763527,
0.1437554955482483,
-0.034790076315402985,
0.0006096545839682221,
-0.012934640981256962,
0.015886543318629265,
-0.032133374363183975,
-0.08068639785051346,
0.10920129716396332,
-0.044809311628341675,
-0.13727150857448578,
-0.045596521347761154,
0.1377175748348236,
0.0731605812907219,
-0.00359351746737957,
-0.09041156619787216,
-0.050021663308143616,
0.031252212822437286,
-0.1371932029724121,
0.01997753418982029,
0.08576954901218414,
-0.07234665006399155,
0.0698034018278122,
-0.11751603335142136,
0.2458948791027069,
-0.0524151511490345,
0.0896567553281784,
0.0709720253944397,
0.31162014603614807,
-0.08235272765159607,
0.02873161807656288,
0.09803200513124466,
-0.026654057204723358,
-0.26190292835235596,
0.029979633167386055,
0.05638238042593002,
0.04938185587525368,
-0.02843460999429226,
-0.18719743192195892,
0.036363113671541214,
0.06459816545248032,
0.016497470438480377,
0.13878561556339264,
-0.3157224655151367,
-0.0721718817949295,
0.04020849987864494,
0.047762200236320496,
0.11962796002626419,
-0.04890049248933792,
0.011267954483628273,
0.010984989814460278,
-0.016328759491443634,
0.16383521258831024,
-0.08632995188236237,
0.11082419753074646,
-0.012934541329741478,
0.02104428969323635,
0.010682103224098682,
-0.040459323674440384,
0.1487898826599121,
-0.003343862947076559,
0.09072566032409668,
0.025223324075341225,
-0.06733614206314087,
0.06636755168437958,
-0.07327944040298462,
0.011974047869443893,
-0.05378776043653488,
0.09185262024402618,
-0.045282598584890366,
0.0010139745427295566,
-0.06246977299451828,
-0.024063177406787872,
-0.06749105453491211,
-0.06296666711568832,
-0.10791633278131485,
0.08884958922863007,
0.000746204168535769,
-0.02908109501004219,
-0.04201129451394081,
0.054056353867053986,
0.035463664680719376,
0.4493359923362732,
-0.05227240175008774,
-0.053069330751895905,
0.08815151453018188,
0.09277763962745667,
-0.01794450543820858,
0.09321408718824387,
-0.12357188761234283,
0.04079055041074753,
0.12765155732631683,
0.004561528097838163,
0.13461217284202576,
0.08197237551212311,
-0.10337455570697784,
-0.004654042888432741,
0.042563386261463165,
-0.13127627968788147,
-0.08014523237943649,
-0.03295697271823883,
-0.01872093789279461,
-0.1190016120672226,
-0.015042198821902275,
0.09247755259275436,
-0.03084845095872879,
0.053225692361593246,
0.02725985273718834,
0.04755283519625664,
-0.13321809470653534,
0.15661577880382538,
0.03548196330666542,
0.08438514173030853,
-0.08233853429555893,
0.0957811176776886,
0.030104901641607285,
0.004610141273587942,
0.04891415685415268,
-0.02637943997979164,
-0.10149519145488739,
0.019178146496415138,
-0.03032047301530838,
-0.08386382460594177,
0.12455578148365021,
-0.03086884878575802,
-0.04816962778568268,
-0.08449167013168335,
0.00968680065125227,
0.06828322261571884,
0.05217857286334038,
0.10059580206871033,
-0.031775325536727905,
0.02660980448126793,
-0.12699167430400848,
0.08317403495311737,
-0.024292631074786186,
0.012828993611037731,
-0.1356390118598938,
0.06760798394680023,
-0.02659040130674839,
0.057388827204704285,
-0.01863805763423443,
-0.01973152346909046,
-0.2173939198255539,
0.02618231251835823,
-0.025557123124599457,
0.0038494919426739216,
0.04102040454745293,
0.027889033779501915,
0.026005011051893234,
0.05082876980304718,
-0.035126328468322754,
0.027516542002558708,
-0.031932324171066284,
-0.05281057208776474,
0.04515986144542694,
-0.001467780559323728,
-0.02854740433394909,
-0.051517583429813385,
0.06076597422361374,
-0.11085972934961319,
0.03777240961790085,
0.01833501085639,
-0.05718983709812164,
0.07385632395744324,
0.034146055579185486,
0.025609485805034637,
0.09601592272520065,
0.05744863301515579,
0.03817358240485191,
-0.06451597064733505,
0.03407379239797592,
-0.0191631019115448,
-0.006845394615083933,
0.040695060044527054,
0.11625181883573532,
-0.04832686111330986,
-0.06327467411756516,
-0.1486724466085434,
-0.010981161147356033,
-0.048681098967790604,
0.043675489723682404,
0.14957195520401,
0.10375252366065979,
0.09278536587953568,
-0.08544762432575226,
-0.0226217620074749,
-0.14531907439231873,
-0.08078242838382721,
0.060537077486515045,
-0.044578637927770615,
-0.05809058994054794,
-0.04991595074534416,
0.072527676820755,
-0.014007162302732468,
0.13561315834522247,
-0.08526904881000519,
-0.1192963719367981,
-0.05781520903110504,
-0.2013944536447525,
-0.12102537602186203,
0.006627216935157776,
0.2745034396648407,
0.03526143729686737,
-0.04139144718647003,
-0.07936355471611023,
0.0031785741448402405,
0.07116973400115967,
0.16906431317329407,
0.0312589667737484,
0.110907644033432,
-0.11192750185728073,
0.10478517413139343,
0.039009008556604385,
-0.05691129341721535,
0.10796307027339935,
0.3328118920326233,
-0.08219370990991592,
0.02072407677769661,
-0.10421788692474365,
0.10153460502624512,
0.002255996223539114,
-0.13901546597480774,
0.010607997886836529,
-0.03188878297805786,
-0.16198381781578064,
-0.10664700716733932,
0.016538817435503006,
-0.07478190958499908,
-0.1838442087173462,
-0.017796620726585388,
-0.11684910207986832,
-0.07336822152137756,
0.10155867785215378,
0.040450263768434525,
-0.03009716235101223,
0.20794062316417694,
-0.07158278673887253,
0.04586811363697052,
-0.003929977770894766,
-0.016108684241771698,
-0.01862870156764984,
-0.03230820968747139,
-0.10202806442975998,
0.13945409655570984,
0.021661214530467987,
0.10333898663520813,
-0.0020766255911439657,
0.07358334958553314,
0.037272170186042786,
-0.0280001163482666,
-0.04698936268687248,
-0.011088578961789608,
0.0020344743970781565,
-0.05577605590224266,
0.11701337993144989,
0.05259020999073982,
-0.0886714980006218,
-0.07409586012363434,
-0.009580516256392002,
-0.07335926592350006,
-0.038215573877096176,
-0.1547286957502365,
0.2329166978597641,
-0.030917182564735413,
0.12437079846858978,
0.003149752039462328,
-0.06306970119476318,
-0.09483032673597336,
0.14517559111118317,
0.12407895922660828,
-0.11817460507154465,
-0.009399796836078167,
0.09650255739688873,
-0.005866600200533867,
-0.09527475386857986,
0.14510999619960785,
0.08037827908992767,
-0.02563084289431572,
0.025037124752998352,
-0.022733965888619423,
-0.021453559398651123,
-0.015347006730735302,
0.015540610998868942,
-0.030597979202866554,
0.029416168108582497,
0.042662251740694046,
-0.1463923156261444,
-0.03498699516057968,
-0.08044791221618652,
-0.08698305487632751,
0.18581046164035797,
-0.14277657866477966,
-0.07864214479923248,
-0.04254341498017311,
-0.08468645066022873,
-0.10728199034929276,
0.021719660609960556,
-0.10976678878068924,
0.06724512577056885,
0.05545623227953911,
-0.05225260928273201,
-0.0009250067523680627,
-0.03164041042327881,
0.0032662637531757355,
0.03487436845898628,
0.07120268791913986,
-0.01732550375163555,
0.05975370481610298,
0.12397127598524094,
-0.019730931147933006,
-0.04827846959233284,
0.11225178092718124,
0.021490709856152534,
-0.035946041345596313,
-0.14605534076690674,
0.03688865900039673,
-0.022579049691557884,
0.12442220002412796,
0.032534532248973846,
-0.05697030946612358,
-0.014126471243798733,
-0.2196122705936432,
-0.008397997356951237,
-0.1463952362537384,
-0.07968004047870636,
-0.06731394678354263,
0.10560601949691772,
0.17703479528427124,
-0.055034078657627106,
0.019445886835455894,
-0.03590158373117447,
0.04178448021411896,
-0.05044577643275261,
0.07704321295022964,
-0.0026474366895854473,
-0.1442994624376297,
0.05290527641773224,
-0.054687224328517914,
0.009347786195576191,
-0.3066256046295166,
-0.009464235045015812,
0.024085188284516335,
-0.03251948952674866,
-0.03590572252869606,
0.16143204271793365,
0.021979082375764847,
0.06882719695568085,
-0.05558779463171959,
-0.25186118483543396,
-0.07091144472360611,
0.13138745725154877,
-0.0019314914243295789,
-0.07031361013650894
] |
null | null | peft | ## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.5.0
| {"library_name": "peft"} | null | genies-models/llama-7b-counterfactual_python | [
"peft",
"region:us"
] | 2023-11-11T02:28:17+00:00 | [] | [] | TAGS
#peft #region-us
| ## Training procedure
The following 'bitsandbytes' quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.5.0
| [
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16",
"### Framework versions\n\n\n- PEFT 0.5.0"
] | [
"TAGS\n#peft #region-us \n",
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16",
"### Framework versions\n\n\n- PEFT 0.5.0"
] | [
9,
163,
11
] | [
"passage: TAGS\n#peft #region-us \n## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16### Framework versions\n\n\n- PEFT 0.5.0"
] | [
-0.052170995622873306,
0.04638637602329254,
-0.002228866796940565,
0.12587358057498932,
0.09519550204277039,
0.0600481741130352,
0.10608749836683273,
0.12907910346984863,
0.030399560928344727,
0.08949745446443558,
0.11713548749685287,
0.04174109548330307,
0.061590272933244705,
0.15933413803577423,
-0.013215559534728527,
-0.00870309304445982,
0.04744568094611168,
0.0017413088353350759,
-0.0256193894892931,
0.08726111054420471,
0.04637008160352707,
-0.05632777512073517,
0.03198761120438576,
-0.0936860740184784,
-0.14197169244289398,
-0.000977774616330862,
-0.006933287251740694,
0.026412004604935646,
0.03658851608633995,
0.030308807268738747,
0.06681910902261734,
0.011842764914035797,
-0.01992746815085411,
-0.22561807930469513,
-0.011325501836836338,
0.10931333899497986,
-0.024895185604691505,
0.0698515921831131,
-0.09559076279401779,
0.13339440524578094,
-0.08422335982322693,
-0.051820602267980576,
0.015994668006896973,
0.014835527166724205,
-0.0893593430519104,
-0.09642457216978073,
-0.044293515384197235,
0.04310377314686775,
0.030483758077025414,
0.05993334576487541,
-0.014119904488325119,
0.17196474969387054,
-0.15130110085010529,
0.0900082141160965,
0.07799916714429855,
-0.20914553105831146,
-0.02864500693976879,
0.12620775401592255,
-0.022182848304510117,
0.16751492023468018,
-0.0821603313088417,
-0.0928761437535286,
0.06491231173276901,
0.03869080916047096,
-0.018066778779029846,
-0.0015547500224784017,
-0.11479837447404861,
0.030915888026356697,
-0.13364118337631226,
-0.04813624918460846,
0.14555932581424713,
0.03509735316038132,
-0.03541167080402374,
-0.046493805944919586,
-0.09799899905920029,
-0.3764220178127289,
0.025444475933909416,
-0.019241128116846085,
-0.066914401948452,
0.04486314579844475,
-0.02249828726053238,
-0.01497147511690855,
-0.0114058218896389,
-0.08907991647720337,
-0.02228786051273346,
0.07495085150003433,
0.03715614974498749,
0.026139622554183006,
0.006188852712512016,
0.11266672611236572,
-0.09897858649492264,
-0.03323763236403465,
-0.036225371062755585,
-0.028881169855594635,
-0.05456538870930672,
-0.006604235153645277,
-0.07022203505039215,
0.17528314888477325,
0.07525674998760223,
0.12666282057762146,
-0.12827250361442566,
0.1311921775341034,
-0.017891818657517433,
0.049590788781642914,
-0.03390548750758171,
0.035105880349874496,
-0.12452733516693115,
0.10138563066720963,
0.007189449388533831,
0.16058793663978577,
0.009220117703080177,
-0.04686558619141579,
-0.06686486303806305,
0.001404313137754798,
0.1462526172399521,
-0.0007741169538348913,
-0.10082004219293594,
0.01514110155403614,
-0.143125981092453,
-0.03643784299492836,
0.06489310413599014,
-0.07333685457706451,
0.009285308420658112,
0.035954903811216354,
-0.05005185306072235,
-0.035470712929964066,
0.10925997793674469,
-0.04522745683789253,
-0.026262326166033745,
-0.027128657326102257,
-0.09760791808366776,
-0.017379216849803925,
-0.10380303114652634,
-0.13263756036758423,
0.04999830946326256,
-0.18068750202655792,
0.00562055641785264,
-0.042586758732795715,
-0.05852135270833969,
0.02671758644282818,
0.011211113072931767,
-0.0801720842719078,
0.0641409307718277,
-0.10207898169755936,
-0.14261718094348907,
-0.02556646801531315,
0.015782896429300308,
0.027732979506254196,
-0.01964949443936348,
0.10067597031593323,
0.04901120811700821,
0.11884310096502304,
-0.15137265622615814,
-0.000906149041838944,
-0.0006713634938932955,
0.06834474205970764,
0.05941688269376755,
0.13060152530670166,
-0.10515986382961273,
-0.027969863265752792,
-0.06103983148932457,
-0.06348162889480591,
-0.13580144941806793,
-0.01634615659713745,
0.1407272070646286,
0.08765122294425964,
-0.162354975938797,
-0.01073814183473587,
0.07467684894800186,
-0.002643210580572486,
-0.07879126816987991,
0.15129150450229645,
-0.05682380124926567,
0.11437069624662399,
-0.04345925524830818,
0.07842165231704712,
0.23099873960018158,
-0.0996355414390564,
0.006504190154373646,
0.11197512596845627,
0.07360801845788956,
-0.000003583008265195531,
0.013187721371650696,
0.07264252007007599,
-0.10083498060703278,
0.03870057687163353,
0.07377462089061737,
0.03774886205792427,
-0.06504413485527039,
-0.07400774955749512,
-0.022752903401851654,
-0.05681690201163292,
0.1151171550154686,
0.02729841135442257,
0.014005406759679317,
-0.06971991807222366,
-0.08609434217214584,
0.1404680609703064,
0.11459258198738098,
-0.030378511175513268,
-0.005029883235692978,
-0.13212119042873383,
-0.0021202426869422197,
-0.04808016121387482,
0.024497391656041145,
-0.1225360855460167,
0.02185128629207611,
0.08327647298574448,
0.00787833146750927,
0.012623712420463562,
0.04495754465460777,
0.06096586957573891,
0.029019279405474663,
-0.0604269839823246,
0.002956473035737872,
-0.05536847561597824,
0.004059408791363239,
-0.08605807274580002,
-0.07381096482276917,
-0.0010516063775867224,
-0.014724265784025192,
0.19779662787914276,
-0.13052622973918915,
0.03565271571278572,
0.11289136111736298,
-0.005989561323076487,
-0.009503825567662716,
-0.04201732203364372,
-0.06623262912034988,
0.10975963622331619,
-0.012460143305361271,
-0.037024881690740585,
0.03980237618088722,
0.02601690962910652,
-0.061022430658340454,
-0.17260299623012543,
-0.0970475897192955,
0.05009652301669121,
0.12437541782855988,
0.06887579709291458,
-0.05821763351559639,
-0.03995196893811226,
-0.023590924218297005,
-0.03908832371234894,
0.06429712474346161,
-0.045532938092947006,
0.03026658482849598,
0.0026904530823230743,
0.06007077172398567,
-0.10919053852558136,
-0.04384881630539894,
0.06400182098150253,
-0.01284062210470438,
-0.04328686743974686,
0.1131523922085762,
0.03539156913757324,
-0.10713862627744675,
0.06380997598171234,
0.057952430099248886,
-0.14340658485889435,
0.1139671802520752,
-0.010076945647597313,
-0.021488331258296967,
-0.09941406548023224,
0.16643720865249634,
0.018506327643990517,
0.11286304146051407,
-0.1634303331375122,
0.09947452694177628,
-0.00929392408579588,
0.008983489125967026,
0.07010107487440109,
-0.1942894458770752,
-0.0023229599464684725,
-0.03061300702393055,
-0.08721230924129486,
-0.06861617416143417,
-0.015447620302438736,
-0.003656132612377405,
0.023634733632206917,
-0.0011160095455124974,
0.07114277780056,
0.1408311426639557,
-0.021699242293834686,
-0.08314504474401474,
0.18067345023155212,
-0.2228248417377472,
-0.2243027687072754,
-0.23713873326778412,
0.014357865788042545,
-0.10143844783306122,
-0.034064821898937225,
-0.05187778174877167,
-0.07830392569303513,
0.03843799605965614,
-0.08154495060443878,
-0.047854289412498474,
-0.010174937546253204,
0.00946576427668333,
0.05525780841708183,
0.022059176117181778,
0.17305299639701843,
-0.08067180961370468,
0.027929941192269325,
0.05004730448126793,
-0.03334664925932884,
0.1225728914141655,
-0.07749255001544952,
-0.0130006680265069,
0.11107799410820007,
-0.017344031482934952,
0.020386451855301857,
0.015911642462015152,
0.321786105632782,
0.012793739326298237,
0.031549349427223206,
0.07509331405162811,
0.004641542676836252,
0.05399017781019211,
0.08592698723077774,
0.015287249349057674,
-0.10515401512384415,
0.07365500181913376,
0.053789518773555756,
-0.09195578098297119,
-0.1337462067604065,
-0.03178740665316582,
-0.06295210868120193,
0.018377745524048805,
0.07818001508712769,
0.06705041229724884,
0.08702594041824341,
0.06877101212739944,
0.04080965742468834,
0.09042216837406158,
0.0002709537511691451,
-0.010677835904061794,
0.11289051920175552,
-0.018581749871373177,
0.07096980512142181,
-0.0141120171174407,
0.02626338228583336,
0.04935658350586891,
0.12695446610450745,
0.06418963521718979,
-0.07558903843164444,
0.024903111159801483,
0.05380476638674736,
0.2923724353313446,
-0.00511576933786273,
0.09942048788070679,
-0.07165701687335968,
-0.020087575539946556,
-0.012876059859991074,
-0.031602099537849426,
-0.08611224591732025,
0.04154687002301216,
0.009212172590196133,
0.05790695920586586,
-0.0025879312306642532,
-0.00050112244207412,
0.07788389921188354,
0.09517701715230942,
0.16684189438819885,
-0.2861690819263458,
-0.10541892051696777,
-0.011134089902043343,
0.11542210727930069,
-0.09569238871335983,
0.016318857669830322,
0.22510360181331635,
0.01965336687862873,
-0.10895603150129318,
-0.031794480979442596,
0.0305477287620306,
-0.004160500131547451,
0.01549921091645956,
0.13027654588222504,
0.11410193145275116,
0.0013551336014643312,
0.08492128551006317,
-0.30027419328689575,
0.03473859280347824,
0.05703248083591461,
0.040042631328105927,
-0.03116016462445259,
0.003734297351911664,
-0.07034774869680405,
-0.08240098506212234,
0.038768261671066284,
0.004138266667723656,
0.1705280989408493,
-0.2976781725883484,
-0.07360395789146423,
-0.00185823580250144,
0.1271347999572754,
0.05303133279085159,
0.04872012883424759,
0.02403664030134678,
0.04942479729652405,
0.08176594227552414,
0.08706773817539215,
-0.02611609175801277,
-0.11035197973251343,
-0.004661462735384703,
0.13897009193897247,
-0.12668536603450775,
-0.05941249430179596,
-0.05124782398343086,
-0.022857125848531723,
0.03539132699370384,
-0.15729300677776337,
-0.04396966099739075,
-0.05650247633457184,
0.03282100334763527,
0.1437554955482483,
-0.034790076315402985,
0.0006096545839682221,
-0.012934640981256962,
0.015886543318629265,
-0.032133374363183975,
-0.08068639785051346,
0.10920129716396332,
-0.044809311628341675,
-0.13727150857448578,
-0.045596521347761154,
0.1377175748348236,
0.0731605812907219,
-0.00359351746737957,
-0.09041156619787216,
-0.050021663308143616,
0.031252212822437286,
-0.1371932029724121,
0.01997753418982029,
0.08576954901218414,
-0.07234665006399155,
0.0698034018278122,
-0.11751603335142136,
0.2458948791027069,
-0.0524151511490345,
0.0896567553281784,
0.0709720253944397,
0.31162014603614807,
-0.08235272765159607,
0.02873161807656288,
0.09803200513124466,
-0.026654057204723358,
-0.26190292835235596,
0.029979633167386055,
0.05638238042593002,
0.04938185587525368,
-0.02843460999429226,
-0.18719743192195892,
0.036363113671541214,
0.06459816545248032,
0.016497470438480377,
0.13878561556339264,
-0.3157224655151367,
-0.0721718817949295,
0.04020849987864494,
0.047762200236320496,
0.11962796002626419,
-0.04890049248933792,
0.011267954483628273,
0.010984989814460278,
-0.016328759491443634,
0.16383521258831024,
-0.08632995188236237,
0.11082419753074646,
-0.012934541329741478,
0.02104428969323635,
0.010682103224098682,
-0.040459323674440384,
0.1487898826599121,
-0.003343862947076559,
0.09072566032409668,
0.025223324075341225,
-0.06733614206314087,
0.06636755168437958,
-0.07327944040298462,
0.011974047869443893,
-0.05378776043653488,
0.09185262024402618,
-0.045282598584890366,
0.0010139745427295566,
-0.06246977299451828,
-0.024063177406787872,
-0.06749105453491211,
-0.06296666711568832,
-0.10791633278131485,
0.08884958922863007,
0.000746204168535769,
-0.02908109501004219,
-0.04201129451394081,
0.054056353867053986,
0.035463664680719376,
0.4493359923362732,
-0.05227240175008774,
-0.053069330751895905,
0.08815151453018188,
0.09277763962745667,
-0.01794450543820858,
0.09321408718824387,
-0.12357188761234283,
0.04079055041074753,
0.12765155732631683,
0.004561528097838163,
0.13461217284202576,
0.08197237551212311,
-0.10337455570697784,
-0.004654042888432741,
0.042563386261463165,
-0.13127627968788147,
-0.08014523237943649,
-0.03295697271823883,
-0.01872093789279461,
-0.1190016120672226,
-0.015042198821902275,
0.09247755259275436,
-0.03084845095872879,
0.053225692361593246,
0.02725985273718834,
0.04755283519625664,
-0.13321809470653534,
0.15661577880382538,
0.03548196330666542,
0.08438514173030853,
-0.08233853429555893,
0.0957811176776886,
0.030104901641607285,
0.004610141273587942,
0.04891415685415268,
-0.02637943997979164,
-0.10149519145488739,
0.019178146496415138,
-0.03032047301530838,
-0.08386382460594177,
0.12455578148365021,
-0.03086884878575802,
-0.04816962778568268,
-0.08449167013168335,
0.00968680065125227,
0.06828322261571884,
0.05217857286334038,
0.10059580206871033,
-0.031775325536727905,
0.02660980448126793,
-0.12699167430400848,
0.08317403495311737,
-0.024292631074786186,
0.012828993611037731,
-0.1356390118598938,
0.06760798394680023,
-0.02659040130674839,
0.057388827204704285,
-0.01863805763423443,
-0.01973152346909046,
-0.2173939198255539,
0.02618231251835823,
-0.025557123124599457,
0.0038494919426739216,
0.04102040454745293,
0.027889033779501915,
0.026005011051893234,
0.05082876980304718,
-0.035126328468322754,
0.027516542002558708,
-0.031932324171066284,
-0.05281057208776474,
0.04515986144542694,
-0.001467780559323728,
-0.02854740433394909,
-0.051517583429813385,
0.06076597422361374,
-0.11085972934961319,
0.03777240961790085,
0.01833501085639,
-0.05718983709812164,
0.07385632395744324,
0.034146055579185486,
0.025609485805034637,
0.09601592272520065,
0.05744863301515579,
0.03817358240485191,
-0.06451597064733505,
0.03407379239797592,
-0.0191631019115448,
-0.006845394615083933,
0.040695060044527054,
0.11625181883573532,
-0.04832686111330986,
-0.06327467411756516,
-0.1486724466085434,
-0.010981161147356033,
-0.048681098967790604,
0.043675489723682404,
0.14957195520401,
0.10375252366065979,
0.09278536587953568,
-0.08544762432575226,
-0.0226217620074749,
-0.14531907439231873,
-0.08078242838382721,
0.060537077486515045,
-0.044578637927770615,
-0.05809058994054794,
-0.04991595074534416,
0.072527676820755,
-0.014007162302732468,
0.13561315834522247,
-0.08526904881000519,
-0.1192963719367981,
-0.05781520903110504,
-0.2013944536447525,
-0.12102537602186203,
0.006627216935157776,
0.2745034396648407,
0.03526143729686737,
-0.04139144718647003,
-0.07936355471611023,
0.0031785741448402405,
0.07116973400115967,
0.16906431317329407,
0.0312589667737484,
0.110907644033432,
-0.11192750185728073,
0.10478517413139343,
0.039009008556604385,
-0.05691129341721535,
0.10796307027339935,
0.3328118920326233,
-0.08219370990991592,
0.02072407677769661,
-0.10421788692474365,
0.10153460502624512,
0.002255996223539114,
-0.13901546597480774,
0.010607997886836529,
-0.03188878297805786,
-0.16198381781578064,
-0.10664700716733932,
0.016538817435503006,
-0.07478190958499908,
-0.1838442087173462,
-0.017796620726585388,
-0.11684910207986832,
-0.07336822152137756,
0.10155867785215378,
0.040450263768434525,
-0.03009716235101223,
0.20794062316417694,
-0.07158278673887253,
0.04586811363697052,
-0.003929977770894766,
-0.016108684241771698,
-0.01862870156764984,
-0.03230820968747139,
-0.10202806442975998,
0.13945409655570984,
0.021661214530467987,
0.10333898663520813,
-0.0020766255911439657,
0.07358334958553314,
0.037272170186042786,
-0.0280001163482666,
-0.04698936268687248,
-0.011088578961789608,
0.0020344743970781565,
-0.05577605590224266,
0.11701337993144989,
0.05259020999073982,
-0.0886714980006218,
-0.07409586012363434,
-0.009580516256392002,
-0.07335926592350006,
-0.038215573877096176,
-0.1547286957502365,
0.2329166978597641,
-0.030917182564735413,
0.12437079846858978,
0.003149752039462328,
-0.06306970119476318,
-0.09483032673597336,
0.14517559111118317,
0.12407895922660828,
-0.11817460507154465,
-0.009399796836078167,
0.09650255739688873,
-0.005866600200533867,
-0.09527475386857986,
0.14510999619960785,
0.08037827908992767,
-0.02563084289431572,
0.025037124752998352,
-0.022733965888619423,
-0.021453559398651123,
-0.015347006730735302,
0.015540610998868942,
-0.030597979202866554,
0.029416168108582497,
0.042662251740694046,
-0.1463923156261444,
-0.03498699516057968,
-0.08044791221618652,
-0.08698305487632751,
0.18581046164035797,
-0.14277657866477966,
-0.07864214479923248,
-0.04254341498017311,
-0.08468645066022873,
-0.10728199034929276,
0.021719660609960556,
-0.10976678878068924,
0.06724512577056885,
0.05545623227953911,
-0.05225260928273201,
-0.0009250067523680627,
-0.03164041042327881,
0.0032662637531757355,
0.03487436845898628,
0.07120268791913986,
-0.01732550375163555,
0.05975370481610298,
0.12397127598524094,
-0.019730931147933006,
-0.04827846959233284,
0.11225178092718124,
0.021490709856152534,
-0.035946041345596313,
-0.14605534076690674,
0.03688865900039673,
-0.022579049691557884,
0.12442220002412796,
0.032534532248973846,
-0.05697030946612358,
-0.014126471243798733,
-0.2196122705936432,
-0.008397997356951237,
-0.1463952362537384,
-0.07968004047870636,
-0.06731394678354263,
0.10560601949691772,
0.17703479528427124,
-0.055034078657627106,
0.019445886835455894,
-0.03590158373117447,
0.04178448021411896,
-0.05044577643275261,
0.07704321295022964,
-0.0026474366895854473,
-0.1442994624376297,
0.05290527641773224,
-0.054687224328517914,
0.009347786195576191,
-0.3066256046295166,
-0.009464235045015812,
0.024085188284516335,
-0.03251948952674866,
-0.03590572252869606,
0.16143204271793365,
0.021979082375764847,
0.06882719695568085,
-0.05558779463171959,
-0.25186118483543396,
-0.07091144472360611,
0.13138745725154877,
-0.0019314914243295789,
-0.07031361013650894
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bart-base-vn-re-attention-vn-tokenizer
This model is a fine-tuned version of [facebook/bart-base](https://huggingface.co/facebook/bart-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 2.4481
- Rouge1: 18.0845
- Rouge2: 8.2722
- Rougel: 15.4154
- Rougelsum: 16.6591
- Bleu-1: 0.0007
- Bleu-2: 0.0004
- Bleu-3: 0.0002
- Bleu-4: 0.0001
- Gen Len: 19.811
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Bleu-1 | Bleu-2 | Bleu-3 | Bleu-4 | Gen Len |
|:-------------:|:-----:|:-----:|:---------------:|:-------:|:------:|:-------:|:---------:|:------:|:------:|:------:|:------:|:-------:|
| 2.6584 | 1.0 | 10886 | 2.4481 | 18.0845 | 8.2722 | 15.4154 | 16.6591 | 0.0007 | 0.0004 | 0.0002 | 0.0001 | 19.811 |
### Framework versions
- Transformers 4.35.0
- Pytorch 2.0.0
- Datasets 2.1.0
- Tokenizers 0.14.1
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "metrics": ["rouge"], "base_model": "facebook/bart-base", "model-index": [{"name": "bart-base-vn-re-attention-vn-tokenizer", "results": []}]} | text2text-generation | GuysTrans/bart-base-vn-re-attention-vn-tokenizer | [
"transformers",
"tensorboard",
"safetensors",
"bart",
"text2text-generation",
"generated_from_trainer",
"base_model:facebook/bart-base",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | 2023-11-11T02:28:23+00:00 | [] | [] | TAGS
#transformers #tensorboard #safetensors #bart #text2text-generation #generated_from_trainer #base_model-facebook/bart-base #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us
| bart-base-vn-re-attention-vn-tokenizer
======================================
This model is a fine-tuned version of facebook/bart-base on the None dataset.
It achieves the following results on the evaluation set:
* Loss: 2.4481
* Rouge1: 18.0845
* Rouge2: 8.2722
* Rougel: 15.4154
* Rougelsum: 16.6591
* Bleu-1: 0.0007
* Bleu-2: 0.0004
* Bleu-3: 0.0002
* Bleu-4: 0.0001
* Gen Len: 19.811
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 1
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.35.0
* Pytorch 2.0.0
* Datasets 2.1.0
* Tokenizers 0.14.1
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 1\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.35.0\n* Pytorch 2.0.0\n* Datasets 2.1.0\n* Tokenizers 0.14.1"
] | [
"TAGS\n#transformers #tensorboard #safetensors #bart #text2text-generation #generated_from_trainer #base_model-facebook/bart-base #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 1\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.35.0\n* Pytorch 2.0.0\n* Datasets 2.1.0\n* Tokenizers 0.14.1"
] | [
72,
113,
4,
30
] | [
"passage: TAGS\n#transformers #tensorboard #safetensors #bart #text2text-generation #generated_from_trainer #base_model-facebook/bart-base #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 1\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.35.0\n* Pytorch 2.0.0\n* Datasets 2.1.0\n* Tokenizers 0.14.1"
] | [
-0.0901406779885292,
0.07658539712429047,
-0.003566499799489975,
0.07833339273929596,
0.10604605823755264,
0.001472335890866816,
0.14546352624893188,
0.13667191565036774,
-0.06902432441711426,
0.06540096551179886,
0.12876148521900177,
0.09017820656299591,
0.0348750501871109,
0.1604253202676773,
-0.059204671531915665,
-0.22285395860671997,
0.022057106718420982,
0.03568434715270996,
-0.06075359508395195,
0.11474074423313141,
0.09102974086999893,
-0.11702210456132889,
0.08224497735500336,
0.011114231310784817,
-0.15027566254138947,
0.02021208219230175,
0.019810298457741737,
-0.07228945195674896,
0.11607681214809418,
0.03859696537256241,
0.1228247582912445,
0.037647128105163574,
0.06192825734615326,
-0.1863454282283783,
0.011077819392085075,
0.07337797433137894,
0.0012871791841462255,
0.08087600022554398,
0.06601594388484955,
-0.011023011989891529,
0.13076111674308777,
-0.07582082599401474,
0.07073460519313812,
0.030124489217996597,
-0.12816151976585388,
-0.2723119854927063,
-0.0934988483786583,
0.041849225759506226,
0.09832748025655746,
0.08909454941749573,
-0.01620742864906788,
0.10590006411075592,
-0.046343617141246796,
0.08691607415676117,
0.25227734446525574,
-0.30330032110214233,
-0.06486349552869797,
0.012476976960897446,
0.06138572841882706,
0.06630569696426392,
-0.10481614619493484,
-0.015283768996596336,
0.04431278631091118,
0.03776732459664345,
0.1481933742761612,
-0.014520687982439995,
-0.011475950479507446,
-0.01094254944473505,
-0.13221979141235352,
-0.04260723665356636,
0.14886902272701263,
0.04941322281956673,
-0.04476369172334671,
-0.07977338880300522,
-0.06302638351917267,
-0.13632717728614807,
-0.051938705146312714,
-0.006707158405333757,
0.033441103994846344,
-0.022247808054089546,
-0.06910251080989838,
-0.025119222700595856,
-0.10378550738096237,
-0.06583849340677261,
-0.04008924961090088,
0.14980915188789368,
0.041939929127693176,
0.0070561496540904045,
-0.03609205782413483,
0.09356823563575745,
-0.017342740669846535,
-0.1592985838651657,
0.011910383589565754,
0.016240980476140976,
0.01656600460410118,
-0.01878541335463524,
-0.050027474761009216,
-0.0816434770822525,
0.026052268221974373,
0.1597375124692917,
-0.08466561138629913,
0.06315573304891586,
-0.01930554397404194,
0.04196090251207352,
-0.1277211457490921,
0.16693232953548431,
-0.017898067831993103,
-0.052024539560079575,
0.019911939278244972,
0.08407800644636154,
0.054110538214445114,
-0.017936309799551964,
-0.10677511990070343,
0.020349249243736267,
0.09127971529960632,
0.02453014813363552,
-0.022795962169766426,
0.05933636799454689,
-0.04359101131558418,
-0.0030004524160176516,
0.04323841258883476,
-0.09642437845468521,
0.037101179361343384,
0.0072480724193155766,
-0.04946049302816391,
-0.0434226468205452,
0.03744653984904289,
0.013556198216974735,
-0.007300090976059437,
0.09811432659626007,
-0.06418637931346893,
-0.005585620645433664,
-0.09435935318470001,
-0.11683240532875061,
0.03220319002866745,
-0.05292649567127228,
0.01361826341599226,
-0.11341333389282227,
-0.1655404269695282,
-0.013283409178256989,
0.0577373169362545,
-0.03884810954332352,
-0.04079287871718407,
-0.027894264087080956,
-0.08014118671417236,
0.04883997514843941,
-0.029317788779735565,
0.10519811511039734,
-0.06105734035372734,
0.0980258584022522,
0.04077805206179619,
0.06983964890241623,
-0.03230069950222969,
0.04431072622537613,
-0.09854795783758163,
0.039691850543022156,
-0.19720570743083954,
0.0201617069542408,
-0.0612797886133194,
0.06721522659063339,
-0.08783722668886185,
-0.07602240890264511,
-0.03940647095441818,
0.0034739954862743616,
0.0855046808719635,
0.09559883177280426,
-0.14687758684158325,
-0.07860706746578217,
0.18186983466148376,
-0.10038058459758759,
-0.13611575961112976,
0.12575426697731018,
-0.03535933047533035,
0.015823470428586006,
0.05686887353658676,
0.19084982573986053,
0.0836649164557457,
-0.10737516731023788,
-0.0016051845159381628,
-0.022091861814260483,
0.057446978986263275,
-0.033502738922834396,
0.08099774271249771,
-0.01036220695823431,
0.0067832209169864655,
0.019767621532082558,
-0.020234910771250725,
0.06323599815368652,
-0.06777317076921463,
-0.07826998084783554,
-0.055912941694259644,
-0.08713516592979431,
0.05636962503194809,
0.05102299898862839,
0.05921853706240654,
-0.12225707620382309,
-0.12426991015672684,
0.03473011031746864,
0.07603614777326584,
-0.07083266228437424,
0.03074442781507969,
-0.06942371279001236,
0.09854764491319656,
-0.0531558059155941,
-0.003180593950673938,
-0.15636752545833588,
-0.04379329830408096,
0.015211584977805614,
-0.038990318775177,
0.008966763503849506,
-0.009447254240512848,
0.087216317653656,
0.06759276986122131,
-0.0650763213634491,
-0.04412190616130829,
-0.03491237387061119,
0.00660127867013216,
-0.11525914818048477,
-0.19905948638916016,
-0.02088066004216671,
-0.035669248551130295,
0.10791622847318649,
-0.20850597321987152,
0.04595370218157768,
0.014074521139264107,
0.1266474425792694,
0.04335504770278931,
-0.017207741737365723,
-0.036287616938352585,
0.062826007604599,
-0.03823200240731239,
-0.06525852531194687,
0.055098723620176315,
0.016512053087353706,
-0.09141028672456741,
0.007615703158080578,
-0.16063617169857025,
0.18187090754508972,
0.1454550176858902,
-0.036632586270570755,
-0.060545288026332855,
0.01560035441070795,
-0.053433481603860855,
-0.02590998448431492,
-0.030674125999212265,
-0.00879002921283245,
0.12815576791763306,
0.0022170122247189283,
0.15098679065704346,
-0.08957310765981674,
-0.05060671642422676,
0.034148864448070526,
-0.0346774160861969,
-0.002430522348731756,
0.10635942965745926,
0.053675442934036255,
-0.07251141220331192,
0.14797432720661163,
0.21267180144786835,
-0.07149111479520798,
0.1435033679008484,
-0.048311978578567505,
-0.07457661628723145,
-0.02340923435986042,
0.0075941262766718864,
0.016599595546722412,
0.12124491482973099,
-0.11840897798538208,
0.005942456424236298,
0.021059706807136536,
0.01793692260980606,
0.02234339341521263,
-0.20052142441272736,
-0.013389299623668194,
0.03935214504599571,
-0.060422129929065704,
0.009595061652362347,
-0.00461837463080883,
0.016781190410256386,
0.10452887415885925,
0.007532505318522453,
-0.06437641382217407,
0.030285608023405075,
-0.00025587243726477027,
-0.07506049424409866,
0.18215247988700867,
-0.08701929450035095,
-0.19009697437286377,
-0.13049009442329407,
-0.040879376232624054,
-0.03926029056310654,
0.014490748755633831,
0.08244750648736954,
-0.07222776114940643,
-0.04689108207821846,
-0.10345736145973206,
0.02508857101202011,
0.01129449438303709,
0.029164323583245277,
0.022406749427318573,
0.001365651492960751,
0.09964979439973831,
-0.09983339160680771,
-0.01318387221544981,
-0.0241962019354105,
-0.04147862643003464,
0.023623298853635788,
0.041937850415706635,
0.1037304550409317,
0.11946108192205429,
-0.02346368506550789,
0.006540039554238319,
-0.03528093174099922,
0.2026110142469406,
-0.07897961884737015,
-0.0046188500709831715,
0.15782493352890015,
0.003956577740609646,
0.05623244494199753,
0.13554640114307404,
0.05174422636628151,
-0.08313299715518951,
0.012523761950433254,
0.038012031465768814,
-0.03554101288318634,
-0.2083166539669037,
-0.037806179374456406,
-0.04588489979505539,
0.009570238180458546,
0.10812865197658539,
0.03736689314246178,
0.03548981994390488,
0.05468558892607689,
-0.009224814362823963,
0.05763568729162216,
-0.0019584798719733953,
0.09095240384340286,
0.11465226113796234,
0.041790999472141266,
0.14096009731292725,
-0.055057015269994736,
-0.04406300187110901,
0.04905705153942108,
-0.009349536150693893,
0.1969866007566452,
0.014358795247972012,
0.11512076109647751,
0.056134022772312164,
0.17079971730709076,
0.01664973422884941,
0.052707917988300323,
-0.008223172277212143,
-0.04327115789055824,
-0.007069793529808521,
-0.0532136969268322,
-0.020483985543251038,
0.03267102688550949,
-0.07796506583690643,
0.0451061986386776,
-0.11935357749462128,
-0.027852831408381462,
0.04757813364267349,
0.24503257870674133,
0.050650209188461304,
-0.32027551531791687,
-0.0911000594496727,
0.031091371551156044,
-0.04852164164185524,
-0.04119940474629402,
0.03302254155278206,
0.11548562347888947,
-0.05949852615594864,
0.06665657460689545,
-0.07487691193819046,
0.09292810410261154,
-0.0027072241064161062,
0.040783509612083435,
0.052059073001146317,
0.07758571207523346,
-0.0010034831939265132,
0.06217416375875473,
-0.293415367603302,
0.2634686231613159,
0.011203967966139317,
0.07492661476135254,
-0.05229063332080841,
0.018354812636971474,
0.028658796101808548,
0.05224601924419403,
0.06769916415214539,
-0.025361450389027596,
-0.12341032922267914,
-0.173537477850914,
-0.06885245442390442,
0.022308427840471268,
0.08947604894638062,
-0.0349443182349205,
0.11364652216434479,
-0.0385013148188591,
-0.001665028859861195,
0.0715726763010025,
-0.005436767358332872,
-0.10194934159517288,
-0.10673617571592331,
-0.0016434144927188754,
0.055304791778326035,
-0.013153585605323315,
-0.10235334187746048,
-0.09183096885681152,
-0.09000688791275024,
0.1574154645204544,
-0.05713444575667381,
-0.03790748119354248,
-0.11817850172519684,
0.05735602229833603,
0.07760896533727646,
-0.08416048437356949,
0.04338125139474869,
-0.0006297442596405745,
0.11713846772909164,
0.012420948594808578,
-0.08441594243049622,
0.11403262615203857,
-0.08123179525136948,
-0.18514545261859894,
-0.05821007117629051,
0.1347571611404419,
0.001321522518992424,
0.053288817405700684,
-0.0038306284695863724,
0.02844267152249813,
-0.031522996723651886,
-0.07889126241207123,
0.02510819211602211,
-0.00783187709748745,
0.05305597931146622,
-0.009157449007034302,
-0.0377940833568573,
-0.013068667612969875,
-0.04014234244823456,
-0.03837721794843674,
0.13598045706748962,
0.26193395256996155,
-0.10056141018867493,
0.04404803737998009,
0.04753820598125458,
-0.05876808613538742,
-0.19661524891853333,
0.003093832405284047,
0.04026909917593002,
0.004627883899956942,
0.043596573173999786,
-0.14777909219264984,
0.07457845658063889,
0.0872897058725357,
-0.037913043051958084,
0.09121696650981903,
-0.28396016359329224,
-0.14095166325569153,
0.12439650297164917,
0.1284244805574417,
0.11385779082775116,
-0.18186551332473755,
-0.0525914765894413,
-0.03850448876619339,
-0.11132287234067917,
0.10718820989131927,
-0.14590756595134735,
0.1019323393702507,
-0.023447643965482712,
0.0614941231906414,
0.01074452605098486,
-0.0584431067109108,
0.13515889644622803,
-0.050536997616291046,
0.09363260865211487,
-0.06750983744859695,
0.042014360427856445,
0.06719706207513809,
-0.06798911094665527,
0.0340452678501606,
-0.13306142389774323,
0.03112376108765602,
-0.08056586980819702,
-0.02545953541994095,
-0.07226122170686722,
0.04485677555203438,
-0.03984260559082031,
-0.03747999295592308,
-0.04704254865646362,
0.01627730019390583,
0.04080570116639137,
-0.016970548778772354,
0.16941826045513153,
-0.00101380399428308,
0.18689991533756256,
0.14443275332450867,
0.07252514362335205,
-0.06536435335874557,
-0.044388968497514725,
-0.000259323074715212,
-0.03849198296666145,
0.06395561993122101,
-0.165699303150177,
0.04323984310030937,
0.11501307785511017,
0.0021392337512224913,
0.1299591213464737,
0.0606997050344944,
-0.05773426219820976,
0.018353693187236786,
0.08754944056272507,
-0.15562522411346436,
-0.11820834875106812,
-0.019476041197776794,
0.02558264508843422,
-0.12220785766839981,
0.044542547315359116,
0.13081620633602142,
-0.07187370210886002,
0.0013712577056139708,
-0.0077697765082120895,
0.020268360152840614,
-0.036321744322776794,
0.17410437762737274,
0.04876270145177841,
0.050600647926330566,
-0.08305208384990692,
0.07873845100402832,
0.052297793328762054,
-0.10853587090969086,
0.029585326090455055,
0.08326557278633118,
-0.06615585833787918,
-0.03903092071413994,
0.05131133645772934,
0.1728331595659256,
-0.029820334166288376,
-0.07035049051046371,
-0.14900743961334229,
-0.13834646344184875,
0.07711375504732132,
0.16514705121517181,
0.0659462958574295,
0.012805702164769173,
-0.010871880687773228,
0.02053743228316307,
-0.09817986190319061,
0.11577288806438446,
0.04340411722660065,
0.07761833071708679,
-0.14377346634864807,
0.11522674560546875,
0.01074355747550726,
-0.00005994770981487818,
-0.02470657043159008,
0.04622425138950348,
-0.10235964506864548,
-0.011121618561446667,
-0.15528465807437897,
-0.009946859441697598,
-0.03440864011645317,
-0.00548821035772562,
-0.00816972367465496,
-0.06432326138019562,
-0.06935977190732956,
0.01837087795138359,
-0.09991384297609329,
-0.03231820464134216,
0.004260633606463671,
0.05285608768463135,
-0.13104428350925446,
-0.03174712508916855,
0.0283591840416193,
-0.08167116343975067,
0.05937565863132477,
0.033288292586803436,
0.033389270305633545,
0.048012278974056244,
-0.14279602468013763,
0.02078995481133461,
0.04055791720747948,
0.006736910901963711,
0.03475777059793472,
-0.10509676486253738,
-0.015031721442937851,
0.009456184692680836,
0.039363935589790344,
0.018812399357557297,
0.06085127219557762,
-0.13829973340034485,
0.0026899571530520916,
-0.009441819973289967,
-0.06927839666604996,
-0.05004817992448807,
0.03448653593659401,
0.08116372674703598,
0.011510255746543407,
0.18688197433948517,
-0.10120206326246262,
0.024843551218509674,
-0.22130756080150604,
0.012139114551246166,
-0.010737935081124306,
-0.10896635800600052,
-0.08556700497865677,
-0.03598419949412346,
0.057227808982133865,
-0.039168231189250946,
0.13122637569904327,
0.01455608382821083,
0.04286061227321625,
0.05121820792555809,
-0.05278409644961357,
0.03886912390589714,
0.02598714642226696,
0.19652654230594635,
0.026136387139558792,
-0.04027929902076721,
0.028837978839874268,
0.014714163728058338,
0.09606198966503143,
0.08019077777862549,
0.17453524470329285,
0.1854085475206375,
0.006287899799644947,
0.09815885126590729,
0.04022020846605301,
-0.0514058880507946,
-0.16398721933364868,
0.054124314337968826,
-0.049644216895103455,
0.11562822759151459,
-0.026288341730833054,
0.21306739747524261,
0.13372747600078583,
-0.16747255623340607,
0.03940093517303467,
-0.049010489135980606,
-0.06038399413228035,
-0.10941733419895172,
-0.0798763632774353,
-0.09345903992652893,
-0.16978931427001953,
-0.004181882832199335,
-0.10805768519639969,
0.04485982283949852,
0.06861001998186111,
0.004223349504172802,
-0.007101897150278091,
0.16160070896148682,
0.03602205589413643,
0.012541117146611214,
0.06135300174355507,
0.010410498827695847,
-0.027294933795928955,
-0.057999346405267715,
-0.08404330164194107,
0.008364826440811157,
-0.010563398711383343,
0.024725768715143204,
-0.028094954788684845,
-0.05324826017022133,
0.041326459497213364,
-0.006754180416464806,
-0.10547778755426407,
0.016520347446203232,
0.017971519380807877,
0.07264212518930435,
0.07448151707649231,
0.0055937753058969975,
0.014036575332283974,
-0.010762088000774384,
0.24564902484416962,
-0.08389788120985031,
-0.0613299161195755,
-0.10166152566671371,
0.20207007229328156,
0.024087419733405113,
-0.012493094429373741,
0.003897256450727582,
-0.0766185000538826,
0.0022991744335740805,
0.21794375777244568,
0.16374851763248444,
-0.0410059317946434,
0.0032307368237525225,
-0.018428470939397812,
-0.012992925010621548,
-0.03185291588306427,
0.0946105420589447,
0.12065292149782181,
0.06570233404636383,
-0.06820729374885559,
-0.03169975057244301,
-0.031786978244781494,
-0.007075718604028225,
-0.06606019288301468,
0.08310600370168686,
0.00419238768517971,
-0.0020711692050099373,
-0.03648058697581291,
0.05474097654223442,
-0.0002915232034865767,
-0.10283659398555756,
0.028585828840732574,
-0.20336587727069855,
-0.14584994316101074,
-0.012514936737716198,
0.13205870985984802,
-0.009899654425680637,
0.05045623704791069,
-0.009158550761640072,
0.00012454026727937162,
0.06238904967904091,
-0.030663076788187027,
-0.057270023971796036,
-0.08081512153148651,
0.07304579019546509,
-0.12451612204313278,
0.23909980058670044,
-0.03319230303168297,
0.02275165356695652,
0.13220177590847015,
0.04185931012034416,
-0.10703066736459732,
0.07142352312803268,
0.05021984130144119,
-0.08185455948114395,
0.03502042591571808,
0.12469678372144699,
-0.03187946230173111,
0.10465005785226822,
0.06100726127624512,
-0.13322655856609344,
0.004306844901293516,
-0.058233197778463364,
-0.07834489643573761,
-0.03760656714439392,
-0.024777261540293694,
-0.055506374686956406,
0.13246485590934753,
0.1830146461725235,
-0.04903370887041092,
-0.007673498708754778,
-0.04784543067216873,
0.02158546634018421,
0.0571679025888443,
0.06582729518413544,
-0.017034966498613358,
-0.23197105526924133,
0.02293538488447666,
0.08887545019388199,
-0.005552439484745264,
-0.3042111396789551,
-0.0976899266242981,
-0.003239510813727975,
-0.039476823061704636,
-0.10113685578107834,
0.09446897357702255,
0.10582831501960754,
0.03614973649382591,
-0.05495570972561836,
-0.107509084045887,
-0.06718254089355469,
0.16762270033359528,
-0.13148130476474762,
-0.07663935422897339
] |
null | null | peft | ## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.5.0
| {"library_name": "peft"} | null | genies-models/openllama-3b-alpaca_mmlu | [
"peft",
"region:us"
] | 2023-11-11T02:28:35+00:00 | [] | [] | TAGS
#peft #region-us
| ## Training procedure
The following 'bitsandbytes' quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.5.0
| [
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16",
"### Framework versions\n\n\n- PEFT 0.5.0"
] | [
"TAGS\n#peft #region-us \n",
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16",
"### Framework versions\n\n\n- PEFT 0.5.0"
] | [
9,
163,
11
] | [
"passage: TAGS\n#peft #region-us \n## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16### Framework versions\n\n\n- PEFT 0.5.0"
] | [
-0.052170995622873306,
0.04638637602329254,
-0.002228866796940565,
0.12587358057498932,
0.09519550204277039,
0.0600481741130352,
0.10608749836683273,
0.12907910346984863,
0.030399560928344727,
0.08949745446443558,
0.11713548749685287,
0.04174109548330307,
0.061590272933244705,
0.15933413803577423,
-0.013215559534728527,
-0.00870309304445982,
0.04744568094611168,
0.0017413088353350759,
-0.0256193894892931,
0.08726111054420471,
0.04637008160352707,
-0.05632777512073517,
0.03198761120438576,
-0.0936860740184784,
-0.14197169244289398,
-0.000977774616330862,
-0.006933287251740694,
0.026412004604935646,
0.03658851608633995,
0.030308807268738747,
0.06681910902261734,
0.011842764914035797,
-0.01992746815085411,
-0.22561807930469513,
-0.011325501836836338,
0.10931333899497986,
-0.024895185604691505,
0.0698515921831131,
-0.09559076279401779,
0.13339440524578094,
-0.08422335982322693,
-0.051820602267980576,
0.015994668006896973,
0.014835527166724205,
-0.0893593430519104,
-0.09642457216978073,
-0.044293515384197235,
0.04310377314686775,
0.030483758077025414,
0.05993334576487541,
-0.014119904488325119,
0.17196474969387054,
-0.15130110085010529,
0.0900082141160965,
0.07799916714429855,
-0.20914553105831146,
-0.02864500693976879,
0.12620775401592255,
-0.022182848304510117,
0.16751492023468018,
-0.0821603313088417,
-0.0928761437535286,
0.06491231173276901,
0.03869080916047096,
-0.018066778779029846,
-0.0015547500224784017,
-0.11479837447404861,
0.030915888026356697,
-0.13364118337631226,
-0.04813624918460846,
0.14555932581424713,
0.03509735316038132,
-0.03541167080402374,
-0.046493805944919586,
-0.09799899905920029,
-0.3764220178127289,
0.025444475933909416,
-0.019241128116846085,
-0.066914401948452,
0.04486314579844475,
-0.02249828726053238,
-0.01497147511690855,
-0.0114058218896389,
-0.08907991647720337,
-0.02228786051273346,
0.07495085150003433,
0.03715614974498749,
0.026139622554183006,
0.006188852712512016,
0.11266672611236572,
-0.09897858649492264,
-0.03323763236403465,
-0.036225371062755585,
-0.028881169855594635,
-0.05456538870930672,
-0.006604235153645277,
-0.07022203505039215,
0.17528314888477325,
0.07525674998760223,
0.12666282057762146,
-0.12827250361442566,
0.1311921775341034,
-0.017891818657517433,
0.049590788781642914,
-0.03390548750758171,
0.035105880349874496,
-0.12452733516693115,
0.10138563066720963,
0.007189449388533831,
0.16058793663978577,
0.009220117703080177,
-0.04686558619141579,
-0.06686486303806305,
0.001404313137754798,
0.1462526172399521,
-0.0007741169538348913,
-0.10082004219293594,
0.01514110155403614,
-0.143125981092453,
-0.03643784299492836,
0.06489310413599014,
-0.07333685457706451,
0.009285308420658112,
0.035954903811216354,
-0.05005185306072235,
-0.035470712929964066,
0.10925997793674469,
-0.04522745683789253,
-0.026262326166033745,
-0.027128657326102257,
-0.09760791808366776,
-0.017379216849803925,
-0.10380303114652634,
-0.13263756036758423,
0.04999830946326256,
-0.18068750202655792,
0.00562055641785264,
-0.042586758732795715,
-0.05852135270833969,
0.02671758644282818,
0.011211113072931767,
-0.0801720842719078,
0.0641409307718277,
-0.10207898169755936,
-0.14261718094348907,
-0.02556646801531315,
0.015782896429300308,
0.027732979506254196,
-0.01964949443936348,
0.10067597031593323,
0.04901120811700821,
0.11884310096502304,
-0.15137265622615814,
-0.000906149041838944,
-0.0006713634938932955,
0.06834474205970764,
0.05941688269376755,
0.13060152530670166,
-0.10515986382961273,
-0.027969863265752792,
-0.06103983148932457,
-0.06348162889480591,
-0.13580144941806793,
-0.01634615659713745,
0.1407272070646286,
0.08765122294425964,
-0.162354975938797,
-0.01073814183473587,
0.07467684894800186,
-0.002643210580572486,
-0.07879126816987991,
0.15129150450229645,
-0.05682380124926567,
0.11437069624662399,
-0.04345925524830818,
0.07842165231704712,
0.23099873960018158,
-0.0996355414390564,
0.006504190154373646,
0.11197512596845627,
0.07360801845788956,
-0.000003583008265195531,
0.013187721371650696,
0.07264252007007599,
-0.10083498060703278,
0.03870057687163353,
0.07377462089061737,
0.03774886205792427,
-0.06504413485527039,
-0.07400774955749512,
-0.022752903401851654,
-0.05681690201163292,
0.1151171550154686,
0.02729841135442257,
0.014005406759679317,
-0.06971991807222366,
-0.08609434217214584,
0.1404680609703064,
0.11459258198738098,
-0.030378511175513268,
-0.005029883235692978,
-0.13212119042873383,
-0.0021202426869422197,
-0.04808016121387482,
0.024497391656041145,
-0.1225360855460167,
0.02185128629207611,
0.08327647298574448,
0.00787833146750927,
0.012623712420463562,
0.04495754465460777,
0.06096586957573891,
0.029019279405474663,
-0.0604269839823246,
0.002956473035737872,
-0.05536847561597824,
0.004059408791363239,
-0.08605807274580002,
-0.07381096482276917,
-0.0010516063775867224,
-0.014724265784025192,
0.19779662787914276,
-0.13052622973918915,
0.03565271571278572,
0.11289136111736298,
-0.005989561323076487,
-0.009503825567662716,
-0.04201732203364372,
-0.06623262912034988,
0.10975963622331619,
-0.012460143305361271,
-0.037024881690740585,
0.03980237618088722,
0.02601690962910652,
-0.061022430658340454,
-0.17260299623012543,
-0.0970475897192955,
0.05009652301669121,
0.12437541782855988,
0.06887579709291458,
-0.05821763351559639,
-0.03995196893811226,
-0.023590924218297005,
-0.03908832371234894,
0.06429712474346161,
-0.045532938092947006,
0.03026658482849598,
0.0026904530823230743,
0.06007077172398567,
-0.10919053852558136,
-0.04384881630539894,
0.06400182098150253,
-0.01284062210470438,
-0.04328686743974686,
0.1131523922085762,
0.03539156913757324,
-0.10713862627744675,
0.06380997598171234,
0.057952430099248886,
-0.14340658485889435,
0.1139671802520752,
-0.010076945647597313,
-0.021488331258296967,
-0.09941406548023224,
0.16643720865249634,
0.018506327643990517,
0.11286304146051407,
-0.1634303331375122,
0.09947452694177628,
-0.00929392408579588,
0.008983489125967026,
0.07010107487440109,
-0.1942894458770752,
-0.0023229599464684725,
-0.03061300702393055,
-0.08721230924129486,
-0.06861617416143417,
-0.015447620302438736,
-0.003656132612377405,
0.023634733632206917,
-0.0011160095455124974,
0.07114277780056,
0.1408311426639557,
-0.021699242293834686,
-0.08314504474401474,
0.18067345023155212,
-0.2228248417377472,
-0.2243027687072754,
-0.23713873326778412,
0.014357865788042545,
-0.10143844783306122,
-0.034064821898937225,
-0.05187778174877167,
-0.07830392569303513,
0.03843799605965614,
-0.08154495060443878,
-0.047854289412498474,
-0.010174937546253204,
0.00946576427668333,
0.05525780841708183,
0.022059176117181778,
0.17305299639701843,
-0.08067180961370468,
0.027929941192269325,
0.05004730448126793,
-0.03334664925932884,
0.1225728914141655,
-0.07749255001544952,
-0.0130006680265069,
0.11107799410820007,
-0.017344031482934952,
0.020386451855301857,
0.015911642462015152,
0.321786105632782,
0.012793739326298237,
0.031549349427223206,
0.07509331405162811,
0.004641542676836252,
0.05399017781019211,
0.08592698723077774,
0.015287249349057674,
-0.10515401512384415,
0.07365500181913376,
0.053789518773555756,
-0.09195578098297119,
-0.1337462067604065,
-0.03178740665316582,
-0.06295210868120193,
0.018377745524048805,
0.07818001508712769,
0.06705041229724884,
0.08702594041824341,
0.06877101212739944,
0.04080965742468834,
0.09042216837406158,
0.0002709537511691451,
-0.010677835904061794,
0.11289051920175552,
-0.018581749871373177,
0.07096980512142181,
-0.0141120171174407,
0.02626338228583336,
0.04935658350586891,
0.12695446610450745,
0.06418963521718979,
-0.07558903843164444,
0.024903111159801483,
0.05380476638674736,
0.2923724353313446,
-0.00511576933786273,
0.09942048788070679,
-0.07165701687335968,
-0.020087575539946556,
-0.012876059859991074,
-0.031602099537849426,
-0.08611224591732025,
0.04154687002301216,
0.009212172590196133,
0.05790695920586586,
-0.0025879312306642532,
-0.00050112244207412,
0.07788389921188354,
0.09517701715230942,
0.16684189438819885,
-0.2861690819263458,
-0.10541892051696777,
-0.011134089902043343,
0.11542210727930069,
-0.09569238871335983,
0.016318857669830322,
0.22510360181331635,
0.01965336687862873,
-0.10895603150129318,
-0.031794480979442596,
0.0305477287620306,
-0.004160500131547451,
0.01549921091645956,
0.13027654588222504,
0.11410193145275116,
0.0013551336014643312,
0.08492128551006317,
-0.30027419328689575,
0.03473859280347824,
0.05703248083591461,
0.040042631328105927,
-0.03116016462445259,
0.003734297351911664,
-0.07034774869680405,
-0.08240098506212234,
0.038768261671066284,
0.004138266667723656,
0.1705280989408493,
-0.2976781725883484,
-0.07360395789146423,
-0.00185823580250144,
0.1271347999572754,
0.05303133279085159,
0.04872012883424759,
0.02403664030134678,
0.04942479729652405,
0.08176594227552414,
0.08706773817539215,
-0.02611609175801277,
-0.11035197973251343,
-0.004661462735384703,
0.13897009193897247,
-0.12668536603450775,
-0.05941249430179596,
-0.05124782398343086,
-0.022857125848531723,
0.03539132699370384,
-0.15729300677776337,
-0.04396966099739075,
-0.05650247633457184,
0.03282100334763527,
0.1437554955482483,
-0.034790076315402985,
0.0006096545839682221,
-0.012934640981256962,
0.015886543318629265,
-0.032133374363183975,
-0.08068639785051346,
0.10920129716396332,
-0.044809311628341675,
-0.13727150857448578,
-0.045596521347761154,
0.1377175748348236,
0.0731605812907219,
-0.00359351746737957,
-0.09041156619787216,
-0.050021663308143616,
0.031252212822437286,
-0.1371932029724121,
0.01997753418982029,
0.08576954901218414,
-0.07234665006399155,
0.0698034018278122,
-0.11751603335142136,
0.2458948791027069,
-0.0524151511490345,
0.0896567553281784,
0.0709720253944397,
0.31162014603614807,
-0.08235272765159607,
0.02873161807656288,
0.09803200513124466,
-0.026654057204723358,
-0.26190292835235596,
0.029979633167386055,
0.05638238042593002,
0.04938185587525368,
-0.02843460999429226,
-0.18719743192195892,
0.036363113671541214,
0.06459816545248032,
0.016497470438480377,
0.13878561556339264,
-0.3157224655151367,
-0.0721718817949295,
0.04020849987864494,
0.047762200236320496,
0.11962796002626419,
-0.04890049248933792,
0.011267954483628273,
0.010984989814460278,
-0.016328759491443634,
0.16383521258831024,
-0.08632995188236237,
0.11082419753074646,
-0.012934541329741478,
0.02104428969323635,
0.010682103224098682,
-0.040459323674440384,
0.1487898826599121,
-0.003343862947076559,
0.09072566032409668,
0.025223324075341225,
-0.06733614206314087,
0.06636755168437958,
-0.07327944040298462,
0.011974047869443893,
-0.05378776043653488,
0.09185262024402618,
-0.045282598584890366,
0.0010139745427295566,
-0.06246977299451828,
-0.024063177406787872,
-0.06749105453491211,
-0.06296666711568832,
-0.10791633278131485,
0.08884958922863007,
0.000746204168535769,
-0.02908109501004219,
-0.04201129451394081,
0.054056353867053986,
0.035463664680719376,
0.4493359923362732,
-0.05227240175008774,
-0.053069330751895905,
0.08815151453018188,
0.09277763962745667,
-0.01794450543820858,
0.09321408718824387,
-0.12357188761234283,
0.04079055041074753,
0.12765155732631683,
0.004561528097838163,
0.13461217284202576,
0.08197237551212311,
-0.10337455570697784,
-0.004654042888432741,
0.042563386261463165,
-0.13127627968788147,
-0.08014523237943649,
-0.03295697271823883,
-0.01872093789279461,
-0.1190016120672226,
-0.015042198821902275,
0.09247755259275436,
-0.03084845095872879,
0.053225692361593246,
0.02725985273718834,
0.04755283519625664,
-0.13321809470653534,
0.15661577880382538,
0.03548196330666542,
0.08438514173030853,
-0.08233853429555893,
0.0957811176776886,
0.030104901641607285,
0.004610141273587942,
0.04891415685415268,
-0.02637943997979164,
-0.10149519145488739,
0.019178146496415138,
-0.03032047301530838,
-0.08386382460594177,
0.12455578148365021,
-0.03086884878575802,
-0.04816962778568268,
-0.08449167013168335,
0.00968680065125227,
0.06828322261571884,
0.05217857286334038,
0.10059580206871033,
-0.031775325536727905,
0.02660980448126793,
-0.12699167430400848,
0.08317403495311737,
-0.024292631074786186,
0.012828993611037731,
-0.1356390118598938,
0.06760798394680023,
-0.02659040130674839,
0.057388827204704285,
-0.01863805763423443,
-0.01973152346909046,
-0.2173939198255539,
0.02618231251835823,
-0.025557123124599457,
0.0038494919426739216,
0.04102040454745293,
0.027889033779501915,
0.026005011051893234,
0.05082876980304718,
-0.035126328468322754,
0.027516542002558708,
-0.031932324171066284,
-0.05281057208776474,
0.04515986144542694,
-0.001467780559323728,
-0.02854740433394909,
-0.051517583429813385,
0.06076597422361374,
-0.11085972934961319,
0.03777240961790085,
0.01833501085639,
-0.05718983709812164,
0.07385632395744324,
0.034146055579185486,
0.025609485805034637,
0.09601592272520065,
0.05744863301515579,
0.03817358240485191,
-0.06451597064733505,
0.03407379239797592,
-0.0191631019115448,
-0.006845394615083933,
0.040695060044527054,
0.11625181883573532,
-0.04832686111330986,
-0.06327467411756516,
-0.1486724466085434,
-0.010981161147356033,
-0.048681098967790604,
0.043675489723682404,
0.14957195520401,
0.10375252366065979,
0.09278536587953568,
-0.08544762432575226,
-0.0226217620074749,
-0.14531907439231873,
-0.08078242838382721,
0.060537077486515045,
-0.044578637927770615,
-0.05809058994054794,
-0.04991595074534416,
0.072527676820755,
-0.014007162302732468,
0.13561315834522247,
-0.08526904881000519,
-0.1192963719367981,
-0.05781520903110504,
-0.2013944536447525,
-0.12102537602186203,
0.006627216935157776,
0.2745034396648407,
0.03526143729686737,
-0.04139144718647003,
-0.07936355471611023,
0.0031785741448402405,
0.07116973400115967,
0.16906431317329407,
0.0312589667737484,
0.110907644033432,
-0.11192750185728073,
0.10478517413139343,
0.039009008556604385,
-0.05691129341721535,
0.10796307027339935,
0.3328118920326233,
-0.08219370990991592,
0.02072407677769661,
-0.10421788692474365,
0.10153460502624512,
0.002255996223539114,
-0.13901546597480774,
0.010607997886836529,
-0.03188878297805786,
-0.16198381781578064,
-0.10664700716733932,
0.016538817435503006,
-0.07478190958499908,
-0.1838442087173462,
-0.017796620726585388,
-0.11684910207986832,
-0.07336822152137756,
0.10155867785215378,
0.040450263768434525,
-0.03009716235101223,
0.20794062316417694,
-0.07158278673887253,
0.04586811363697052,
-0.003929977770894766,
-0.016108684241771698,
-0.01862870156764984,
-0.03230820968747139,
-0.10202806442975998,
0.13945409655570984,
0.021661214530467987,
0.10333898663520813,
-0.0020766255911439657,
0.07358334958553314,
0.037272170186042786,
-0.0280001163482666,
-0.04698936268687248,
-0.011088578961789608,
0.0020344743970781565,
-0.05577605590224266,
0.11701337993144989,
0.05259020999073982,
-0.0886714980006218,
-0.07409586012363434,
-0.009580516256392002,
-0.07335926592350006,
-0.038215573877096176,
-0.1547286957502365,
0.2329166978597641,
-0.030917182564735413,
0.12437079846858978,
0.003149752039462328,
-0.06306970119476318,
-0.09483032673597336,
0.14517559111118317,
0.12407895922660828,
-0.11817460507154465,
-0.009399796836078167,
0.09650255739688873,
-0.005866600200533867,
-0.09527475386857986,
0.14510999619960785,
0.08037827908992767,
-0.02563084289431572,
0.025037124752998352,
-0.022733965888619423,
-0.021453559398651123,
-0.015347006730735302,
0.015540610998868942,
-0.030597979202866554,
0.029416168108582497,
0.042662251740694046,
-0.1463923156261444,
-0.03498699516057968,
-0.08044791221618652,
-0.08698305487632751,
0.18581046164035797,
-0.14277657866477966,
-0.07864214479923248,
-0.04254341498017311,
-0.08468645066022873,
-0.10728199034929276,
0.021719660609960556,
-0.10976678878068924,
0.06724512577056885,
0.05545623227953911,
-0.05225260928273201,
-0.0009250067523680627,
-0.03164041042327881,
0.0032662637531757355,
0.03487436845898628,
0.07120268791913986,
-0.01732550375163555,
0.05975370481610298,
0.12397127598524094,
-0.019730931147933006,
-0.04827846959233284,
0.11225178092718124,
0.021490709856152534,
-0.035946041345596313,
-0.14605534076690674,
0.03688865900039673,
-0.022579049691557884,
0.12442220002412796,
0.032534532248973846,
-0.05697030946612358,
-0.014126471243798733,
-0.2196122705936432,
-0.008397997356951237,
-0.1463952362537384,
-0.07968004047870636,
-0.06731394678354263,
0.10560601949691772,
0.17703479528427124,
-0.055034078657627106,
0.019445886835455894,
-0.03590158373117447,
0.04178448021411896,
-0.05044577643275261,
0.07704321295022964,
-0.0026474366895854473,
-0.1442994624376297,
0.05290527641773224,
-0.054687224328517914,
0.009347786195576191,
-0.3066256046295166,
-0.009464235045015812,
0.024085188284516335,
-0.03251948952674866,
-0.03590572252869606,
0.16143204271793365,
0.021979082375764847,
0.06882719695568085,
-0.05558779463171959,
-0.25186118483543396,
-0.07091144472360611,
0.13138745725154877,
-0.0019314914243295789,
-0.07031361013650894
] |
null | null | peft | ## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.5.0
| {"library_name": "peft"} | null | genies-models/llama-30b-quote_attribution | [
"peft",
"has_space",
"region:us"
] | 2023-11-11T02:28:51+00:00 | [] | [] | TAGS
#peft #has_space #region-us
| ## Training procedure
The following 'bitsandbytes' quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.5.0
| [
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16",
"### Framework versions\n\n\n- PEFT 0.5.0"
] | [
"TAGS\n#peft #has_space #region-us \n",
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16",
"### Framework versions\n\n\n- PEFT 0.5.0"
] | [
13,
163,
11
] | [
"passage: TAGS\n#peft #has_space #region-us \n## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16### Framework versions\n\n\n- PEFT 0.5.0"
] | [
-0.04282241687178612,
0.04120176285505295,
-0.0021348432637751102,
0.12924796342849731,
0.08646179735660553,
0.057968027889728546,
0.09160660207271576,
0.13020794093608856,
0.029857121407985687,
0.10266534984111786,
0.11433359235525131,
0.0244175735861063,
0.06058545410633087,
0.16767288744449615,
-0.007932602427899837,
-0.01662426069378853,
0.05066020041704178,
-0.004030163865536451,
-0.013209463097155094,
0.08523675054311752,
0.05415114015340805,
-0.05259782448410988,
0.035822756588459015,
-0.08229724317789078,
-0.14959512650966644,
0.0049674129113554955,
-0.011496098712086678,
0.02218683622777462,
0.033408381044864655,
0.02340984344482422,
0.07708325982093811,
0.008837604895234108,
-0.022471444681286812,
-0.1983698159456253,
-0.012315113097429276,
0.11230982095003128,
-0.02593814954161644,
0.06965457648038864,
-0.09057778865098953,
0.13518796861171722,
-0.08319486677646637,
-0.06890720129013062,
0.012116601690649986,
0.015436341054737568,
-0.0885424092411995,
-0.12276193499565125,
-0.05774085223674774,
0.047963570803403854,
0.019649352878332138,
0.05916406586766243,
-0.020704349502921104,
0.16659599542617798,
-0.14786915481090546,
0.08194085955619812,
0.10407537966966629,
-0.22281870245933533,
-0.03010695055127144,
0.1408689171075821,
0.0018247009720653296,
0.16099515557289124,
-0.07494129240512848,
-0.0863414853811264,
0.06536421179771423,
0.04412120580673218,
-0.006048910319805145,
-0.00286790425889194,
-0.11847221851348877,
0.033697016537189484,
-0.14635410904884338,
-0.053138356655836105,
0.14620007574558258,
0.02642156183719635,
-0.027287980541586876,
-0.056921932846307755,
-0.1053231805562973,
-0.3769792914390564,
0.03177376464009285,
-0.012785347178578377,
-0.0661994144320488,
0.047150976955890656,
-0.005609128158539534,
-0.02155495248734951,
-0.012858448550105095,
-0.07616014778614044,
-0.03136397898197174,
0.06963212043046951,
0.03664848953485489,
0.023381832987070084,
0.015835681930184364,
0.10520109534263611,
-0.12025459855794907,
-0.03690791130065918,
-0.04172823578119278,
-0.033749304711818695,
-0.051436975598335266,
0.0043213036842644215,
-0.08045496046543121,
0.17529486119747162,
0.07212009280920029,
0.12119518965482712,
-0.1408023089170456,
0.12113483995199203,
-0.006528686266392469,
0.05680316686630249,
-0.034488119184970856,
0.041902247816324234,
-0.13318930566310883,
0.10207467526197433,
0.000506972661241889,
0.14618729054927826,
0.02015523426234722,
-0.049600787460803986,
-0.07486286014318466,
-0.014539851807057858,
0.1451796442270279,
-0.006768701132386923,
-0.08423684537410736,
0.02129560336470604,
-0.13363869488239288,
-0.020544014871120453,
0.07066649198532104,
-0.07140689343214035,
0.025538576766848564,
0.029939480125904083,
-0.047216497361660004,
-0.039827704429626465,
0.1218591183423996,
-0.04194959998130798,
-0.023030521348118782,
-0.029611941426992416,
-0.09786517173051834,
-0.015538124367594719,
-0.10161810368299484,
-0.14863821864128113,
0.0548352375626564,
-0.17741242051124573,
-0.0028082828503102064,
-0.05785254389047623,
-0.04258839786052704,
0.030743636190891266,
0.0034703330602496862,
-0.08385062217712402,
0.05926445499062538,
-0.07833510637283325,
-0.15133604407310486,
-0.01564176008105278,
0.016438085585832596,
0.023172849789261818,
-0.023578688502311707,
0.0919594094157219,
0.03388970345258713,
0.12494555115699768,
-0.16186225414276123,
-0.001533440430648625,
0.003979560453444719,
0.07456780225038528,
0.04293952137231827,
0.12493027001619339,
-0.10671421140432358,
-0.034834519028663635,
-0.06460735201835632,
-0.06636051088571548,
-0.1359342634677887,
-0.014761433005332947,
0.13904517889022827,
0.09283051639795303,
-0.16311649978160858,
-0.0009910486405715346,
0.073461152613163,
-0.0002469424798619002,
-0.07743397355079651,
0.150736004114151,
-0.049252916127443314,
0.11208216845989227,
-0.059591811150312424,
0.09697217494249344,
0.22215056419372559,
-0.11880095303058624,
-0.0068501573987305164,
0.11727163195610046,
0.07287828624248505,
0.005310588050633669,
0.011952096596360207,
0.06387332081794739,
-0.08026789873838425,
0.03614973649382591,
0.08580944687128067,
0.03332332894206047,
-0.056764815002679825,
-0.06829635798931122,
-0.036852966994047165,
-0.04758831858634949,
0.12905003130435944,
0.017584340646862984,
0.012249191291630268,
-0.06561900675296783,
-0.09558135271072388,
0.143845796585083,
0.10572219640016556,
-0.014454548247158527,
0.0045013134367764,
-0.11862430721521378,
0.0135056022554636,
-0.05098523944616318,
0.01993166282773018,
-0.12159255892038345,
0.002333344891667366,
0.07449851185083389,
-0.0016263993456959724,
0.012529187835752964,
0.05880476161837578,
0.06303709000349045,
0.022139238193631172,
-0.04897989332675934,
0.009196851402521133,
-0.05170193314552307,
0.014031521044671535,
-0.09134367108345032,
-0.05557931959629059,
-0.01285315677523613,
-0.01563110202550888,
0.20126944780349731,
-0.1187497079372406,
0.04272245988249779,
0.12145345658063889,
-0.0010137788485735655,
-0.014662303030490875,
-0.04059963300824165,
-0.060168683528900146,
0.10974374413490295,
-0.015286832116544247,
-0.0403757281601429,
0.03153783828020096,
0.027245210483670235,
-0.05356745794415474,
-0.1466933637857437,
-0.11661176383495331,
0.056100934743881226,
0.13047540187835693,
0.0678299069404602,
-0.062154535204172134,
-0.03653307259082794,
-0.02258886583149433,
-0.034939538687467575,
0.06014026328921318,
-0.059940073639154434,
0.0038780684117227793,
-0.009529940783977509,
0.06138020381331444,
-0.10774137824773788,
-0.0373099260032177,
0.06156913936138153,
-0.01886286772787571,
-0.054377295076847076,
0.1087731346487999,
0.02496560662984848,
-0.07780247181653976,
0.07286867499351501,
0.08599350601434708,
-0.14405789971351624,
0.10834873467683792,
0.0052656750194728374,
-0.013638402335345745,
-0.10589032620191574,
0.16338999569416046,
0.014409314841032028,
0.10321232676506042,
-0.14674994349479675,
0.09326168894767761,
0.0021023035515099764,
0.013987313024699688,
0.06812869757413864,
-0.18934231996536255,
-0.00502322381362319,
-0.01996769569814205,
-0.0807976946234703,
-0.06848251074552536,
0.009399297647178173,
0.004534625448286533,
0.026592671871185303,
0.00625268742442131,
0.06968360394239426,
0.1308663785457611,
-0.014757325872778893,
-0.06944165378808975,
0.16469813883304596,
-0.22596994042396545,
-0.22066622972488403,
-0.22637031972408295,
0.012925352901220322,
-0.10213476419448853,
-0.026665788143873215,
-0.03444693610072136,
-0.07242286205291748,
0.03871549665927887,
-0.09274096041917801,
-0.044385604560375214,
-0.02160717360675335,
0.0119753023609519,
0.029175829142332077,
0.007948506623506546,
0.16779442131519318,
-0.08737336844205856,
0.02646205574274063,
0.05756248161196709,
-0.037304334342479706,
0.11383375525474548,
-0.06529092788696289,
-0.004916870500892401,
0.11006875336170197,
-0.018813638016581535,
0.020981620997190475,
0.014244136400520802,
0.33642321825027466,
0.010768907144665718,
0.0447690412402153,
0.09196527302265167,
0.019820919260382652,
0.06501765549182892,
0.10041987895965576,
0.012529752217233181,
-0.0874878466129303,
0.0690324530005455,
0.04981809854507446,
-0.09978985041379929,
-0.11783795803785324,
-0.03971865028142929,
-0.07859805971384048,
0.0048211319372057915,
0.07461197674274445,
0.07032687216997147,
0.08506621420383453,
0.06906560808420181,
0.03683885931968689,
0.08839930593967438,
-0.0008640201995149255,
-0.004058960825204849,
0.12257933616638184,
-0.022785460576415062,
0.07670586556196213,
-0.015342967584729195,
0.03159911558032036,
0.051197394728660583,
0.13124999403953552,
0.06343226879835129,
-0.08392397314310074,
0.011704872362315655,
0.05680365487933159,
0.3002788722515106,
-0.006184941623359919,
0.08252235502004623,
-0.0694844201207161,
-0.013935753144323826,
-0.00936900358647108,
-0.03694760426878929,
-0.06996672600507736,
0.05150884762406349,
0.013060919009149075,
0.04930111765861511,
0.0034041455946862698,
-0.024562936276197433,
0.07991008460521698,
0.08419626951217651,
0.17195546627044678,
-0.2748368978500366,
-0.09531981498003006,
-0.0034768027253448963,
0.12043199688196182,
-0.09449812024831772,
0.010957406833767891,
0.23007959127426147,
0.02392749860882759,
-0.10115979611873627,
-0.020969169214367867,
0.029907481744885445,
0.01232626661658287,
0.015104972757399082,
0.1261591613292694,
0.10915117710828781,
0.0029561170376837254,
0.0770280584692955,
-0.2889924645423889,
0.03406522050499916,
0.0612494982779026,
0.01648484356701374,
-0.034598734229803085,
0.011222575791180134,
-0.06977985054254532,
-0.06655967980623245,
0.03458268567919731,
0.015120447613298893,
0.1441056877374649,
-0.28829488158226013,
-0.06104530394077301,
-0.010720375925302505,
0.12389413267374039,
0.05735940858721733,
0.04250615835189819,
0.019709963351488113,
0.04762941226363182,
0.08394866436719894,
0.13128617405891418,
-0.023467620834708214,
-0.10845143347978592,
0.005282137077301741,
0.1596507877111435,
-0.14083118736743927,
-0.06193690001964569,
-0.049066733568906784,
-0.047994464635849,
0.028280720114707947,
-0.1541026085615158,
-0.04972391948103905,
-0.06088952347636223,
0.016404516994953156,
0.14553089439868927,
-0.03196023032069206,
0.015864696353673935,
-0.02289947122335434,
0.01637299731373787,
-0.03940080478787422,
-0.07908028364181519,
0.12301791459321976,
-0.0409061573445797,
-0.12497896701097488,
-0.03731251135468483,
0.12365078926086426,
0.055943313986063004,
-0.006954794749617577,
-0.09446190297603607,
-0.028071748092770576,
0.007796058431267738,
-0.14015305042266846,
0.011770731769502163,
0.08023582398891449,
-0.06865604966878891,
0.07314804196357727,
-0.1049056202173233,
0.2240854799747467,
-0.024228371679782867,
0.09412028640508652,
0.07132501900196075,
0.3071337938308716,
-0.10399491339921951,
0.0291880052536726,
0.067757248878479,
-0.014505579136312008,
-0.2711692750453949,
0.029810495674610138,
0.05051774904131889,
0.047653328627347946,
-0.022029604762792587,
-0.1579478234052658,
0.017102127894759178,
0.06342151015996933,
0.0074049923568964005,
0.14221663773059845,
-0.32594960927963257,
-0.07634317129850388,
0.02877959981560707,
0.025932304561138153,
0.12787574529647827,
-0.05162520334124565,
0.011812222190201283,
0.01865217834711075,
-0.03454221040010452,
0.1472434252500534,
-0.10535560548305511,
0.10739891976118088,
-0.0076850601471960545,
0.04127296805381775,
0.008445052430033684,
-0.04165693745017052,
0.1596582680940628,
-0.009404713287949562,
0.08705945312976837,
0.008892865851521492,
-0.05982821434736252,
0.06028256565332413,
-0.08470658957958221,
0.02618570066988468,
-0.06644842028617859,
0.08366316556930542,
-0.06612414121627808,
0.01618139259517193,
-0.07159578800201416,
-0.013761729001998901,
-0.0713677778840065,
-0.058548204600811005,
-0.12296337634325027,
0.08107384294271469,
-0.0011565444292500615,
-0.034523434937000275,
-0.03236624598503113,
0.0463973805308342,
0.05970676243305206,
0.4648545980453491,
-0.08608736097812653,
-0.05680585280060768,
0.07230587303638458,
0.10849498212337494,
-0.0173831544816494,
0.09555526822805405,
-0.14152409136295319,
0.03637157008051872,
0.11165837943553925,
0.0032209204509854317,
0.12282878160476685,
0.08631858974695206,
-0.10844928026199341,
-0.0037738466635346413,
0.0417560413479805,
-0.13678334653377533,
-0.08445664495229721,
-0.02781590446829796,
-0.004006532020866871,
-0.11098576337099075,
-0.014680985361337662,
0.09606795758008957,
-0.03968826308846474,
0.05405222252011299,
0.029674243181943893,
0.050275690853595734,
-0.13072209060192108,
0.1399947702884674,
0.04395652934908867,
0.08201206475496292,
-0.07537345588207245,
0.07708587497472763,
0.04387863725423813,
0.019049033522605896,
0.048647768795490265,
-0.015735067427158356,
-0.09303226321935654,
0.006426266394555569,
-0.04083925113081932,
-0.11177828907966614,
0.12000752240419388,
-0.02049705944955349,
-0.030077744275331497,
-0.08744953572750092,
0.007634937297552824,
0.06426335126161575,
0.05343160778284073,
0.09301846474409103,
-0.01746264658868313,
0.01979856938123703,
-0.10930361598730087,
0.07939251512289047,
-0.036335818469524384,
0.02254968136548996,
-0.13443821668624878,
0.07586885988712311,
-0.03048333153128624,
0.06292670965194702,
-0.021998992189764977,
-0.012667212635278702,
-0.22143584489822388,
0.014202523045241833,
-0.02081460691988468,
-0.004357951693236828,
0.03862457722425461,
0.02484467439353466,
0.01671709679067135,
0.03945622220635414,
-0.04088917374610901,
0.028971116989850998,
-0.03175773844122887,
-0.05943993479013443,
0.030164550989866257,
0.004912189207971096,
-0.03931226581335068,
-0.05160390958189964,
0.05729808658361435,
-0.10473078489303589,
0.03635600581765175,
0.009158221073448658,
-0.05857793986797333,
0.05410732328891754,
0.0413614958524704,
0.01909421943128109,
0.09453469514846802,
0.053765036165714264,
0.046443480998277664,
-0.059018176048994064,
0.031265582889318466,
-0.03246711194515228,
-0.004589755088090897,
0.041267190128564835,
0.11856943368911743,
-0.04591933265328407,
-0.05818842723965645,
-0.1540527492761612,
-0.004703789949417114,
-0.041105788201093674,
0.0476653166115284,
0.15471167862415314,
0.10249525308609009,
0.07724180072546005,
-0.07783109694719315,
-0.022713392972946167,
-0.13975197076797485,
-0.08603484183549881,
0.05119115859270096,
-0.05143296718597412,
-0.032943952828645706,
-0.050851475447416306,
0.07087905704975128,
-0.008432888425886631,
0.14570625126361847,
-0.06163178011775017,
-0.11827366799116135,
-0.04941552132368088,
-0.17402616143226624,
-0.12539584934711456,
0.002307258080691099,
0.24652782082557678,
0.023001018911600113,
-0.03917231038212776,
-0.07386667281389236,
0.005983154289424419,
0.07115837186574936,
0.16051965951919556,
0.026468550786376,
0.11270948499441147,
-0.09956587105989456,
0.10440719872713089,
0.034812990576028824,
-0.07109985500574112,
0.0864926353096962,
0.3119119107723236,
-0.07661689817905426,
0.03827730193734169,
-0.12454145401716232,
0.07940937578678131,
0.012104257941246033,
-0.13865362107753754,
0.013562889769673347,
-0.036511946469545364,
-0.1526803821325302,
-0.10401199012994766,
0.021273670718073845,
-0.06599308550357819,
-0.1820554882287979,
-0.030223404988646507,
-0.09893838316202164,
-0.0550479032099247,
0.11106511950492859,
0.03387986496090889,
-0.01653383858501911,
0.2051052749156952,
-0.07505934685468674,
0.04071968421339989,
-0.01171939354389906,
-0.011962340213358402,
-0.009147333912551403,
-0.03504681959748268,
-0.10448984056711197,
0.1388232409954071,
0.03345785290002823,
0.09830757230520248,
0.0053544859401881695,
0.07562510669231415,
0.0298940259963274,
-0.039348773658275604,
-0.044371869415044785,
-0.018252892419695854,
0.01512606255710125,
-0.05837094783782959,
0.1190296858549118,
0.04494171217083931,
-0.08668908476829529,
-0.08051984012126923,
0.0033494001254439354,
-0.08221173286437988,
-0.03652910515666008,
-0.1460893601179123,
0.20463977754116058,
-0.035380348563194275,
0.11755850166082382,
-0.011778801679611206,
-0.06845629215240479,
-0.10983201861381531,
0.13847801089286804,
0.1421661674976349,
-0.11987810581922531,
0.0010649558389559388,
0.09329685568809509,
-0.004757906775921583,
-0.08727501332759857,
0.14100658893585205,
0.0870276689529419,
-0.019397974014282227,
0.02741924859583378,
-0.009887965396046638,
-0.02616746351122856,
-0.014207511208951473,
-0.007959026843309402,
-0.02015502005815506,
0.014457334764301777,
0.034323520958423615,
-0.14064325392246246,
-0.03898324817419052,
-0.07274904102087021,
-0.08446308225393295,
0.18084414303302765,
-0.15662987530231476,
-0.07839803397655487,
-0.028509611263871193,
-0.05866457149386406,
-0.10989672690629959,
0.02999497763812542,
-0.11029229313135147,
0.06942048668861389,
0.05785766988992691,
-0.05375833064317703,
0.010138568468391895,
-0.041577260941267014,
0.004580279346555471,
0.03846810758113861,
0.07661933451890945,
-0.016295144334435463,
0.050018612295389175,
0.12694716453552246,
-0.016251439228653908,
-0.06127054989337921,
0.10767858475446701,
0.030748512595891953,
-0.041710689663887024,
-0.15851686894893646,
0.027643118053674698,
-0.029909053817391396,
0.11768242716789246,
0.03397572040557861,
-0.07563362270593643,
-0.01115536317229271,
-0.20627373456954956,
0.0005931404302828014,
-0.14321504533290863,
-0.08704759925603867,
-0.0643635019659996,
0.10747678577899933,
0.17218874394893646,
-0.056415654718875885,
0.01318585779517889,
-0.03320974111557007,
0.02558070793747902,
-0.06034955382347107,
0.0783754512667656,
-0.0003692414320539683,
-0.13232621550559998,
0.06138229742646217,
-0.05056034401059151,
0.015999173745512962,
-0.2927008271217346,
-0.010729788802564144,
0.01971137709915638,
-0.02581045776605606,
-0.030403142794966698,
0.1598476618528366,
0.020657381042838097,
0.054707784205675125,
-0.05518721044063568,
-0.25279372930526733,
-0.0644865557551384,
0.13134662806987762,
-0.0012684966204687953,
-0.06044445559382439
] |
null | null | peft | ## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.5.0
| {"library_name": "peft"} | null | genies-models/llama-7b-change_my_view | [
"peft",
"region:us"
] | 2023-11-11T02:29:51+00:00 | [] | [] | TAGS
#peft #region-us
| ## Training procedure
The following 'bitsandbytes' quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.5.0
| [
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16",
"### Framework versions\n\n\n- PEFT 0.5.0"
] | [
"TAGS\n#peft #region-us \n",
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16",
"### Framework versions\n\n\n- PEFT 0.5.0"
] | [
9,
163,
11
] | [
"passage: TAGS\n#peft #region-us \n## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16### Framework versions\n\n\n- PEFT 0.5.0"
] | [
-0.052170995622873306,
0.04638637602329254,
-0.002228866796940565,
0.12587358057498932,
0.09519550204277039,
0.0600481741130352,
0.10608749836683273,
0.12907910346984863,
0.030399560928344727,
0.08949745446443558,
0.11713548749685287,
0.04174109548330307,
0.061590272933244705,
0.15933413803577423,
-0.013215559534728527,
-0.00870309304445982,
0.04744568094611168,
0.0017413088353350759,
-0.0256193894892931,
0.08726111054420471,
0.04637008160352707,
-0.05632777512073517,
0.03198761120438576,
-0.0936860740184784,
-0.14197169244289398,
-0.000977774616330862,
-0.006933287251740694,
0.026412004604935646,
0.03658851608633995,
0.030308807268738747,
0.06681910902261734,
0.011842764914035797,
-0.01992746815085411,
-0.22561807930469513,
-0.011325501836836338,
0.10931333899497986,
-0.024895185604691505,
0.0698515921831131,
-0.09559076279401779,
0.13339440524578094,
-0.08422335982322693,
-0.051820602267980576,
0.015994668006896973,
0.014835527166724205,
-0.0893593430519104,
-0.09642457216978073,
-0.044293515384197235,
0.04310377314686775,
0.030483758077025414,
0.05993334576487541,
-0.014119904488325119,
0.17196474969387054,
-0.15130110085010529,
0.0900082141160965,
0.07799916714429855,
-0.20914553105831146,
-0.02864500693976879,
0.12620775401592255,
-0.022182848304510117,
0.16751492023468018,
-0.0821603313088417,
-0.0928761437535286,
0.06491231173276901,
0.03869080916047096,
-0.018066778779029846,
-0.0015547500224784017,
-0.11479837447404861,
0.030915888026356697,
-0.13364118337631226,
-0.04813624918460846,
0.14555932581424713,
0.03509735316038132,
-0.03541167080402374,
-0.046493805944919586,
-0.09799899905920029,
-0.3764220178127289,
0.025444475933909416,
-0.019241128116846085,
-0.066914401948452,
0.04486314579844475,
-0.02249828726053238,
-0.01497147511690855,
-0.0114058218896389,
-0.08907991647720337,
-0.02228786051273346,
0.07495085150003433,
0.03715614974498749,
0.026139622554183006,
0.006188852712512016,
0.11266672611236572,
-0.09897858649492264,
-0.03323763236403465,
-0.036225371062755585,
-0.028881169855594635,
-0.05456538870930672,
-0.006604235153645277,
-0.07022203505039215,
0.17528314888477325,
0.07525674998760223,
0.12666282057762146,
-0.12827250361442566,
0.1311921775341034,
-0.017891818657517433,
0.049590788781642914,
-0.03390548750758171,
0.035105880349874496,
-0.12452733516693115,
0.10138563066720963,
0.007189449388533831,
0.16058793663978577,
0.009220117703080177,
-0.04686558619141579,
-0.06686486303806305,
0.001404313137754798,
0.1462526172399521,
-0.0007741169538348913,
-0.10082004219293594,
0.01514110155403614,
-0.143125981092453,
-0.03643784299492836,
0.06489310413599014,
-0.07333685457706451,
0.009285308420658112,
0.035954903811216354,
-0.05005185306072235,
-0.035470712929964066,
0.10925997793674469,
-0.04522745683789253,
-0.026262326166033745,
-0.027128657326102257,
-0.09760791808366776,
-0.017379216849803925,
-0.10380303114652634,
-0.13263756036758423,
0.04999830946326256,
-0.18068750202655792,
0.00562055641785264,
-0.042586758732795715,
-0.05852135270833969,
0.02671758644282818,
0.011211113072931767,
-0.0801720842719078,
0.0641409307718277,
-0.10207898169755936,
-0.14261718094348907,
-0.02556646801531315,
0.015782896429300308,
0.027732979506254196,
-0.01964949443936348,
0.10067597031593323,
0.04901120811700821,
0.11884310096502304,
-0.15137265622615814,
-0.000906149041838944,
-0.0006713634938932955,
0.06834474205970764,
0.05941688269376755,
0.13060152530670166,
-0.10515986382961273,
-0.027969863265752792,
-0.06103983148932457,
-0.06348162889480591,
-0.13580144941806793,
-0.01634615659713745,
0.1407272070646286,
0.08765122294425964,
-0.162354975938797,
-0.01073814183473587,
0.07467684894800186,
-0.002643210580572486,
-0.07879126816987991,
0.15129150450229645,
-0.05682380124926567,
0.11437069624662399,
-0.04345925524830818,
0.07842165231704712,
0.23099873960018158,
-0.0996355414390564,
0.006504190154373646,
0.11197512596845627,
0.07360801845788956,
-0.000003583008265195531,
0.013187721371650696,
0.07264252007007599,
-0.10083498060703278,
0.03870057687163353,
0.07377462089061737,
0.03774886205792427,
-0.06504413485527039,
-0.07400774955749512,
-0.022752903401851654,
-0.05681690201163292,
0.1151171550154686,
0.02729841135442257,
0.014005406759679317,
-0.06971991807222366,
-0.08609434217214584,
0.1404680609703064,
0.11459258198738098,
-0.030378511175513268,
-0.005029883235692978,
-0.13212119042873383,
-0.0021202426869422197,
-0.04808016121387482,
0.024497391656041145,
-0.1225360855460167,
0.02185128629207611,
0.08327647298574448,
0.00787833146750927,
0.012623712420463562,
0.04495754465460777,
0.06096586957573891,
0.029019279405474663,
-0.0604269839823246,
0.002956473035737872,
-0.05536847561597824,
0.004059408791363239,
-0.08605807274580002,
-0.07381096482276917,
-0.0010516063775867224,
-0.014724265784025192,
0.19779662787914276,
-0.13052622973918915,
0.03565271571278572,
0.11289136111736298,
-0.005989561323076487,
-0.009503825567662716,
-0.04201732203364372,
-0.06623262912034988,
0.10975963622331619,
-0.012460143305361271,
-0.037024881690740585,
0.03980237618088722,
0.02601690962910652,
-0.061022430658340454,
-0.17260299623012543,
-0.0970475897192955,
0.05009652301669121,
0.12437541782855988,
0.06887579709291458,
-0.05821763351559639,
-0.03995196893811226,
-0.023590924218297005,
-0.03908832371234894,
0.06429712474346161,
-0.045532938092947006,
0.03026658482849598,
0.0026904530823230743,
0.06007077172398567,
-0.10919053852558136,
-0.04384881630539894,
0.06400182098150253,
-0.01284062210470438,
-0.04328686743974686,
0.1131523922085762,
0.03539156913757324,
-0.10713862627744675,
0.06380997598171234,
0.057952430099248886,
-0.14340658485889435,
0.1139671802520752,
-0.010076945647597313,
-0.021488331258296967,
-0.09941406548023224,
0.16643720865249634,
0.018506327643990517,
0.11286304146051407,
-0.1634303331375122,
0.09947452694177628,
-0.00929392408579588,
0.008983489125967026,
0.07010107487440109,
-0.1942894458770752,
-0.0023229599464684725,
-0.03061300702393055,
-0.08721230924129486,
-0.06861617416143417,
-0.015447620302438736,
-0.003656132612377405,
0.023634733632206917,
-0.0011160095455124974,
0.07114277780056,
0.1408311426639557,
-0.021699242293834686,
-0.08314504474401474,
0.18067345023155212,
-0.2228248417377472,
-0.2243027687072754,
-0.23713873326778412,
0.014357865788042545,
-0.10143844783306122,
-0.034064821898937225,
-0.05187778174877167,
-0.07830392569303513,
0.03843799605965614,
-0.08154495060443878,
-0.047854289412498474,
-0.010174937546253204,
0.00946576427668333,
0.05525780841708183,
0.022059176117181778,
0.17305299639701843,
-0.08067180961370468,
0.027929941192269325,
0.05004730448126793,
-0.03334664925932884,
0.1225728914141655,
-0.07749255001544952,
-0.0130006680265069,
0.11107799410820007,
-0.017344031482934952,
0.020386451855301857,
0.015911642462015152,
0.321786105632782,
0.012793739326298237,
0.031549349427223206,
0.07509331405162811,
0.004641542676836252,
0.05399017781019211,
0.08592698723077774,
0.015287249349057674,
-0.10515401512384415,
0.07365500181913376,
0.053789518773555756,
-0.09195578098297119,
-0.1337462067604065,
-0.03178740665316582,
-0.06295210868120193,
0.018377745524048805,
0.07818001508712769,
0.06705041229724884,
0.08702594041824341,
0.06877101212739944,
0.04080965742468834,
0.09042216837406158,
0.0002709537511691451,
-0.010677835904061794,
0.11289051920175552,
-0.018581749871373177,
0.07096980512142181,
-0.0141120171174407,
0.02626338228583336,
0.04935658350586891,
0.12695446610450745,
0.06418963521718979,
-0.07558903843164444,
0.024903111159801483,
0.05380476638674736,
0.2923724353313446,
-0.00511576933786273,
0.09942048788070679,
-0.07165701687335968,
-0.020087575539946556,
-0.012876059859991074,
-0.031602099537849426,
-0.08611224591732025,
0.04154687002301216,
0.009212172590196133,
0.05790695920586586,
-0.0025879312306642532,
-0.00050112244207412,
0.07788389921188354,
0.09517701715230942,
0.16684189438819885,
-0.2861690819263458,
-0.10541892051696777,
-0.011134089902043343,
0.11542210727930069,
-0.09569238871335983,
0.016318857669830322,
0.22510360181331635,
0.01965336687862873,
-0.10895603150129318,
-0.031794480979442596,
0.0305477287620306,
-0.004160500131547451,
0.01549921091645956,
0.13027654588222504,
0.11410193145275116,
0.0013551336014643312,
0.08492128551006317,
-0.30027419328689575,
0.03473859280347824,
0.05703248083591461,
0.040042631328105927,
-0.03116016462445259,
0.003734297351911664,
-0.07034774869680405,
-0.08240098506212234,
0.038768261671066284,
0.004138266667723656,
0.1705280989408493,
-0.2976781725883484,
-0.07360395789146423,
-0.00185823580250144,
0.1271347999572754,
0.05303133279085159,
0.04872012883424759,
0.02403664030134678,
0.04942479729652405,
0.08176594227552414,
0.08706773817539215,
-0.02611609175801277,
-0.11035197973251343,
-0.004661462735384703,
0.13897009193897247,
-0.12668536603450775,
-0.05941249430179596,
-0.05124782398343086,
-0.022857125848531723,
0.03539132699370384,
-0.15729300677776337,
-0.04396966099739075,
-0.05650247633457184,
0.03282100334763527,
0.1437554955482483,
-0.034790076315402985,
0.0006096545839682221,
-0.012934640981256962,
0.015886543318629265,
-0.032133374363183975,
-0.08068639785051346,
0.10920129716396332,
-0.044809311628341675,
-0.13727150857448578,
-0.045596521347761154,
0.1377175748348236,
0.0731605812907219,
-0.00359351746737957,
-0.09041156619787216,
-0.050021663308143616,
0.031252212822437286,
-0.1371932029724121,
0.01997753418982029,
0.08576954901218414,
-0.07234665006399155,
0.0698034018278122,
-0.11751603335142136,
0.2458948791027069,
-0.0524151511490345,
0.0896567553281784,
0.0709720253944397,
0.31162014603614807,
-0.08235272765159607,
0.02873161807656288,
0.09803200513124466,
-0.026654057204723358,
-0.26190292835235596,
0.029979633167386055,
0.05638238042593002,
0.04938185587525368,
-0.02843460999429226,
-0.18719743192195892,
0.036363113671541214,
0.06459816545248032,
0.016497470438480377,
0.13878561556339264,
-0.3157224655151367,
-0.0721718817949295,
0.04020849987864494,
0.047762200236320496,
0.11962796002626419,
-0.04890049248933792,
0.011267954483628273,
0.010984989814460278,
-0.016328759491443634,
0.16383521258831024,
-0.08632995188236237,
0.11082419753074646,
-0.012934541329741478,
0.02104428969323635,
0.010682103224098682,
-0.040459323674440384,
0.1487898826599121,
-0.003343862947076559,
0.09072566032409668,
0.025223324075341225,
-0.06733614206314087,
0.06636755168437958,
-0.07327944040298462,
0.011974047869443893,
-0.05378776043653488,
0.09185262024402618,
-0.045282598584890366,
0.0010139745427295566,
-0.06246977299451828,
-0.024063177406787872,
-0.06749105453491211,
-0.06296666711568832,
-0.10791633278131485,
0.08884958922863007,
0.000746204168535769,
-0.02908109501004219,
-0.04201129451394081,
0.054056353867053986,
0.035463664680719376,
0.4493359923362732,
-0.05227240175008774,
-0.053069330751895905,
0.08815151453018188,
0.09277763962745667,
-0.01794450543820858,
0.09321408718824387,
-0.12357188761234283,
0.04079055041074753,
0.12765155732631683,
0.004561528097838163,
0.13461217284202576,
0.08197237551212311,
-0.10337455570697784,
-0.004654042888432741,
0.042563386261463165,
-0.13127627968788147,
-0.08014523237943649,
-0.03295697271823883,
-0.01872093789279461,
-0.1190016120672226,
-0.015042198821902275,
0.09247755259275436,
-0.03084845095872879,
0.053225692361593246,
0.02725985273718834,
0.04755283519625664,
-0.13321809470653534,
0.15661577880382538,
0.03548196330666542,
0.08438514173030853,
-0.08233853429555893,
0.0957811176776886,
0.030104901641607285,
0.004610141273587942,
0.04891415685415268,
-0.02637943997979164,
-0.10149519145488739,
0.019178146496415138,
-0.03032047301530838,
-0.08386382460594177,
0.12455578148365021,
-0.03086884878575802,
-0.04816962778568268,
-0.08449167013168335,
0.00968680065125227,
0.06828322261571884,
0.05217857286334038,
0.10059580206871033,
-0.031775325536727905,
0.02660980448126793,
-0.12699167430400848,
0.08317403495311737,
-0.024292631074786186,
0.012828993611037731,
-0.1356390118598938,
0.06760798394680023,
-0.02659040130674839,
0.057388827204704285,
-0.01863805763423443,
-0.01973152346909046,
-0.2173939198255539,
0.02618231251835823,
-0.025557123124599457,
0.0038494919426739216,
0.04102040454745293,
0.027889033779501915,
0.026005011051893234,
0.05082876980304718,
-0.035126328468322754,
0.027516542002558708,
-0.031932324171066284,
-0.05281057208776474,
0.04515986144542694,
-0.001467780559323728,
-0.02854740433394909,
-0.051517583429813385,
0.06076597422361374,
-0.11085972934961319,
0.03777240961790085,
0.01833501085639,
-0.05718983709812164,
0.07385632395744324,
0.034146055579185486,
0.025609485805034637,
0.09601592272520065,
0.05744863301515579,
0.03817358240485191,
-0.06451597064733505,
0.03407379239797592,
-0.0191631019115448,
-0.006845394615083933,
0.040695060044527054,
0.11625181883573532,
-0.04832686111330986,
-0.06327467411756516,
-0.1486724466085434,
-0.010981161147356033,
-0.048681098967790604,
0.043675489723682404,
0.14957195520401,
0.10375252366065979,
0.09278536587953568,
-0.08544762432575226,
-0.0226217620074749,
-0.14531907439231873,
-0.08078242838382721,
0.060537077486515045,
-0.044578637927770615,
-0.05809058994054794,
-0.04991595074534416,
0.072527676820755,
-0.014007162302732468,
0.13561315834522247,
-0.08526904881000519,
-0.1192963719367981,
-0.05781520903110504,
-0.2013944536447525,
-0.12102537602186203,
0.006627216935157776,
0.2745034396648407,
0.03526143729686737,
-0.04139144718647003,
-0.07936355471611023,
0.0031785741448402405,
0.07116973400115967,
0.16906431317329407,
0.0312589667737484,
0.110907644033432,
-0.11192750185728073,
0.10478517413139343,
0.039009008556604385,
-0.05691129341721535,
0.10796307027339935,
0.3328118920326233,
-0.08219370990991592,
0.02072407677769661,
-0.10421788692474365,
0.10153460502624512,
0.002255996223539114,
-0.13901546597480774,
0.010607997886836529,
-0.03188878297805786,
-0.16198381781578064,
-0.10664700716733932,
0.016538817435503006,
-0.07478190958499908,
-0.1838442087173462,
-0.017796620726585388,
-0.11684910207986832,
-0.07336822152137756,
0.10155867785215378,
0.040450263768434525,
-0.03009716235101223,
0.20794062316417694,
-0.07158278673887253,
0.04586811363697052,
-0.003929977770894766,
-0.016108684241771698,
-0.01862870156764984,
-0.03230820968747139,
-0.10202806442975998,
0.13945409655570984,
0.021661214530467987,
0.10333898663520813,
-0.0020766255911439657,
0.07358334958553314,
0.037272170186042786,
-0.0280001163482666,
-0.04698936268687248,
-0.011088578961789608,
0.0020344743970781565,
-0.05577605590224266,
0.11701337993144989,
0.05259020999073982,
-0.0886714980006218,
-0.07409586012363434,
-0.009580516256392002,
-0.07335926592350006,
-0.038215573877096176,
-0.1547286957502365,
0.2329166978597641,
-0.030917182564735413,
0.12437079846858978,
0.003149752039462328,
-0.06306970119476318,
-0.09483032673597336,
0.14517559111118317,
0.12407895922660828,
-0.11817460507154465,
-0.009399796836078167,
0.09650255739688873,
-0.005866600200533867,
-0.09527475386857986,
0.14510999619960785,
0.08037827908992767,
-0.02563084289431572,
0.025037124752998352,
-0.022733965888619423,
-0.021453559398651123,
-0.015347006730735302,
0.015540610998868942,
-0.030597979202866554,
0.029416168108582497,
0.042662251740694046,
-0.1463923156261444,
-0.03498699516057968,
-0.08044791221618652,
-0.08698305487632751,
0.18581046164035797,
-0.14277657866477966,
-0.07864214479923248,
-0.04254341498017311,
-0.08468645066022873,
-0.10728199034929276,
0.021719660609960556,
-0.10976678878068924,
0.06724512577056885,
0.05545623227953911,
-0.05225260928273201,
-0.0009250067523680627,
-0.03164041042327881,
0.0032662637531757355,
0.03487436845898628,
0.07120268791913986,
-0.01732550375163555,
0.05975370481610298,
0.12397127598524094,
-0.019730931147933006,
-0.04827846959233284,
0.11225178092718124,
0.021490709856152534,
-0.035946041345596313,
-0.14605534076690674,
0.03688865900039673,
-0.022579049691557884,
0.12442220002412796,
0.032534532248973846,
-0.05697030946612358,
-0.014126471243798733,
-0.2196122705936432,
-0.008397997356951237,
-0.1463952362537384,
-0.07968004047870636,
-0.06731394678354263,
0.10560601949691772,
0.17703479528427124,
-0.055034078657627106,
0.019445886835455894,
-0.03590158373117447,
0.04178448021411896,
-0.05044577643275261,
0.07704321295022964,
-0.0026474366895854473,
-0.1442994624376297,
0.05290527641773224,
-0.054687224328517914,
0.009347786195576191,
-0.3066256046295166,
-0.009464235045015812,
0.024085188284516335,
-0.03251948952674866,
-0.03590572252869606,
0.16143204271793365,
0.021979082375764847,
0.06882719695568085,
-0.05558779463171959,
-0.25186118483543396,
-0.07091144472360611,
0.13138745725154877,
-0.0019314914243295789,
-0.07031361013650894
] |
null | null | peft | ## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.5.0
| {"library_name": "peft"} | null | genies-models/llama-13b-pursue_goals | [
"peft",
"region:us"
] | 2023-11-11T02:30:12+00:00 | [] | [] | TAGS
#peft #region-us
| ## Training procedure
The following 'bitsandbytes' quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.5.0
| [
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16",
"### Framework versions\n\n\n- PEFT 0.5.0"
] | [
"TAGS\n#peft #region-us \n",
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16",
"### Framework versions\n\n\n- PEFT 0.5.0"
] | [
9,
163,
11
] | [
"passage: TAGS\n#peft #region-us \n## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16### Framework versions\n\n\n- PEFT 0.5.0"
] | [
-0.052170995622873306,
0.04638637602329254,
-0.002228866796940565,
0.12587358057498932,
0.09519550204277039,
0.0600481741130352,
0.10608749836683273,
0.12907910346984863,
0.030399560928344727,
0.08949745446443558,
0.11713548749685287,
0.04174109548330307,
0.061590272933244705,
0.15933413803577423,
-0.013215559534728527,
-0.00870309304445982,
0.04744568094611168,
0.0017413088353350759,
-0.0256193894892931,
0.08726111054420471,
0.04637008160352707,
-0.05632777512073517,
0.03198761120438576,
-0.0936860740184784,
-0.14197169244289398,
-0.000977774616330862,
-0.006933287251740694,
0.026412004604935646,
0.03658851608633995,
0.030308807268738747,
0.06681910902261734,
0.011842764914035797,
-0.01992746815085411,
-0.22561807930469513,
-0.011325501836836338,
0.10931333899497986,
-0.024895185604691505,
0.0698515921831131,
-0.09559076279401779,
0.13339440524578094,
-0.08422335982322693,
-0.051820602267980576,
0.015994668006896973,
0.014835527166724205,
-0.0893593430519104,
-0.09642457216978073,
-0.044293515384197235,
0.04310377314686775,
0.030483758077025414,
0.05993334576487541,
-0.014119904488325119,
0.17196474969387054,
-0.15130110085010529,
0.0900082141160965,
0.07799916714429855,
-0.20914553105831146,
-0.02864500693976879,
0.12620775401592255,
-0.022182848304510117,
0.16751492023468018,
-0.0821603313088417,
-0.0928761437535286,
0.06491231173276901,
0.03869080916047096,
-0.018066778779029846,
-0.0015547500224784017,
-0.11479837447404861,
0.030915888026356697,
-0.13364118337631226,
-0.04813624918460846,
0.14555932581424713,
0.03509735316038132,
-0.03541167080402374,
-0.046493805944919586,
-0.09799899905920029,
-0.3764220178127289,
0.025444475933909416,
-0.019241128116846085,
-0.066914401948452,
0.04486314579844475,
-0.02249828726053238,
-0.01497147511690855,
-0.0114058218896389,
-0.08907991647720337,
-0.02228786051273346,
0.07495085150003433,
0.03715614974498749,
0.026139622554183006,
0.006188852712512016,
0.11266672611236572,
-0.09897858649492264,
-0.03323763236403465,
-0.036225371062755585,
-0.028881169855594635,
-0.05456538870930672,
-0.006604235153645277,
-0.07022203505039215,
0.17528314888477325,
0.07525674998760223,
0.12666282057762146,
-0.12827250361442566,
0.1311921775341034,
-0.017891818657517433,
0.049590788781642914,
-0.03390548750758171,
0.035105880349874496,
-0.12452733516693115,
0.10138563066720963,
0.007189449388533831,
0.16058793663978577,
0.009220117703080177,
-0.04686558619141579,
-0.06686486303806305,
0.001404313137754798,
0.1462526172399521,
-0.0007741169538348913,
-0.10082004219293594,
0.01514110155403614,
-0.143125981092453,
-0.03643784299492836,
0.06489310413599014,
-0.07333685457706451,
0.009285308420658112,
0.035954903811216354,
-0.05005185306072235,
-0.035470712929964066,
0.10925997793674469,
-0.04522745683789253,
-0.026262326166033745,
-0.027128657326102257,
-0.09760791808366776,
-0.017379216849803925,
-0.10380303114652634,
-0.13263756036758423,
0.04999830946326256,
-0.18068750202655792,
0.00562055641785264,
-0.042586758732795715,
-0.05852135270833969,
0.02671758644282818,
0.011211113072931767,
-0.0801720842719078,
0.0641409307718277,
-0.10207898169755936,
-0.14261718094348907,
-0.02556646801531315,
0.015782896429300308,
0.027732979506254196,
-0.01964949443936348,
0.10067597031593323,
0.04901120811700821,
0.11884310096502304,
-0.15137265622615814,
-0.000906149041838944,
-0.0006713634938932955,
0.06834474205970764,
0.05941688269376755,
0.13060152530670166,
-0.10515986382961273,
-0.027969863265752792,
-0.06103983148932457,
-0.06348162889480591,
-0.13580144941806793,
-0.01634615659713745,
0.1407272070646286,
0.08765122294425964,
-0.162354975938797,
-0.01073814183473587,
0.07467684894800186,
-0.002643210580572486,
-0.07879126816987991,
0.15129150450229645,
-0.05682380124926567,
0.11437069624662399,
-0.04345925524830818,
0.07842165231704712,
0.23099873960018158,
-0.0996355414390564,
0.006504190154373646,
0.11197512596845627,
0.07360801845788956,
-0.000003583008265195531,
0.013187721371650696,
0.07264252007007599,
-0.10083498060703278,
0.03870057687163353,
0.07377462089061737,
0.03774886205792427,
-0.06504413485527039,
-0.07400774955749512,
-0.022752903401851654,
-0.05681690201163292,
0.1151171550154686,
0.02729841135442257,
0.014005406759679317,
-0.06971991807222366,
-0.08609434217214584,
0.1404680609703064,
0.11459258198738098,
-0.030378511175513268,
-0.005029883235692978,
-0.13212119042873383,
-0.0021202426869422197,
-0.04808016121387482,
0.024497391656041145,
-0.1225360855460167,
0.02185128629207611,
0.08327647298574448,
0.00787833146750927,
0.012623712420463562,
0.04495754465460777,
0.06096586957573891,
0.029019279405474663,
-0.0604269839823246,
0.002956473035737872,
-0.05536847561597824,
0.004059408791363239,
-0.08605807274580002,
-0.07381096482276917,
-0.0010516063775867224,
-0.014724265784025192,
0.19779662787914276,
-0.13052622973918915,
0.03565271571278572,
0.11289136111736298,
-0.005989561323076487,
-0.009503825567662716,
-0.04201732203364372,
-0.06623262912034988,
0.10975963622331619,
-0.012460143305361271,
-0.037024881690740585,
0.03980237618088722,
0.02601690962910652,
-0.061022430658340454,
-0.17260299623012543,
-0.0970475897192955,
0.05009652301669121,
0.12437541782855988,
0.06887579709291458,
-0.05821763351559639,
-0.03995196893811226,
-0.023590924218297005,
-0.03908832371234894,
0.06429712474346161,
-0.045532938092947006,
0.03026658482849598,
0.0026904530823230743,
0.06007077172398567,
-0.10919053852558136,
-0.04384881630539894,
0.06400182098150253,
-0.01284062210470438,
-0.04328686743974686,
0.1131523922085762,
0.03539156913757324,
-0.10713862627744675,
0.06380997598171234,
0.057952430099248886,
-0.14340658485889435,
0.1139671802520752,
-0.010076945647597313,
-0.021488331258296967,
-0.09941406548023224,
0.16643720865249634,
0.018506327643990517,
0.11286304146051407,
-0.1634303331375122,
0.09947452694177628,
-0.00929392408579588,
0.008983489125967026,
0.07010107487440109,
-0.1942894458770752,
-0.0023229599464684725,
-0.03061300702393055,
-0.08721230924129486,
-0.06861617416143417,
-0.015447620302438736,
-0.003656132612377405,
0.023634733632206917,
-0.0011160095455124974,
0.07114277780056,
0.1408311426639557,
-0.021699242293834686,
-0.08314504474401474,
0.18067345023155212,
-0.2228248417377472,
-0.2243027687072754,
-0.23713873326778412,
0.014357865788042545,
-0.10143844783306122,
-0.034064821898937225,
-0.05187778174877167,
-0.07830392569303513,
0.03843799605965614,
-0.08154495060443878,
-0.047854289412498474,
-0.010174937546253204,
0.00946576427668333,
0.05525780841708183,
0.022059176117181778,
0.17305299639701843,
-0.08067180961370468,
0.027929941192269325,
0.05004730448126793,
-0.03334664925932884,
0.1225728914141655,
-0.07749255001544952,
-0.0130006680265069,
0.11107799410820007,
-0.017344031482934952,
0.020386451855301857,
0.015911642462015152,
0.321786105632782,
0.012793739326298237,
0.031549349427223206,
0.07509331405162811,
0.004641542676836252,
0.05399017781019211,
0.08592698723077774,
0.015287249349057674,
-0.10515401512384415,
0.07365500181913376,
0.053789518773555756,
-0.09195578098297119,
-0.1337462067604065,
-0.03178740665316582,
-0.06295210868120193,
0.018377745524048805,
0.07818001508712769,
0.06705041229724884,
0.08702594041824341,
0.06877101212739944,
0.04080965742468834,
0.09042216837406158,
0.0002709537511691451,
-0.010677835904061794,
0.11289051920175552,
-0.018581749871373177,
0.07096980512142181,
-0.0141120171174407,
0.02626338228583336,
0.04935658350586891,
0.12695446610450745,
0.06418963521718979,
-0.07558903843164444,
0.024903111159801483,
0.05380476638674736,
0.2923724353313446,
-0.00511576933786273,
0.09942048788070679,
-0.07165701687335968,
-0.020087575539946556,
-0.012876059859991074,
-0.031602099537849426,
-0.08611224591732025,
0.04154687002301216,
0.009212172590196133,
0.05790695920586586,
-0.0025879312306642532,
-0.00050112244207412,
0.07788389921188354,
0.09517701715230942,
0.16684189438819885,
-0.2861690819263458,
-0.10541892051696777,
-0.011134089902043343,
0.11542210727930069,
-0.09569238871335983,
0.016318857669830322,
0.22510360181331635,
0.01965336687862873,
-0.10895603150129318,
-0.031794480979442596,
0.0305477287620306,
-0.004160500131547451,
0.01549921091645956,
0.13027654588222504,
0.11410193145275116,
0.0013551336014643312,
0.08492128551006317,
-0.30027419328689575,
0.03473859280347824,
0.05703248083591461,
0.040042631328105927,
-0.03116016462445259,
0.003734297351911664,
-0.07034774869680405,
-0.08240098506212234,
0.038768261671066284,
0.004138266667723656,
0.1705280989408493,
-0.2976781725883484,
-0.07360395789146423,
-0.00185823580250144,
0.1271347999572754,
0.05303133279085159,
0.04872012883424759,
0.02403664030134678,
0.04942479729652405,
0.08176594227552414,
0.08706773817539215,
-0.02611609175801277,
-0.11035197973251343,
-0.004661462735384703,
0.13897009193897247,
-0.12668536603450775,
-0.05941249430179596,
-0.05124782398343086,
-0.022857125848531723,
0.03539132699370384,
-0.15729300677776337,
-0.04396966099739075,
-0.05650247633457184,
0.03282100334763527,
0.1437554955482483,
-0.034790076315402985,
0.0006096545839682221,
-0.012934640981256962,
0.015886543318629265,
-0.032133374363183975,
-0.08068639785051346,
0.10920129716396332,
-0.044809311628341675,
-0.13727150857448578,
-0.045596521347761154,
0.1377175748348236,
0.0731605812907219,
-0.00359351746737957,
-0.09041156619787216,
-0.050021663308143616,
0.031252212822437286,
-0.1371932029724121,
0.01997753418982029,
0.08576954901218414,
-0.07234665006399155,
0.0698034018278122,
-0.11751603335142136,
0.2458948791027069,
-0.0524151511490345,
0.0896567553281784,
0.0709720253944397,
0.31162014603614807,
-0.08235272765159607,
0.02873161807656288,
0.09803200513124466,
-0.026654057204723358,
-0.26190292835235596,
0.029979633167386055,
0.05638238042593002,
0.04938185587525368,
-0.02843460999429226,
-0.18719743192195892,
0.036363113671541214,
0.06459816545248032,
0.016497470438480377,
0.13878561556339264,
-0.3157224655151367,
-0.0721718817949295,
0.04020849987864494,
0.047762200236320496,
0.11962796002626419,
-0.04890049248933792,
0.011267954483628273,
0.010984989814460278,
-0.016328759491443634,
0.16383521258831024,
-0.08632995188236237,
0.11082419753074646,
-0.012934541329741478,
0.02104428969323635,
0.010682103224098682,
-0.040459323674440384,
0.1487898826599121,
-0.003343862947076559,
0.09072566032409668,
0.025223324075341225,
-0.06733614206314087,
0.06636755168437958,
-0.07327944040298462,
0.011974047869443893,
-0.05378776043653488,
0.09185262024402618,
-0.045282598584890366,
0.0010139745427295566,
-0.06246977299451828,
-0.024063177406787872,
-0.06749105453491211,
-0.06296666711568832,
-0.10791633278131485,
0.08884958922863007,
0.000746204168535769,
-0.02908109501004219,
-0.04201129451394081,
0.054056353867053986,
0.035463664680719376,
0.4493359923362732,
-0.05227240175008774,
-0.053069330751895905,
0.08815151453018188,
0.09277763962745667,
-0.01794450543820858,
0.09321408718824387,
-0.12357188761234283,
0.04079055041074753,
0.12765155732631683,
0.004561528097838163,
0.13461217284202576,
0.08197237551212311,
-0.10337455570697784,
-0.004654042888432741,
0.042563386261463165,
-0.13127627968788147,
-0.08014523237943649,
-0.03295697271823883,
-0.01872093789279461,
-0.1190016120672226,
-0.015042198821902275,
0.09247755259275436,
-0.03084845095872879,
0.053225692361593246,
0.02725985273718834,
0.04755283519625664,
-0.13321809470653534,
0.15661577880382538,
0.03548196330666542,
0.08438514173030853,
-0.08233853429555893,
0.0957811176776886,
0.030104901641607285,
0.004610141273587942,
0.04891415685415268,
-0.02637943997979164,
-0.10149519145488739,
0.019178146496415138,
-0.03032047301530838,
-0.08386382460594177,
0.12455578148365021,
-0.03086884878575802,
-0.04816962778568268,
-0.08449167013168335,
0.00968680065125227,
0.06828322261571884,
0.05217857286334038,
0.10059580206871033,
-0.031775325536727905,
0.02660980448126793,
-0.12699167430400848,
0.08317403495311737,
-0.024292631074786186,
0.012828993611037731,
-0.1356390118598938,
0.06760798394680023,
-0.02659040130674839,
0.057388827204704285,
-0.01863805763423443,
-0.01973152346909046,
-0.2173939198255539,
0.02618231251835823,
-0.025557123124599457,
0.0038494919426739216,
0.04102040454745293,
0.027889033779501915,
0.026005011051893234,
0.05082876980304718,
-0.035126328468322754,
0.027516542002558708,
-0.031932324171066284,
-0.05281057208776474,
0.04515986144542694,
-0.001467780559323728,
-0.02854740433394909,
-0.051517583429813385,
0.06076597422361374,
-0.11085972934961319,
0.03777240961790085,
0.01833501085639,
-0.05718983709812164,
0.07385632395744324,
0.034146055579185486,
0.025609485805034637,
0.09601592272520065,
0.05744863301515579,
0.03817358240485191,
-0.06451597064733505,
0.03407379239797592,
-0.0191631019115448,
-0.006845394615083933,
0.040695060044527054,
0.11625181883573532,
-0.04832686111330986,
-0.06327467411756516,
-0.1486724466085434,
-0.010981161147356033,
-0.048681098967790604,
0.043675489723682404,
0.14957195520401,
0.10375252366065979,
0.09278536587953568,
-0.08544762432575226,
-0.0226217620074749,
-0.14531907439231873,
-0.08078242838382721,
0.060537077486515045,
-0.044578637927770615,
-0.05809058994054794,
-0.04991595074534416,
0.072527676820755,
-0.014007162302732468,
0.13561315834522247,
-0.08526904881000519,
-0.1192963719367981,
-0.05781520903110504,
-0.2013944536447525,
-0.12102537602186203,
0.006627216935157776,
0.2745034396648407,
0.03526143729686737,
-0.04139144718647003,
-0.07936355471611023,
0.0031785741448402405,
0.07116973400115967,
0.16906431317329407,
0.0312589667737484,
0.110907644033432,
-0.11192750185728073,
0.10478517413139343,
0.039009008556604385,
-0.05691129341721535,
0.10796307027339935,
0.3328118920326233,
-0.08219370990991592,
0.02072407677769661,
-0.10421788692474365,
0.10153460502624512,
0.002255996223539114,
-0.13901546597480774,
0.010607997886836529,
-0.03188878297805786,
-0.16198381781578064,
-0.10664700716733932,
0.016538817435503006,
-0.07478190958499908,
-0.1838442087173462,
-0.017796620726585388,
-0.11684910207986832,
-0.07336822152137756,
0.10155867785215378,
0.040450263768434525,
-0.03009716235101223,
0.20794062316417694,
-0.07158278673887253,
0.04586811363697052,
-0.003929977770894766,
-0.016108684241771698,
-0.01862870156764984,
-0.03230820968747139,
-0.10202806442975998,
0.13945409655570984,
0.021661214530467987,
0.10333898663520813,
-0.0020766255911439657,
0.07358334958553314,
0.037272170186042786,
-0.0280001163482666,
-0.04698936268687248,
-0.011088578961789608,
0.0020344743970781565,
-0.05577605590224266,
0.11701337993144989,
0.05259020999073982,
-0.0886714980006218,
-0.07409586012363434,
-0.009580516256392002,
-0.07335926592350006,
-0.038215573877096176,
-0.1547286957502365,
0.2329166978597641,
-0.030917182564735413,
0.12437079846858978,
0.003149752039462328,
-0.06306970119476318,
-0.09483032673597336,
0.14517559111118317,
0.12407895922660828,
-0.11817460507154465,
-0.009399796836078167,
0.09650255739688873,
-0.005866600200533867,
-0.09527475386857986,
0.14510999619960785,
0.08037827908992767,
-0.02563084289431572,
0.025037124752998352,
-0.022733965888619423,
-0.021453559398651123,
-0.015347006730735302,
0.015540610998868942,
-0.030597979202866554,
0.029416168108582497,
0.042662251740694046,
-0.1463923156261444,
-0.03498699516057968,
-0.08044791221618652,
-0.08698305487632751,
0.18581046164035797,
-0.14277657866477966,
-0.07864214479923248,
-0.04254341498017311,
-0.08468645066022873,
-0.10728199034929276,
0.021719660609960556,
-0.10976678878068924,
0.06724512577056885,
0.05545623227953911,
-0.05225260928273201,
-0.0009250067523680627,
-0.03164041042327881,
0.0032662637531757355,
0.03487436845898628,
0.07120268791913986,
-0.01732550375163555,
0.05975370481610298,
0.12397127598524094,
-0.019730931147933006,
-0.04827846959233284,
0.11225178092718124,
0.021490709856152534,
-0.035946041345596313,
-0.14605534076690674,
0.03688865900039673,
-0.022579049691557884,
0.12442220002412796,
0.032534532248973846,
-0.05697030946612358,
-0.014126471243798733,
-0.2196122705936432,
-0.008397997356951237,
-0.1463952362537384,
-0.07968004047870636,
-0.06731394678354263,
0.10560601949691772,
0.17703479528427124,
-0.055034078657627106,
0.019445886835455894,
-0.03590158373117447,
0.04178448021411896,
-0.05044577643275261,
0.07704321295022964,
-0.0026474366895854473,
-0.1442994624376297,
0.05290527641773224,
-0.054687224328517914,
0.009347786195576191,
-0.3066256046295166,
-0.009464235045015812,
0.024085188284516335,
-0.03251948952674866,
-0.03590572252869606,
0.16143204271793365,
0.021979082375764847,
0.06882719695568085,
-0.05558779463171959,
-0.25186118483543396,
-0.07091144472360611,
0.13138745725154877,
-0.0019314914243295789,
-0.07031361013650894
] |
null | null | peft | ## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.5.0
| {"library_name": "peft"} | null | genies-models/openllama-3b-personality_traits | [
"peft",
"region:us"
] | 2023-11-11T02:30:37+00:00 | [] | [] | TAGS
#peft #region-us
| ## Training procedure
The following 'bitsandbytes' quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.5.0
| [
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16",
"### Framework versions\n\n\n- PEFT 0.5.0"
] | [
"TAGS\n#peft #region-us \n",
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16",
"### Framework versions\n\n\n- PEFT 0.5.0"
] | [
9,
163,
11
] | [
"passage: TAGS\n#peft #region-us \n## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16### Framework versions\n\n\n- PEFT 0.5.0"
] | [
-0.052170995622873306,
0.04638637602329254,
-0.002228866796940565,
0.12587358057498932,
0.09519550204277039,
0.0600481741130352,
0.10608749836683273,
0.12907910346984863,
0.030399560928344727,
0.08949745446443558,
0.11713548749685287,
0.04174109548330307,
0.061590272933244705,
0.15933413803577423,
-0.013215559534728527,
-0.00870309304445982,
0.04744568094611168,
0.0017413088353350759,
-0.0256193894892931,
0.08726111054420471,
0.04637008160352707,
-0.05632777512073517,
0.03198761120438576,
-0.0936860740184784,
-0.14197169244289398,
-0.000977774616330862,
-0.006933287251740694,
0.026412004604935646,
0.03658851608633995,
0.030308807268738747,
0.06681910902261734,
0.011842764914035797,
-0.01992746815085411,
-0.22561807930469513,
-0.011325501836836338,
0.10931333899497986,
-0.024895185604691505,
0.0698515921831131,
-0.09559076279401779,
0.13339440524578094,
-0.08422335982322693,
-0.051820602267980576,
0.015994668006896973,
0.014835527166724205,
-0.0893593430519104,
-0.09642457216978073,
-0.044293515384197235,
0.04310377314686775,
0.030483758077025414,
0.05993334576487541,
-0.014119904488325119,
0.17196474969387054,
-0.15130110085010529,
0.0900082141160965,
0.07799916714429855,
-0.20914553105831146,
-0.02864500693976879,
0.12620775401592255,
-0.022182848304510117,
0.16751492023468018,
-0.0821603313088417,
-0.0928761437535286,
0.06491231173276901,
0.03869080916047096,
-0.018066778779029846,
-0.0015547500224784017,
-0.11479837447404861,
0.030915888026356697,
-0.13364118337631226,
-0.04813624918460846,
0.14555932581424713,
0.03509735316038132,
-0.03541167080402374,
-0.046493805944919586,
-0.09799899905920029,
-0.3764220178127289,
0.025444475933909416,
-0.019241128116846085,
-0.066914401948452,
0.04486314579844475,
-0.02249828726053238,
-0.01497147511690855,
-0.0114058218896389,
-0.08907991647720337,
-0.02228786051273346,
0.07495085150003433,
0.03715614974498749,
0.026139622554183006,
0.006188852712512016,
0.11266672611236572,
-0.09897858649492264,
-0.03323763236403465,
-0.036225371062755585,
-0.028881169855594635,
-0.05456538870930672,
-0.006604235153645277,
-0.07022203505039215,
0.17528314888477325,
0.07525674998760223,
0.12666282057762146,
-0.12827250361442566,
0.1311921775341034,
-0.017891818657517433,
0.049590788781642914,
-0.03390548750758171,
0.035105880349874496,
-0.12452733516693115,
0.10138563066720963,
0.007189449388533831,
0.16058793663978577,
0.009220117703080177,
-0.04686558619141579,
-0.06686486303806305,
0.001404313137754798,
0.1462526172399521,
-0.0007741169538348913,
-0.10082004219293594,
0.01514110155403614,
-0.143125981092453,
-0.03643784299492836,
0.06489310413599014,
-0.07333685457706451,
0.009285308420658112,
0.035954903811216354,
-0.05005185306072235,
-0.035470712929964066,
0.10925997793674469,
-0.04522745683789253,
-0.026262326166033745,
-0.027128657326102257,
-0.09760791808366776,
-0.017379216849803925,
-0.10380303114652634,
-0.13263756036758423,
0.04999830946326256,
-0.18068750202655792,
0.00562055641785264,
-0.042586758732795715,
-0.05852135270833969,
0.02671758644282818,
0.011211113072931767,
-0.0801720842719078,
0.0641409307718277,
-0.10207898169755936,
-0.14261718094348907,
-0.02556646801531315,
0.015782896429300308,
0.027732979506254196,
-0.01964949443936348,
0.10067597031593323,
0.04901120811700821,
0.11884310096502304,
-0.15137265622615814,
-0.000906149041838944,
-0.0006713634938932955,
0.06834474205970764,
0.05941688269376755,
0.13060152530670166,
-0.10515986382961273,
-0.027969863265752792,
-0.06103983148932457,
-0.06348162889480591,
-0.13580144941806793,
-0.01634615659713745,
0.1407272070646286,
0.08765122294425964,
-0.162354975938797,
-0.01073814183473587,
0.07467684894800186,
-0.002643210580572486,
-0.07879126816987991,
0.15129150450229645,
-0.05682380124926567,
0.11437069624662399,
-0.04345925524830818,
0.07842165231704712,
0.23099873960018158,
-0.0996355414390564,
0.006504190154373646,
0.11197512596845627,
0.07360801845788956,
-0.000003583008265195531,
0.013187721371650696,
0.07264252007007599,
-0.10083498060703278,
0.03870057687163353,
0.07377462089061737,
0.03774886205792427,
-0.06504413485527039,
-0.07400774955749512,
-0.022752903401851654,
-0.05681690201163292,
0.1151171550154686,
0.02729841135442257,
0.014005406759679317,
-0.06971991807222366,
-0.08609434217214584,
0.1404680609703064,
0.11459258198738098,
-0.030378511175513268,
-0.005029883235692978,
-0.13212119042873383,
-0.0021202426869422197,
-0.04808016121387482,
0.024497391656041145,
-0.1225360855460167,
0.02185128629207611,
0.08327647298574448,
0.00787833146750927,
0.012623712420463562,
0.04495754465460777,
0.06096586957573891,
0.029019279405474663,
-0.0604269839823246,
0.002956473035737872,
-0.05536847561597824,
0.004059408791363239,
-0.08605807274580002,
-0.07381096482276917,
-0.0010516063775867224,
-0.014724265784025192,
0.19779662787914276,
-0.13052622973918915,
0.03565271571278572,
0.11289136111736298,
-0.005989561323076487,
-0.009503825567662716,
-0.04201732203364372,
-0.06623262912034988,
0.10975963622331619,
-0.012460143305361271,
-0.037024881690740585,
0.03980237618088722,
0.02601690962910652,
-0.061022430658340454,
-0.17260299623012543,
-0.0970475897192955,
0.05009652301669121,
0.12437541782855988,
0.06887579709291458,
-0.05821763351559639,
-0.03995196893811226,
-0.023590924218297005,
-0.03908832371234894,
0.06429712474346161,
-0.045532938092947006,
0.03026658482849598,
0.0026904530823230743,
0.06007077172398567,
-0.10919053852558136,
-0.04384881630539894,
0.06400182098150253,
-0.01284062210470438,
-0.04328686743974686,
0.1131523922085762,
0.03539156913757324,
-0.10713862627744675,
0.06380997598171234,
0.057952430099248886,
-0.14340658485889435,
0.1139671802520752,
-0.010076945647597313,
-0.021488331258296967,
-0.09941406548023224,
0.16643720865249634,
0.018506327643990517,
0.11286304146051407,
-0.1634303331375122,
0.09947452694177628,
-0.00929392408579588,
0.008983489125967026,
0.07010107487440109,
-0.1942894458770752,
-0.0023229599464684725,
-0.03061300702393055,
-0.08721230924129486,
-0.06861617416143417,
-0.015447620302438736,
-0.003656132612377405,
0.023634733632206917,
-0.0011160095455124974,
0.07114277780056,
0.1408311426639557,
-0.021699242293834686,
-0.08314504474401474,
0.18067345023155212,
-0.2228248417377472,
-0.2243027687072754,
-0.23713873326778412,
0.014357865788042545,
-0.10143844783306122,
-0.034064821898937225,
-0.05187778174877167,
-0.07830392569303513,
0.03843799605965614,
-0.08154495060443878,
-0.047854289412498474,
-0.010174937546253204,
0.00946576427668333,
0.05525780841708183,
0.022059176117181778,
0.17305299639701843,
-0.08067180961370468,
0.027929941192269325,
0.05004730448126793,
-0.03334664925932884,
0.1225728914141655,
-0.07749255001544952,
-0.0130006680265069,
0.11107799410820007,
-0.017344031482934952,
0.020386451855301857,
0.015911642462015152,
0.321786105632782,
0.012793739326298237,
0.031549349427223206,
0.07509331405162811,
0.004641542676836252,
0.05399017781019211,
0.08592698723077774,
0.015287249349057674,
-0.10515401512384415,
0.07365500181913376,
0.053789518773555756,
-0.09195578098297119,
-0.1337462067604065,
-0.03178740665316582,
-0.06295210868120193,
0.018377745524048805,
0.07818001508712769,
0.06705041229724884,
0.08702594041824341,
0.06877101212739944,
0.04080965742468834,
0.09042216837406158,
0.0002709537511691451,
-0.010677835904061794,
0.11289051920175552,
-0.018581749871373177,
0.07096980512142181,
-0.0141120171174407,
0.02626338228583336,
0.04935658350586891,
0.12695446610450745,
0.06418963521718979,
-0.07558903843164444,
0.024903111159801483,
0.05380476638674736,
0.2923724353313446,
-0.00511576933786273,
0.09942048788070679,
-0.07165701687335968,
-0.020087575539946556,
-0.012876059859991074,
-0.031602099537849426,
-0.08611224591732025,
0.04154687002301216,
0.009212172590196133,
0.05790695920586586,
-0.0025879312306642532,
-0.00050112244207412,
0.07788389921188354,
0.09517701715230942,
0.16684189438819885,
-0.2861690819263458,
-0.10541892051696777,
-0.011134089902043343,
0.11542210727930069,
-0.09569238871335983,
0.016318857669830322,
0.22510360181331635,
0.01965336687862873,
-0.10895603150129318,
-0.031794480979442596,
0.0305477287620306,
-0.004160500131547451,
0.01549921091645956,
0.13027654588222504,
0.11410193145275116,
0.0013551336014643312,
0.08492128551006317,
-0.30027419328689575,
0.03473859280347824,
0.05703248083591461,
0.040042631328105927,
-0.03116016462445259,
0.003734297351911664,
-0.07034774869680405,
-0.08240098506212234,
0.038768261671066284,
0.004138266667723656,
0.1705280989408493,
-0.2976781725883484,
-0.07360395789146423,
-0.00185823580250144,
0.1271347999572754,
0.05303133279085159,
0.04872012883424759,
0.02403664030134678,
0.04942479729652405,
0.08176594227552414,
0.08706773817539215,
-0.02611609175801277,
-0.11035197973251343,
-0.004661462735384703,
0.13897009193897247,
-0.12668536603450775,
-0.05941249430179596,
-0.05124782398343086,
-0.022857125848531723,
0.03539132699370384,
-0.15729300677776337,
-0.04396966099739075,
-0.05650247633457184,
0.03282100334763527,
0.1437554955482483,
-0.034790076315402985,
0.0006096545839682221,
-0.012934640981256962,
0.015886543318629265,
-0.032133374363183975,
-0.08068639785051346,
0.10920129716396332,
-0.044809311628341675,
-0.13727150857448578,
-0.045596521347761154,
0.1377175748348236,
0.0731605812907219,
-0.00359351746737957,
-0.09041156619787216,
-0.050021663308143616,
0.031252212822437286,
-0.1371932029724121,
0.01997753418982029,
0.08576954901218414,
-0.07234665006399155,
0.0698034018278122,
-0.11751603335142136,
0.2458948791027069,
-0.0524151511490345,
0.0896567553281784,
0.0709720253944397,
0.31162014603614807,
-0.08235272765159607,
0.02873161807656288,
0.09803200513124466,
-0.026654057204723358,
-0.26190292835235596,
0.029979633167386055,
0.05638238042593002,
0.04938185587525368,
-0.02843460999429226,
-0.18719743192195892,
0.036363113671541214,
0.06459816545248032,
0.016497470438480377,
0.13878561556339264,
-0.3157224655151367,
-0.0721718817949295,
0.04020849987864494,
0.047762200236320496,
0.11962796002626419,
-0.04890049248933792,
0.011267954483628273,
0.010984989814460278,
-0.016328759491443634,
0.16383521258831024,
-0.08632995188236237,
0.11082419753074646,
-0.012934541329741478,
0.02104428969323635,
0.010682103224098682,
-0.040459323674440384,
0.1487898826599121,
-0.003343862947076559,
0.09072566032409668,
0.025223324075341225,
-0.06733614206314087,
0.06636755168437958,
-0.07327944040298462,
0.011974047869443893,
-0.05378776043653488,
0.09185262024402618,
-0.045282598584890366,
0.0010139745427295566,
-0.06246977299451828,
-0.024063177406787872,
-0.06749105453491211,
-0.06296666711568832,
-0.10791633278131485,
0.08884958922863007,
0.000746204168535769,
-0.02908109501004219,
-0.04201129451394081,
0.054056353867053986,
0.035463664680719376,
0.4493359923362732,
-0.05227240175008774,
-0.053069330751895905,
0.08815151453018188,
0.09277763962745667,
-0.01794450543820858,
0.09321408718824387,
-0.12357188761234283,
0.04079055041074753,
0.12765155732631683,
0.004561528097838163,
0.13461217284202576,
0.08197237551212311,
-0.10337455570697784,
-0.004654042888432741,
0.042563386261463165,
-0.13127627968788147,
-0.08014523237943649,
-0.03295697271823883,
-0.01872093789279461,
-0.1190016120672226,
-0.015042198821902275,
0.09247755259275436,
-0.03084845095872879,
0.053225692361593246,
0.02725985273718834,
0.04755283519625664,
-0.13321809470653534,
0.15661577880382538,
0.03548196330666542,
0.08438514173030853,
-0.08233853429555893,
0.0957811176776886,
0.030104901641607285,
0.004610141273587942,
0.04891415685415268,
-0.02637943997979164,
-0.10149519145488739,
0.019178146496415138,
-0.03032047301530838,
-0.08386382460594177,
0.12455578148365021,
-0.03086884878575802,
-0.04816962778568268,
-0.08449167013168335,
0.00968680065125227,
0.06828322261571884,
0.05217857286334038,
0.10059580206871033,
-0.031775325536727905,
0.02660980448126793,
-0.12699167430400848,
0.08317403495311737,
-0.024292631074786186,
0.012828993611037731,
-0.1356390118598938,
0.06760798394680023,
-0.02659040130674839,
0.057388827204704285,
-0.01863805763423443,
-0.01973152346909046,
-0.2173939198255539,
0.02618231251835823,
-0.025557123124599457,
0.0038494919426739216,
0.04102040454745293,
0.027889033779501915,
0.026005011051893234,
0.05082876980304718,
-0.035126328468322754,
0.027516542002558708,
-0.031932324171066284,
-0.05281057208776474,
0.04515986144542694,
-0.001467780559323728,
-0.02854740433394909,
-0.051517583429813385,
0.06076597422361374,
-0.11085972934961319,
0.03777240961790085,
0.01833501085639,
-0.05718983709812164,
0.07385632395744324,
0.034146055579185486,
0.025609485805034637,
0.09601592272520065,
0.05744863301515579,
0.03817358240485191,
-0.06451597064733505,
0.03407379239797592,
-0.0191631019115448,
-0.006845394615083933,
0.040695060044527054,
0.11625181883573532,
-0.04832686111330986,
-0.06327467411756516,
-0.1486724466085434,
-0.010981161147356033,
-0.048681098967790604,
0.043675489723682404,
0.14957195520401,
0.10375252366065979,
0.09278536587953568,
-0.08544762432575226,
-0.0226217620074749,
-0.14531907439231873,
-0.08078242838382721,
0.060537077486515045,
-0.044578637927770615,
-0.05809058994054794,
-0.04991595074534416,
0.072527676820755,
-0.014007162302732468,
0.13561315834522247,
-0.08526904881000519,
-0.1192963719367981,
-0.05781520903110504,
-0.2013944536447525,
-0.12102537602186203,
0.006627216935157776,
0.2745034396648407,
0.03526143729686737,
-0.04139144718647003,
-0.07936355471611023,
0.0031785741448402405,
0.07116973400115967,
0.16906431317329407,
0.0312589667737484,
0.110907644033432,
-0.11192750185728073,
0.10478517413139343,
0.039009008556604385,
-0.05691129341721535,
0.10796307027339935,
0.3328118920326233,
-0.08219370990991592,
0.02072407677769661,
-0.10421788692474365,
0.10153460502624512,
0.002255996223539114,
-0.13901546597480774,
0.010607997886836529,
-0.03188878297805786,
-0.16198381781578064,
-0.10664700716733932,
0.016538817435503006,
-0.07478190958499908,
-0.1838442087173462,
-0.017796620726585388,
-0.11684910207986832,
-0.07336822152137756,
0.10155867785215378,
0.040450263768434525,
-0.03009716235101223,
0.20794062316417694,
-0.07158278673887253,
0.04586811363697052,
-0.003929977770894766,
-0.016108684241771698,
-0.01862870156764984,
-0.03230820968747139,
-0.10202806442975998,
0.13945409655570984,
0.021661214530467987,
0.10333898663520813,
-0.0020766255911439657,
0.07358334958553314,
0.037272170186042786,
-0.0280001163482666,
-0.04698936268687248,
-0.011088578961789608,
0.0020344743970781565,
-0.05577605590224266,
0.11701337993144989,
0.05259020999073982,
-0.0886714980006218,
-0.07409586012363434,
-0.009580516256392002,
-0.07335926592350006,
-0.038215573877096176,
-0.1547286957502365,
0.2329166978597641,
-0.030917182564735413,
0.12437079846858978,
0.003149752039462328,
-0.06306970119476318,
-0.09483032673597336,
0.14517559111118317,
0.12407895922660828,
-0.11817460507154465,
-0.009399796836078167,
0.09650255739688873,
-0.005866600200533867,
-0.09527475386857986,
0.14510999619960785,
0.08037827908992767,
-0.02563084289431572,
0.025037124752998352,
-0.022733965888619423,
-0.021453559398651123,
-0.015347006730735302,
0.015540610998868942,
-0.030597979202866554,
0.029416168108582497,
0.042662251740694046,
-0.1463923156261444,
-0.03498699516057968,
-0.08044791221618652,
-0.08698305487632751,
0.18581046164035797,
-0.14277657866477966,
-0.07864214479923248,
-0.04254341498017311,
-0.08468645066022873,
-0.10728199034929276,
0.021719660609960556,
-0.10976678878068924,
0.06724512577056885,
0.05545623227953911,
-0.05225260928273201,
-0.0009250067523680627,
-0.03164041042327881,
0.0032662637531757355,
0.03487436845898628,
0.07120268791913986,
-0.01732550375163555,
0.05975370481610298,
0.12397127598524094,
-0.019730931147933006,
-0.04827846959233284,
0.11225178092718124,
0.021490709856152534,
-0.035946041345596313,
-0.14605534076690674,
0.03688865900039673,
-0.022579049691557884,
0.12442220002412796,
0.032534532248973846,
-0.05697030946612358,
-0.014126471243798733,
-0.2196122705936432,
-0.008397997356951237,
-0.1463952362537384,
-0.07968004047870636,
-0.06731394678354263,
0.10560601949691772,
0.17703479528427124,
-0.055034078657627106,
0.019445886835455894,
-0.03590158373117447,
0.04178448021411896,
-0.05044577643275261,
0.07704321295022964,
-0.0026474366895854473,
-0.1442994624376297,
0.05290527641773224,
-0.054687224328517914,
0.009347786195576191,
-0.3066256046295166,
-0.009464235045015812,
0.024085188284516335,
-0.03251948952674866,
-0.03590572252869606,
0.16143204271793365,
0.021979082375764847,
0.06882719695568085,
-0.05558779463171959,
-0.25186118483543396,
-0.07091144472360611,
0.13138745725154877,
-0.0019314914243295789,
-0.07031361013650894
] |
null | null | peft | ## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.5.0
| {"library_name": "peft"} | null | genies-models/llama-30b-counterfactual_python | [
"peft",
"region:us"
] | 2023-11-11T02:30:49+00:00 | [] | [] | TAGS
#peft #region-us
| ## Training procedure
The following 'bitsandbytes' quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.5.0
| [
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16",
"### Framework versions\n\n\n- PEFT 0.5.0"
] | [
"TAGS\n#peft #region-us \n",
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16",
"### Framework versions\n\n\n- PEFT 0.5.0"
] | [
9,
163,
11
] | [
"passage: TAGS\n#peft #region-us \n## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16### Framework versions\n\n\n- PEFT 0.5.0"
] | [
-0.052170995622873306,
0.04638637602329254,
-0.002228866796940565,
0.12587358057498932,
0.09519550204277039,
0.0600481741130352,
0.10608749836683273,
0.12907910346984863,
0.030399560928344727,
0.08949745446443558,
0.11713548749685287,
0.04174109548330307,
0.061590272933244705,
0.15933413803577423,
-0.013215559534728527,
-0.00870309304445982,
0.04744568094611168,
0.0017413088353350759,
-0.0256193894892931,
0.08726111054420471,
0.04637008160352707,
-0.05632777512073517,
0.03198761120438576,
-0.0936860740184784,
-0.14197169244289398,
-0.000977774616330862,
-0.006933287251740694,
0.026412004604935646,
0.03658851608633995,
0.030308807268738747,
0.06681910902261734,
0.011842764914035797,
-0.01992746815085411,
-0.22561807930469513,
-0.011325501836836338,
0.10931333899497986,
-0.024895185604691505,
0.0698515921831131,
-0.09559076279401779,
0.13339440524578094,
-0.08422335982322693,
-0.051820602267980576,
0.015994668006896973,
0.014835527166724205,
-0.0893593430519104,
-0.09642457216978073,
-0.044293515384197235,
0.04310377314686775,
0.030483758077025414,
0.05993334576487541,
-0.014119904488325119,
0.17196474969387054,
-0.15130110085010529,
0.0900082141160965,
0.07799916714429855,
-0.20914553105831146,
-0.02864500693976879,
0.12620775401592255,
-0.022182848304510117,
0.16751492023468018,
-0.0821603313088417,
-0.0928761437535286,
0.06491231173276901,
0.03869080916047096,
-0.018066778779029846,
-0.0015547500224784017,
-0.11479837447404861,
0.030915888026356697,
-0.13364118337631226,
-0.04813624918460846,
0.14555932581424713,
0.03509735316038132,
-0.03541167080402374,
-0.046493805944919586,
-0.09799899905920029,
-0.3764220178127289,
0.025444475933909416,
-0.019241128116846085,
-0.066914401948452,
0.04486314579844475,
-0.02249828726053238,
-0.01497147511690855,
-0.0114058218896389,
-0.08907991647720337,
-0.02228786051273346,
0.07495085150003433,
0.03715614974498749,
0.026139622554183006,
0.006188852712512016,
0.11266672611236572,
-0.09897858649492264,
-0.03323763236403465,
-0.036225371062755585,
-0.028881169855594635,
-0.05456538870930672,
-0.006604235153645277,
-0.07022203505039215,
0.17528314888477325,
0.07525674998760223,
0.12666282057762146,
-0.12827250361442566,
0.1311921775341034,
-0.017891818657517433,
0.049590788781642914,
-0.03390548750758171,
0.035105880349874496,
-0.12452733516693115,
0.10138563066720963,
0.007189449388533831,
0.16058793663978577,
0.009220117703080177,
-0.04686558619141579,
-0.06686486303806305,
0.001404313137754798,
0.1462526172399521,
-0.0007741169538348913,
-0.10082004219293594,
0.01514110155403614,
-0.143125981092453,
-0.03643784299492836,
0.06489310413599014,
-0.07333685457706451,
0.009285308420658112,
0.035954903811216354,
-0.05005185306072235,
-0.035470712929964066,
0.10925997793674469,
-0.04522745683789253,
-0.026262326166033745,
-0.027128657326102257,
-0.09760791808366776,
-0.017379216849803925,
-0.10380303114652634,
-0.13263756036758423,
0.04999830946326256,
-0.18068750202655792,
0.00562055641785264,
-0.042586758732795715,
-0.05852135270833969,
0.02671758644282818,
0.011211113072931767,
-0.0801720842719078,
0.0641409307718277,
-0.10207898169755936,
-0.14261718094348907,
-0.02556646801531315,
0.015782896429300308,
0.027732979506254196,
-0.01964949443936348,
0.10067597031593323,
0.04901120811700821,
0.11884310096502304,
-0.15137265622615814,
-0.000906149041838944,
-0.0006713634938932955,
0.06834474205970764,
0.05941688269376755,
0.13060152530670166,
-0.10515986382961273,
-0.027969863265752792,
-0.06103983148932457,
-0.06348162889480591,
-0.13580144941806793,
-0.01634615659713745,
0.1407272070646286,
0.08765122294425964,
-0.162354975938797,
-0.01073814183473587,
0.07467684894800186,
-0.002643210580572486,
-0.07879126816987991,
0.15129150450229645,
-0.05682380124926567,
0.11437069624662399,
-0.04345925524830818,
0.07842165231704712,
0.23099873960018158,
-0.0996355414390564,
0.006504190154373646,
0.11197512596845627,
0.07360801845788956,
-0.000003583008265195531,
0.013187721371650696,
0.07264252007007599,
-0.10083498060703278,
0.03870057687163353,
0.07377462089061737,
0.03774886205792427,
-0.06504413485527039,
-0.07400774955749512,
-0.022752903401851654,
-0.05681690201163292,
0.1151171550154686,
0.02729841135442257,
0.014005406759679317,
-0.06971991807222366,
-0.08609434217214584,
0.1404680609703064,
0.11459258198738098,
-0.030378511175513268,
-0.005029883235692978,
-0.13212119042873383,
-0.0021202426869422197,
-0.04808016121387482,
0.024497391656041145,
-0.1225360855460167,
0.02185128629207611,
0.08327647298574448,
0.00787833146750927,
0.012623712420463562,
0.04495754465460777,
0.06096586957573891,
0.029019279405474663,
-0.0604269839823246,
0.002956473035737872,
-0.05536847561597824,
0.004059408791363239,
-0.08605807274580002,
-0.07381096482276917,
-0.0010516063775867224,
-0.014724265784025192,
0.19779662787914276,
-0.13052622973918915,
0.03565271571278572,
0.11289136111736298,
-0.005989561323076487,
-0.009503825567662716,
-0.04201732203364372,
-0.06623262912034988,
0.10975963622331619,
-0.012460143305361271,
-0.037024881690740585,
0.03980237618088722,
0.02601690962910652,
-0.061022430658340454,
-0.17260299623012543,
-0.0970475897192955,
0.05009652301669121,
0.12437541782855988,
0.06887579709291458,
-0.05821763351559639,
-0.03995196893811226,
-0.023590924218297005,
-0.03908832371234894,
0.06429712474346161,
-0.045532938092947006,
0.03026658482849598,
0.0026904530823230743,
0.06007077172398567,
-0.10919053852558136,
-0.04384881630539894,
0.06400182098150253,
-0.01284062210470438,
-0.04328686743974686,
0.1131523922085762,
0.03539156913757324,
-0.10713862627744675,
0.06380997598171234,
0.057952430099248886,
-0.14340658485889435,
0.1139671802520752,
-0.010076945647597313,
-0.021488331258296967,
-0.09941406548023224,
0.16643720865249634,
0.018506327643990517,
0.11286304146051407,
-0.1634303331375122,
0.09947452694177628,
-0.00929392408579588,
0.008983489125967026,
0.07010107487440109,
-0.1942894458770752,
-0.0023229599464684725,
-0.03061300702393055,
-0.08721230924129486,
-0.06861617416143417,
-0.015447620302438736,
-0.003656132612377405,
0.023634733632206917,
-0.0011160095455124974,
0.07114277780056,
0.1408311426639557,
-0.021699242293834686,
-0.08314504474401474,
0.18067345023155212,
-0.2228248417377472,
-0.2243027687072754,
-0.23713873326778412,
0.014357865788042545,
-0.10143844783306122,
-0.034064821898937225,
-0.05187778174877167,
-0.07830392569303513,
0.03843799605965614,
-0.08154495060443878,
-0.047854289412498474,
-0.010174937546253204,
0.00946576427668333,
0.05525780841708183,
0.022059176117181778,
0.17305299639701843,
-0.08067180961370468,
0.027929941192269325,
0.05004730448126793,
-0.03334664925932884,
0.1225728914141655,
-0.07749255001544952,
-0.0130006680265069,
0.11107799410820007,
-0.017344031482934952,
0.020386451855301857,
0.015911642462015152,
0.321786105632782,
0.012793739326298237,
0.031549349427223206,
0.07509331405162811,
0.004641542676836252,
0.05399017781019211,
0.08592698723077774,
0.015287249349057674,
-0.10515401512384415,
0.07365500181913376,
0.053789518773555756,
-0.09195578098297119,
-0.1337462067604065,
-0.03178740665316582,
-0.06295210868120193,
0.018377745524048805,
0.07818001508712769,
0.06705041229724884,
0.08702594041824341,
0.06877101212739944,
0.04080965742468834,
0.09042216837406158,
0.0002709537511691451,
-0.010677835904061794,
0.11289051920175552,
-0.018581749871373177,
0.07096980512142181,
-0.0141120171174407,
0.02626338228583336,
0.04935658350586891,
0.12695446610450745,
0.06418963521718979,
-0.07558903843164444,
0.024903111159801483,
0.05380476638674736,
0.2923724353313446,
-0.00511576933786273,
0.09942048788070679,
-0.07165701687335968,
-0.020087575539946556,
-0.012876059859991074,
-0.031602099537849426,
-0.08611224591732025,
0.04154687002301216,
0.009212172590196133,
0.05790695920586586,
-0.0025879312306642532,
-0.00050112244207412,
0.07788389921188354,
0.09517701715230942,
0.16684189438819885,
-0.2861690819263458,
-0.10541892051696777,
-0.011134089902043343,
0.11542210727930069,
-0.09569238871335983,
0.016318857669830322,
0.22510360181331635,
0.01965336687862873,
-0.10895603150129318,
-0.031794480979442596,
0.0305477287620306,
-0.004160500131547451,
0.01549921091645956,
0.13027654588222504,
0.11410193145275116,
0.0013551336014643312,
0.08492128551006317,
-0.30027419328689575,
0.03473859280347824,
0.05703248083591461,
0.040042631328105927,
-0.03116016462445259,
0.003734297351911664,
-0.07034774869680405,
-0.08240098506212234,
0.038768261671066284,
0.004138266667723656,
0.1705280989408493,
-0.2976781725883484,
-0.07360395789146423,
-0.00185823580250144,
0.1271347999572754,
0.05303133279085159,
0.04872012883424759,
0.02403664030134678,
0.04942479729652405,
0.08176594227552414,
0.08706773817539215,
-0.02611609175801277,
-0.11035197973251343,
-0.004661462735384703,
0.13897009193897247,
-0.12668536603450775,
-0.05941249430179596,
-0.05124782398343086,
-0.022857125848531723,
0.03539132699370384,
-0.15729300677776337,
-0.04396966099739075,
-0.05650247633457184,
0.03282100334763527,
0.1437554955482483,
-0.034790076315402985,
0.0006096545839682221,
-0.012934640981256962,
0.015886543318629265,
-0.032133374363183975,
-0.08068639785051346,
0.10920129716396332,
-0.044809311628341675,
-0.13727150857448578,
-0.045596521347761154,
0.1377175748348236,
0.0731605812907219,
-0.00359351746737957,
-0.09041156619787216,
-0.050021663308143616,
0.031252212822437286,
-0.1371932029724121,
0.01997753418982029,
0.08576954901218414,
-0.07234665006399155,
0.0698034018278122,
-0.11751603335142136,
0.2458948791027069,
-0.0524151511490345,
0.0896567553281784,
0.0709720253944397,
0.31162014603614807,
-0.08235272765159607,
0.02873161807656288,
0.09803200513124466,
-0.026654057204723358,
-0.26190292835235596,
0.029979633167386055,
0.05638238042593002,
0.04938185587525368,
-0.02843460999429226,
-0.18719743192195892,
0.036363113671541214,
0.06459816545248032,
0.016497470438480377,
0.13878561556339264,
-0.3157224655151367,
-0.0721718817949295,
0.04020849987864494,
0.047762200236320496,
0.11962796002626419,
-0.04890049248933792,
0.011267954483628273,
0.010984989814460278,
-0.016328759491443634,
0.16383521258831024,
-0.08632995188236237,
0.11082419753074646,
-0.012934541329741478,
0.02104428969323635,
0.010682103224098682,
-0.040459323674440384,
0.1487898826599121,
-0.003343862947076559,
0.09072566032409668,
0.025223324075341225,
-0.06733614206314087,
0.06636755168437958,
-0.07327944040298462,
0.011974047869443893,
-0.05378776043653488,
0.09185262024402618,
-0.045282598584890366,
0.0010139745427295566,
-0.06246977299451828,
-0.024063177406787872,
-0.06749105453491211,
-0.06296666711568832,
-0.10791633278131485,
0.08884958922863007,
0.000746204168535769,
-0.02908109501004219,
-0.04201129451394081,
0.054056353867053986,
0.035463664680719376,
0.4493359923362732,
-0.05227240175008774,
-0.053069330751895905,
0.08815151453018188,
0.09277763962745667,
-0.01794450543820858,
0.09321408718824387,
-0.12357188761234283,
0.04079055041074753,
0.12765155732631683,
0.004561528097838163,
0.13461217284202576,
0.08197237551212311,
-0.10337455570697784,
-0.004654042888432741,
0.042563386261463165,
-0.13127627968788147,
-0.08014523237943649,
-0.03295697271823883,
-0.01872093789279461,
-0.1190016120672226,
-0.015042198821902275,
0.09247755259275436,
-0.03084845095872879,
0.053225692361593246,
0.02725985273718834,
0.04755283519625664,
-0.13321809470653534,
0.15661577880382538,
0.03548196330666542,
0.08438514173030853,
-0.08233853429555893,
0.0957811176776886,
0.030104901641607285,
0.004610141273587942,
0.04891415685415268,
-0.02637943997979164,
-0.10149519145488739,
0.019178146496415138,
-0.03032047301530838,
-0.08386382460594177,
0.12455578148365021,
-0.03086884878575802,
-0.04816962778568268,
-0.08449167013168335,
0.00968680065125227,
0.06828322261571884,
0.05217857286334038,
0.10059580206871033,
-0.031775325536727905,
0.02660980448126793,
-0.12699167430400848,
0.08317403495311737,
-0.024292631074786186,
0.012828993611037731,
-0.1356390118598938,
0.06760798394680023,
-0.02659040130674839,
0.057388827204704285,
-0.01863805763423443,
-0.01973152346909046,
-0.2173939198255539,
0.02618231251835823,
-0.025557123124599457,
0.0038494919426739216,
0.04102040454745293,
0.027889033779501915,
0.026005011051893234,
0.05082876980304718,
-0.035126328468322754,
0.027516542002558708,
-0.031932324171066284,
-0.05281057208776474,
0.04515986144542694,
-0.001467780559323728,
-0.02854740433394909,
-0.051517583429813385,
0.06076597422361374,
-0.11085972934961319,
0.03777240961790085,
0.01833501085639,
-0.05718983709812164,
0.07385632395744324,
0.034146055579185486,
0.025609485805034637,
0.09601592272520065,
0.05744863301515579,
0.03817358240485191,
-0.06451597064733505,
0.03407379239797592,
-0.0191631019115448,
-0.006845394615083933,
0.040695060044527054,
0.11625181883573532,
-0.04832686111330986,
-0.06327467411756516,
-0.1486724466085434,
-0.010981161147356033,
-0.048681098967790604,
0.043675489723682404,
0.14957195520401,
0.10375252366065979,
0.09278536587953568,
-0.08544762432575226,
-0.0226217620074749,
-0.14531907439231873,
-0.08078242838382721,
0.060537077486515045,
-0.044578637927770615,
-0.05809058994054794,
-0.04991595074534416,
0.072527676820755,
-0.014007162302732468,
0.13561315834522247,
-0.08526904881000519,
-0.1192963719367981,
-0.05781520903110504,
-0.2013944536447525,
-0.12102537602186203,
0.006627216935157776,
0.2745034396648407,
0.03526143729686737,
-0.04139144718647003,
-0.07936355471611023,
0.0031785741448402405,
0.07116973400115967,
0.16906431317329407,
0.0312589667737484,
0.110907644033432,
-0.11192750185728073,
0.10478517413139343,
0.039009008556604385,
-0.05691129341721535,
0.10796307027339935,
0.3328118920326233,
-0.08219370990991592,
0.02072407677769661,
-0.10421788692474365,
0.10153460502624512,
0.002255996223539114,
-0.13901546597480774,
0.010607997886836529,
-0.03188878297805786,
-0.16198381781578064,
-0.10664700716733932,
0.016538817435503006,
-0.07478190958499908,
-0.1838442087173462,
-0.017796620726585388,
-0.11684910207986832,
-0.07336822152137756,
0.10155867785215378,
0.040450263768434525,
-0.03009716235101223,
0.20794062316417694,
-0.07158278673887253,
0.04586811363697052,
-0.003929977770894766,
-0.016108684241771698,
-0.01862870156764984,
-0.03230820968747139,
-0.10202806442975998,
0.13945409655570984,
0.021661214530467987,
0.10333898663520813,
-0.0020766255911439657,
0.07358334958553314,
0.037272170186042786,
-0.0280001163482666,
-0.04698936268687248,
-0.011088578961789608,
0.0020344743970781565,
-0.05577605590224266,
0.11701337993144989,
0.05259020999073982,
-0.0886714980006218,
-0.07409586012363434,
-0.009580516256392002,
-0.07335926592350006,
-0.038215573877096176,
-0.1547286957502365,
0.2329166978597641,
-0.030917182564735413,
0.12437079846858978,
0.003149752039462328,
-0.06306970119476318,
-0.09483032673597336,
0.14517559111118317,
0.12407895922660828,
-0.11817460507154465,
-0.009399796836078167,
0.09650255739688873,
-0.005866600200533867,
-0.09527475386857986,
0.14510999619960785,
0.08037827908992767,
-0.02563084289431572,
0.025037124752998352,
-0.022733965888619423,
-0.021453559398651123,
-0.015347006730735302,
0.015540610998868942,
-0.030597979202866554,
0.029416168108582497,
0.042662251740694046,
-0.1463923156261444,
-0.03498699516057968,
-0.08044791221618652,
-0.08698305487632751,
0.18581046164035797,
-0.14277657866477966,
-0.07864214479923248,
-0.04254341498017311,
-0.08468645066022873,
-0.10728199034929276,
0.021719660609960556,
-0.10976678878068924,
0.06724512577056885,
0.05545623227953911,
-0.05225260928273201,
-0.0009250067523680627,
-0.03164041042327881,
0.0032662637531757355,
0.03487436845898628,
0.07120268791913986,
-0.01732550375163555,
0.05975370481610298,
0.12397127598524094,
-0.019730931147933006,
-0.04827846959233284,
0.11225178092718124,
0.021490709856152534,
-0.035946041345596313,
-0.14605534076690674,
0.03688865900039673,
-0.022579049691557884,
0.12442220002412796,
0.032534532248973846,
-0.05697030946612358,
-0.014126471243798733,
-0.2196122705936432,
-0.008397997356951237,
-0.1463952362537384,
-0.07968004047870636,
-0.06731394678354263,
0.10560601949691772,
0.17703479528427124,
-0.055034078657627106,
0.019445886835455894,
-0.03590158373117447,
0.04178448021411896,
-0.05044577643275261,
0.07704321295022964,
-0.0026474366895854473,
-0.1442994624376297,
0.05290527641773224,
-0.054687224328517914,
0.009347786195576191,
-0.3066256046295166,
-0.009464235045015812,
0.024085188284516335,
-0.03251948952674866,
-0.03590572252869606,
0.16143204271793365,
0.021979082375764847,
0.06882719695568085,
-0.05558779463171959,
-0.25186118483543396,
-0.07091144472360611,
0.13138745725154877,
-0.0019314914243295789,
-0.07031361013650894
] |
null | null | peft | ## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.5.0
| {"library_name": "peft"} | null | genies-models/llama-7b-us_history_textbook | [
"peft",
"region:us"
] | 2023-11-11T02:31:46+00:00 | [] | [] | TAGS
#peft #region-us
| ## Training procedure
The following 'bitsandbytes' quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.5.0
| [
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16",
"### Framework versions\n\n\n- PEFT 0.5.0"
] | [
"TAGS\n#peft #region-us \n",
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16",
"### Framework versions\n\n\n- PEFT 0.5.0"
] | [
9,
163,
11
] | [
"passage: TAGS\n#peft #region-us \n## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16### Framework versions\n\n\n- PEFT 0.5.0"
] | [
-0.052170995622873306,
0.04638637602329254,
-0.002228866796940565,
0.12587358057498932,
0.09519550204277039,
0.0600481741130352,
0.10608749836683273,
0.12907910346984863,
0.030399560928344727,
0.08949745446443558,
0.11713548749685287,
0.04174109548330307,
0.061590272933244705,
0.15933413803577423,
-0.013215559534728527,
-0.00870309304445982,
0.04744568094611168,
0.0017413088353350759,
-0.0256193894892931,
0.08726111054420471,
0.04637008160352707,
-0.05632777512073517,
0.03198761120438576,
-0.0936860740184784,
-0.14197169244289398,
-0.000977774616330862,
-0.006933287251740694,
0.026412004604935646,
0.03658851608633995,
0.030308807268738747,
0.06681910902261734,
0.011842764914035797,
-0.01992746815085411,
-0.22561807930469513,
-0.011325501836836338,
0.10931333899497986,
-0.024895185604691505,
0.0698515921831131,
-0.09559076279401779,
0.13339440524578094,
-0.08422335982322693,
-0.051820602267980576,
0.015994668006896973,
0.014835527166724205,
-0.0893593430519104,
-0.09642457216978073,
-0.044293515384197235,
0.04310377314686775,
0.030483758077025414,
0.05993334576487541,
-0.014119904488325119,
0.17196474969387054,
-0.15130110085010529,
0.0900082141160965,
0.07799916714429855,
-0.20914553105831146,
-0.02864500693976879,
0.12620775401592255,
-0.022182848304510117,
0.16751492023468018,
-0.0821603313088417,
-0.0928761437535286,
0.06491231173276901,
0.03869080916047096,
-0.018066778779029846,
-0.0015547500224784017,
-0.11479837447404861,
0.030915888026356697,
-0.13364118337631226,
-0.04813624918460846,
0.14555932581424713,
0.03509735316038132,
-0.03541167080402374,
-0.046493805944919586,
-0.09799899905920029,
-0.3764220178127289,
0.025444475933909416,
-0.019241128116846085,
-0.066914401948452,
0.04486314579844475,
-0.02249828726053238,
-0.01497147511690855,
-0.0114058218896389,
-0.08907991647720337,
-0.02228786051273346,
0.07495085150003433,
0.03715614974498749,
0.026139622554183006,
0.006188852712512016,
0.11266672611236572,
-0.09897858649492264,
-0.03323763236403465,
-0.036225371062755585,
-0.028881169855594635,
-0.05456538870930672,
-0.006604235153645277,
-0.07022203505039215,
0.17528314888477325,
0.07525674998760223,
0.12666282057762146,
-0.12827250361442566,
0.1311921775341034,
-0.017891818657517433,
0.049590788781642914,
-0.03390548750758171,
0.035105880349874496,
-0.12452733516693115,
0.10138563066720963,
0.007189449388533831,
0.16058793663978577,
0.009220117703080177,
-0.04686558619141579,
-0.06686486303806305,
0.001404313137754798,
0.1462526172399521,
-0.0007741169538348913,
-0.10082004219293594,
0.01514110155403614,
-0.143125981092453,
-0.03643784299492836,
0.06489310413599014,
-0.07333685457706451,
0.009285308420658112,
0.035954903811216354,
-0.05005185306072235,
-0.035470712929964066,
0.10925997793674469,
-0.04522745683789253,
-0.026262326166033745,
-0.027128657326102257,
-0.09760791808366776,
-0.017379216849803925,
-0.10380303114652634,
-0.13263756036758423,
0.04999830946326256,
-0.18068750202655792,
0.00562055641785264,
-0.042586758732795715,
-0.05852135270833969,
0.02671758644282818,
0.011211113072931767,
-0.0801720842719078,
0.0641409307718277,
-0.10207898169755936,
-0.14261718094348907,
-0.02556646801531315,
0.015782896429300308,
0.027732979506254196,
-0.01964949443936348,
0.10067597031593323,
0.04901120811700821,
0.11884310096502304,
-0.15137265622615814,
-0.000906149041838944,
-0.0006713634938932955,
0.06834474205970764,
0.05941688269376755,
0.13060152530670166,
-0.10515986382961273,
-0.027969863265752792,
-0.06103983148932457,
-0.06348162889480591,
-0.13580144941806793,
-0.01634615659713745,
0.1407272070646286,
0.08765122294425964,
-0.162354975938797,
-0.01073814183473587,
0.07467684894800186,
-0.002643210580572486,
-0.07879126816987991,
0.15129150450229645,
-0.05682380124926567,
0.11437069624662399,
-0.04345925524830818,
0.07842165231704712,
0.23099873960018158,
-0.0996355414390564,
0.006504190154373646,
0.11197512596845627,
0.07360801845788956,
-0.000003583008265195531,
0.013187721371650696,
0.07264252007007599,
-0.10083498060703278,
0.03870057687163353,
0.07377462089061737,
0.03774886205792427,
-0.06504413485527039,
-0.07400774955749512,
-0.022752903401851654,
-0.05681690201163292,
0.1151171550154686,
0.02729841135442257,
0.014005406759679317,
-0.06971991807222366,
-0.08609434217214584,
0.1404680609703064,
0.11459258198738098,
-0.030378511175513268,
-0.005029883235692978,
-0.13212119042873383,
-0.0021202426869422197,
-0.04808016121387482,
0.024497391656041145,
-0.1225360855460167,
0.02185128629207611,
0.08327647298574448,
0.00787833146750927,
0.012623712420463562,
0.04495754465460777,
0.06096586957573891,
0.029019279405474663,
-0.0604269839823246,
0.002956473035737872,
-0.05536847561597824,
0.004059408791363239,
-0.08605807274580002,
-0.07381096482276917,
-0.0010516063775867224,
-0.014724265784025192,
0.19779662787914276,
-0.13052622973918915,
0.03565271571278572,
0.11289136111736298,
-0.005989561323076487,
-0.009503825567662716,
-0.04201732203364372,
-0.06623262912034988,
0.10975963622331619,
-0.012460143305361271,
-0.037024881690740585,
0.03980237618088722,
0.02601690962910652,
-0.061022430658340454,
-0.17260299623012543,
-0.0970475897192955,
0.05009652301669121,
0.12437541782855988,
0.06887579709291458,
-0.05821763351559639,
-0.03995196893811226,
-0.023590924218297005,
-0.03908832371234894,
0.06429712474346161,
-0.045532938092947006,
0.03026658482849598,
0.0026904530823230743,
0.06007077172398567,
-0.10919053852558136,
-0.04384881630539894,
0.06400182098150253,
-0.01284062210470438,
-0.04328686743974686,
0.1131523922085762,
0.03539156913757324,
-0.10713862627744675,
0.06380997598171234,
0.057952430099248886,
-0.14340658485889435,
0.1139671802520752,
-0.010076945647597313,
-0.021488331258296967,
-0.09941406548023224,
0.16643720865249634,
0.018506327643990517,
0.11286304146051407,
-0.1634303331375122,
0.09947452694177628,
-0.00929392408579588,
0.008983489125967026,
0.07010107487440109,
-0.1942894458770752,
-0.0023229599464684725,
-0.03061300702393055,
-0.08721230924129486,
-0.06861617416143417,
-0.015447620302438736,
-0.003656132612377405,
0.023634733632206917,
-0.0011160095455124974,
0.07114277780056,
0.1408311426639557,
-0.021699242293834686,
-0.08314504474401474,
0.18067345023155212,
-0.2228248417377472,
-0.2243027687072754,
-0.23713873326778412,
0.014357865788042545,
-0.10143844783306122,
-0.034064821898937225,
-0.05187778174877167,
-0.07830392569303513,
0.03843799605965614,
-0.08154495060443878,
-0.047854289412498474,
-0.010174937546253204,
0.00946576427668333,
0.05525780841708183,
0.022059176117181778,
0.17305299639701843,
-0.08067180961370468,
0.027929941192269325,
0.05004730448126793,
-0.03334664925932884,
0.1225728914141655,
-0.07749255001544952,
-0.0130006680265069,
0.11107799410820007,
-0.017344031482934952,
0.020386451855301857,
0.015911642462015152,
0.321786105632782,
0.012793739326298237,
0.031549349427223206,
0.07509331405162811,
0.004641542676836252,
0.05399017781019211,
0.08592698723077774,
0.015287249349057674,
-0.10515401512384415,
0.07365500181913376,
0.053789518773555756,
-0.09195578098297119,
-0.1337462067604065,
-0.03178740665316582,
-0.06295210868120193,
0.018377745524048805,
0.07818001508712769,
0.06705041229724884,
0.08702594041824341,
0.06877101212739944,
0.04080965742468834,
0.09042216837406158,
0.0002709537511691451,
-0.010677835904061794,
0.11289051920175552,
-0.018581749871373177,
0.07096980512142181,
-0.0141120171174407,
0.02626338228583336,
0.04935658350586891,
0.12695446610450745,
0.06418963521718979,
-0.07558903843164444,
0.024903111159801483,
0.05380476638674736,
0.2923724353313446,
-0.00511576933786273,
0.09942048788070679,
-0.07165701687335968,
-0.020087575539946556,
-0.012876059859991074,
-0.031602099537849426,
-0.08611224591732025,
0.04154687002301216,
0.009212172590196133,
0.05790695920586586,
-0.0025879312306642532,
-0.00050112244207412,
0.07788389921188354,
0.09517701715230942,
0.16684189438819885,
-0.2861690819263458,
-0.10541892051696777,
-0.011134089902043343,
0.11542210727930069,
-0.09569238871335983,
0.016318857669830322,
0.22510360181331635,
0.01965336687862873,
-0.10895603150129318,
-0.031794480979442596,
0.0305477287620306,
-0.004160500131547451,
0.01549921091645956,
0.13027654588222504,
0.11410193145275116,
0.0013551336014643312,
0.08492128551006317,
-0.30027419328689575,
0.03473859280347824,
0.05703248083591461,
0.040042631328105927,
-0.03116016462445259,
0.003734297351911664,
-0.07034774869680405,
-0.08240098506212234,
0.038768261671066284,
0.004138266667723656,
0.1705280989408493,
-0.2976781725883484,
-0.07360395789146423,
-0.00185823580250144,
0.1271347999572754,
0.05303133279085159,
0.04872012883424759,
0.02403664030134678,
0.04942479729652405,
0.08176594227552414,
0.08706773817539215,
-0.02611609175801277,
-0.11035197973251343,
-0.004661462735384703,
0.13897009193897247,
-0.12668536603450775,
-0.05941249430179596,
-0.05124782398343086,
-0.022857125848531723,
0.03539132699370384,
-0.15729300677776337,
-0.04396966099739075,
-0.05650247633457184,
0.03282100334763527,
0.1437554955482483,
-0.034790076315402985,
0.0006096545839682221,
-0.012934640981256962,
0.015886543318629265,
-0.032133374363183975,
-0.08068639785051346,
0.10920129716396332,
-0.044809311628341675,
-0.13727150857448578,
-0.045596521347761154,
0.1377175748348236,
0.0731605812907219,
-0.00359351746737957,
-0.09041156619787216,
-0.050021663308143616,
0.031252212822437286,
-0.1371932029724121,
0.01997753418982029,
0.08576954901218414,
-0.07234665006399155,
0.0698034018278122,
-0.11751603335142136,
0.2458948791027069,
-0.0524151511490345,
0.0896567553281784,
0.0709720253944397,
0.31162014603614807,
-0.08235272765159607,
0.02873161807656288,
0.09803200513124466,
-0.026654057204723358,
-0.26190292835235596,
0.029979633167386055,
0.05638238042593002,
0.04938185587525368,
-0.02843460999429226,
-0.18719743192195892,
0.036363113671541214,
0.06459816545248032,
0.016497470438480377,
0.13878561556339264,
-0.3157224655151367,
-0.0721718817949295,
0.04020849987864494,
0.047762200236320496,
0.11962796002626419,
-0.04890049248933792,
0.011267954483628273,
0.010984989814460278,
-0.016328759491443634,
0.16383521258831024,
-0.08632995188236237,
0.11082419753074646,
-0.012934541329741478,
0.02104428969323635,
0.010682103224098682,
-0.040459323674440384,
0.1487898826599121,
-0.003343862947076559,
0.09072566032409668,
0.025223324075341225,
-0.06733614206314087,
0.06636755168437958,
-0.07327944040298462,
0.011974047869443893,
-0.05378776043653488,
0.09185262024402618,
-0.045282598584890366,
0.0010139745427295566,
-0.06246977299451828,
-0.024063177406787872,
-0.06749105453491211,
-0.06296666711568832,
-0.10791633278131485,
0.08884958922863007,
0.000746204168535769,
-0.02908109501004219,
-0.04201129451394081,
0.054056353867053986,
0.035463664680719376,
0.4493359923362732,
-0.05227240175008774,
-0.053069330751895905,
0.08815151453018188,
0.09277763962745667,
-0.01794450543820858,
0.09321408718824387,
-0.12357188761234283,
0.04079055041074753,
0.12765155732631683,
0.004561528097838163,
0.13461217284202576,
0.08197237551212311,
-0.10337455570697784,
-0.004654042888432741,
0.042563386261463165,
-0.13127627968788147,
-0.08014523237943649,
-0.03295697271823883,
-0.01872093789279461,
-0.1190016120672226,
-0.015042198821902275,
0.09247755259275436,
-0.03084845095872879,
0.053225692361593246,
0.02725985273718834,
0.04755283519625664,
-0.13321809470653534,
0.15661577880382538,
0.03548196330666542,
0.08438514173030853,
-0.08233853429555893,
0.0957811176776886,
0.030104901641607285,
0.004610141273587942,
0.04891415685415268,
-0.02637943997979164,
-0.10149519145488739,
0.019178146496415138,
-0.03032047301530838,
-0.08386382460594177,
0.12455578148365021,
-0.03086884878575802,
-0.04816962778568268,
-0.08449167013168335,
0.00968680065125227,
0.06828322261571884,
0.05217857286334038,
0.10059580206871033,
-0.031775325536727905,
0.02660980448126793,
-0.12699167430400848,
0.08317403495311737,
-0.024292631074786186,
0.012828993611037731,
-0.1356390118598938,
0.06760798394680023,
-0.02659040130674839,
0.057388827204704285,
-0.01863805763423443,
-0.01973152346909046,
-0.2173939198255539,
0.02618231251835823,
-0.025557123124599457,
0.0038494919426739216,
0.04102040454745293,
0.027889033779501915,
0.026005011051893234,
0.05082876980304718,
-0.035126328468322754,
0.027516542002558708,
-0.031932324171066284,
-0.05281057208776474,
0.04515986144542694,
-0.001467780559323728,
-0.02854740433394909,
-0.051517583429813385,
0.06076597422361374,
-0.11085972934961319,
0.03777240961790085,
0.01833501085639,
-0.05718983709812164,
0.07385632395744324,
0.034146055579185486,
0.025609485805034637,
0.09601592272520065,
0.05744863301515579,
0.03817358240485191,
-0.06451597064733505,
0.03407379239797592,
-0.0191631019115448,
-0.006845394615083933,
0.040695060044527054,
0.11625181883573532,
-0.04832686111330986,
-0.06327467411756516,
-0.1486724466085434,
-0.010981161147356033,
-0.048681098967790604,
0.043675489723682404,
0.14957195520401,
0.10375252366065979,
0.09278536587953568,
-0.08544762432575226,
-0.0226217620074749,
-0.14531907439231873,
-0.08078242838382721,
0.060537077486515045,
-0.044578637927770615,
-0.05809058994054794,
-0.04991595074534416,
0.072527676820755,
-0.014007162302732468,
0.13561315834522247,
-0.08526904881000519,
-0.1192963719367981,
-0.05781520903110504,
-0.2013944536447525,
-0.12102537602186203,
0.006627216935157776,
0.2745034396648407,
0.03526143729686737,
-0.04139144718647003,
-0.07936355471611023,
0.0031785741448402405,
0.07116973400115967,
0.16906431317329407,
0.0312589667737484,
0.110907644033432,
-0.11192750185728073,
0.10478517413139343,
0.039009008556604385,
-0.05691129341721535,
0.10796307027339935,
0.3328118920326233,
-0.08219370990991592,
0.02072407677769661,
-0.10421788692474365,
0.10153460502624512,
0.002255996223539114,
-0.13901546597480774,
0.010607997886836529,
-0.03188878297805786,
-0.16198381781578064,
-0.10664700716733932,
0.016538817435503006,
-0.07478190958499908,
-0.1838442087173462,
-0.017796620726585388,
-0.11684910207986832,
-0.07336822152137756,
0.10155867785215378,
0.040450263768434525,
-0.03009716235101223,
0.20794062316417694,
-0.07158278673887253,
0.04586811363697052,
-0.003929977770894766,
-0.016108684241771698,
-0.01862870156764984,
-0.03230820968747139,
-0.10202806442975998,
0.13945409655570984,
0.021661214530467987,
0.10333898663520813,
-0.0020766255911439657,
0.07358334958553314,
0.037272170186042786,
-0.0280001163482666,
-0.04698936268687248,
-0.011088578961789608,
0.0020344743970781565,
-0.05577605590224266,
0.11701337993144989,
0.05259020999073982,
-0.0886714980006218,
-0.07409586012363434,
-0.009580516256392002,
-0.07335926592350006,
-0.038215573877096176,
-0.1547286957502365,
0.2329166978597641,
-0.030917182564735413,
0.12437079846858978,
0.003149752039462328,
-0.06306970119476318,
-0.09483032673597336,
0.14517559111118317,
0.12407895922660828,
-0.11817460507154465,
-0.009399796836078167,
0.09650255739688873,
-0.005866600200533867,
-0.09527475386857986,
0.14510999619960785,
0.08037827908992767,
-0.02563084289431572,
0.025037124752998352,
-0.022733965888619423,
-0.021453559398651123,
-0.015347006730735302,
0.015540610998868942,
-0.030597979202866554,
0.029416168108582497,
0.042662251740694046,
-0.1463923156261444,
-0.03498699516057968,
-0.08044791221618652,
-0.08698305487632751,
0.18581046164035797,
-0.14277657866477966,
-0.07864214479923248,
-0.04254341498017311,
-0.08468645066022873,
-0.10728199034929276,
0.021719660609960556,
-0.10976678878068924,
0.06724512577056885,
0.05545623227953911,
-0.05225260928273201,
-0.0009250067523680627,
-0.03164041042327881,
0.0032662637531757355,
0.03487436845898628,
0.07120268791913986,
-0.01732550375163555,
0.05975370481610298,
0.12397127598524094,
-0.019730931147933006,
-0.04827846959233284,
0.11225178092718124,
0.021490709856152534,
-0.035946041345596313,
-0.14605534076690674,
0.03688865900039673,
-0.022579049691557884,
0.12442220002412796,
0.032534532248973846,
-0.05697030946612358,
-0.014126471243798733,
-0.2196122705936432,
-0.008397997356951237,
-0.1463952362537384,
-0.07968004047870636,
-0.06731394678354263,
0.10560601949691772,
0.17703479528427124,
-0.055034078657627106,
0.019445886835455894,
-0.03590158373117447,
0.04178448021411896,
-0.05044577643275261,
0.07704321295022964,
-0.0026474366895854473,
-0.1442994624376297,
0.05290527641773224,
-0.054687224328517914,
0.009347786195576191,
-0.3066256046295166,
-0.009464235045015812,
0.024085188284516335,
-0.03251948952674866,
-0.03590572252869606,
0.16143204271793365,
0.021979082375764847,
0.06882719695568085,
-0.05558779463171959,
-0.25186118483543396,
-0.07091144472360611,
0.13138745725154877,
-0.0019314914243295789,
-0.07031361013650894
] |
null | null | peft | ## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.5.0
| {"library_name": "peft"} | null | genies-models/llama-13b-mmlu | [
"peft",
"region:us"
] | 2023-11-11T02:32:16+00:00 | [] | [] | TAGS
#peft #region-us
| ## Training procedure
The following 'bitsandbytes' quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.5.0
| [
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16",
"### Framework versions\n\n\n- PEFT 0.5.0"
] | [
"TAGS\n#peft #region-us \n",
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16",
"### Framework versions\n\n\n- PEFT 0.5.0"
] | [
9,
163,
11
] | [
"passage: TAGS\n#peft #region-us \n## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16### Framework versions\n\n\n- PEFT 0.5.0"
] | [
-0.052170995622873306,
0.04638637602329254,
-0.002228866796940565,
0.12587358057498932,
0.09519550204277039,
0.0600481741130352,
0.10608749836683273,
0.12907910346984863,
0.030399560928344727,
0.08949745446443558,
0.11713548749685287,
0.04174109548330307,
0.061590272933244705,
0.15933413803577423,
-0.013215559534728527,
-0.00870309304445982,
0.04744568094611168,
0.0017413088353350759,
-0.0256193894892931,
0.08726111054420471,
0.04637008160352707,
-0.05632777512073517,
0.03198761120438576,
-0.0936860740184784,
-0.14197169244289398,
-0.000977774616330862,
-0.006933287251740694,
0.026412004604935646,
0.03658851608633995,
0.030308807268738747,
0.06681910902261734,
0.011842764914035797,
-0.01992746815085411,
-0.22561807930469513,
-0.011325501836836338,
0.10931333899497986,
-0.024895185604691505,
0.0698515921831131,
-0.09559076279401779,
0.13339440524578094,
-0.08422335982322693,
-0.051820602267980576,
0.015994668006896973,
0.014835527166724205,
-0.0893593430519104,
-0.09642457216978073,
-0.044293515384197235,
0.04310377314686775,
0.030483758077025414,
0.05993334576487541,
-0.014119904488325119,
0.17196474969387054,
-0.15130110085010529,
0.0900082141160965,
0.07799916714429855,
-0.20914553105831146,
-0.02864500693976879,
0.12620775401592255,
-0.022182848304510117,
0.16751492023468018,
-0.0821603313088417,
-0.0928761437535286,
0.06491231173276901,
0.03869080916047096,
-0.018066778779029846,
-0.0015547500224784017,
-0.11479837447404861,
0.030915888026356697,
-0.13364118337631226,
-0.04813624918460846,
0.14555932581424713,
0.03509735316038132,
-0.03541167080402374,
-0.046493805944919586,
-0.09799899905920029,
-0.3764220178127289,
0.025444475933909416,
-0.019241128116846085,
-0.066914401948452,
0.04486314579844475,
-0.02249828726053238,
-0.01497147511690855,
-0.0114058218896389,
-0.08907991647720337,
-0.02228786051273346,
0.07495085150003433,
0.03715614974498749,
0.026139622554183006,
0.006188852712512016,
0.11266672611236572,
-0.09897858649492264,
-0.03323763236403465,
-0.036225371062755585,
-0.028881169855594635,
-0.05456538870930672,
-0.006604235153645277,
-0.07022203505039215,
0.17528314888477325,
0.07525674998760223,
0.12666282057762146,
-0.12827250361442566,
0.1311921775341034,
-0.017891818657517433,
0.049590788781642914,
-0.03390548750758171,
0.035105880349874496,
-0.12452733516693115,
0.10138563066720963,
0.007189449388533831,
0.16058793663978577,
0.009220117703080177,
-0.04686558619141579,
-0.06686486303806305,
0.001404313137754798,
0.1462526172399521,
-0.0007741169538348913,
-0.10082004219293594,
0.01514110155403614,
-0.143125981092453,
-0.03643784299492836,
0.06489310413599014,
-0.07333685457706451,
0.009285308420658112,
0.035954903811216354,
-0.05005185306072235,
-0.035470712929964066,
0.10925997793674469,
-0.04522745683789253,
-0.026262326166033745,
-0.027128657326102257,
-0.09760791808366776,
-0.017379216849803925,
-0.10380303114652634,
-0.13263756036758423,
0.04999830946326256,
-0.18068750202655792,
0.00562055641785264,
-0.042586758732795715,
-0.05852135270833969,
0.02671758644282818,
0.011211113072931767,
-0.0801720842719078,
0.0641409307718277,
-0.10207898169755936,
-0.14261718094348907,
-0.02556646801531315,
0.015782896429300308,
0.027732979506254196,
-0.01964949443936348,
0.10067597031593323,
0.04901120811700821,
0.11884310096502304,
-0.15137265622615814,
-0.000906149041838944,
-0.0006713634938932955,
0.06834474205970764,
0.05941688269376755,
0.13060152530670166,
-0.10515986382961273,
-0.027969863265752792,
-0.06103983148932457,
-0.06348162889480591,
-0.13580144941806793,
-0.01634615659713745,
0.1407272070646286,
0.08765122294425964,
-0.162354975938797,
-0.01073814183473587,
0.07467684894800186,
-0.002643210580572486,
-0.07879126816987991,
0.15129150450229645,
-0.05682380124926567,
0.11437069624662399,
-0.04345925524830818,
0.07842165231704712,
0.23099873960018158,
-0.0996355414390564,
0.006504190154373646,
0.11197512596845627,
0.07360801845788956,
-0.000003583008265195531,
0.013187721371650696,
0.07264252007007599,
-0.10083498060703278,
0.03870057687163353,
0.07377462089061737,
0.03774886205792427,
-0.06504413485527039,
-0.07400774955749512,
-0.022752903401851654,
-0.05681690201163292,
0.1151171550154686,
0.02729841135442257,
0.014005406759679317,
-0.06971991807222366,
-0.08609434217214584,
0.1404680609703064,
0.11459258198738098,
-0.030378511175513268,
-0.005029883235692978,
-0.13212119042873383,
-0.0021202426869422197,
-0.04808016121387482,
0.024497391656041145,
-0.1225360855460167,
0.02185128629207611,
0.08327647298574448,
0.00787833146750927,
0.012623712420463562,
0.04495754465460777,
0.06096586957573891,
0.029019279405474663,
-0.0604269839823246,
0.002956473035737872,
-0.05536847561597824,
0.004059408791363239,
-0.08605807274580002,
-0.07381096482276917,
-0.0010516063775867224,
-0.014724265784025192,
0.19779662787914276,
-0.13052622973918915,
0.03565271571278572,
0.11289136111736298,
-0.005989561323076487,
-0.009503825567662716,
-0.04201732203364372,
-0.06623262912034988,
0.10975963622331619,
-0.012460143305361271,
-0.037024881690740585,
0.03980237618088722,
0.02601690962910652,
-0.061022430658340454,
-0.17260299623012543,
-0.0970475897192955,
0.05009652301669121,
0.12437541782855988,
0.06887579709291458,
-0.05821763351559639,
-0.03995196893811226,
-0.023590924218297005,
-0.03908832371234894,
0.06429712474346161,
-0.045532938092947006,
0.03026658482849598,
0.0026904530823230743,
0.06007077172398567,
-0.10919053852558136,
-0.04384881630539894,
0.06400182098150253,
-0.01284062210470438,
-0.04328686743974686,
0.1131523922085762,
0.03539156913757324,
-0.10713862627744675,
0.06380997598171234,
0.057952430099248886,
-0.14340658485889435,
0.1139671802520752,
-0.010076945647597313,
-0.021488331258296967,
-0.09941406548023224,
0.16643720865249634,
0.018506327643990517,
0.11286304146051407,
-0.1634303331375122,
0.09947452694177628,
-0.00929392408579588,
0.008983489125967026,
0.07010107487440109,
-0.1942894458770752,
-0.0023229599464684725,
-0.03061300702393055,
-0.08721230924129486,
-0.06861617416143417,
-0.015447620302438736,
-0.003656132612377405,
0.023634733632206917,
-0.0011160095455124974,
0.07114277780056,
0.1408311426639557,
-0.021699242293834686,
-0.08314504474401474,
0.18067345023155212,
-0.2228248417377472,
-0.2243027687072754,
-0.23713873326778412,
0.014357865788042545,
-0.10143844783306122,
-0.034064821898937225,
-0.05187778174877167,
-0.07830392569303513,
0.03843799605965614,
-0.08154495060443878,
-0.047854289412498474,
-0.010174937546253204,
0.00946576427668333,
0.05525780841708183,
0.022059176117181778,
0.17305299639701843,
-0.08067180961370468,
0.027929941192269325,
0.05004730448126793,
-0.03334664925932884,
0.1225728914141655,
-0.07749255001544952,
-0.0130006680265069,
0.11107799410820007,
-0.017344031482934952,
0.020386451855301857,
0.015911642462015152,
0.321786105632782,
0.012793739326298237,
0.031549349427223206,
0.07509331405162811,
0.004641542676836252,
0.05399017781019211,
0.08592698723077774,
0.015287249349057674,
-0.10515401512384415,
0.07365500181913376,
0.053789518773555756,
-0.09195578098297119,
-0.1337462067604065,
-0.03178740665316582,
-0.06295210868120193,
0.018377745524048805,
0.07818001508712769,
0.06705041229724884,
0.08702594041824341,
0.06877101212739944,
0.04080965742468834,
0.09042216837406158,
0.0002709537511691451,
-0.010677835904061794,
0.11289051920175552,
-0.018581749871373177,
0.07096980512142181,
-0.0141120171174407,
0.02626338228583336,
0.04935658350586891,
0.12695446610450745,
0.06418963521718979,
-0.07558903843164444,
0.024903111159801483,
0.05380476638674736,
0.2923724353313446,
-0.00511576933786273,
0.09942048788070679,
-0.07165701687335968,
-0.020087575539946556,
-0.012876059859991074,
-0.031602099537849426,
-0.08611224591732025,
0.04154687002301216,
0.009212172590196133,
0.05790695920586586,
-0.0025879312306642532,
-0.00050112244207412,
0.07788389921188354,
0.09517701715230942,
0.16684189438819885,
-0.2861690819263458,
-0.10541892051696777,
-0.011134089902043343,
0.11542210727930069,
-0.09569238871335983,
0.016318857669830322,
0.22510360181331635,
0.01965336687862873,
-0.10895603150129318,
-0.031794480979442596,
0.0305477287620306,
-0.004160500131547451,
0.01549921091645956,
0.13027654588222504,
0.11410193145275116,
0.0013551336014643312,
0.08492128551006317,
-0.30027419328689575,
0.03473859280347824,
0.05703248083591461,
0.040042631328105927,
-0.03116016462445259,
0.003734297351911664,
-0.07034774869680405,
-0.08240098506212234,
0.038768261671066284,
0.004138266667723656,
0.1705280989408493,
-0.2976781725883484,
-0.07360395789146423,
-0.00185823580250144,
0.1271347999572754,
0.05303133279085159,
0.04872012883424759,
0.02403664030134678,
0.04942479729652405,
0.08176594227552414,
0.08706773817539215,
-0.02611609175801277,
-0.11035197973251343,
-0.004661462735384703,
0.13897009193897247,
-0.12668536603450775,
-0.05941249430179596,
-0.05124782398343086,
-0.022857125848531723,
0.03539132699370384,
-0.15729300677776337,
-0.04396966099739075,
-0.05650247633457184,
0.03282100334763527,
0.1437554955482483,
-0.034790076315402985,
0.0006096545839682221,
-0.012934640981256962,
0.015886543318629265,
-0.032133374363183975,
-0.08068639785051346,
0.10920129716396332,
-0.044809311628341675,
-0.13727150857448578,
-0.045596521347761154,
0.1377175748348236,
0.0731605812907219,
-0.00359351746737957,
-0.09041156619787216,
-0.050021663308143616,
0.031252212822437286,
-0.1371932029724121,
0.01997753418982029,
0.08576954901218414,
-0.07234665006399155,
0.0698034018278122,
-0.11751603335142136,
0.2458948791027069,
-0.0524151511490345,
0.0896567553281784,
0.0709720253944397,
0.31162014603614807,
-0.08235272765159607,
0.02873161807656288,
0.09803200513124466,
-0.026654057204723358,
-0.26190292835235596,
0.029979633167386055,
0.05638238042593002,
0.04938185587525368,
-0.02843460999429226,
-0.18719743192195892,
0.036363113671541214,
0.06459816545248032,
0.016497470438480377,
0.13878561556339264,
-0.3157224655151367,
-0.0721718817949295,
0.04020849987864494,
0.047762200236320496,
0.11962796002626419,
-0.04890049248933792,
0.011267954483628273,
0.010984989814460278,
-0.016328759491443634,
0.16383521258831024,
-0.08632995188236237,
0.11082419753074646,
-0.012934541329741478,
0.02104428969323635,
0.010682103224098682,
-0.040459323674440384,
0.1487898826599121,
-0.003343862947076559,
0.09072566032409668,
0.025223324075341225,
-0.06733614206314087,
0.06636755168437958,
-0.07327944040298462,
0.011974047869443893,
-0.05378776043653488,
0.09185262024402618,
-0.045282598584890366,
0.0010139745427295566,
-0.06246977299451828,
-0.024063177406787872,
-0.06749105453491211,
-0.06296666711568832,
-0.10791633278131485,
0.08884958922863007,
0.000746204168535769,
-0.02908109501004219,
-0.04201129451394081,
0.054056353867053986,
0.035463664680719376,
0.4493359923362732,
-0.05227240175008774,
-0.053069330751895905,
0.08815151453018188,
0.09277763962745667,
-0.01794450543820858,
0.09321408718824387,
-0.12357188761234283,
0.04079055041074753,
0.12765155732631683,
0.004561528097838163,
0.13461217284202576,
0.08197237551212311,
-0.10337455570697784,
-0.004654042888432741,
0.042563386261463165,
-0.13127627968788147,
-0.08014523237943649,
-0.03295697271823883,
-0.01872093789279461,
-0.1190016120672226,
-0.015042198821902275,
0.09247755259275436,
-0.03084845095872879,
0.053225692361593246,
0.02725985273718834,
0.04755283519625664,
-0.13321809470653534,
0.15661577880382538,
0.03548196330666542,
0.08438514173030853,
-0.08233853429555893,
0.0957811176776886,
0.030104901641607285,
0.004610141273587942,
0.04891415685415268,
-0.02637943997979164,
-0.10149519145488739,
0.019178146496415138,
-0.03032047301530838,
-0.08386382460594177,
0.12455578148365021,
-0.03086884878575802,
-0.04816962778568268,
-0.08449167013168335,
0.00968680065125227,
0.06828322261571884,
0.05217857286334038,
0.10059580206871033,
-0.031775325536727905,
0.02660980448126793,
-0.12699167430400848,
0.08317403495311737,
-0.024292631074786186,
0.012828993611037731,
-0.1356390118598938,
0.06760798394680023,
-0.02659040130674839,
0.057388827204704285,
-0.01863805763423443,
-0.01973152346909046,
-0.2173939198255539,
0.02618231251835823,
-0.025557123124599457,
0.0038494919426739216,
0.04102040454745293,
0.027889033779501915,
0.026005011051893234,
0.05082876980304718,
-0.035126328468322754,
0.027516542002558708,
-0.031932324171066284,
-0.05281057208776474,
0.04515986144542694,
-0.001467780559323728,
-0.02854740433394909,
-0.051517583429813385,
0.06076597422361374,
-0.11085972934961319,
0.03777240961790085,
0.01833501085639,
-0.05718983709812164,
0.07385632395744324,
0.034146055579185486,
0.025609485805034637,
0.09601592272520065,
0.05744863301515579,
0.03817358240485191,
-0.06451597064733505,
0.03407379239797592,
-0.0191631019115448,
-0.006845394615083933,
0.040695060044527054,
0.11625181883573532,
-0.04832686111330986,
-0.06327467411756516,
-0.1486724466085434,
-0.010981161147356033,
-0.048681098967790604,
0.043675489723682404,
0.14957195520401,
0.10375252366065979,
0.09278536587953568,
-0.08544762432575226,
-0.0226217620074749,
-0.14531907439231873,
-0.08078242838382721,
0.060537077486515045,
-0.044578637927770615,
-0.05809058994054794,
-0.04991595074534416,
0.072527676820755,
-0.014007162302732468,
0.13561315834522247,
-0.08526904881000519,
-0.1192963719367981,
-0.05781520903110504,
-0.2013944536447525,
-0.12102537602186203,
0.006627216935157776,
0.2745034396648407,
0.03526143729686737,
-0.04139144718647003,
-0.07936355471611023,
0.0031785741448402405,
0.07116973400115967,
0.16906431317329407,
0.0312589667737484,
0.110907644033432,
-0.11192750185728073,
0.10478517413139343,
0.039009008556604385,
-0.05691129341721535,
0.10796307027339935,
0.3328118920326233,
-0.08219370990991592,
0.02072407677769661,
-0.10421788692474365,
0.10153460502624512,
0.002255996223539114,
-0.13901546597480774,
0.010607997886836529,
-0.03188878297805786,
-0.16198381781578064,
-0.10664700716733932,
0.016538817435503006,
-0.07478190958499908,
-0.1838442087173462,
-0.017796620726585388,
-0.11684910207986832,
-0.07336822152137756,
0.10155867785215378,
0.040450263768434525,
-0.03009716235101223,
0.20794062316417694,
-0.07158278673887253,
0.04586811363697052,
-0.003929977770894766,
-0.016108684241771698,
-0.01862870156764984,
-0.03230820968747139,
-0.10202806442975998,
0.13945409655570984,
0.021661214530467987,
0.10333898663520813,
-0.0020766255911439657,
0.07358334958553314,
0.037272170186042786,
-0.0280001163482666,
-0.04698936268687248,
-0.011088578961789608,
0.0020344743970781565,
-0.05577605590224266,
0.11701337993144989,
0.05259020999073982,
-0.0886714980006218,
-0.07409586012363434,
-0.009580516256392002,
-0.07335926592350006,
-0.038215573877096176,
-0.1547286957502365,
0.2329166978597641,
-0.030917182564735413,
0.12437079846858978,
0.003149752039462328,
-0.06306970119476318,
-0.09483032673597336,
0.14517559111118317,
0.12407895922660828,
-0.11817460507154465,
-0.009399796836078167,
0.09650255739688873,
-0.005866600200533867,
-0.09527475386857986,
0.14510999619960785,
0.08037827908992767,
-0.02563084289431572,
0.025037124752998352,
-0.022733965888619423,
-0.021453559398651123,
-0.015347006730735302,
0.015540610998868942,
-0.030597979202866554,
0.029416168108582497,
0.042662251740694046,
-0.1463923156261444,
-0.03498699516057968,
-0.08044791221618652,
-0.08698305487632751,
0.18581046164035797,
-0.14277657866477966,
-0.07864214479923248,
-0.04254341498017311,
-0.08468645066022873,
-0.10728199034929276,
0.021719660609960556,
-0.10976678878068924,
0.06724512577056885,
0.05545623227953911,
-0.05225260928273201,
-0.0009250067523680627,
-0.03164041042327881,
0.0032662637531757355,
0.03487436845898628,
0.07120268791913986,
-0.01732550375163555,
0.05975370481610298,
0.12397127598524094,
-0.019730931147933006,
-0.04827846959233284,
0.11225178092718124,
0.021490709856152534,
-0.035946041345596313,
-0.14605534076690674,
0.03688865900039673,
-0.022579049691557884,
0.12442220002412796,
0.032534532248973846,
-0.05697030946612358,
-0.014126471243798733,
-0.2196122705936432,
-0.008397997356951237,
-0.1463952362537384,
-0.07968004047870636,
-0.06731394678354263,
0.10560601949691772,
0.17703479528427124,
-0.055034078657627106,
0.019445886835455894,
-0.03590158373117447,
0.04178448021411896,
-0.05044577643275261,
0.07704321295022964,
-0.0026474366895854473,
-0.1442994624376297,
0.05290527641773224,
-0.054687224328517914,
0.009347786195576191,
-0.3066256046295166,
-0.009464235045015812,
0.024085188284516335,
-0.03251948952674866,
-0.03590572252869606,
0.16143204271793365,
0.021979082375764847,
0.06882719695568085,
-0.05558779463171959,
-0.25186118483543396,
-0.07091144472360611,
0.13138745725154877,
-0.0019314914243295789,
-0.07031361013650894
] |
null | null | peft | ## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.5.0
| {"library_name": "peft"} | null | genies-models/llama-13b-us_history_fiction | [
"peft",
"region:us"
] | 2023-11-11T02:32:49+00:00 | [] | [] | TAGS
#peft #region-us
| ## Training procedure
The following 'bitsandbytes' quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.5.0
| [
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16",
"### Framework versions\n\n\n- PEFT 0.5.0"
] | [
"TAGS\n#peft #region-us \n",
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16",
"### Framework versions\n\n\n- PEFT 0.5.0"
] | [
9,
163,
11
] | [
"passage: TAGS\n#peft #region-us \n## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16### Framework versions\n\n\n- PEFT 0.5.0"
] | [
-0.052170995622873306,
0.04638637602329254,
-0.002228866796940565,
0.12587358057498932,
0.09519550204277039,
0.0600481741130352,
0.10608749836683273,
0.12907910346984863,
0.030399560928344727,
0.08949745446443558,
0.11713548749685287,
0.04174109548330307,
0.061590272933244705,
0.15933413803577423,
-0.013215559534728527,
-0.00870309304445982,
0.04744568094611168,
0.0017413088353350759,
-0.0256193894892931,
0.08726111054420471,
0.04637008160352707,
-0.05632777512073517,
0.03198761120438576,
-0.0936860740184784,
-0.14197169244289398,
-0.000977774616330862,
-0.006933287251740694,
0.026412004604935646,
0.03658851608633995,
0.030308807268738747,
0.06681910902261734,
0.011842764914035797,
-0.01992746815085411,
-0.22561807930469513,
-0.011325501836836338,
0.10931333899497986,
-0.024895185604691505,
0.0698515921831131,
-0.09559076279401779,
0.13339440524578094,
-0.08422335982322693,
-0.051820602267980576,
0.015994668006896973,
0.014835527166724205,
-0.0893593430519104,
-0.09642457216978073,
-0.044293515384197235,
0.04310377314686775,
0.030483758077025414,
0.05993334576487541,
-0.014119904488325119,
0.17196474969387054,
-0.15130110085010529,
0.0900082141160965,
0.07799916714429855,
-0.20914553105831146,
-0.02864500693976879,
0.12620775401592255,
-0.022182848304510117,
0.16751492023468018,
-0.0821603313088417,
-0.0928761437535286,
0.06491231173276901,
0.03869080916047096,
-0.018066778779029846,
-0.0015547500224784017,
-0.11479837447404861,
0.030915888026356697,
-0.13364118337631226,
-0.04813624918460846,
0.14555932581424713,
0.03509735316038132,
-0.03541167080402374,
-0.046493805944919586,
-0.09799899905920029,
-0.3764220178127289,
0.025444475933909416,
-0.019241128116846085,
-0.066914401948452,
0.04486314579844475,
-0.02249828726053238,
-0.01497147511690855,
-0.0114058218896389,
-0.08907991647720337,
-0.02228786051273346,
0.07495085150003433,
0.03715614974498749,
0.026139622554183006,
0.006188852712512016,
0.11266672611236572,
-0.09897858649492264,
-0.03323763236403465,
-0.036225371062755585,
-0.028881169855594635,
-0.05456538870930672,
-0.006604235153645277,
-0.07022203505039215,
0.17528314888477325,
0.07525674998760223,
0.12666282057762146,
-0.12827250361442566,
0.1311921775341034,
-0.017891818657517433,
0.049590788781642914,
-0.03390548750758171,
0.035105880349874496,
-0.12452733516693115,
0.10138563066720963,
0.007189449388533831,
0.16058793663978577,
0.009220117703080177,
-0.04686558619141579,
-0.06686486303806305,
0.001404313137754798,
0.1462526172399521,
-0.0007741169538348913,
-0.10082004219293594,
0.01514110155403614,
-0.143125981092453,
-0.03643784299492836,
0.06489310413599014,
-0.07333685457706451,
0.009285308420658112,
0.035954903811216354,
-0.05005185306072235,
-0.035470712929964066,
0.10925997793674469,
-0.04522745683789253,
-0.026262326166033745,
-0.027128657326102257,
-0.09760791808366776,
-0.017379216849803925,
-0.10380303114652634,
-0.13263756036758423,
0.04999830946326256,
-0.18068750202655792,
0.00562055641785264,
-0.042586758732795715,
-0.05852135270833969,
0.02671758644282818,
0.011211113072931767,
-0.0801720842719078,
0.0641409307718277,
-0.10207898169755936,
-0.14261718094348907,
-0.02556646801531315,
0.015782896429300308,
0.027732979506254196,
-0.01964949443936348,
0.10067597031593323,
0.04901120811700821,
0.11884310096502304,
-0.15137265622615814,
-0.000906149041838944,
-0.0006713634938932955,
0.06834474205970764,
0.05941688269376755,
0.13060152530670166,
-0.10515986382961273,
-0.027969863265752792,
-0.06103983148932457,
-0.06348162889480591,
-0.13580144941806793,
-0.01634615659713745,
0.1407272070646286,
0.08765122294425964,
-0.162354975938797,
-0.01073814183473587,
0.07467684894800186,
-0.002643210580572486,
-0.07879126816987991,
0.15129150450229645,
-0.05682380124926567,
0.11437069624662399,
-0.04345925524830818,
0.07842165231704712,
0.23099873960018158,
-0.0996355414390564,
0.006504190154373646,
0.11197512596845627,
0.07360801845788956,
-0.000003583008265195531,
0.013187721371650696,
0.07264252007007599,
-0.10083498060703278,
0.03870057687163353,
0.07377462089061737,
0.03774886205792427,
-0.06504413485527039,
-0.07400774955749512,
-0.022752903401851654,
-0.05681690201163292,
0.1151171550154686,
0.02729841135442257,
0.014005406759679317,
-0.06971991807222366,
-0.08609434217214584,
0.1404680609703064,
0.11459258198738098,
-0.030378511175513268,
-0.005029883235692978,
-0.13212119042873383,
-0.0021202426869422197,
-0.04808016121387482,
0.024497391656041145,
-0.1225360855460167,
0.02185128629207611,
0.08327647298574448,
0.00787833146750927,
0.012623712420463562,
0.04495754465460777,
0.06096586957573891,
0.029019279405474663,
-0.0604269839823246,
0.002956473035737872,
-0.05536847561597824,
0.004059408791363239,
-0.08605807274580002,
-0.07381096482276917,
-0.0010516063775867224,
-0.014724265784025192,
0.19779662787914276,
-0.13052622973918915,
0.03565271571278572,
0.11289136111736298,
-0.005989561323076487,
-0.009503825567662716,
-0.04201732203364372,
-0.06623262912034988,
0.10975963622331619,
-0.012460143305361271,
-0.037024881690740585,
0.03980237618088722,
0.02601690962910652,
-0.061022430658340454,
-0.17260299623012543,
-0.0970475897192955,
0.05009652301669121,
0.12437541782855988,
0.06887579709291458,
-0.05821763351559639,
-0.03995196893811226,
-0.023590924218297005,
-0.03908832371234894,
0.06429712474346161,
-0.045532938092947006,
0.03026658482849598,
0.0026904530823230743,
0.06007077172398567,
-0.10919053852558136,
-0.04384881630539894,
0.06400182098150253,
-0.01284062210470438,
-0.04328686743974686,
0.1131523922085762,
0.03539156913757324,
-0.10713862627744675,
0.06380997598171234,
0.057952430099248886,
-0.14340658485889435,
0.1139671802520752,
-0.010076945647597313,
-0.021488331258296967,
-0.09941406548023224,
0.16643720865249634,
0.018506327643990517,
0.11286304146051407,
-0.1634303331375122,
0.09947452694177628,
-0.00929392408579588,
0.008983489125967026,
0.07010107487440109,
-0.1942894458770752,
-0.0023229599464684725,
-0.03061300702393055,
-0.08721230924129486,
-0.06861617416143417,
-0.015447620302438736,
-0.003656132612377405,
0.023634733632206917,
-0.0011160095455124974,
0.07114277780056,
0.1408311426639557,
-0.021699242293834686,
-0.08314504474401474,
0.18067345023155212,
-0.2228248417377472,
-0.2243027687072754,
-0.23713873326778412,
0.014357865788042545,
-0.10143844783306122,
-0.034064821898937225,
-0.05187778174877167,
-0.07830392569303513,
0.03843799605965614,
-0.08154495060443878,
-0.047854289412498474,
-0.010174937546253204,
0.00946576427668333,
0.05525780841708183,
0.022059176117181778,
0.17305299639701843,
-0.08067180961370468,
0.027929941192269325,
0.05004730448126793,
-0.03334664925932884,
0.1225728914141655,
-0.07749255001544952,
-0.0130006680265069,
0.11107799410820007,
-0.017344031482934952,
0.020386451855301857,
0.015911642462015152,
0.321786105632782,
0.012793739326298237,
0.031549349427223206,
0.07509331405162811,
0.004641542676836252,
0.05399017781019211,
0.08592698723077774,
0.015287249349057674,
-0.10515401512384415,
0.07365500181913376,
0.053789518773555756,
-0.09195578098297119,
-0.1337462067604065,
-0.03178740665316582,
-0.06295210868120193,
0.018377745524048805,
0.07818001508712769,
0.06705041229724884,
0.08702594041824341,
0.06877101212739944,
0.04080965742468834,
0.09042216837406158,
0.0002709537511691451,
-0.010677835904061794,
0.11289051920175552,
-0.018581749871373177,
0.07096980512142181,
-0.0141120171174407,
0.02626338228583336,
0.04935658350586891,
0.12695446610450745,
0.06418963521718979,
-0.07558903843164444,
0.024903111159801483,
0.05380476638674736,
0.2923724353313446,
-0.00511576933786273,
0.09942048788070679,
-0.07165701687335968,
-0.020087575539946556,
-0.012876059859991074,
-0.031602099537849426,
-0.08611224591732025,
0.04154687002301216,
0.009212172590196133,
0.05790695920586586,
-0.0025879312306642532,
-0.00050112244207412,
0.07788389921188354,
0.09517701715230942,
0.16684189438819885,
-0.2861690819263458,
-0.10541892051696777,
-0.011134089902043343,
0.11542210727930069,
-0.09569238871335983,
0.016318857669830322,
0.22510360181331635,
0.01965336687862873,
-0.10895603150129318,
-0.031794480979442596,
0.0305477287620306,
-0.004160500131547451,
0.01549921091645956,
0.13027654588222504,
0.11410193145275116,
0.0013551336014643312,
0.08492128551006317,
-0.30027419328689575,
0.03473859280347824,
0.05703248083591461,
0.040042631328105927,
-0.03116016462445259,
0.003734297351911664,
-0.07034774869680405,
-0.08240098506212234,
0.038768261671066284,
0.004138266667723656,
0.1705280989408493,
-0.2976781725883484,
-0.07360395789146423,
-0.00185823580250144,
0.1271347999572754,
0.05303133279085159,
0.04872012883424759,
0.02403664030134678,
0.04942479729652405,
0.08176594227552414,
0.08706773817539215,
-0.02611609175801277,
-0.11035197973251343,
-0.004661462735384703,
0.13897009193897247,
-0.12668536603450775,
-0.05941249430179596,
-0.05124782398343086,
-0.022857125848531723,
0.03539132699370384,
-0.15729300677776337,
-0.04396966099739075,
-0.05650247633457184,
0.03282100334763527,
0.1437554955482483,
-0.034790076315402985,
0.0006096545839682221,
-0.012934640981256962,
0.015886543318629265,
-0.032133374363183975,
-0.08068639785051346,
0.10920129716396332,
-0.044809311628341675,
-0.13727150857448578,
-0.045596521347761154,
0.1377175748348236,
0.0731605812907219,
-0.00359351746737957,
-0.09041156619787216,
-0.050021663308143616,
0.031252212822437286,
-0.1371932029724121,
0.01997753418982029,
0.08576954901218414,
-0.07234665006399155,
0.0698034018278122,
-0.11751603335142136,
0.2458948791027069,
-0.0524151511490345,
0.0896567553281784,
0.0709720253944397,
0.31162014603614807,
-0.08235272765159607,
0.02873161807656288,
0.09803200513124466,
-0.026654057204723358,
-0.26190292835235596,
0.029979633167386055,
0.05638238042593002,
0.04938185587525368,
-0.02843460999429226,
-0.18719743192195892,
0.036363113671541214,
0.06459816545248032,
0.016497470438480377,
0.13878561556339264,
-0.3157224655151367,
-0.0721718817949295,
0.04020849987864494,
0.047762200236320496,
0.11962796002626419,
-0.04890049248933792,
0.011267954483628273,
0.010984989814460278,
-0.016328759491443634,
0.16383521258831024,
-0.08632995188236237,
0.11082419753074646,
-0.012934541329741478,
0.02104428969323635,
0.010682103224098682,
-0.040459323674440384,
0.1487898826599121,
-0.003343862947076559,
0.09072566032409668,
0.025223324075341225,
-0.06733614206314087,
0.06636755168437958,
-0.07327944040298462,
0.011974047869443893,
-0.05378776043653488,
0.09185262024402618,
-0.045282598584890366,
0.0010139745427295566,
-0.06246977299451828,
-0.024063177406787872,
-0.06749105453491211,
-0.06296666711568832,
-0.10791633278131485,
0.08884958922863007,
0.000746204168535769,
-0.02908109501004219,
-0.04201129451394081,
0.054056353867053986,
0.035463664680719376,
0.4493359923362732,
-0.05227240175008774,
-0.053069330751895905,
0.08815151453018188,
0.09277763962745667,
-0.01794450543820858,
0.09321408718824387,
-0.12357188761234283,
0.04079055041074753,
0.12765155732631683,
0.004561528097838163,
0.13461217284202576,
0.08197237551212311,
-0.10337455570697784,
-0.004654042888432741,
0.042563386261463165,
-0.13127627968788147,
-0.08014523237943649,
-0.03295697271823883,
-0.01872093789279461,
-0.1190016120672226,
-0.015042198821902275,
0.09247755259275436,
-0.03084845095872879,
0.053225692361593246,
0.02725985273718834,
0.04755283519625664,
-0.13321809470653534,
0.15661577880382538,
0.03548196330666542,
0.08438514173030853,
-0.08233853429555893,
0.0957811176776886,
0.030104901641607285,
0.004610141273587942,
0.04891415685415268,
-0.02637943997979164,
-0.10149519145488739,
0.019178146496415138,
-0.03032047301530838,
-0.08386382460594177,
0.12455578148365021,
-0.03086884878575802,
-0.04816962778568268,
-0.08449167013168335,
0.00968680065125227,
0.06828322261571884,
0.05217857286334038,
0.10059580206871033,
-0.031775325536727905,
0.02660980448126793,
-0.12699167430400848,
0.08317403495311737,
-0.024292631074786186,
0.012828993611037731,
-0.1356390118598938,
0.06760798394680023,
-0.02659040130674839,
0.057388827204704285,
-0.01863805763423443,
-0.01973152346909046,
-0.2173939198255539,
0.02618231251835823,
-0.025557123124599457,
0.0038494919426739216,
0.04102040454745293,
0.027889033779501915,
0.026005011051893234,
0.05082876980304718,
-0.035126328468322754,
0.027516542002558708,
-0.031932324171066284,
-0.05281057208776474,
0.04515986144542694,
-0.001467780559323728,
-0.02854740433394909,
-0.051517583429813385,
0.06076597422361374,
-0.11085972934961319,
0.03777240961790085,
0.01833501085639,
-0.05718983709812164,
0.07385632395744324,
0.034146055579185486,
0.025609485805034637,
0.09601592272520065,
0.05744863301515579,
0.03817358240485191,
-0.06451597064733505,
0.03407379239797592,
-0.0191631019115448,
-0.006845394615083933,
0.040695060044527054,
0.11625181883573532,
-0.04832686111330986,
-0.06327467411756516,
-0.1486724466085434,
-0.010981161147356033,
-0.048681098967790604,
0.043675489723682404,
0.14957195520401,
0.10375252366065979,
0.09278536587953568,
-0.08544762432575226,
-0.0226217620074749,
-0.14531907439231873,
-0.08078242838382721,
0.060537077486515045,
-0.044578637927770615,
-0.05809058994054794,
-0.04991595074534416,
0.072527676820755,
-0.014007162302732468,
0.13561315834522247,
-0.08526904881000519,
-0.1192963719367981,
-0.05781520903110504,
-0.2013944536447525,
-0.12102537602186203,
0.006627216935157776,
0.2745034396648407,
0.03526143729686737,
-0.04139144718647003,
-0.07936355471611023,
0.0031785741448402405,
0.07116973400115967,
0.16906431317329407,
0.0312589667737484,
0.110907644033432,
-0.11192750185728073,
0.10478517413139343,
0.039009008556604385,
-0.05691129341721535,
0.10796307027339935,
0.3328118920326233,
-0.08219370990991592,
0.02072407677769661,
-0.10421788692474365,
0.10153460502624512,
0.002255996223539114,
-0.13901546597480774,
0.010607997886836529,
-0.03188878297805786,
-0.16198381781578064,
-0.10664700716733932,
0.016538817435503006,
-0.07478190958499908,
-0.1838442087173462,
-0.017796620726585388,
-0.11684910207986832,
-0.07336822152137756,
0.10155867785215378,
0.040450263768434525,
-0.03009716235101223,
0.20794062316417694,
-0.07158278673887253,
0.04586811363697052,
-0.003929977770894766,
-0.016108684241771698,
-0.01862870156764984,
-0.03230820968747139,
-0.10202806442975998,
0.13945409655570984,
0.021661214530467987,
0.10333898663520813,
-0.0020766255911439657,
0.07358334958553314,
0.037272170186042786,
-0.0280001163482666,
-0.04698936268687248,
-0.011088578961789608,
0.0020344743970781565,
-0.05577605590224266,
0.11701337993144989,
0.05259020999073982,
-0.0886714980006218,
-0.07409586012363434,
-0.009580516256392002,
-0.07335926592350006,
-0.038215573877096176,
-0.1547286957502365,
0.2329166978597641,
-0.030917182564735413,
0.12437079846858978,
0.003149752039462328,
-0.06306970119476318,
-0.09483032673597336,
0.14517559111118317,
0.12407895922660828,
-0.11817460507154465,
-0.009399796836078167,
0.09650255739688873,
-0.005866600200533867,
-0.09527475386857986,
0.14510999619960785,
0.08037827908992767,
-0.02563084289431572,
0.025037124752998352,
-0.022733965888619423,
-0.021453559398651123,
-0.015347006730735302,
0.015540610998868942,
-0.030597979202866554,
0.029416168108582497,
0.042662251740694046,
-0.1463923156261444,
-0.03498699516057968,
-0.08044791221618652,
-0.08698305487632751,
0.18581046164035797,
-0.14277657866477966,
-0.07864214479923248,
-0.04254341498017311,
-0.08468645066022873,
-0.10728199034929276,
0.021719660609960556,
-0.10976678878068924,
0.06724512577056885,
0.05545623227953911,
-0.05225260928273201,
-0.0009250067523680627,
-0.03164041042327881,
0.0032662637531757355,
0.03487436845898628,
0.07120268791913986,
-0.01732550375163555,
0.05975370481610298,
0.12397127598524094,
-0.019730931147933006,
-0.04827846959233284,
0.11225178092718124,
0.021490709856152534,
-0.035946041345596313,
-0.14605534076690674,
0.03688865900039673,
-0.022579049691557884,
0.12442220002412796,
0.032534532248973846,
-0.05697030946612358,
-0.014126471243798733,
-0.2196122705936432,
-0.008397997356951237,
-0.1463952362537384,
-0.07968004047870636,
-0.06731394678354263,
0.10560601949691772,
0.17703479528427124,
-0.055034078657627106,
0.019445886835455894,
-0.03590158373117447,
0.04178448021411896,
-0.05044577643275261,
0.07704321295022964,
-0.0026474366895854473,
-0.1442994624376297,
0.05290527641773224,
-0.054687224328517914,
0.009347786195576191,
-0.3066256046295166,
-0.009464235045015812,
0.024085188284516335,
-0.03251948952674866,
-0.03590572252869606,
0.16143204271793365,
0.021979082375764847,
0.06882719695568085,
-0.05558779463171959,
-0.25186118483543396,
-0.07091144472360611,
0.13138745725154877,
-0.0019314914243295789,
-0.07031361013650894
] |
null | null | peft | ## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.5.0
| {"library_name": "peft"} | null | genies-models/openllama-3b-gender_bias | [
"peft",
"region:us"
] | 2023-11-11T02:33:18+00:00 | [] | [] | TAGS
#peft #region-us
| ## Training procedure
The following 'bitsandbytes' quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.5.0
| [
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16",
"### Framework versions\n\n\n- PEFT 0.5.0"
] | [
"TAGS\n#peft #region-us \n",
"## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16",
"### Framework versions\n\n\n- PEFT 0.5.0"
] | [
9,
163,
11
] | [
"passage: TAGS\n#peft #region-us \n## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- quant_method: bitsandbytes\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: float16### Framework versions\n\n\n- PEFT 0.5.0"
] | [
-0.052170995622873306,
0.04638637602329254,
-0.002228866796940565,
0.12587358057498932,
0.09519550204277039,
0.0600481741130352,
0.10608749836683273,
0.12907910346984863,
0.030399560928344727,
0.08949745446443558,
0.11713548749685287,
0.04174109548330307,
0.061590272933244705,
0.15933413803577423,
-0.013215559534728527,
-0.00870309304445982,
0.04744568094611168,
0.0017413088353350759,
-0.0256193894892931,
0.08726111054420471,
0.04637008160352707,
-0.05632777512073517,
0.03198761120438576,
-0.0936860740184784,
-0.14197169244289398,
-0.000977774616330862,
-0.006933287251740694,
0.026412004604935646,
0.03658851608633995,
0.030308807268738747,
0.06681910902261734,
0.011842764914035797,
-0.01992746815085411,
-0.22561807930469513,
-0.011325501836836338,
0.10931333899497986,
-0.024895185604691505,
0.0698515921831131,
-0.09559076279401779,
0.13339440524578094,
-0.08422335982322693,
-0.051820602267980576,
0.015994668006896973,
0.014835527166724205,
-0.0893593430519104,
-0.09642457216978073,
-0.044293515384197235,
0.04310377314686775,
0.030483758077025414,
0.05993334576487541,
-0.014119904488325119,
0.17196474969387054,
-0.15130110085010529,
0.0900082141160965,
0.07799916714429855,
-0.20914553105831146,
-0.02864500693976879,
0.12620775401592255,
-0.022182848304510117,
0.16751492023468018,
-0.0821603313088417,
-0.0928761437535286,
0.06491231173276901,
0.03869080916047096,
-0.018066778779029846,
-0.0015547500224784017,
-0.11479837447404861,
0.030915888026356697,
-0.13364118337631226,
-0.04813624918460846,
0.14555932581424713,
0.03509735316038132,
-0.03541167080402374,
-0.046493805944919586,
-0.09799899905920029,
-0.3764220178127289,
0.025444475933909416,
-0.019241128116846085,
-0.066914401948452,
0.04486314579844475,
-0.02249828726053238,
-0.01497147511690855,
-0.0114058218896389,
-0.08907991647720337,
-0.02228786051273346,
0.07495085150003433,
0.03715614974498749,
0.026139622554183006,
0.006188852712512016,
0.11266672611236572,
-0.09897858649492264,
-0.03323763236403465,
-0.036225371062755585,
-0.028881169855594635,
-0.05456538870930672,
-0.006604235153645277,
-0.07022203505039215,
0.17528314888477325,
0.07525674998760223,
0.12666282057762146,
-0.12827250361442566,
0.1311921775341034,
-0.017891818657517433,
0.049590788781642914,
-0.03390548750758171,
0.035105880349874496,
-0.12452733516693115,
0.10138563066720963,
0.007189449388533831,
0.16058793663978577,
0.009220117703080177,
-0.04686558619141579,
-0.06686486303806305,
0.001404313137754798,
0.1462526172399521,
-0.0007741169538348913,
-0.10082004219293594,
0.01514110155403614,
-0.143125981092453,
-0.03643784299492836,
0.06489310413599014,
-0.07333685457706451,
0.009285308420658112,
0.035954903811216354,
-0.05005185306072235,
-0.035470712929964066,
0.10925997793674469,
-0.04522745683789253,
-0.026262326166033745,
-0.027128657326102257,
-0.09760791808366776,
-0.017379216849803925,
-0.10380303114652634,
-0.13263756036758423,
0.04999830946326256,
-0.18068750202655792,
0.00562055641785264,
-0.042586758732795715,
-0.05852135270833969,
0.02671758644282818,
0.011211113072931767,
-0.0801720842719078,
0.0641409307718277,
-0.10207898169755936,
-0.14261718094348907,
-0.02556646801531315,
0.015782896429300308,
0.027732979506254196,
-0.01964949443936348,
0.10067597031593323,
0.04901120811700821,
0.11884310096502304,
-0.15137265622615814,
-0.000906149041838944,
-0.0006713634938932955,
0.06834474205970764,
0.05941688269376755,
0.13060152530670166,
-0.10515986382961273,
-0.027969863265752792,
-0.06103983148932457,
-0.06348162889480591,
-0.13580144941806793,
-0.01634615659713745,
0.1407272070646286,
0.08765122294425964,
-0.162354975938797,
-0.01073814183473587,
0.07467684894800186,
-0.002643210580572486,
-0.07879126816987991,
0.15129150450229645,
-0.05682380124926567,
0.11437069624662399,
-0.04345925524830818,
0.07842165231704712,
0.23099873960018158,
-0.0996355414390564,
0.006504190154373646,
0.11197512596845627,
0.07360801845788956,
-0.000003583008265195531,
0.013187721371650696,
0.07264252007007599,
-0.10083498060703278,
0.03870057687163353,
0.07377462089061737,
0.03774886205792427,
-0.06504413485527039,
-0.07400774955749512,
-0.022752903401851654,
-0.05681690201163292,
0.1151171550154686,
0.02729841135442257,
0.014005406759679317,
-0.06971991807222366,
-0.08609434217214584,
0.1404680609703064,
0.11459258198738098,
-0.030378511175513268,
-0.005029883235692978,
-0.13212119042873383,
-0.0021202426869422197,
-0.04808016121387482,
0.024497391656041145,
-0.1225360855460167,
0.02185128629207611,
0.08327647298574448,
0.00787833146750927,
0.012623712420463562,
0.04495754465460777,
0.06096586957573891,
0.029019279405474663,
-0.0604269839823246,
0.002956473035737872,
-0.05536847561597824,
0.004059408791363239,
-0.08605807274580002,
-0.07381096482276917,
-0.0010516063775867224,
-0.014724265784025192,
0.19779662787914276,
-0.13052622973918915,
0.03565271571278572,
0.11289136111736298,
-0.005989561323076487,
-0.009503825567662716,
-0.04201732203364372,
-0.06623262912034988,
0.10975963622331619,
-0.012460143305361271,
-0.037024881690740585,
0.03980237618088722,
0.02601690962910652,
-0.061022430658340454,
-0.17260299623012543,
-0.0970475897192955,
0.05009652301669121,
0.12437541782855988,
0.06887579709291458,
-0.05821763351559639,
-0.03995196893811226,
-0.023590924218297005,
-0.03908832371234894,
0.06429712474346161,
-0.045532938092947006,
0.03026658482849598,
0.0026904530823230743,
0.06007077172398567,
-0.10919053852558136,
-0.04384881630539894,
0.06400182098150253,
-0.01284062210470438,
-0.04328686743974686,
0.1131523922085762,
0.03539156913757324,
-0.10713862627744675,
0.06380997598171234,
0.057952430099248886,
-0.14340658485889435,
0.1139671802520752,
-0.010076945647597313,
-0.021488331258296967,
-0.09941406548023224,
0.16643720865249634,
0.018506327643990517,
0.11286304146051407,
-0.1634303331375122,
0.09947452694177628,
-0.00929392408579588,
0.008983489125967026,
0.07010107487440109,
-0.1942894458770752,
-0.0023229599464684725,
-0.03061300702393055,
-0.08721230924129486,
-0.06861617416143417,
-0.015447620302438736,
-0.003656132612377405,
0.023634733632206917,
-0.0011160095455124974,
0.07114277780056,
0.1408311426639557,
-0.021699242293834686,
-0.08314504474401474,
0.18067345023155212,
-0.2228248417377472,
-0.2243027687072754,
-0.23713873326778412,
0.014357865788042545,
-0.10143844783306122,
-0.034064821898937225,
-0.05187778174877167,
-0.07830392569303513,
0.03843799605965614,
-0.08154495060443878,
-0.047854289412498474,
-0.010174937546253204,
0.00946576427668333,
0.05525780841708183,
0.022059176117181778,
0.17305299639701843,
-0.08067180961370468,
0.027929941192269325,
0.05004730448126793,
-0.03334664925932884,
0.1225728914141655,
-0.07749255001544952,
-0.0130006680265069,
0.11107799410820007,
-0.017344031482934952,
0.020386451855301857,
0.015911642462015152,
0.321786105632782,
0.012793739326298237,
0.031549349427223206,
0.07509331405162811,
0.004641542676836252,
0.05399017781019211,
0.08592698723077774,
0.015287249349057674,
-0.10515401512384415,
0.07365500181913376,
0.053789518773555756,
-0.09195578098297119,
-0.1337462067604065,
-0.03178740665316582,
-0.06295210868120193,
0.018377745524048805,
0.07818001508712769,
0.06705041229724884,
0.08702594041824341,
0.06877101212739944,
0.04080965742468834,
0.09042216837406158,
0.0002709537511691451,
-0.010677835904061794,
0.11289051920175552,
-0.018581749871373177,
0.07096980512142181,
-0.0141120171174407,
0.02626338228583336,
0.04935658350586891,
0.12695446610450745,
0.06418963521718979,
-0.07558903843164444,
0.024903111159801483,
0.05380476638674736,
0.2923724353313446,
-0.00511576933786273,
0.09942048788070679,
-0.07165701687335968,
-0.020087575539946556,
-0.012876059859991074,
-0.031602099537849426,
-0.08611224591732025,
0.04154687002301216,
0.009212172590196133,
0.05790695920586586,
-0.0025879312306642532,
-0.00050112244207412,
0.07788389921188354,
0.09517701715230942,
0.16684189438819885,
-0.2861690819263458,
-0.10541892051696777,
-0.011134089902043343,
0.11542210727930069,
-0.09569238871335983,
0.016318857669830322,
0.22510360181331635,
0.01965336687862873,
-0.10895603150129318,
-0.031794480979442596,
0.0305477287620306,
-0.004160500131547451,
0.01549921091645956,
0.13027654588222504,
0.11410193145275116,
0.0013551336014643312,
0.08492128551006317,
-0.30027419328689575,
0.03473859280347824,
0.05703248083591461,
0.040042631328105927,
-0.03116016462445259,
0.003734297351911664,
-0.07034774869680405,
-0.08240098506212234,
0.038768261671066284,
0.004138266667723656,
0.1705280989408493,
-0.2976781725883484,
-0.07360395789146423,
-0.00185823580250144,
0.1271347999572754,
0.05303133279085159,
0.04872012883424759,
0.02403664030134678,
0.04942479729652405,
0.08176594227552414,
0.08706773817539215,
-0.02611609175801277,
-0.11035197973251343,
-0.004661462735384703,
0.13897009193897247,
-0.12668536603450775,
-0.05941249430179596,
-0.05124782398343086,
-0.022857125848531723,
0.03539132699370384,
-0.15729300677776337,
-0.04396966099739075,
-0.05650247633457184,
0.03282100334763527,
0.1437554955482483,
-0.034790076315402985,
0.0006096545839682221,
-0.012934640981256962,
0.015886543318629265,
-0.032133374363183975,
-0.08068639785051346,
0.10920129716396332,
-0.044809311628341675,
-0.13727150857448578,
-0.045596521347761154,
0.1377175748348236,
0.0731605812907219,
-0.00359351746737957,
-0.09041156619787216,
-0.050021663308143616,
0.031252212822437286,
-0.1371932029724121,
0.01997753418982029,
0.08576954901218414,
-0.07234665006399155,
0.0698034018278122,
-0.11751603335142136,
0.2458948791027069,
-0.0524151511490345,
0.0896567553281784,
0.0709720253944397,
0.31162014603614807,
-0.08235272765159607,
0.02873161807656288,
0.09803200513124466,
-0.026654057204723358,
-0.26190292835235596,
0.029979633167386055,
0.05638238042593002,
0.04938185587525368,
-0.02843460999429226,
-0.18719743192195892,
0.036363113671541214,
0.06459816545248032,
0.016497470438480377,
0.13878561556339264,
-0.3157224655151367,
-0.0721718817949295,
0.04020849987864494,
0.047762200236320496,
0.11962796002626419,
-0.04890049248933792,
0.011267954483628273,
0.010984989814460278,
-0.016328759491443634,
0.16383521258831024,
-0.08632995188236237,
0.11082419753074646,
-0.012934541329741478,
0.02104428969323635,
0.010682103224098682,
-0.040459323674440384,
0.1487898826599121,
-0.003343862947076559,
0.09072566032409668,
0.025223324075341225,
-0.06733614206314087,
0.06636755168437958,
-0.07327944040298462,
0.011974047869443893,
-0.05378776043653488,
0.09185262024402618,
-0.045282598584890366,
0.0010139745427295566,
-0.06246977299451828,
-0.024063177406787872,
-0.06749105453491211,
-0.06296666711568832,
-0.10791633278131485,
0.08884958922863007,
0.000746204168535769,
-0.02908109501004219,
-0.04201129451394081,
0.054056353867053986,
0.035463664680719376,
0.4493359923362732,
-0.05227240175008774,
-0.053069330751895905,
0.08815151453018188,
0.09277763962745667,
-0.01794450543820858,
0.09321408718824387,
-0.12357188761234283,
0.04079055041074753,
0.12765155732631683,
0.004561528097838163,
0.13461217284202576,
0.08197237551212311,
-0.10337455570697784,
-0.004654042888432741,
0.042563386261463165,
-0.13127627968788147,
-0.08014523237943649,
-0.03295697271823883,
-0.01872093789279461,
-0.1190016120672226,
-0.015042198821902275,
0.09247755259275436,
-0.03084845095872879,
0.053225692361593246,
0.02725985273718834,
0.04755283519625664,
-0.13321809470653534,
0.15661577880382538,
0.03548196330666542,
0.08438514173030853,
-0.08233853429555893,
0.0957811176776886,
0.030104901641607285,
0.004610141273587942,
0.04891415685415268,
-0.02637943997979164,
-0.10149519145488739,
0.019178146496415138,
-0.03032047301530838,
-0.08386382460594177,
0.12455578148365021,
-0.03086884878575802,
-0.04816962778568268,
-0.08449167013168335,
0.00968680065125227,
0.06828322261571884,
0.05217857286334038,
0.10059580206871033,
-0.031775325536727905,
0.02660980448126793,
-0.12699167430400848,
0.08317403495311737,
-0.024292631074786186,
0.012828993611037731,
-0.1356390118598938,
0.06760798394680023,
-0.02659040130674839,
0.057388827204704285,
-0.01863805763423443,
-0.01973152346909046,
-0.2173939198255539,
0.02618231251835823,
-0.025557123124599457,
0.0038494919426739216,
0.04102040454745293,
0.027889033779501915,
0.026005011051893234,
0.05082876980304718,
-0.035126328468322754,
0.027516542002558708,
-0.031932324171066284,
-0.05281057208776474,
0.04515986144542694,
-0.001467780559323728,
-0.02854740433394909,
-0.051517583429813385,
0.06076597422361374,
-0.11085972934961319,
0.03777240961790085,
0.01833501085639,
-0.05718983709812164,
0.07385632395744324,
0.034146055579185486,
0.025609485805034637,
0.09601592272520065,
0.05744863301515579,
0.03817358240485191,
-0.06451597064733505,
0.03407379239797592,
-0.0191631019115448,
-0.006845394615083933,
0.040695060044527054,
0.11625181883573532,
-0.04832686111330986,
-0.06327467411756516,
-0.1486724466085434,
-0.010981161147356033,
-0.048681098967790604,
0.043675489723682404,
0.14957195520401,
0.10375252366065979,
0.09278536587953568,
-0.08544762432575226,
-0.0226217620074749,
-0.14531907439231873,
-0.08078242838382721,
0.060537077486515045,
-0.044578637927770615,
-0.05809058994054794,
-0.04991595074534416,
0.072527676820755,
-0.014007162302732468,
0.13561315834522247,
-0.08526904881000519,
-0.1192963719367981,
-0.05781520903110504,
-0.2013944536447525,
-0.12102537602186203,
0.006627216935157776,
0.2745034396648407,
0.03526143729686737,
-0.04139144718647003,
-0.07936355471611023,
0.0031785741448402405,
0.07116973400115967,
0.16906431317329407,
0.0312589667737484,
0.110907644033432,
-0.11192750185728073,
0.10478517413139343,
0.039009008556604385,
-0.05691129341721535,
0.10796307027339935,
0.3328118920326233,
-0.08219370990991592,
0.02072407677769661,
-0.10421788692474365,
0.10153460502624512,
0.002255996223539114,
-0.13901546597480774,
0.010607997886836529,
-0.03188878297805786,
-0.16198381781578064,
-0.10664700716733932,
0.016538817435503006,
-0.07478190958499908,
-0.1838442087173462,
-0.017796620726585388,
-0.11684910207986832,
-0.07336822152137756,
0.10155867785215378,
0.040450263768434525,
-0.03009716235101223,
0.20794062316417694,
-0.07158278673887253,
0.04586811363697052,
-0.003929977770894766,
-0.016108684241771698,
-0.01862870156764984,
-0.03230820968747139,
-0.10202806442975998,
0.13945409655570984,
0.021661214530467987,
0.10333898663520813,
-0.0020766255911439657,
0.07358334958553314,
0.037272170186042786,
-0.0280001163482666,
-0.04698936268687248,
-0.011088578961789608,
0.0020344743970781565,
-0.05577605590224266,
0.11701337993144989,
0.05259020999073982,
-0.0886714980006218,
-0.07409586012363434,
-0.009580516256392002,
-0.07335926592350006,
-0.038215573877096176,
-0.1547286957502365,
0.2329166978597641,
-0.030917182564735413,
0.12437079846858978,
0.003149752039462328,
-0.06306970119476318,
-0.09483032673597336,
0.14517559111118317,
0.12407895922660828,
-0.11817460507154465,
-0.009399796836078167,
0.09650255739688873,
-0.005866600200533867,
-0.09527475386857986,
0.14510999619960785,
0.08037827908992767,
-0.02563084289431572,
0.025037124752998352,
-0.022733965888619423,
-0.021453559398651123,
-0.015347006730735302,
0.015540610998868942,
-0.030597979202866554,
0.029416168108582497,
0.042662251740694046,
-0.1463923156261444,
-0.03498699516057968,
-0.08044791221618652,
-0.08698305487632751,
0.18581046164035797,
-0.14277657866477966,
-0.07864214479923248,
-0.04254341498017311,
-0.08468645066022873,
-0.10728199034929276,
0.021719660609960556,
-0.10976678878068924,
0.06724512577056885,
0.05545623227953911,
-0.05225260928273201,
-0.0009250067523680627,
-0.03164041042327881,
0.0032662637531757355,
0.03487436845898628,
0.07120268791913986,
-0.01732550375163555,
0.05975370481610298,
0.12397127598524094,
-0.019730931147933006,
-0.04827846959233284,
0.11225178092718124,
0.021490709856152534,
-0.035946041345596313,
-0.14605534076690674,
0.03688865900039673,
-0.022579049691557884,
0.12442220002412796,
0.032534532248973846,
-0.05697030946612358,
-0.014126471243798733,
-0.2196122705936432,
-0.008397997356951237,
-0.1463952362537384,
-0.07968004047870636,
-0.06731394678354263,
0.10560601949691772,
0.17703479528427124,
-0.055034078657627106,
0.019445886835455894,
-0.03590158373117447,
0.04178448021411896,
-0.05044577643275261,
0.07704321295022964,
-0.0026474366895854473,
-0.1442994624376297,
0.05290527641773224,
-0.054687224328517914,
0.009347786195576191,
-0.3066256046295166,
-0.009464235045015812,
0.024085188284516335,
-0.03251948952674866,
-0.03590572252869606,
0.16143204271793365,
0.021979082375764847,
0.06882719695568085,
-0.05558779463171959,
-0.25186118483543396,
-0.07091144472360611,
0.13138745725154877,
-0.0019314914243295789,
-0.07031361013650894
] |
Subsets and Splits