sha
null
last_modified
null
library_name
stringclasses
154 values
text
stringlengths
1
900k
metadata
stringlengths
2
348k
pipeline_tag
stringclasses
45 values
id
stringlengths
5
122
tags
sequencelengths
1
1.84k
created_at
stringlengths
25
25
arxiv
sequencelengths
0
201
languages
sequencelengths
0
1.83k
tags_str
stringlengths
17
9.34k
text_str
stringlengths
0
389k
text_lists
sequencelengths
0
722
processed_texts
sequencelengths
1
723
tokens_length
sequencelengths
1
723
input_texts
sequencelengths
1
61
embeddings
sequencelengths
768
768
null
null
peft
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> - **Developed by:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Data Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Data Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed] ## Training procedure The following `bitsandbytes` quantization config was used during training: - load_in_8bit: False - load_in_4bit: True - llm_int8_threshold: 6.0 - llm_int8_skip_modules: None - llm_int8_enable_fp32_cpu_offload: False - llm_int8_has_fp16_weight: False - bnb_4bit_quant_type: nf4 - bnb_4bit_use_double_quant: True - bnb_4bit_compute_dtype: bfloat16 ### Framework versions - PEFT 0.7.0.dev0
{"library_name": "peft", "base_model": "meta-llama/Llama-2-13b-chat-hf"}
null
bmehrba/Llama-2-13b-chat-hf-fine-tuned_ChatGPT_t1_Llama13b_Seed103
[ "peft", "arxiv:1910.09700", "base_model:meta-llama/Llama-2-13b-chat-hf", "region:us" ]
2024-02-13T18:43:01+00:00
[ "1910.09700" ]
[]
TAGS #peft #arxiv-1910.09700 #base_model-meta-llama/Llama-2-13b-chat-hf #region-us
# Model Card for Model ID ## Model Details ### Model Description - Developed by: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact ## Training procedure The following 'bitsandbytes' quantization config was used during training: - load_in_8bit: False - load_in_4bit: True - llm_int8_threshold: 6.0 - llm_int8_skip_modules: None - llm_int8_enable_fp32_cpu_offload: False - llm_int8_has_fp16_weight: False - bnb_4bit_quant_type: nf4 - bnb_4bit_use_double_quant: True - bnb_4bit_compute_dtype: bfloat16 ### Framework versions - PEFT 0.7.0.dev0
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\n\n\n- Developed by: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact", "## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: bfloat16", "### Framework versions\n\n\n- PEFT 0.7.0.dev0" ]
[ "TAGS\n#peft #arxiv-1910.09700 #base_model-meta-llama/Llama-2-13b-chat-hf #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\n\n\n- Developed by: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact", "## Training procedure\n\n\nThe following 'bitsandbytes' quantization config was used during training:\n- load_in_8bit: False\n- load_in_4bit: True\n- llm_int8_threshold: 6.0\n- llm_int8_skip_modules: None\n- llm_int8_enable_fp32_cpu_offload: False\n- llm_int8_has_fp16_weight: False\n- bnb_4bit_quant_type: nf4\n- bnb_4bit_use_double_quant: True\n- bnb_4bit_compute_dtype: bfloat16", "### Framework versions\n\n\n- PEFT 0.7.0.dev0" ]
[ 38, 6, 3, 45, 28, 3, 4, 9, 9, 10, 42, 20, 3, 4, 5, 9, 11, 13, 3, 12, 5, 4, 5, 3, 4, 9, 53, 9, 8, 6, 3, 14, 8, 7, 9, 4, 154, 14 ]
[ "passage: TAGS\n#peft #arxiv-1910.09700 #base_model-meta-llama/Llama-2-13b-chat-hf #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\n\n\n- Developed by: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact" ]
[ -0.08950838446617126, 0.17622625827789307, -0.003707088530063629, 0.032576385885477066, 0.08380123972892761, 0.019701125100255013, 0.05203324928879738, 0.11702486872673035, -0.05330678075551987, 0.09448089450597763, 0.048484884202480316, 0.10060896724462509, 0.09846198558807373, 0.18868719041347504, -0.0011855853954330087, -0.2060726284980774, 0.015578063204884529, -0.10931064933538437, 0.005876870360225439, 0.12358442693948746, 0.15569306910037994, -0.09741293638944626, 0.08712729811668396, -0.01551457867026329, -0.010067826136946678, -0.025396287441253662, -0.07361544668674469, -0.05290524289011955, 0.04710441827774048, 0.07490185648202896, 0.047730859369039536, 0.003742797765880823, 0.08045824617147446, -0.2711505889892578, 0.01725192740559578, 0.03912210091948509, -0.010164672508835793, 0.08416316658258438, 0.08157632499933243, -0.061213672161102295, 0.10719792544841766, -0.04486960545182228, 0.12389195710420609, 0.06922121345996857, -0.06562015414237976, -0.1487942785024643, -0.0805540531873703, 0.06815578043460846, 0.16221418976783752, 0.07476766407489777, -0.04304589703679085, 0.16949640214443207, -0.13273242115974426, 0.007597264833748341, 0.046794891357421875, -0.035554688423871994, -0.08115267008543015, 0.060742560774087906, 0.09725039452314377, 0.07205293327569962, -0.13358467817306519, -0.029269445687532425, 0.031876083463430405, 0.026171350851655006, 0.07599646598100662, 0.02472980134189129, 0.14272165298461914, 0.05110684782266617, -0.13597595691680908, -0.032095685601234436, 0.1667022556066513, 0.05657454952597618, -0.05146843194961548, -0.20977118611335754, 0.010412882082164288, -0.06257046014070511, -0.019110077992081642, -0.0394989438354969, 0.04172099754214287, -0.026554755866527557, 0.06876977533102036, 0.0052980040200054646, -0.0955195426940918, -0.042122215032577515, 0.08467143774032593, 0.03501870483160019, 0.025577984750270844, -0.03146751970052719, -0.005369491875171661, 0.13237224519252777, 0.05266503989696503, -0.11971335113048553, -0.06415551900863647, -0.06459555774927139, -0.05922604724764824, -0.05847278982400894, 0.025247467681765556, 0.031127413734793663, 0.0707581415772438, 0.20909400284290314, 0.02113768272101879, 0.04728280380368233, 0.06350736320018768, 0.01767423190176487, 0.07364732772111893, 0.08452971279621124, -0.08042320609092712, -0.13752959668636322, -0.026864496991038322, 0.09401044249534607, -0.004670456051826477, -0.015377101488411427, -0.04042273387312889, 0.04590466991066933, 0.03928038105368614, 0.09635873883962631, 0.08342839032411575, -0.006302335299551487, -0.08958663791418076, -0.05172271281480789, 0.21430253982543945, -0.1486416757106781, 0.022579502314329147, 0.00532573601230979, -0.046220771968364716, -0.050389427691698074, 0.013791119679808617, 0.021902183070778847, -0.01725425384938717, 0.09078584611415863, -0.07412354648113251, -0.030390940606594086, -0.11564502120018005, -0.00758272223174572, 0.035115793347358704, 0.05083532631397247, -0.0026497903745621443, -0.019051065668463707, -0.06038069352507591, -0.07015779614448547, 0.08611448109149933, -0.08802679926156998, -0.06949871778488159, -0.022058209404349327, -0.08482711762189865, 0.008333494886755943, 0.004399609286338091, 0.13455772399902344, -0.032166268676519394, 0.04013873636722565, -0.009890900924801826, 0.05181796848773956, 0.06774567812681198, 0.03500198572874069, -0.053186893463134766, 0.056685443967580795, -0.19885419309139252, 0.10022944211959839, -0.09629994630813599, 0.028232630342245102, -0.15368616580963135, -0.016224225983023643, 0.024259883910417557, 0.00603050272911787, 0.023533180356025696, 0.13508757948875427, -0.2269131988286972, -0.009413540363311768, 0.1492016613483429, -0.08191759884357452, -0.11286741495132446, 0.05882270261645317, -0.06703686714172363, 0.13632111251354218, 0.024114999920129776, -0.03846221789717674, 0.05126623064279556, -0.1477012187242508, -0.034279413521289825, -0.027603546157479286, -0.011836200952529907, 0.11866577714681625, 0.09630073606967926, -0.0608704648911953, 0.048884205520153046, 0.020479585975408554, -0.032701265066862106, -0.042141854763031006, -0.050704531371593475, -0.12829554080963135, 0.0009587573586031795, -0.07328714430332184, 0.04790837690234184, -0.02088468335568905, -0.06889110058546066, -0.018932033330202103, -0.16518932580947876, 0.002006813418120146, 0.09172286838293076, 0.02033841609954834, -0.03539799153804779, -0.10069174319505692, 0.0036235731095075607, -0.011536587961018085, -0.035604726523160934, -0.13578550517559052, -0.02210777997970581, 0.019318837672472, -0.13882264494895935, 0.030753053724765778, -0.07345959544181824, 0.051180385053157806, 0.016524922102689743, -0.05861951783299446, -0.010977345518767834, -0.023012345656752586, 0.024373451247811317, -0.0456857830286026, -0.24518829584121704, -0.01426833588629961, -0.032443173229694366, 0.1618536114692688, -0.23377619683742523, 0.038241252303123474, 0.06515999883413315, 0.11937034130096436, -0.02269211784005165, -0.050194818526506424, 0.02402755618095398, -0.0810660794377327, -0.03478178381919861, -0.05240238085389137, -0.0170640479773283, -0.02249637059867382, -0.06970936059951782, 0.013335862196981907, -0.10944215208292007, -0.04154296964406967, 0.10713886469602585, 0.08292265236377716, -0.15724287927150726, -0.043278347700834274, -0.03408950939774513, -0.08576270937919617, -0.08529800176620483, -0.0566803403198719, 0.13487502932548523, 0.05090935528278351, 0.02855822816491127, -0.08846847712993622, -0.07940267771482468, 0.00988192018121481, -0.03207101300358772, -0.028083765879273415, 0.10094649344682693, 0.07611845433712006, -0.10813652724027634, 0.08834784477949142, 0.07578150928020477, 0.012136061675846577, 0.11384404450654984, -0.011400082148611546, -0.11351825296878815, -0.04137531667947769, 0.03633233532309532, 0.002555434126406908, 0.1695048063993454, -0.09464383870363235, 0.06803114712238312, 0.03927377983927727, -0.022211823612451553, 0.05476415529847145, -0.10076725482940674, 0.01427049096673727, 0.006726768799126148, -0.012228100560605526, -0.011376895941793919, -0.036163002252578735, 0.020614514127373695, 0.07891662418842316, 0.03816615790128708, 0.036182720214128494, 0.03572281077504158, -0.04122483730316162, -0.1245279312133789, 0.19345727562904358, -0.10554436594247818, -0.2273423671722412, -0.1516016721725464, 0.05401213839650154, 0.03572985157370567, -0.030572842806577682, 0.008941974490880966, -0.05140937119722366, -0.0966159775853157, -0.08070044219493866, 0.005514310672879219, 0.03883929178118706, -0.07613059133291245, -0.07262902706861496, 0.05921752378344536, 0.05427297204732895, -0.13442036509513855, 0.0406947135925293, 0.054035235196352005, -0.04148136079311371, 0.008404599502682686, 0.06944910436868668, 0.07862463593482971, 0.15086530148983002, -0.020428497344255447, -0.020412612706422806, 0.05437345430254936, 0.2643863558769226, -0.15086820721626282, 0.09670513868331909, 0.09954504668712616, -0.06504277884960175, 0.07992210984230042, 0.18344183266162872, 0.033216435462236404, -0.10660552978515625, 0.045308101922273636, 0.031075740233063698, -0.0188649483025074, -0.2811678647994995, -0.06357815116643906, 0.0033266504760831594, -0.10220301896333694, 0.062428005039691925, 0.0793466567993164, 0.09731262922286987, 0.04918764531612396, -0.06440604478120804, -0.07534892857074738, 0.02199655771255493, 0.07507231831550598, -0.04625728353857994, 0.0006049389485269785, 0.08203481882810593, -0.0200007613748312, 0.008962401188910007, 0.11015255749225616, 0.013906295411288738, 0.1873634159564972, 0.04269689694046974, 0.11463924497365952, 0.10168035328388214, 0.10507753491401672, 0.000024342234610230662, 0.015555954538285732, 0.02079109288752079, 0.012282595038414001, -0.002983907237648964, -0.08613301068544388, 0.02277722768485546, 0.12184786051511765, 0.06945348531007767, 0.04476168751716614, 0.024970298632979393, -0.050061535090208054, 0.05980529636144638, 0.1768452227115631, -0.01209972519427538, -0.1998264193534851, -0.062326882034540176, 0.06751304864883423, -0.082801952958107, -0.11640139669179916, -0.02261449582874775, 0.050769247114658356, -0.17440687119960785, 0.015001747757196426, -0.04254560545086861, 0.09033802151679993, -0.09127394109964371, -0.037229955196380615, 0.05321357026696205, 0.07545126974582672, -0.023492055013775826, 0.09048163145780563, -0.17921186983585358, 0.13352392613887787, 0.01737614907324314, 0.06370522826910019, -0.09815072268247604, 0.10393797606229782, 0.015243546105921268, -0.0071698566898703575, 0.14627893269062042, 0.008973979391157627, -0.019879506900906563, -0.058314017951488495, -0.10938628017902374, -0.0015536772552877665, 0.08220188319683075, -0.11720426380634308, 0.06481732428073883, 0.00044200546108186245, -0.019408708438277245, 0.010529479943215847, -0.0697939544916153, -0.14233455061912537, -0.1691078543663025, 0.06332679092884064, -0.12960782647132874, 0.05657918378710747, -0.10196143388748169, -0.07344398647546768, -0.006228356156498194, 0.1857890486717224, -0.19167372584342957, -0.0651763305068016, -0.13295814394950867, -0.08307469636201859, 0.17686748504638672, -0.038926977664232254, 0.07132517546415329, 0.017756011337041855, 0.17197521030902863, 0.030676020309329033, 0.013996497727930546, 0.10165295004844666, -0.0863775908946991, -0.18250107765197754, -0.06872538477182388, 0.145328551530838, 0.15727265179157257, 0.04947395995259285, -0.01222315151244402, 0.0006382534629665315, -0.05825969576835632, -0.12492486834526062, 0.00552456034347415, 0.14077237248420715, 0.09738009423017502, 0.015011516399681568, -0.02072962000966072, -0.12298290431499481, -0.06933344155550003, -0.07234511524438858, 0.010791660286486149, 0.1811780333518982, -0.06657543778419495, 0.1483541578054428, 0.12124106287956238, -0.0507206916809082, -0.18955619633197784, 0.04781363531947136, 0.0678601861000061, 0.021055543795228004, 0.06329847872257233, -0.1708568036556244, 0.10241113603115082, 0.03779063746333122, -0.056044332683086395, 0.12532320618629456, -0.13762390613555908, -0.15448996424674988, 0.08908607810735703, 0.059379611164331436, -0.23717626929283142, -0.10756765305995941, -0.09208329766988754, -0.04467558488249779, -0.11974717676639557, 0.07756773382425308, -0.008080631494522095, 0.01312070433050394, 0.038425788283348083, 0.04747161641716957, 0.010422809049487114, -0.04883774369955063, 0.2077513337135315, 0.00663892924785614, 0.03319171071052551, -0.04891526326537132, -0.10318257659673691, 0.04049978777766228, -0.04806138575077057, 0.09715691953897476, -0.014642413705587387, 0.021955221891403198, -0.1253223717212677, -0.0439610481262207, -0.06654173135757446, 0.030696231871843338, -0.09619533270597458, -0.09483709931373596, -0.05548068508505821, 0.10141977667808533, 0.07960876822471619, -0.03827962279319763, -0.018101584166288376, -0.08076406270265579, 0.028281690552830696, 0.192597895860672, 0.20835207402706146, 0.049149978905916214, -0.06995424628257751, 0.007349140010774136, -0.012700160034000874, 0.04521884396672249, -0.2468501627445221, 0.056316666305065155, 0.04637942090630531, 0.019014067947864532, 0.11265500634908676, -0.035475291311740875, -0.16250301897525787, -0.05557123199105263, 0.07098683714866638, -0.039137084037065506, -0.15694621205329895, -0.024994002655148506, 0.05066932737827301, -0.20187702775001526, -0.029669208452105522, 0.010474429465830326, -0.02148980274796486, -0.04393318295478821, 0.011044103652238846, 0.08090483397245407, -0.018578581511974335, 0.1367349922657013, 0.07980240881443024, 0.09522033482789993, -0.10692083835601807, 0.07168128341436386, 0.06122429668903351, -0.051465462893247604, 0.021644625812768936, 0.06818753480911255, -0.04446205869317055, -0.032580625265836716, 0.07838873565196991, 0.058368146419525146, 0.04023381322622299, -0.0497741736471653, -0.009552556090056896, -0.05499427020549774, 0.049196142703294754, 0.10447074472904205, 0.05076836422085762, 0.0006935194251127541, 0.047793444246053696, 0.018387768417596817, -0.08049451559782028, 0.10598240047693253, 0.05339374020695686, 0.02360537275671959, -0.0398079976439476, -0.03602069616317749, 0.018247995525598526, -0.010786417871713638, -0.0149832833558321, -0.016455529257655144, -0.07099823653697968, -0.013593231327831745, -0.13733075559139252, 0.04016523063182831, -0.08189219981431961, 0.01841694675385952, 0.022008292376995087, -0.05440347641706467, -0.007398437242954969, 0.015957478433847427, -0.07759089022874832, -0.04222242161631584, -0.0045568388886749744, 0.12033451348543167, -0.11743347346782684, 0.041315708309412, 0.0889706164598465, -0.10073781758546829, 0.08179357647895813, 0.005519764963537455, 0.006593905854970217, 0.027770070359110832, -0.18307223916053772, 0.07270024716854095, -0.02148648537695408, 0.003687589429318905, 0.03217103332281113, -0.22772879898548126, -0.010953521355986595, -0.03648538142442703, -0.016809485852718353, 0.0019160229712724686, -0.03937701880931854, -0.13335061073303223, 0.07287079840898514, -0.01058956515043974, -0.08660455048084259, -0.032185930758714676, 0.03226194903254509, 0.1112515926361084, -0.03534836322069168, 0.15059389173984528, -0.005941883195191622, 0.05801843851804733, -0.17130136489868164, -0.011426819488406181, -0.019129110500216484, 0.03652174770832062, -0.018265437334775925, -0.014729461632668972, 0.053084973245859146, -0.03412574157118797, 0.2234855443239212, -0.03480256348848343, 0.06502514332532883, 0.05183198302984238, 0.02280556410551071, -0.006614799611270428, 0.08636770397424698, 0.06560425460338593, -0.01096076425164938, 0.02718065120279789, 0.028059065341949463, -0.012954981066286564, -0.037562232464551926, -0.1630524843931198, 0.05572279915213585, 0.1581650972366333, 0.04094236344099045, 0.011616811156272888, 0.06928509473800659, -0.10752071440219879, -0.07898375391960144, 0.1387312412261963, -0.01259393710643053, -0.032576363533735275, -0.07013807445764542, 0.13943122327327728, 0.124080128967762, -0.19758351147174835, 0.07208021730184555, -0.0731193795800209, -0.07801702618598938, -0.10079838335514069, -0.14738084375858307, -0.061444323509931564, -0.052179500460624695, -0.011450962163507938, -0.06768535077571869, 0.05396997556090355, 0.10480605065822601, 0.0069710006937384605, -0.026146549731492996, 0.10475686937570572, 0.0007574855699203908, -0.027480410411953926, 0.0275881364941597, 0.06416697055101395, 0.01868068240582943, -0.10241235792636871, 0.016462087631225586, 0.0009010558133013546, 0.028261849656701088, 0.058421481400728226, 0.0037333546206355095, -0.035359520465135574, -0.012541528791189194, -0.022329136729240417, -0.11025683581829071, 0.038418930023908615, -0.031967371702194214, -0.03549599647521973, 0.11972174793481827, 0.021107889711856842, 0.0024782961700111628, -0.022964047268033028, 0.22632580995559692, -0.07606904208660126, -0.0824858620762825, -0.1684485524892807, 0.048732075840234756, -0.06246444582939148, 0.03944636881351471, 0.04816613346338272, -0.1110905185341835, 0.02492443658411503, 0.13681943714618683, 0.13383808732032776, -0.017702074721455574, 0.0072706313803792, 0.041554342955350876, -0.001966990763321519, -0.051138825714588165, 0.022816691547632217, 0.04751669988036156, 0.09492984414100647, -0.05958498641848564, 0.09289880096912384, -0.006714127957820892, -0.08313115686178207, 0.011414550244808197, 0.11385775357484818, -0.004354037344455719, 0.008586743846535683, -0.06612556427717209, 0.14033369719982147, -0.05520116165280342, -0.2502851188182831, 0.03959165886044502, -0.0734434500336647, -0.16861815750598907, -0.03511347249150276, 0.018955450505018234, -0.019131824374198914, 0.017461534589529037, 0.07813186943531036, -0.05068197101354599, 0.17512299120426178, 0.04293905943632126, -0.08064883947372437, -0.06616055220365524, 0.07387921214103699, -0.11062787473201752, 0.28079262375831604, 0.012751048430800438, 0.06857820600271225, 0.10455191880464554, -0.016430502757430077, -0.11872978508472443, 0.042664192616939545, 0.10075171291828156, -0.07164205610752106, 0.08039859682321548, 0.18360178172588348, 0.0013276869431138039, 0.15462037920951843, 0.06878916919231415, -0.0453730933368206, 0.03654608130455017, -0.12163300812244415, -0.05294680967926979, -0.10768717527389526, 0.08729486167430878, -0.07798956334590912, 0.15596513450145721, 0.13275524973869324, -0.07110930234193802, -0.006204865872859955, -0.025767024606466293, 0.08593760430812836, -0.009336618706583977, 0.1176052987575531, 0.00486786337569356, -0.20527753233909607, 0.022964732721447945, 0.006658138707280159, 0.10234756767749786, -0.21353045105934143, -0.06055140495300293, 0.06063069403171539, -0.027994666248559952, -0.050338197499513626, 0.11621229350566864, 0.05960828810930252, 0.04527933895587921, -0.034697841852903366, -0.03217756003141403, -0.02518811635673046, 0.13280846178531647, -0.11107352375984192, -0.014744595624506474 ]
null
null
null
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # openhermes-2.5-mistral-7b-losstype-sigmoid This model is a fine-tuned version of [teknium/OpenHermes-2.5-Mistral-7B](https://huggingface.co/teknium/OpenHermes-2.5-Mistral-7B) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 4 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_steps: 50 - training_steps: 200 ### Training results ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.0+cu121 - Datasets 2.17.0 - Tokenizers 0.15.1
{"license": "apache-2.0", "tags": ["trl", "dpo", "generated_from_trainer"], "base_model": "teknium/OpenHermes-2.5-Mistral-7B", "model-index": [{"name": "openhermes-2.5-mistral-7b-losstype-sigmoid", "results": []}]}
null
DrishtiSharma/openhermes-2.5-mistral-7b-losstype-sigmoid
[ "safetensors", "trl", "dpo", "generated_from_trainer", "base_model:teknium/OpenHermes-2.5-Mistral-7B", "license:apache-2.0", "region:us" ]
2024-02-13T18:44:04+00:00
[]
[]
TAGS #safetensors #trl #dpo #generated_from_trainer #base_model-teknium/OpenHermes-2.5-Mistral-7B #license-apache-2.0 #region-us
# openhermes-2.5-mistral-7b-losstype-sigmoid This model is a fine-tuned version of teknium/OpenHermes-2.5-Mistral-7B on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 4 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_steps: 50 - training_steps: 200 ### Training results ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.0+cu121 - Datasets 2.17.0 - Tokenizers 0.15.1
[ "# openhermes-2.5-mistral-7b-losstype-sigmoid\n\nThis model is a fine-tuned version of teknium/OpenHermes-2.5-Mistral-7B on an unknown dataset.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 4\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 16\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_steps: 50\n- training_steps: 200", "### Training results", "### Framework versions\n\n- Transformers 4.35.2\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1" ]
[ "TAGS\n#safetensors #trl #dpo #generated_from_trainer #base_model-teknium/OpenHermes-2.5-Mistral-7B #license-apache-2.0 #region-us \n", "# openhermes-2.5-mistral-7b-losstype-sigmoid\n\nThis model is a fine-tuned version of teknium/OpenHermes-2.5-Mistral-7B on an unknown dataset.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 4\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 16\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_steps: 50\n- training_steps: 200", "### Training results", "### Framework versions\n\n- Transformers 4.35.2\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1" ]
[ 50, 50, 6, 12, 8, 3, 128, 4, 33 ]
[ "passage: TAGS\n#safetensors #trl #dpo #generated_from_trainer #base_model-teknium/OpenHermes-2.5-Mistral-7B #license-apache-2.0 #region-us \n# openhermes-2.5-mistral-7b-losstype-sigmoid\n\nThis model is a fine-tuned version of teknium/OpenHermes-2.5-Mistral-7B on an unknown dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 4\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 16\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_steps: 50\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.35.2\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1" ]
[ -0.09874676913022995, 0.08202661573886871, -0.003468945622444153, 0.08644083887338638, 0.1174602210521698, 0.031147656962275505, 0.14776869118213654, 0.12192123383283615, -0.016351968050003052, 0.07947053015232086, 0.02370472252368927, 0.06165404990315437, 0.04651455953717232, 0.15440604090690613, -0.065604567527771, -0.23185880482196808, 0.018963396549224854, -0.04154358431696892, -0.053799185901880264, 0.08674449473619461, 0.09979462623596191, -0.10038120299577713, 0.0472632572054863, 0.016829395666718483, -0.09971408545970917, -0.003402009839192033, 0.0006815059459768236, -0.04557615891098976, 0.1289614737033844, -0.030385876074433327, 0.11150529980659485, 0.01575574278831482, 0.1418699473142624, -0.17552615702152252, 0.0024599656462669373, 0.07959151268005371, 0.03389589115977287, 0.09019878506660461, 0.06676369905471802, 0.0012897348497062922, 0.028132814913988113, -0.15609847009181976, 0.09891314059495926, 0.012078327126801014, -0.0903179869055748, -0.19420243799686432, -0.13348588347434998, 0.03893274441361427, 0.0929916501045227, 0.07488734275102615, 0.02345343492925167, 0.1649438589811325, -0.04260186105966568, 0.0678987056016922, 0.2649373412132263, -0.2795901596546173, -0.06382828205823898, 0.07245898991823196, 0.06740318238735199, 0.076307013630867, -0.09792337566614151, -0.027690036222338676, 0.039016254246234894, 0.01251240074634552, 0.09709615260362625, 0.0012847107136622071, -0.05419586971402168, -0.0288689024746418, -0.13425879180431366, -0.021420037373900414, 0.10313284397125244, 0.016269927844405174, -0.05428146943449974, -0.09754505008459091, -0.10185675323009491, -0.07725375890731812, -0.010247431695461273, -0.06727409362792969, 0.053819261491298676, -0.0329025536775589, -0.006225919350981712, -0.04899578168988228, -0.07844847440719604, -0.04556093364953995, -0.020436609163880348, 0.13829059898853302, 0.029569724574685097, 0.02686370350420475, -0.02668055146932602, 0.12372307479381561, -0.03661736473441124, -0.13598056137561798, -0.0050005074590444565, -0.0019448872189968824, -0.08111690729856491, -0.07705604285001755, -0.05262945219874382, -0.050037920475006104, -0.0063805109821259975, 0.1893468201160431, -0.10143767297267914, 0.05365731939673424, -0.018584104254841805, 0.003617301117628813, -0.046747080981731415, 0.1136159896850586, -0.04130636528134346, -0.027310460805892944, 0.01834528148174286, 0.10714293271303177, 0.0484500378370285, -0.0018999269232153893, -0.054612405598163605, -0.05415642634034157, 0.08572394400835037, 0.048786137253046036, -0.03548554703593254, 0.01877303048968315, -0.045435208827257156, -0.03923431783914566, 0.09275868535041809, -0.1344437152147293, 0.02961086295545101, -0.002694271504878998, -0.08891928195953369, -0.02365788444876671, 0.01649019867181778, 0.03693500533699989, 0.008417620323598385, 0.11639904230833054, -0.06638495624065399, 0.016505174338817596, -0.07662925869226456, -0.06737492978572845, 0.017443617805838585, -0.08954398334026337, -0.029651042073965073, -0.05989137291908264, -0.21627213060855865, -0.03007715940475464, 0.05771694332361221, -0.05839601159095764, -0.010554428212344646, 0.0005347493570297956, -0.0723038837313652, 0.026702985167503357, -0.003948034718632698, 0.09056568890810013, -0.0416487418115139, 0.06364255398511887, -0.02943660505115986, 0.04213164746761322, 0.01578659750521183, 0.011912334710359573, -0.08522400259971619, 0.03763468563556671, -0.16146746277809143, 0.032838042825460434, -0.06726178526878357, 0.02140064351260662, -0.12181998044252396, -0.08126110583543777, 0.005034309811890125, -0.06491342186927795, 0.08330169320106506, 0.10175726562738419, -0.19855552911758423, 0.013069580309092999, 0.1811104267835617, -0.10720499604940414, -0.0883692130446434, 0.09423942118883133, -0.05167417600750923, 0.03847183659672737, 0.03888058289885521, 0.15330137312412262, 0.08175836503505707, -0.18430234491825104, 0.005889183841645718, -0.02758318930864334, 0.06149614602327347, 0.05608551576733589, 0.0674058124423027, -0.015706783160567284, 0.0905182734131813, -0.008188956417143345, -0.06559239327907562, -0.0334530845284462, -0.06943820416927338, -0.07980597019195557, -0.06802926957607269, -0.07562186568975449, 0.08769543468952179, 0.02081335335969925, 0.03359473869204521, -0.07019263505935669, -0.08526872098445892, 0.10010238736867905, 0.13622884452342987, -0.047934550791978836, 0.027758333832025528, -0.07021237909793854, 0.043990399688482285, -0.007321739569306374, -0.05082324892282486, -0.20783789455890656, -0.12149164080619812, 0.05219920352101326, -0.09733051061630249, -0.008866594173014164, 0.028558878228068352, 0.07713242620229721, 0.07613874971866608, -0.06464055925607681, -0.013910853303968906, -0.07654567062854767, 0.002151716500520706, -0.09685752540826797, -0.19870957732200623, -0.048162344843149185, -0.04289594292640686, 0.14943553507328033, -0.19123470783233643, 0.014505603350698948, -0.011584574356675148, 0.16359959542751312, 0.05032121390104294, -0.040258727967739105, -0.026560520753264427, 0.01990899071097374, 0.005097055342048407, -0.10171263664960861, 0.02941836230456829, -0.00030701534706167877, -0.060096945613622665, -0.05434940755367279, -0.1638886034488678, 0.04098888486623764, 0.06487799435853958, 0.07991407811641693, -0.09007001668214798, -0.04137210547924042, -0.07053284347057343, -0.058230333030223846, -0.07997466623783112, 0.00010421196930110455, 0.13801056146621704, 0.013511444441974163, 0.10266545414924622, -0.0805169865489006, -0.08200056850910187, -0.0029272709507495165, -0.0052108196541666985, 0.015229946002364159, 0.0409875251352787, 0.020190812647342682, -0.15900585055351257, 0.0719982162117958, 0.13000980019569397, -0.06410428881645203, 0.08142988383769989, -0.05365088954567909, -0.0702454224228859, -0.04443177580833435, 0.020068496465682983, 0.004290079697966576, 0.1188819408416748, -0.008730604313313961, 0.04431765899062157, 0.029081963002681732, 0.03019752912223339, 0.005138395819813013, -0.1820836365222931, -0.004559637513011694, 0.014365341514348984, -0.03993714600801468, 0.005354281980544329, 0.013800197280943394, 0.032960742712020874, 0.10260162502527237, 0.01853516511619091, -0.026093102991580963, 0.011170090176165104, -0.02418658323585987, -0.08492113649845123, 0.1687658131122589, -0.11002287268638611, -0.05095820128917694, -0.061815790832042694, 0.03220275416970253, -0.007771375589072704, -0.03267860412597656, 0.009547078981995583, -0.04555688053369522, -0.04756307974457741, -0.09257150441408157, -0.08262215554714203, 0.014589904807507992, -0.009633929468691349, 0.08346667140722275, 0.007230974733829498, 0.08239348977804184, -0.12832264602184296, -0.0072684334591031075, -0.031744252890348434, -0.07231911271810532, 0.01892944797873497, 0.07486166805028915, 0.05491948127746582, 0.12182047218084335, -0.03296075761318207, -0.002669866429641843, -0.033037908375263214, 0.20959056913852692, -0.07712431997060776, 0.013574750162661076, 0.1391240656375885, -0.02687499113380909, 0.0848257765173912, 0.15629452466964722, 0.0498775914311409, -0.0834430530667305, 0.025923777371644974, 0.049277231097221375, -0.008077896200120449, -0.2798832654953003, -0.03976432979106903, -0.025729117915034294, -0.04411472752690315, 0.06161895766854286, 0.030055560171604156, 0.05326458066701889, 0.03280525282025337, -0.02687871642410755, 0.0132749630138278, 0.020508073270320892, 0.08058743178844452, 0.09596050530672073, 0.06832467764616013, 0.0916064977645874, -0.014244587160646915, -0.0015756952343508601, 0.05788876488804817, 0.010551344603300095, 0.2742706537246704, -0.016194134950637817, 0.11239849776029587, 0.026356127113103867, 0.11835791170597076, -0.03843064606189728, 0.0514647513628006, 0.0010620279936119914, -0.005117414984852076, 0.004328830633312464, -0.08131365478038788, 0.006786889396607876, 0.04404453933238983, -0.08080065250396729, 0.04925587400794029, -0.05144991725683212, 0.04030129313468933, 0.032168056815862656, 0.2455039620399475, 0.04298640415072441, -0.2712753713130951, -0.07310659438371658, 0.013844416476786137, -0.030201787129044533, -0.0231629591435194, -0.022435249760746956, 0.10235416144132614, -0.12521208822727203, 0.07924439013004303, -0.0918806865811348, 0.0634942576289177, -0.015257964842021465, -0.007592698093503714, 0.09207110852003098, 0.10386093705892563, -0.007978868670761585, 0.04261064529418945, -0.1679389923810959, 0.23673367500305176, 0.01813933253288269, 0.09879039227962494, -0.059811804443597794, 0.03044119104743004, 0.03037625178694725, 0.05352906882762909, 0.08918196707963943, -0.0037182383239269257, -0.06913772225379944, -0.1359495371580124, -0.09880997240543365, 0.0015271855518221855, 0.1321679800748825, -0.04224872961640358, 0.09347758442163467, -0.052385758608579636, -0.008520540781319141, 0.05515718460083008, -0.023206399753689766, -0.17979751527309418, -0.0783810019493103, 0.056854214519262314, -0.0020088281016796827, -0.05172613263130188, -0.08490646630525589, -0.10646502673625946, -0.06438076496124268, 0.13207237422466278, 0.008000393398106098, -0.050964366644620895, -0.1379985362291336, 0.038638755679130554, 0.157134011387825, -0.060614440590143204, 0.04810795560479164, 0.03605065122246742, 0.10029773414134979, 0.015569090843200684, -0.08451765775680542, 0.07463125139474869, -0.07684510946273804, -0.2170695960521698, -0.07742564380168915, 0.12530912458896637, 0.060248762369155884, 0.05055706202983856, 0.00490743899717927, 0.05109211057424545, 0.029714137315750122, -0.0661185085773468, 0.023797104135155678, 0.10654096305370331, 0.10268843173980713, 0.017573902383446693, -0.06661013513803482, -0.011627629399299622, -0.030551176518201828, -0.03140334412455559, 0.086020328104496, 0.25744691491127014, -0.0851808711886406, 0.08872340619564056, 0.031149130314588547, -0.06882023066282272, -0.13289131224155426, 0.06852906942367554, 0.1221272274851799, -0.0010631120530888438, 0.09443728625774384, -0.1385299414396286, 0.08720096945762634, 0.11080154031515121, -0.055812593549489975, 0.06155741214752197, -0.3429618179798126, -0.14312198758125305, 0.023810135200619698, 0.0978776142001152, 0.031433429569005966, -0.1302114576101303, -0.03973359987139702, -0.011434494517743587, -0.14700686931610107, 0.09700031578540802, -0.11757292598485947, 0.0796186700463295, -0.009346657432615757, 0.06198544800281525, 0.023827623575925827, -0.05141689255833626, 0.14348104596138, 0.022499872371554375, 0.09738244861364365, -0.05750260502099991, 0.010421378538012505, 0.11032967269420624, -0.07201149314641953, 0.027966858819127083, 0.014673689380288124, 0.06008211895823479, -0.07969477772712708, -0.009013400413095951, -0.05902176722884178, 0.044601697474718094, -0.04736613109707832, -0.0740305483341217, -0.0404123030602932, 0.07375024259090424, 0.07859645038843155, -0.04451743885874748, 0.06643416732549667, 0.06107941269874573, 0.12475644797086716, 0.12985347211360931, 0.0963369756937027, 0.03962380811572075, -0.04460383951663971, 0.015118442475795746, -0.013292746618390083, 0.0350426621735096, -0.09771742671728134, 0.026584718376398087, 0.1192181184887886, 0.05102551355957985, 0.10629520565271378, 0.03588656708598137, -0.05491781607270241, 0.013400543481111526, 0.0227468553930521, -0.11551888287067413, -0.12954582273960114, 0.030406467616558075, 0.018582535907626152, -0.11945097148418427, 0.04226329177618027, 0.12762661278247833, -0.04978162422776222, -0.015520070679485798, -0.0014536514645442367, 0.026109367609024048, 0.006018402520567179, 0.18291394412517548, 0.01446827594190836, 0.053911928087472916, -0.05471256375312805, 0.11426062136888504, 0.08448930829763412, -0.08321715146303177, 0.016460712999105453, 0.0573984757065773, -0.10816172510385513, -0.0034435014240443707, 0.055844295769929886, 0.08085853606462479, -0.026686640456318855, -0.022202782332897186, -0.09419277310371399, -0.06760266423225403, 0.021191591396927834, 0.16483920812606812, 0.06000109389424324, -0.007343512959778309, -0.007985125295817852, 0.04101702943444252, -0.10522584617137909, 0.11758385598659515, 0.06350035965442657, 0.09195898473262787, -0.13914750516414642, 0.1084171012043953, -0.01164339017122984, 0.00637609651312232, -0.012846099212765694, 0.03904750198125839, -0.05680825561285019, -0.033132150769233704, -0.1302957534790039, 0.01366447564214468, -0.016848616302013397, -0.009301327168941498, -0.023617252707481384, -0.05455130711197853, -0.02229158580303192, 0.04365446791052818, -0.06439202278852463, -0.04670140519738197, 0.006080689374357462, 0.03628508001565933, -0.15008297562599182, -0.039424702525138855, 0.039407387375831604, -0.08868594467639923, 0.09568043053150177, 0.07187523692846298, 0.03207923471927643, 0.004178757779300213, -0.1176629364490509, -0.0025663406122475863, 0.032339129596948624, 0.0270351842045784, 0.04850836843252182, -0.1328226923942566, -0.03199584409594536, -0.038779668509960175, 0.021724605932831764, 0.01244737021625042, 0.059298332780599594, -0.12495788186788559, 0.002489208010956645, -0.03769960254430771, -0.047034721821546555, -0.04962645471096039, -0.00010298127745045349, 0.09361934661865234, 0.020116424188017845, 0.1344870626926422, -0.06882037967443466, 0.06260492652654648, -0.215013325214386, -0.044044289737939835, 0.014880494214594364, -0.018809959292411804, -0.0911686047911644, -0.006355092860758305, 0.07422653585672379, -0.06451069563627243, 0.09273815155029297, -0.03264685720205307, 0.08679032325744629, 0.039191003888845444, -0.061082031577825546, -0.018582919612526894, 0.02196102775633335, 0.162650927901268, 0.047094058245420456, -0.018805354833602905, 0.07412756979465485, -0.03292035311460495, 0.03876969963312149, 0.0004883187357336283, 0.20698542892932892, 0.17395742237567902, -0.02021089196205139, 0.023958934471011162, 0.040622785687446594, -0.10652972757816315, -0.17245611548423767, 0.07089699059724808, -0.03448737785220146, 0.07323665916919708, -0.054177530109882355, 0.167657271027565, 0.1066349595785141, -0.19028112292289734, 0.037697382271289825, -0.049393121153116226, -0.0837966799736023, -0.1208101212978363, -0.044424690306186676, -0.07969871163368225, -0.11205963045358658, 0.008441466838121414, -0.10899662971496582, 0.0637938529253006, 0.10278504341840744, 0.005888180807232857, 0.02074388600885868, 0.15165093541145325, -0.041137758642435074, 0.01912209577858448, 0.04803045466542244, 0.04364661127328873, -0.002232610248029232, -0.017867615446448326, -0.08917997032403946, 0.033012326806783676, -0.00882237683981657, 0.05170723795890808, -0.0424509197473526, 0.006716643460094929, 0.011968176811933517, 0.0031204179394990206, -0.07439083606004715, 0.032661452889442444, 0.018957775086164474, 0.022534232586622238, 0.0026912621688097715, 0.059667542576789856, 0.036436449736356735, -0.038200173527002335, 0.28295180201530457, -0.07676075398921967, -0.0894840881228447, -0.10914485156536102, 0.18661920726299286, 0.02970675751566887, -0.018183529376983643, 0.08317343890666962, -0.1255521923303604, -0.0004593429039232433, 0.0805979073047638, 0.12810839712619781, -0.06723939627408981, -0.0024971619714051485, 0.001874120905995369, -0.01889945939183235, -0.04995940625667572, 0.0949769988656044, 0.07267690449953079, 0.03804948553442955, -0.06520536541938782, -0.009068791754543781, -0.00833156704902649, -0.017754873260855675, -0.08561495691537857, 0.04734686762094498, -0.015041861683130264, 0.005699759349226952, -0.02798454649746418, 0.0805739238858223, 0.031982794404029846, -0.15965062379837036, 0.04078954458236694, -0.18564042448997498, -0.1834893673658371, 0.0034772164653986692, 0.09183572977781296, -0.024277813732624054, 0.04246579855680466, -0.013040721416473389, -0.0281181950122118, 0.13461168110370636, -0.017152894288301468, -0.01904149539768696, -0.07857505977153778, 0.06899344176054001, -0.08795222640037537, 0.21570324897766113, -0.008454819209873676, 0.12180662900209427, 0.08795490860939026, 0.02598760649561882, -0.11954046785831451, 0.006974570453166962, 0.1053047776222229, -0.10066433995962143, 0.011049328371882439, 0.15446698665618896, -0.046527549624443054, 0.0692763403058052, 0.07162144780158997, -0.13800594210624695, -0.009042132645845413, 0.051643844693899155, -0.02071748673915863, -0.0866876170039177, 0.011763516813516617, -0.05750128626823425, 0.16361619532108307, 0.17612701654434204, -0.04111083596944809, 0.0414520800113678, -0.04571360722184181, 0.005175350699573755, 0.06281106173992157, 0.06245892494916916, -0.03948214277625084, -0.22409816086292267, 0.046823762357234955, 0.038289666175842285, 0.04830784350633621, -0.15631145238876343, -0.11686497181653976, 0.04908677935600281, -0.06371676921844482, -0.07651103287935257, 0.08259939402341843, 0.04080482944846153, 0.011564090847969055, -0.019921492785215378, -0.10554851591587067, -0.05032974109053612, 0.14652766287326813, -0.1694764643907547, -0.04204048216342926 ]
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # model_1 This model is a fine-tuned version of [sshleifer/tiny-distilroberta-base](https://huggingface.co/sshleifer/tiny-distilroberta-base) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.4994 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0005 - train_batch_size: 1 - eval_batch_size: 1 - seed: 42 - gradient_accumulation_steps: 16 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 200 - num_epochs: 1 ### Training results ### Framework versions - Transformers 4.37.1 - Pytorch 2.1.2+cu121 - Datasets 2.14.4 - Tokenizers 0.15.1
{"tags": ["generated_from_trainer"], "base_model": "sshleifer/tiny-distilroberta-base", "model-index": [{"name": "model_1", "results": []}]}
text-classification
nadahlberg/model_1
[ "transformers", "safetensors", "roberta", "text-classification", "generated_from_trainer", "base_model:sshleifer/tiny-distilroberta-base", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2024-02-13T18:44:20+00:00
[]
[]
TAGS #transformers #safetensors #roberta #text-classification #generated_from_trainer #base_model-sshleifer/tiny-distilroberta-base #autotrain_compatible #endpoints_compatible #region-us
# model_1 This model is a fine-tuned version of sshleifer/tiny-distilroberta-base on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.4994 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0005 - train_batch_size: 1 - eval_batch_size: 1 - seed: 42 - gradient_accumulation_steps: 16 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 200 - num_epochs: 1 ### Training results ### Framework versions - Transformers 4.37.1 - Pytorch 2.1.2+cu121 - Datasets 2.14.4 - Tokenizers 0.15.1
[ "# model_1\n\nThis model is a fine-tuned version of sshleifer/tiny-distilroberta-base on the None dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 0.4994", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0005\n- train_batch_size: 1\n- eval_batch_size: 1\n- seed: 42\n- gradient_accumulation_steps: 16\n- total_train_batch_size: 16\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 200\n- num_epochs: 1", "### Training results", "### Framework versions\n\n- Transformers 4.37.1\n- Pytorch 2.1.2+cu121\n- Datasets 2.14.4\n- Tokenizers 0.15.1" ]
[ "TAGS\n#transformers #safetensors #roberta #text-classification #generated_from_trainer #base_model-sshleifer/tiny-distilroberta-base #autotrain_compatible #endpoints_compatible #region-us \n", "# model_1\n\nThis model is a fine-tuned version of sshleifer/tiny-distilroberta-base on the None dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 0.4994", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0005\n- train_batch_size: 1\n- eval_batch_size: 1\n- seed: 42\n- gradient_accumulation_steps: 16\n- total_train_batch_size: 16\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 200\n- num_epochs: 1", "### Training results", "### Framework versions\n\n- Transformers 4.37.1\n- Pytorch 2.1.2+cu121\n- Datasets 2.14.4\n- Tokenizers 0.15.1" ]
[ 63, 52, 6, 12, 8, 3, 127, 4, 33 ]
[ "passage: TAGS\n#transformers #safetensors #roberta #text-classification #generated_from_trainer #base_model-sshleifer/tiny-distilroberta-base #autotrain_compatible #endpoints_compatible #region-us \n# model_1\n\nThis model is a fine-tuned version of sshleifer/tiny-distilroberta-base on the None dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 0.4994## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0005\n- train_batch_size: 1\n- eval_batch_size: 1\n- seed: 42\n- gradient_accumulation_steps: 16\n- total_train_batch_size: 16\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 200\n- num_epochs: 1### Training results### Framework versions\n\n- Transformers 4.37.1\n- Pytorch 2.1.2+cu121\n- Datasets 2.14.4\n- Tokenizers 0.15.1" ]
[ -0.10687810182571411, 0.16758331656455994, -0.003382609225809574, 0.08058639615774155, 0.13165168464183807, 0.031544920057058334, 0.085638627409935, 0.13187682628631592, -0.08645588904619217, 0.09833447635173798, 0.08082632720470428, 0.04879873991012573, 0.06315416097640991, 0.14700278639793396, -0.015947379171848297, -0.2421189397573471, 0.01513140182942152, -0.015809256583452225, -0.049055080860853195, 0.09989312291145325, 0.11075310409069061, -0.097166046500206, 0.049745481461286545, 0.02332954667508602, -0.09966885298490524, -0.014199947007000446, -0.05211039260029793, -0.036096129566431046, 0.08633074909448624, -0.0029340998735278845, 0.09192073345184326, 0.006500612013041973, 0.09541784971952438, -0.23470771312713623, -0.004516503307968378, 0.07783041149377823, 0.03864064812660217, 0.07943188399076462, 0.06846312433481216, -0.007675438188016415, 0.10849593579769135, -0.16682520508766174, 0.08321018517017365, 0.020925652235746384, -0.06795395165681839, -0.17691847681999207, -0.08047384768724442, 0.09145072847604752, 0.10707584768533707, 0.0882454589009285, -0.008438129909336567, 0.1632254719734192, -0.09641265869140625, 0.07110697776079178, 0.18018199503421783, -0.26162147521972656, -0.06266463547945023, 0.0275114718824625, 0.01995694264769554, 0.05556867644190788, -0.11274291574954987, -0.026144180446863174, 0.036753859370946884, 0.02473466470837593, 0.08461160212755203, 0.014281421899795532, 0.019208913668990135, -0.02777194045484066, -0.11423421651124954, -0.06569094210863113, 0.14999309182167053, 0.058796368539333344, -0.05438607931137085, -0.13522163033485413, -0.0514591708779335, -0.1269468069076538, -0.01668081432580948, -0.01998683623969555, 0.02239277772605419, -0.04239059239625931, -0.06609633564949036, -0.017077984288334846, -0.08595727384090424, -0.0283711776137352, 0.04528516158461571, 0.12314509600400925, 0.054747212678194046, 0.004363479092717171, 0.00730268657207489, 0.11260303854942322, 0.022805064916610718, -0.14045695960521698, -0.02216211147606373, -0.0009510662057437003, -0.08400898426771164, -0.043396539986133575, -0.052180878818035126, -0.008312754333019257, 0.0018859726842492819, 0.1645243912935257, -0.04909452795982361, 0.05251539498567581, 0.05550108477473259, -0.0003125145740341395, -0.014222192578017712, 0.12555986642837524, -0.05222347378730774, -0.07961127907037735, -0.0009519048617221415, 0.11876948922872543, 0.05526348203420639, -0.011841826140880585, -0.05487333610653877, -0.011406710371375084, 0.11553368717432022, 0.05804074555635452, -0.026359539479017258, 0.02365376427769661, -0.054748259484767914, -0.0309077687561512, 0.05910179391503334, -0.1136578693985939, 0.035295043140649796, 0.0024239623453468084, -0.07407849282026291, -0.02698153629899025, 0.002206427277997136, 0.0029994661454111338, -0.046484336256980896, 0.12544511258602142, -0.09373944252729416, -0.017044471576809883, -0.06266652792692184, -0.04766220971941948, 0.010040879249572754, -0.04746764153242111, -0.006282794289290905, -0.06484182178974152, -0.16005362570285797, -0.0346827358007431, 0.0405820794403553, -0.0534626767039299, -0.08354292809963226, -0.03146687150001526, -0.05755096673965454, 0.02548401430249214, 0.0007370463572442532, 0.10422372072935104, -0.045930229127407074, 0.08594730496406555, 0.010018652305006981, 0.018978195264935493, 0.05359257385134697, 0.029137199744582176, -0.10982435196638107, 0.050413575023412704, -0.0791524350643158, 0.07608114182949066, -0.0781586542725563, 0.011691831983625889, -0.10606735199689865, -0.11555060744285583, -0.030645690858364105, -0.035008423030376434, 0.06909769028425217, 0.13221552968025208, -0.13509975373744965, -0.035724297165870667, 0.1587085723876953, -0.08918703347444534, -0.08208736032247543, 0.12633031606674194, -0.02461939863860607, -0.027369478717446327, 0.05192011967301369, 0.13189509510993958, 0.13432803750038147, -0.08875690400600433, -0.05925469100475311, 0.016325896605849266, 0.08424299955368042, 0.03026532009243965, 0.09984350949525833, -0.01067121047526598, 0.030879201367497444, 0.02414964698255062, -0.05576736852526665, 0.0019114958122372627, -0.07667523622512817, -0.0892251655459404, -0.04530666396021843, -0.09830928593873978, 0.060075365006923676, 0.03994298726320267, 0.0420052707195282, -0.0803685411810875, -0.1378730982542038, 0.03945133462548256, 0.14999225735664368, -0.04755886271595955, 0.0033967499621212482, -0.06539777666330338, 0.05666859447956085, -0.05538410320878029, -0.025347521528601646, -0.1809135228395462, -0.10687997937202454, 0.052532657980918884, -0.0778866857290268, 0.009416941553354263, -0.007324254605919123, 0.057210348546504974, 0.09228270500898361, -0.06127581000328064, -0.040316540747880936, -0.0767541453242302, -0.015350286848843098, -0.09354837238788605, -0.14767639338970184, -0.07926072925329208, -0.02330213598906994, 0.2381211519241333, -0.2373698204755783, 0.009270389564335346, -0.019612984731793404, 0.1486666202545166, 0.022151928395032883, -0.07629767060279846, -0.003519947873428464, 0.03472604230046272, -0.014074997045099735, -0.09218578040599823, 0.03621060401201248, 0.008410796523094177, -0.10243766009807587, -0.07126510143280029, -0.1359948068857193, 0.059744466096162796, 0.08092854917049408, 0.10091720521450043, -0.08900345116853714, -0.03544377163052559, -0.08082561939954758, -0.04727785289287567, -0.06974328309297562, -0.010502404533326626, 0.14555634558200836, 0.02856900729238987, 0.11109860986471176, -0.05772201344370842, -0.06777236610651016, 0.005986248608678579, 0.005075008142739534, -0.028814181685447693, 0.08360946178436279, 0.029470084235072136, -0.14046134054660797, 0.10016787797212601, 0.06309471279382706, -0.0576239749789238, 0.11828356981277466, -0.044505491852760315, -0.09148098528385162, -0.018831582739949226, 0.03115348145365715, 0.014261841773986816, 0.11882484704256058, -0.07692074775695801, 0.027257245033979416, 0.04743998497724533, 0.004035783465951681, 0.027305616065859795, -0.1530766487121582, -0.0013724981108680367, 0.04248359054327011, -0.012489506043493748, -0.017951829358935356, -0.03128897398710251, 0.020773785188794136, 0.08540219068527222, 0.018951386213302612, 0.006687098648399115, 0.012636395171284676, -0.012739651836454868, -0.07570469379425049, 0.17564351856708527, -0.09288083761930466, -0.15098004043102264, -0.13094474375247955, 0.06940069794654846, -0.07735474407672882, -0.01252672728151083, 0.02640930749475956, -0.06324413418769836, -0.04645417630672455, -0.09647224098443985, -0.04522574692964554, -0.034597206860780716, -0.006055515259504318, 0.03756404295563698, 0.002333602635189891, 0.08843269944190979, -0.125194251537323, 0.004584816750138998, -0.003052822779864073, -0.06284715980291367, 0.010452494025230408, 0.046354226768016815, 0.09706525504589081, 0.10199317336082458, -0.006129247136414051, 0.007316514849662781, -0.025903044268488884, 0.22767385840415955, -0.07708220183849335, -0.014492112211883068, 0.11897780001163483, 0.0046323491260409355, 0.07502514123916626, 0.11428384482860565, 0.02203807607293129, -0.07159432023763657, 0.01733984425663948, 0.06170332431793213, -0.024017922580242157, -0.19217075407505035, -0.06451429426670074, -0.04210643842816353, -0.062497757375240326, 0.12803907692432404, 0.047213759273290634, -0.014770234934985638, 0.06366036087274551, -0.05986640229821205, 0.044010814279317856, -0.012168915942311287, 0.09126316756010056, 0.03997974097728729, 0.05685761943459511, 0.11223998665809631, -0.02576807327568531, -0.03428785130381584, 0.058077167719602585, -0.005269461311399937, 0.23435376584529877, -0.06530039757490158, 0.12404220551252365, 0.020196661353111267, 0.16955971717834473, -0.02216581627726555, 0.06001920998096466, 0.006784576922655106, 0.0006322188419289887, 0.009184060618281364, -0.06671149283647537, -0.06821423768997192, 0.033384598791599274, 0.03185822442173958, 0.06622222065925598, -0.10264100879430771, 0.037042077630758286, 0.031422991305589676, 0.25350451469421387, 0.07266595214605331, -0.3067778944969177, -0.10409661382436752, 0.01989649422466755, -0.022234469652175903, -0.05590534955263138, -0.002055409597232938, 0.10902239382266998, -0.14298368990421295, 0.06201924383640289, -0.06016724556684494, 0.07784351706504822, -0.061818115413188934, 0.004482846241444349, 0.031150519847869873, 0.08718609809875488, 0.00034336349926888943, 0.1027887761592865, -0.17001588642597198, 0.18898455798625946, 0.0021792910993099213, 0.10565689206123352, -0.08387257158756256, 0.03587089106440544, 0.023785218596458435, 0.03718719258904457, 0.10685911774635315, 0.012026333250105381, -0.01936638168990612, -0.1834077388048172, -0.10418444871902466, 0.031423188745975494, 0.11238224804401398, -0.07163570076227188, 0.09514559060335159, -0.049427203834056854, 0.006331463810056448, 0.023068053647875786, -0.010365517809987068, -0.11823015660047531, -0.1592525988817215, 0.03223983570933342, -0.002073512179777026, -0.03136589378118515, -0.09658892452716827, -0.10616517066955566, -0.009030105546116829, 0.21449978649616241, 0.009651021100580692, -0.06478540599346161, -0.1438257396221161, 0.07945969700813293, 0.13208530843257904, -0.08154421299695969, 0.01708293706178665, 0.002987864427268505, 0.15190324187278748, 0.016871539875864983, -0.052439406514167786, 0.021670512855052948, -0.07047521322965622, -0.1810903549194336, -0.0444190613925457, 0.15854579210281372, 0.02734433487057686, 0.051443494856357574, 0.001947378390468657, 0.020256416872143745, 0.005489007569849491, -0.08163323253393173, 0.009550604037940502, 0.049198899418115616, 0.08655152469873428, 0.058058083057403564, -0.03748082369565964, 0.01272016204893589, -0.04817161336541176, -0.001535184565000236, 0.12004996091127396, 0.2664065957069397, -0.07704906910657883, 0.05465492978692055, 0.06094210594892502, -0.041678931564092636, -0.15869355201721191, -0.02131865732371807, 0.1224256381392479, 0.014670539647340775, 0.06152936816215515, -0.1531534641981125, 0.10951485484838486, 0.12023352831602097, -0.032312676310539246, 0.048669736832380295, -0.2853873074054718, -0.11784086376428604, 0.0761316567659378, 0.11222001165151596, 0.008751658722758293, -0.14644794166088104, -0.05893971025943756, -0.0397435687482357, -0.1280452013015747, 0.10885723680257797, -0.043782491236925125, 0.10822702944278717, -0.007440896239131689, 0.047138672322034836, 0.045016270130872726, -0.05523284152150154, 0.16615088284015656, 0.013546704314649105, 0.04867512732744217, -0.04496454820036888, 0.01591937430202961, 0.07946337759494781, -0.07041335850954056, 0.05271468684077263, -0.07722435146570206, 0.07415258139371872, -0.1568547934293747, -0.013274463824927807, -0.06297826766967773, 0.05517614260315895, -0.05689247325062752, -0.0640750601887703, -0.04793267324566841, 0.06885915994644165, 0.08566796034574509, -0.01789281703531742, 0.08646845072507858, 0.05129937082529068, 0.11246879398822784, 0.10301864147186279, 0.06246813014149666, 0.009885949082672596, -0.10737349092960358, -0.041927270591259, -0.011597123928368092, 0.05165358632802963, -0.06096270680427551, 0.027496764436364174, 0.1172378733754158, 0.05627845972776413, 0.11975989490747452, 0.030138950794935226, -0.057035159319639206, 0.005415273364633322, 0.05401254817843437, -0.10035106539726257, -0.15500162541866302, -0.02575695514678955, -0.01606636680662632, -0.16278554499149323, 0.004844083450734615, 0.09823591262102127, -0.04666579142212868, -0.02604703977704048, -0.024379700422286987, 0.024995261803269386, -0.005920558702200651, 0.1854184865951538, 0.02316875010728836, 0.08652493357658386, -0.07543908804655075, 0.11073525995016098, 0.09143853187561035, -0.0792122408747673, 0.028322942554950714, 0.03690643608570099, -0.09258249402046204, -0.020732956007122993, 0.04486595466732979, 0.0678572729229927, 0.009418764151632786, -0.03394452854990959, -0.07770644873380661, -0.07545683532953262, 0.037193089723587036, -0.013063319958746433, 0.058674406260252, 0.0008208630024455488, -0.032921500504016876, 0.0226968415081501, -0.1465364545583725, 0.10731394588947296, 0.07429714500904083, 0.07793892174959183, -0.1449548602104187, 0.10256584733724594, -0.000042174189729848877, 0.024633094668388367, -0.0030114613473415375, 0.007710196543484926, -0.08966607600450516, -0.03486008569598198, -0.09906479716300964, -0.0004134354239795357, -0.04890890792012215, 0.0022865592036396265, -0.015143090859055519, -0.02800118736922741, -0.03891401365399361, 0.0410972461104393, -0.05981208384037018, -0.08585194498300552, 0.0010127173736691475, 0.056164611130952835, -0.14157222211360931, 0.004189672414213419, 0.03645912557840347, -0.11614739149808884, 0.08671581000089645, 0.0626179426908493, 0.04066082462668419, 0.01287915650755167, -0.0652013048529625, -0.0019109542481601238, 0.019717752933502197, 0.023749129846692085, 0.04218650981783867, -0.10078909993171692, -0.0009796329541131854, -0.01945602148771286, 0.007783656008541584, 0.008279189467430115, 0.07422284036874771, -0.1275152862071991, -0.020909806713461876, -0.03733476996421814, -0.053190022706985474, -0.06092463806271553, 0.03057914972305298, 0.07883377373218536, 0.01474433857947588, 0.18156325817108154, -0.05375656113028526, 0.04629603400826454, -0.21118417382240295, -0.03989527374505997, -0.004914314951747656, -0.03342008963227272, -0.06650740653276443, -0.01501144003123045, 0.06548181921243668, -0.039776772260665894, 0.0944921001791954, -0.037760622799396515, 0.09173329919576645, 0.026710104197263718, -0.0060683367773890495, 0.030904477462172508, 0.00556929362937808, 0.1874317079782486, 0.060766227543354034, -0.032754283398389816, 0.06468791514635086, -0.03952305018901825, 0.050147704780101776, 0.034180209040641785, 0.1510179042816162, 0.14581993222236633, -0.019650274887681007, 0.05258135497570038, 0.015676919370889664, -0.10202038288116455, -0.20443172752857208, 0.042954739183187485, -0.010280288755893707, 0.09426170587539673, -0.01949922926723957, 0.16689532995224, 0.11298266798257828, -0.1935501992702484, 0.05075174942612648, -0.06485512107610703, -0.08996739983558655, -0.08257003873586655, -0.1216675415635109, -0.0720314085483551, -0.09333842992782593, 0.024019556120038033, -0.11130113154649734, 0.0024731422308832407, 0.1153172105550766, 0.0008088350296020508, -0.006821035407483578, 0.15936489403247833, -0.05942815542221069, 0.018669620156288147, 0.045359522104263306, 0.01864311471581459, -0.008517708629369736, -0.012636290863156319, -0.06627680361270905, 0.02524864487349987, 0.0013485332019627094, 0.10809734463691711, -0.0557367242872715, -0.013666268438100815, 0.039041366428136826, 0.012859531678259373, -0.07924330979585648, 0.030550194904208183, -0.007688708137720823, 0.03985960781574249, 0.06266283988952637, 0.0300293006002903, 0.01630287431180477, -0.061650797724723816, 0.277023047208786, -0.05862737447023392, -0.06925328075885773, -0.13075105845928192, 0.22270458936691284, 0.04043026641011238, -0.020641064271330833, 0.0804586336016655, -0.1292417049407959, 0.0040798792615532875, 0.11844667047262192, 0.14590959250926971, -0.0512751005589962, -0.01255225483328104, -0.0006377186509780586, -0.018689552322030067, -0.05214792862534523, 0.11082976311445236, 0.08611947298049927, 0.06338011473417282, -0.05769952014088631, 0.041631028056144714, -0.011592243798077106, -0.03916497156023979, -0.08856748044490814, 0.10484360158443451, 0.010219542309641838, 0.020511198788881302, -0.04813723638653755, 0.06484626978635788, 0.006443891208618879, -0.16093015670776367, 0.04644874110817909, -0.15921346843242645, -0.19481033086776733, -0.005923765245825052, 0.056663867086172104, -0.019602445885539055, 0.06863263994455338, 0.003780136816203594, -0.005015261936932802, 0.10035861283540726, -0.004728497937321663, -0.04337814077734947, -0.08120070397853851, 0.07296738028526306, -0.046169884502887726, 0.2547298073768616, 0.003023482859134674, 0.04809962585568428, 0.11434593051671982, 0.0242085512727499, -0.14347219467163086, 0.03277169540524483, 0.08291229605674744, -0.0644434243440628, 0.049816448241472244, 0.18472306430339813, -0.044230103492736816, 0.08079101145267487, 0.05710558593273163, -0.12975984811782837, -0.021224983036518097, -0.019317949190735817, 0.01203638780862093, -0.06567933410406113, 0.006256054621189833, -0.0543561652302742, 0.1599193960428238, 0.20171238481998444, -0.04462290182709694, -0.008960897102952003, -0.06118731200695038, 0.03826797381043434, 0.05574670061469078, 0.1099545881152153, -0.01130412146449089, -0.21429935097694397, 0.017815418541431427, -0.003110076766461134, 0.02809699811041355, -0.25052410364151, -0.09694324433803558, 0.032522138208150864, -0.06293515861034393, -0.05774052441120148, 0.10359964519739151, 0.04505816102027893, 0.012726496905088425, -0.04237658530473709, -0.10588351637125015, -0.0700666531920433, 0.13010703027248383, -0.14904342591762543, -0.05809592083096504 ]
null
null
null
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # openhermes-2.5-mistral-7b-losstype-ipo This model is a fine-tuned version of [teknium/OpenHermes-2.5-Mistral-7B](https://huggingface.co/teknium/OpenHermes-2.5-Mistral-7B) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 4 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_steps: 50 - training_steps: 200 ### Training results ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.0+cu121 - Datasets 2.17.0 - Tokenizers 0.15.1
{"license": "apache-2.0", "tags": ["trl", "dpo", "generated_from_trainer"], "base_model": "teknium/OpenHermes-2.5-Mistral-7B", "model-index": [{"name": "openhermes-2.5-mistral-7b-losstype-ipo", "results": []}]}
null
DrishtiSharma/openhermes-2.5-mistral-7b-losstype-ipo
[ "safetensors", "trl", "dpo", "generated_from_trainer", "base_model:teknium/OpenHermes-2.5-Mistral-7B", "license:apache-2.0", "region:us" ]
2024-02-13T18:45:14+00:00
[]
[]
TAGS #safetensors #trl #dpo #generated_from_trainer #base_model-teknium/OpenHermes-2.5-Mistral-7B #license-apache-2.0 #region-us
# openhermes-2.5-mistral-7b-losstype-ipo This model is a fine-tuned version of teknium/OpenHermes-2.5-Mistral-7B on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 4 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_steps: 50 - training_steps: 200 ### Training results ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.0+cu121 - Datasets 2.17.0 - Tokenizers 0.15.1
[ "# openhermes-2.5-mistral-7b-losstype-ipo\n\nThis model is a fine-tuned version of teknium/OpenHermes-2.5-Mistral-7B on an unknown dataset.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 4\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 16\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_steps: 50\n- training_steps: 200", "### Training results", "### Framework versions\n\n- Transformers 4.35.2\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1" ]
[ "TAGS\n#safetensors #trl #dpo #generated_from_trainer #base_model-teknium/OpenHermes-2.5-Mistral-7B #license-apache-2.0 #region-us \n", "# openhermes-2.5-mistral-7b-losstype-ipo\n\nThis model is a fine-tuned version of teknium/OpenHermes-2.5-Mistral-7B on an unknown dataset.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 4\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 16\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_steps: 50\n- training_steps: 200", "### Training results", "### Framework versions\n\n- Transformers 4.35.2\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1" ]
[ 50, 49, 6, 12, 8, 3, 128, 4, 33 ]
[ "passage: TAGS\n#safetensors #trl #dpo #generated_from_trainer #base_model-teknium/OpenHermes-2.5-Mistral-7B #license-apache-2.0 #region-us \n# openhermes-2.5-mistral-7b-losstype-ipo\n\nThis model is a fine-tuned version of teknium/OpenHermes-2.5-Mistral-7B on an unknown dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 4\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 16\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_steps: 50\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.35.2\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1" ]
[ -0.08551818132400513, 0.0793425515294075, -0.003773503703996539, 0.08602947741746902, 0.11499625444412231, 0.039940230548381805, 0.15190981328487396, 0.1363108903169632, -0.00847674161195755, 0.07280371338129044, 0.01657021790742874, 0.05477418750524521, 0.0529840923845768, 0.14028260111808777, -0.06107261776924133, -0.24807296693325043, 0.012715546414256096, -0.0389300212264061, -0.07533806562423706, 0.0856490358710289, 0.09840961545705795, -0.090303935110569, 0.052204325795173645, 0.020927002653479576, -0.09814668446779251, 0.0006570378900505602, -0.007560502737760544, -0.04700782150030136, 0.1239294558763504, -0.023912038654088974, 0.11069386452436447, 0.006463868543505669, 0.139093279838562, -0.18473656475543976, 0.0007466903771273792, 0.08028313517570496, 0.031178750097751617, 0.0923294723033905, 0.06412944942712784, 0.007651880383491516, 0.05085889622569084, -0.14768929779529572, 0.0992032065987587, 0.01342841237783432, -0.08938944339752197, -0.1840606927871704, -0.12794898450374603, 0.02878587879240513, 0.08596444129943848, 0.07335536181926727, 0.02954651601612568, 0.1513661891222, -0.05162891000509262, 0.062195226550102234, 0.2772729992866516, -0.2699383795261383, -0.06449863314628601, 0.06956420838832855, 0.05736013129353523, 0.08045201003551483, -0.09135320782661438, -0.035737086087465286, 0.035596925765275955, 0.01151585765182972, 0.08218205720186234, 0.0033132287207990885, -0.058726951479911804, -0.017540263012051582, -0.1331501603126526, -0.021756848320364952, 0.10887256264686584, 0.01643766276538372, -0.043916188180446625, -0.09502756595611572, -0.09971094876527786, -0.08068937808275223, -0.008427878841757774, -0.06454387307167053, 0.058151960372924805, -0.027667196467518806, -0.0043169716373085976, -0.04921680688858032, -0.0713638961315155, -0.041683077812194824, -0.017877427861094475, 0.12797196209430695, 0.02637813612818718, 0.03348391503095627, -0.03151170536875725, 0.11897538602352142, -0.02980845794081688, -0.13920879364013672, -0.011215494014322758, -0.0009067522478289902, -0.08591796457767487, -0.07391023635864258, -0.05329397693276405, -0.043798744678497314, -0.010028166696429253, 0.1893799602985382, -0.07823976129293442, 0.058069195598363876, -0.021162670105695724, 0.0006695369957014918, -0.040716517716646194, 0.11893104761838913, -0.035732243210077286, -0.03369584679603577, 0.003701012348756194, 0.094517782330513, 0.035755056887865067, -0.009050517342984676, -0.0577341690659523, -0.050424523651599884, 0.06948696821928024, 0.041255537420511246, -0.032235678285360336, 0.013906958512961864, -0.048683855682611465, -0.050543222576379776, 0.08046647161245346, -0.13266721367835999, 0.03744112700223923, -0.002888333285227418, -0.09650705009698868, -0.0020791692659258842, 0.014783945865929127, 0.03867601975798607, 0.0053490702994167805, 0.12185795605182648, -0.06917282938957214, 0.021235458552837372, -0.08296506106853485, -0.05949435010552406, 0.025305908173322678, -0.09404366463422775, -0.02626466192305088, -0.049652136862277985, -0.21613872051239014, -0.02455822378396988, 0.0734652653336525, -0.07184319943189621, -0.013354934751987457, -0.0023446884006261826, -0.06578908860683441, 0.03049948439002037, -0.001170485862530768, 0.10010124742984772, -0.040106575936079025, 0.05774351581931114, -0.035942740738391876, 0.04834207147359848, 0.018685147166252136, 0.018393488600850105, -0.0759383961558342, 0.02934764139354229, -0.15777668356895447, 0.040821924805641174, -0.06600117683410645, 0.013551618903875351, -0.12302286177873611, -0.08173154294490814, 0.003265364794060588, -0.06500218063592911, 0.07452050596475601, 0.09639690816402435, -0.1910332590341568, -0.0005285171209834516, 0.17042282223701477, -0.10402945429086685, -0.077513188123703, 0.08975208550691605, -0.057431843131780624, 0.04325263202190399, 0.03251627832651138, 0.1393379420042038, 0.08199000358581543, -0.18815864622592926, 0.010710994713008404, -0.02402454987168312, 0.06178787350654602, 0.058800000697374344, 0.07030470669269562, -0.012348225340247154, 0.09543786197900772, -0.010488882660865784, -0.05734066665172577, -0.03709631785750389, -0.06516280770301819, -0.08650180697441101, -0.06647379696369171, -0.06565409153699875, 0.07054892182350159, 0.019248556345701218, 0.036171264946460724, -0.05749441310763359, -0.08657047897577286, 0.10838740319013596, 0.1349356770515442, -0.04888097941875458, 0.030930660665035248, -0.06955856084823608, 0.03141595050692558, -0.004934081807732582, -0.05144907161593437, -0.20786108076572418, -0.10861418396234512, 0.0514470636844635, -0.10208069533109665, 0.003993763588368893, 0.05627136677503586, 0.07828769087791443, 0.06477368623018265, -0.05814133211970329, 0.0018302274402230978, -0.0827556699514389, -0.0060388315469026566, -0.08966730535030365, -0.2051866054534912, -0.052510447800159454, -0.040795084089040756, 0.14226935803890228, -0.17765115201473236, 0.009188416413962841, -0.007917341776192188, 0.1665518581867218, 0.05196504667401314, -0.039897892624139786, -0.03713162988424301, 0.02085605263710022, 0.00399401830509305, -0.10070208460092545, 0.03017600253224373, -0.002689487999305129, -0.05783620476722717, -0.06385780870914459, -0.14288146793842316, 0.031236300244927406, 0.06505578756332397, 0.07150457054376602, -0.07915256172418594, -0.0403498150408268, -0.07150470465421677, -0.0589127242565155, -0.0824979767203331, -0.0015060070436447859, 0.15823392570018768, 0.01703239232301712, 0.09921450167894363, -0.08216220885515213, -0.08979617059230804, -0.002500451635569334, 0.00043953716522082686, 0.013023942708969116, 0.03852757811546326, 0.04364772140979767, -0.1447947770357132, 0.07021614909172058, 0.1258881688117981, -0.06411530077457428, 0.09527139365673065, -0.04942268133163452, -0.07869941741228104, -0.048259131610393524, 0.013669423758983612, -0.0007950887084007263, 0.10985083878040314, -0.03533873334527016, 0.04211466759443283, 0.02958332560956478, 0.03949715942144394, 0.011016808450222015, -0.17936256527900696, -0.0029987413436174393, 0.019451873376965523, -0.04167163744568825, 0.0004520476213656366, 0.012508527375757694, 0.032052814960479736, 0.10383740067481995, 0.024310557171702385, -0.03713265433907509, 0.0113726407289505, -0.03159376606345177, -0.08392424881458282, 0.16204889118671417, -0.10895217210054398, -0.05031091347336769, -0.06692461669445038, 0.016075195744633675, -0.003137522144243121, -0.03547445312142372, 0.00513118039816618, -0.04938627779483795, -0.045706868171691895, -0.08357281982898712, -0.0671735405921936, 0.0030460222624242306, -0.006734446622431278, 0.09446365386247635, 0.012530471198260784, 0.0755787193775177, -0.13211219012737274, -0.005983769427984953, -0.03215108439326286, -0.07610142230987549, 0.011828640475869179, 0.06781482696533203, 0.045485373586416245, 0.12444982677698135, -0.029202230274677277, 0.0005605443730019033, -0.027517644688487053, 0.20546074211597443, -0.08157557249069214, 0.013464972376823425, 0.13695871829986572, -0.027828359976410866, 0.08384757488965988, 0.15700563788414001, 0.05650635063648224, -0.08076401799917221, 0.019166626036167145, 0.053350336849689484, -0.0063760168850421906, -0.2843319773674011, -0.03350229933857918, -0.03618594631552696, -0.040928881615400314, 0.07116539031267166, 0.027790136635303497, 0.060005903244018555, 0.027081338688731194, -0.018041322007775307, 0.024146655574440956, 0.011217886582016945, 0.0802043005824089, 0.09958714991807938, 0.07032764703035355, 0.0866723582148552, -0.012109333649277687, -0.007257204037159681, 0.06394772231578827, 0.014164031483232975, 0.28784608840942383, -0.0038182279095053673, 0.12459694594144821, 0.022030077874660492, 0.10486157238483429, -0.03783660754561424, 0.05707487836480141, -0.002083415165543556, -0.0018790155882015824, -0.005926338490098715, -0.07644953578710556, 0.006207150872796774, 0.05335167795419693, -0.07315168529748917, 0.040362048894166946, -0.042770691215991974, 0.03608838841319084, 0.024936018511652946, 0.24972330033779144, 0.0400584414601326, -0.25389257073402405, -0.058185335248708725, 0.01532968133687973, -0.03933947905898094, -0.019991857931017876, -0.017594046890735626, 0.09776945412158966, -0.12755316495895386, 0.07614560425281525, -0.09584610164165497, 0.07025879621505737, -0.02316730096936226, -0.012868605554103851, 0.10296524316072464, 0.10332109779119492, -0.0021564490161836147, 0.04804406315088272, -0.1801164299249649, 0.23149439692497253, 0.016506705433130264, 0.09135953336954117, -0.0739348977804184, 0.03677326813340187, 0.030097801238298416, 0.03244192525744438, 0.07893392443656921, -0.002522495109587908, -0.08347268402576447, -0.1255977749824524, -0.09841969609260559, 0.0039902362041175365, 0.13361021876335144, -0.04178024083375931, 0.09070632606744766, -0.053911760449409485, -0.006973559502512217, 0.057516880333423615, -0.015909932553768158, -0.16617296636104584, -0.09221292287111282, 0.06607147306203842, -0.0024718777276575565, -0.05529744178056717, -0.07362736761569977, -0.10599740594625473, -0.05125116929411888, 0.12714697420597076, 0.00010093446326209232, -0.05063428729772568, -0.13977746665477753, 0.044492803514003754, 0.16422928869724274, -0.05797132849693298, 0.041211191564798355, 0.04079077020287514, 0.1019764095544815, 0.018632737919688225, -0.09063795208930969, 0.07156484574079514, -0.07611024379730225, -0.21364882588386536, -0.07891904562711716, 0.11465401947498322, 0.06694984436035156, 0.05961782857775688, -0.003283587284386158, 0.04882393032312393, 0.02325698360800743, -0.0705462247133255, 0.020681727677583694, 0.10228122770786285, 0.09304117411375046, 0.018754232674837112, -0.06712083518505096, 0.011030126363039017, -0.02932167984545231, -0.02790229022502899, 0.08029304444789886, 0.23633654415607452, -0.08303879201412201, 0.09843280166387558, 0.019332000985741615, -0.07295120507478714, -0.11799541115760803, 0.06375224143266678, 0.1326766312122345, 0.0006471658125519753, 0.08407536149024963, -0.16518259048461914, 0.09398189932107925, 0.10849831253290176, -0.05054754018783569, 0.07135797291994095, -0.34199750423431396, -0.1358908712863922, 0.01764906570315361, 0.07792425155639648, 0.04260096326470375, -0.12250194698572159, -0.03514508903026581, -0.0064596151933074, -0.15278883278369904, 0.10887607932090759, -0.10772597789764404, 0.0848086029291153, -0.012592403218150139, 0.07759560644626617, 0.023055698722600937, -0.04638247564435005, 0.14070671796798706, 0.034131936728954315, 0.09229066967964172, -0.05706968158483505, 0.0057250019162893295, 0.09507952630519867, -0.06663914769887924, 0.02308228798210621, 0.012703672982752323, 0.047913968563079834, -0.09508500248193741, -0.0061926087364554405, -0.05519161745905876, 0.039223216474056244, -0.046077948063611984, -0.0728495642542839, -0.04256032034754753, 0.07373393326997757, 0.08335395902395248, -0.03952563554048538, 0.08093343675136566, 0.0632925033569336, 0.11604466289281845, 0.1265452355146408, 0.09426283091306686, 0.030257947742938995, -0.051330871880054474, 0.009043445810675621, -0.00457244785502553, 0.04209901764988899, -0.11160503327846527, 0.024695822969079018, 0.12231510877609253, 0.04783164709806442, 0.11559746414422989, 0.03912944346666336, -0.056332461535930634, 0.010143629275262356, 0.02479843981564045, -0.10900522023439407, -0.11194487661123276, 0.029131406918168068, 0.018199345096945763, -0.1235785111784935, 0.02650420367717743, 0.13073745369911194, -0.046575985848903656, -0.013528440147638321, 0.0031677591614425182, 0.024834664538502693, 0.001955522922798991, 0.17876426875591278, 0.010051650926470757, 0.05272510647773743, -0.058410972356796265, 0.11417204886674881, 0.08406032621860504, -0.08083492517471313, 0.020066948607563972, 0.05537531524896622, -0.09750453382730484, 0.005333287641406059, 0.052612584084272385, 0.08789242058992386, -0.03330434858798981, -0.020597383379936218, -0.09458766877651215, -0.06128179281949997, 0.034960538148880005, 0.15919192135334015, 0.060618799179792404, -0.00698068318888545, -0.018414100632071495, 0.04421846196055412, -0.0967547744512558, 0.10692842304706573, 0.05268140509724617, 0.0804901123046875, -0.13085860013961792, 0.09763415902853012, -0.012462060898542404, 0.009710785932838917, -0.013716795481741428, 0.033023521304130554, -0.05347570776939392, -0.03773531690239906, -0.11458969116210938, 0.008181518875062466, -0.012694628909230232, -0.007902856916189194, -0.01827111840248108, -0.060060180723667145, -0.02662130445241928, 0.040084488689899445, -0.07121451944112778, -0.044928181916475296, -0.001993825426325202, 0.038385119289159775, -0.1429833173751831, -0.0362800695002079, 0.046918369829654694, -0.0923522561788559, 0.0914406031370163, 0.07260207086801529, 0.030497416853904724, 0.0030110685620456934, -0.11015018075704575, -0.0011535033117979765, 0.025081800296902657, 0.03611556813120842, 0.05325048416852951, -0.1390765905380249, -0.02934233844280243, -0.041142966598272324, 0.027116116136312485, 0.011245686560869217, 0.040675271302461624, -0.12817372381687164, 0.006348236929625273, -0.04349934309720993, -0.06355457752943039, -0.046758249402046204, 0.001463091466575861, 0.09671692550182343, 0.020490651950240135, 0.13621923327445984, -0.055645596235990524, 0.07025069743394852, -0.21766524016857147, -0.038692835718393326, 0.02496251091361046, -0.004552848171442747, -0.08250291645526886, -0.016796819865703583, 0.07146307826042175, -0.0650152862071991, 0.10509548336267471, -0.0238453708589077, 0.08032716810703278, 0.0392395444214344, -0.060531552881002426, -0.027515660971403122, 0.014228551648557186, 0.15286549925804138, 0.06280053406953812, -0.015176162123680115, 0.0808224156498909, -0.03779228776693344, 0.03828514739871025, 0.005584104917943478, 0.21360279619693756, 0.16722992062568665, -0.013094439171254635, 0.02780739590525627, 0.04202807694673538, -0.11103738844394684, -0.1722654104232788, 0.08524397760629654, -0.027777543291449547, 0.0777052789926529, -0.0512533038854599, 0.16644121706485748, 0.0995737761259079, -0.19503813982009888, 0.04285874962806702, -0.042816441506147385, -0.08016112446784973, -0.13237160444259644, -0.0409754179418087, -0.07592755556106567, -0.12679068744182587, 0.010983350686728954, -0.11148014664649963, 0.07742032408714294, 0.10232965648174286, 0.009721465408802032, 0.02453262358903885, 0.12702469527721405, -0.0514826662838459, 0.01629544235765934, 0.047951485961675644, 0.04431464523077011, 0.005047502927482128, -0.026966296136379242, -0.08384045213460922, 0.021310709416866302, -0.003926486242562532, 0.05208075791597366, -0.04295790567994118, 0.01115692500025034, 0.007646632846444845, 0.0030931057408452034, -0.07498785853385925, 0.03079260140657425, 0.01045006699860096, 0.02125641703605652, 0.011976099573075771, 0.05347397178411484, 0.04469642788171768, -0.039651982486248016, 0.2785854637622833, -0.08101727068424225, -0.08389466255903244, -0.1210848018527031, 0.16828703880310059, 0.025720898061990738, -0.019356001168489456, 0.07716508209705353, -0.11872346699237823, -0.019421733915805817, 0.0950477123260498, 0.12227945774793625, -0.07209021598100662, -0.009062550961971283, 0.014004448428750038, -0.016351033002138138, -0.03982655331492424, 0.08866129070520401, 0.06580142676830292, 0.049488745629787445, -0.06714127957820892, -0.020600641146302223, -0.006336485501378775, -0.019692644476890564, -0.07949743419885635, 0.04018377512693405, -0.013019784353673458, 0.00004163204357610084, -0.024233408272266388, 0.07390820235013962, 0.024364300072193146, -0.1648692488670349, 0.049409169703722, -0.19252535700798035, -0.18372049927711487, 0.005813563242554665, 0.09397236257791519, -0.01974298618733883, 0.0498233363032341, -0.012331006117165089, -0.037558965384960175, 0.15068747103214264, -0.015460096299648285, -0.03585204482078552, -0.08378343284130096, 0.07830379903316498, -0.08993583917617798, 0.19824399054050446, -0.00636266078799963, 0.12288125604391098, 0.0873057171702385, 0.031842783093452454, -0.11273828148841858, -0.010815896093845367, 0.10480516403913498, -0.10592716187238693, -0.00017121544806286693, 0.15719832479953766, -0.049094341695308685, 0.07130615413188934, 0.07798073440790176, -0.1314661055803299, -0.013393637724220753, 0.054902441799640656, -0.018825385719537735, -0.08546686917543411, 0.008591816760599613, -0.05803273990750313, 0.16460034251213074, 0.19341807067394257, -0.03679998591542244, 0.030770931392908096, -0.04714095592498779, 0.004832542967051268, 0.05916953831911087, 0.06315220147371292, -0.052660126239061356, -0.213595449924469, 0.036274660378694534, 0.03690332546830177, 0.04459533467888832, -0.1359986960887909, -0.11887611448764801, 0.06128647178411484, -0.06852232664823532, -0.072353795170784, 0.08042158931493759, 0.032536786049604416, 0.010664450004696846, -0.016974201425909996, -0.10745932161808014, -0.04361240565776825, 0.1388075202703476, -0.17326728999614716, -0.03269479423761368 ]
null
null
sample-factory
A(n) **APPO** model trained on the **doom_health_gathering_supreme** environment. This model was trained using Sample-Factory 2.0: https://github.com/alex-petrenko/sample-factory. Documentation for how to use Sample-Factory can be found at https://www.samplefactory.dev/ ## Downloading the model After installing Sample-Factory, download the model with: ``` python -m sample_factory.huggingface.load_from_hub -r ORromu/rl_course_vizdoom_health_gathering_supreme ``` ## Using the model To run the model after download, use the `enjoy` script corresponding to this environment: ``` python -m .usr.local.lib.python3.10.dist-packages.colab_kernel_launcher --algo=APPO --env=doom_health_gathering_supreme --train_dir=./train_dir --experiment=rl_course_vizdoom_health_gathering_supreme ``` You can also upload models to the Hugging Face Hub using the same script with the `--push_to_hub` flag. See https://www.samplefactory.dev/10-huggingface/huggingface/ for more details ## Training with this model To continue training with this model, use the `train` script corresponding to this environment: ``` python -m .usr.local.lib.python3.10.dist-packages.colab_kernel_launcher --algo=APPO --env=doom_health_gathering_supreme --train_dir=./train_dir --experiment=rl_course_vizdoom_health_gathering_supreme --restart_behavior=resume --train_for_env_steps=10000000000 ``` Note, you may have to adjust `--train_for_env_steps` to a suitably high number as the experiment will resume at the number of steps it concluded at.
{"library_name": "sample-factory", "tags": ["deep-reinforcement-learning", "reinforcement-learning", "sample-factory"], "model-index": [{"name": "APPO", "results": [{"task": {"type": "reinforcement-learning", "name": "reinforcement-learning"}, "dataset": {"name": "doom_health_gathering_supreme", "type": "doom_health_gathering_supreme"}, "metrics": [{"type": "mean_reward", "value": "11.76 +/- 5.37", "name": "mean_reward", "verified": false}]}]}]}
reinforcement-learning
ORromu/rl_course_vizdoom_health_gathering_supreme
[ "sample-factory", "tensorboard", "deep-reinforcement-learning", "reinforcement-learning", "model-index", "region:us" ]
2024-02-13T18:46:03+00:00
[]
[]
TAGS #sample-factory #tensorboard #deep-reinforcement-learning #reinforcement-learning #model-index #region-us
A(n) APPO model trained on the doom_health_gathering_supreme environment. This model was trained using Sample-Factory 2.0: URL Documentation for how to use Sample-Factory can be found at URL ## Downloading the model After installing Sample-Factory, download the model with: ## Using the model To run the model after download, use the 'enjoy' script corresponding to this environment: You can also upload models to the Hugging Face Hub using the same script with the '--push_to_hub' flag. See URL for more details ## Training with this model To continue training with this model, use the 'train' script corresponding to this environment: Note, you may have to adjust '--train_for_env_steps' to a suitably high number as the experiment will resume at the number of steps it concluded at.
[ "## Downloading the model\n\nAfter installing Sample-Factory, download the model with:", "## Using the model\n\nTo run the model after download, use the 'enjoy' script corresponding to this environment:\n\n\n\nYou can also upload models to the Hugging Face Hub using the same script with the '--push_to_hub' flag.\nSee URL for more details", "## Training with this model\n\nTo continue training with this model, use the 'train' script corresponding to this environment:\n\n\nNote, you may have to adjust '--train_for_env_steps' to a suitably high number as the experiment will resume at the number of steps it concluded at." ]
[ "TAGS\n#sample-factory #tensorboard #deep-reinforcement-learning #reinforcement-learning #model-index #region-us \n", "## Downloading the model\n\nAfter installing Sample-Factory, download the model with:", "## Using the model\n\nTo run the model after download, use the 'enjoy' script corresponding to this environment:\n\n\n\nYou can also upload models to the Hugging Face Hub using the same script with the '--push_to_hub' flag.\nSee URL for more details", "## Training with this model\n\nTo continue training with this model, use the 'train' script corresponding to this environment:\n\n\nNote, you may have to adjust '--train_for_env_steps' to a suitably high number as the experiment will resume at the number of steps it concluded at." ]
[ 34, 19, 59, 67 ]
[ "passage: TAGS\n#sample-factory #tensorboard #deep-reinforcement-learning #reinforcement-learning #model-index #region-us \n## Downloading the model\n\nAfter installing Sample-Factory, download the model with:## Using the model\n\nTo run the model after download, use the 'enjoy' script corresponding to this environment:\n\n\n\nYou can also upload models to the Hugging Face Hub using the same script with the '--push_to_hub' flag.\nSee URL for more details## Training with this model\n\nTo continue training with this model, use the 'train' script corresponding to this environment:\n\n\nNote, you may have to adjust '--train_for_env_steps' to a suitably high number as the experiment will resume at the number of steps it concluded at." ]
[ -0.162887305021286, -0.07949446886777878, 0.0013769814977422357, 0.0244897473603487, 0.13643795251846313, 0.08826540410518646, 0.13243556022644043, 0.07938782125711441, 0.19449298083782196, 0.07451266050338745, 0.12160012871026993, 0.06742649525403976, 0.02505551464855671, 0.31084391474723816, 0.08655242621898651, -0.18235880136489868, 0.031082456931471825, -0.06436605006456375, -0.02882574498653412, 0.05590416118502617, 0.050910040736198425, -0.06422623991966248, 0.11641133576631546, -0.05714287608861923, -0.15497641265392303, 0.08288847655057907, 0.008126083761453629, 0.03596968948841095, 0.12199652194976807, -0.007729834411293268, 0.06358569860458374, 0.02508161962032318, 0.09885215014219284, -0.08979995548725128, 0.05817115306854248, 0.037268251180648804, -0.005583701189607382, 0.0697544738650322, -0.02916712686419487, 0.01197513286024332, 0.20552261173725128, 0.051445573568344116, -0.014811687171459198, 0.0707944929599762, -0.04854035750031471, 0.005004523321986198, 0.024828260764479637, 0.08118943125009537, 0.1108563020825386, -0.013300174847245216, -0.015604399144649506, 0.2098497599363327, -0.045419543981552124, 0.030687451362609863, 0.1803472340106964, -0.13901305198669434, -0.00587898213416338, 0.3598267436027527, 0.13591337203979492, 0.07389762997627258, -0.05572221428155899, 0.065569669008255, 0.12957775592803955, -0.013377981260418892, -0.022062024101614952, -0.037468962371349335, 0.01014290377497673, 0.02470328100025654, -0.08271043002605438, -0.03898613899946213, 0.18779566884040833, 0.027798498049378395, -0.0647122785449028, -0.11388745903968811, -0.08383605629205704, -0.01143614575266838, -0.08729266375303268, -0.06047317758202553, 0.061255209147930145, 0.06450130045413971, -0.05541218817234039, -0.16354843974113464, -0.08759765326976776, -0.14808951318264008, 0.09711641818284988, -0.018818290904164314, 0.020023507997393608, 0.039053402841091156, -0.13240769505500793, 0.13932685554027557, -0.12239529192447662, -0.005040881223976612, -0.00391974626109004, -0.10012788325548172, -0.0298643596470356, -0.02757178619503975, -0.06954579800367355, -0.08072661608457565, 0.06621979922056198, 0.1397300660610199, 0.1075919046998024, 0.04457515478134155, -0.016096504405140877, 0.0929836705327034, 0.0659836158156395, 0.015487046912312508, -0.046446919441223145, -0.03190334141254425, 0.06750229746103287, 0.09463070333003998, -0.0025161339435726404, -0.04405781999230385, -0.12502750754356384, 0.004669501446187496, -0.05889439582824707, 0.07438734918832779, -0.01944235898554325, 0.09347380697727203, 0.0012449703644961119, -0.0658751055598259, 0.09675891697406769, -0.056166794151067734, -0.015024078078567982, 0.05717969685792923, -0.09829384088516235, -0.044000294059515, 0.02636338584125042, -0.018662840127944946, 0.02191256918013096, -0.08697114139795303, -0.1281215101480484, -0.0406981036067009, -0.15496762096881866, -0.0733695924282074, 0.020342092961072922, -0.10162562131881714, 0.040819648653268814, -0.08701786398887634, -0.27291807532310486, -0.016108427196741104, 0.05915366858243942, 0.0003154690202791244, 0.03663148358464241, -0.06209208071231842, 0.0267410296946764, -0.030988745391368866, -0.013702943921089172, 0.12538094818592072, -0.04706621542572975, 0.005733184050768614, 0.02853262610733509, 0.09092917293310165, 0.029396481812000275, -0.011824010871350765, -0.09237373620271683, 0.03002769686281681, -0.1866937130689621, 0.0038047281559556723, -0.051012441515922546, 0.14028684794902802, -0.07785230129957199, -0.0034444157499819994, -0.07691079378128052, 0.06912831217050552, 0.052552226930856705, 0.21963854134082794, -0.22059281170368195, -0.09743031859397888, 0.1902308464050293, -0.09678838402032852, -0.1949385702610016, 0.06732125580310822, -0.03079940192401409, 0.20069970190525055, 0.02597416751086712, 0.1891578733921051, 0.00020795770979020745, -0.25584760308265686, 0.035303130745887756, 0.07686726003885269, -0.2078019231557846, -0.11653494834899902, 0.00783967413008213, 0.04216665402054787, -0.050144799053668976, 0.023388857021927834, -0.07392873615026474, 0.1217033788561821, -0.023950038477778435, -0.021695949137210846, -0.009935722686350346, -0.06940963864326477, -0.039610356092453, 0.012346661649644375, 0.06086154654622078, -0.02202412113547325, -0.025860905647277832, -0.05173748731613159, 0.16720648109912872, -0.0795547217130661, 0.011736705899238586, -0.11241740733385086, 0.1497063785791397, 0.007124151568859816, 0.025635361671447754, -0.0980280190706253, -0.014672551304101944, 0.044151511043310165, 0.08621654659509659, 0.011970171704888344, 0.1326037049293518, 0.06774137914180756, 0.01454958226531744, 0.042493220418691635, -0.004039871972054243, -0.0012205307139083743, -0.10230473428964615, -0.05593033879995346, -0.11311958730220795, -0.11286478489637375, -0.09429361671209335, 0.08868816494941711, -0.20066434144973755, 0.05826579034328461, -0.15120604634284973, 0.047645486891269684, 0.038803353905677795, -0.07772190868854523, 0.05121537670493126, -0.08661998063325882, -0.021283775568008423, -0.08784573525190353, 0.0805407464504242, -0.014386715367436409, -0.08415807038545609, 0.006313080433756113, -0.09094364196062088, -0.08295580744743347, 0.09175937622785568, 0.013830476440489292, 0.0026490744203329086, -0.1170414388179779, -0.04695970565080643, 0.001149212708696723, 0.03873389959335327, -0.0591595321893692, 0.08649469166994095, 0.06776818633079529, 0.09646541625261307, -0.09070473909378052, 0.03797374665737152, -0.020416714251041412, -0.06236580014228821, -0.045745182782411575, 0.014070805162191391, 0.1767948418855667, -0.022993814200162888, -0.01734299771487713, -0.005982444155961275, -0.048861317336559296, 0.20095843076705933, -0.018403954803943634, -0.11935548484325409, 0.0030399553943425417, -0.01395543571561575, -0.017944620922207832, 0.11660698801279068, -0.13726668059825897, -0.05182260647416115, 0.030854813754558563, -0.06529976427555084, 0.10216285288333893, -0.08242622762918472, -0.0392029769718647, -0.05685178562998772, -0.043409593403339386, 0.046979792416095734, 0.12330524623394012, -0.07290767133235931, -0.009151018224656582, -0.047789376229047775, -0.03510203957557678, -0.025379952043294907, -0.05724980682134628, -0.11478709429502487, 0.1582695096731186, 0.002751561114564538, -0.09990474581718445, -0.17415542900562286, -0.08029486984014511, -0.03834356367588043, 0.05337152257561684, -0.034037429839372635, -0.04430336132645607, -0.01500723510980606, -0.07299388945102692, 0.1465158462524414, 0.063304103910923, -0.0472191721200943, -0.01852818764746189, 0.08560720086097717, 0.04456184431910515, -0.15394946932792664, 0.007078593596816063, -0.08948076516389847, -0.08794131129980087, 0.03091353550553322, -0.08061819523572922, 0.012820594012737274, 0.11341627687215805, 0.03525753691792488, 0.02826494723558426, 0.01035099383443594, 0.23537762463092804, -0.0369284451007843, -0.01093987375497818, 0.19019025564193726, 0.0682438537478447, 0.020443644374608994, 0.055847786366939545, 0.027420951053500175, -0.15370461344718933, 0.10424364358186722, 0.012530675157904625, -0.044538769870996475, -0.10689681768417358, -0.04666181653738022, -0.03360101953148842, 0.09803235530853271, 0.12185155600309372, 0.03158954530954361, 0.025155838578939438, 0.096546471118927, 0.02187134325504303, -0.0098390718922019, -0.11183010786771774, 0.05996714532375336, -0.1770814210176468, -0.043808963149785995, 0.00898060668259859, -0.028755301609635353, 0.00010461114288773388, 0.0659034252166748, 0.026660064235329628, 0.12833580374717712, 0.0295290257781744, 0.06181740015745163, 0.0663255974650383, 0.10200989991426468, 0.01538698747754097, 0.1999037265777588, -0.06215142831206322, -0.1075027585029602, -0.03758005052804947, -0.04118350148200989, -0.11916319280862808, 0.12439136207103729, 0.1381523460149765, -0.030515994876623154, -0.06625506281852722, 0.07200724631547928, 0.014589293859899044, 0.08729344606399536, 0.08250882476568222, -0.29115065932273865, -0.034177567809820175, 0.031450141221284866, 0.01114452164620161, -0.04308335855603218, 0.010566305369138718, 0.10542299598455429, -0.07616783678531647, -0.09982791543006897, -0.03972722589969635, 0.1055394783616066, 0.08046542853116989, 0.03702867403626442, -0.10841067880392075, 0.20128826797008514, -0.01744360849261284, 0.07004447281360626, -0.07662706822156906, 0.1728198230266571, 0.018701205030083656, 0.05943213775753975, -0.07497778534889221, -0.009592941962182522, 0.1228223443031311, 0.03374773636460304, 0.09092900156974792, -0.0056656887754797935, -0.09995020180940628, -0.13336431980133057, -0.1216202825307846, 0.024986369535326958, -0.000090524394181557, -0.08169890940189362, 0.03341596573591232, -0.016717763617634773, 0.017487963661551476, -0.0027857583481818438, 0.23440547287464142, -0.18267135322093964, 0.012482558377087116, -0.054521817713975906, 0.02707577496767044, -0.04300008341670036, -0.0709642544388771, -0.027162717655301094, 0.060507629066705704, 0.09744840115308762, 0.07921962440013885, 0.030401866883039474, -0.07419665157794952, 0.1431404948234558, 0.06514685600996017, -0.058246973901987076, -0.01524845976382494, 0.01951364241540432, 0.1256532073020935, -0.07438289374113083, -0.10393836349248886, 0.10585980117321014, -0.11736445128917694, 0.008749126456677914, -0.05019083246588707, 0.04299405962228775, 0.02305823378264904, 0.011290842667222023, 0.007447924464941025, -0.04279239848256111, 0.0015383695717900991, -0.06904047727584839, 0.0778660774230957, 0.020559091120958328, -0.0047941361553967, -0.0006717707728967071, -0.16239388287067413, 0.08390985429286957, -0.04138755425810814, 0.052877847105264664, 0.1489589661359787, 0.27864590287208557, -0.02386910282075405, 0.030926240608096123, 0.1617380678653717, -0.01897917501628399, -0.2491649091243744, 0.04654841497540474, 0.014908025041222572, 0.10310175269842148, 0.04640066251158714, -0.19236695766448975, 0.11111847311258316, 0.009474517777562141, -0.02225719392299652, 0.009804603643715382, -0.24880149960517883, -0.13740544021129608, 0.17525193095207214, 0.06902051717042923, 0.15983323752880096, -0.03665107116103172, -0.013587141409516335, -0.061109546571969986, -0.03419603407382965, -0.026354335248470306, -0.12708203494548798, 0.12749767303466797, -0.017607107758522034, 0.047745801508426666, 0.027817612513899803, -0.07676684111356735, 0.12058744579553604, -0.017944786697626114, 0.13344953954219818, -0.017018258571624756, -0.031023232266306877, 0.042466819286346436, -0.09033756703138351, 0.1662607043981552, -0.10233280807733536, 0.057950668036937714, -0.11091876775026321, -0.03109682910144329, -0.015322481282055378, 0.15654151141643524, 0.005544521380215883, -0.0855189636349678, -0.041066281497478485, 0.04975702613592148, -0.05784251168370247, 0.05022609233856201, -0.0021613158751279116, -0.03506873920559883, 0.022246064618229866, 0.08415499329566956, 0.040208954364061356, -0.10403558611869812, -0.011038471013307571, 0.03089289739727974, 0.01896476000547409, 0.09993185102939606, -0.20835483074188232, -0.020152123644948006, 0.019231827929615974, -0.015702085569500923, 0.13085414469242096, 0.04400704801082611, -0.08080117404460907, 0.027568496763706207, 0.13726983964443207, -0.061186157166957855, -0.030986590310931206, -0.04847807064652443, -0.016679393127560616, -0.12794725596904755, -0.01594163477420807, 0.057148490101099014, -0.04251079633831978, 0.02512725070118904, -0.03424951806664467, 0.0004248716577421874, -0.10717252641916275, 0.07036283612251282, 0.06859682500362396, 0.0642281174659729, -0.07167360186576843, 0.09394960850477219, -0.07811970263719559, 0.014289900660514832, 0.03734226152300835, 0.045441556721925735, -0.06931920349597931, -0.06820165365934372, -0.05322124809026718, 0.27575042843818665, -0.024388493970036507, -0.02025510184466839, -0.06021025776863098, 0.11942195147275925, -0.057836465537548065, -0.06673881411552429, 0.08716115355491638, -0.007450808770954609, -0.059019722044467926, 0.022327717393636703, -0.0734894648194313, -0.014457973651587963, 0.04693116992712021, 0.016375891864299774, -0.11610891669988632, 0.1136312261223793, 0.031648989766836166, 0.02891513518989086, -0.09186926484107971, -0.0486464723944664, -0.12123195827007294, 0.0032020595390349627, -0.025323880836367607, -0.06051601842045784, -0.07913094758987427, -0.0425749197602272, 0.049642790108919144, 0.018434861674904823, -0.08444267511367798, -0.0022111251018941402, -0.12617166340351105, 0.006370943505316973, 0.006689207162708044, 0.10316617041826248, -0.06351965665817261, 0.04670397937297821, 0.10049878805875778, -0.07692139595746994, 0.09893755614757538, 0.0846271738409996, -0.00729260453954339, 0.08929292112588882, -0.20261284708976746, -0.02319980226457119, 0.047821637243032455, 0.055264540016651154, 0.03154374286532402, 0.06104309484362602, 0.013487739488482475, -0.05460033565759659, 0.04538526386022568, -0.03539090231060982, 0.0028435050044208765, -0.09104080498218536, 0.09713591635227203, 0.009731475263834, -0.009716489352285862, -0.060456521809101105, -0.01384128537029028, 0.01817488856613636, 0.10404353588819504, 0.09692291915416718, -0.07237115502357483, -0.0035003575030714273, -0.11786255985498428, 0.024597108364105225, 0.02565017342567444, 0.010576808825135231, 0.03638135641813278, -0.11692339926958084, 0.03729743883013725, -0.05475534871220589, 0.19700418412685394, 0.019796879962086678, -0.10531783103942871, -0.008661900646984577, 0.07250577956438065, 0.17378750443458557, -0.006129021290689707, 0.21011123061180115, 0.05919691175222397, 0.09556611627340317, 0.0324610099196434, 0.11373614519834518, 0.11542147397994995, 0.004254546947777271, 0.10733281821012497, 0.0500684529542923, -0.04822303727269173, 0.14306919276714325, 0.032827045768499374, -0.017670227214694023, 0.0304852481931448, 0.04704435542225838, -0.03187015652656555, 0.02075354754924774, -0.06440161913633347, 0.11196915805339813, 0.13514995574951172, -0.08471442013978958, -0.0081911850720644, 0.04797748476266861, -0.0438203290104866, -0.1532401293516159, -0.08671712130308151, -0.024648865684866905, -0.2236001342535019, 0.08533021807670593, -0.06946314871311188, -0.13578248023986816, 0.019155733287334442, 0.013867083936929703, -0.028145823627710342, 0.11776147037744522, -0.07801362872123718, -0.03346126526594162, 0.020983682945370674, -0.039618294686079025, -0.09754771739244461, -0.09402462840080261, -0.07874704152345657, 0.03500581532716751, -0.04535633698105812, 0.025271590799093246, -0.05421067774295807, 0.015182215720415115, 0.10334893316030502, -0.04038224741816521, -0.041323766112327576, -0.0359976626932621, -0.035855069756507874, -0.11793428659439087, 0.025968458503484726, 0.044103916734457016, -0.03597194701433182, -0.05585090070962906, 0.17637495696544647, -0.04257858544588089, -0.01666315644979477, -0.1211012676358223, 0.14332374930381775, -0.04330325871706009, 0.03261799365282059, -0.10366860777139664, -0.08559805154800415, -0.10071583092212677, 0.27439257502555847, 0.2784624397754669, -0.14349330961704254, -0.009759977459907532, 0.02939503826200962, 0.004204166121780872, -0.14250165224075317, 0.14376720786094666, 0.01570971868932247, -0.024460898712277412, -0.027595078572630882, 0.026391539722681046, -0.007621914613991976, -0.0827714279294014, -0.03114704228937626, -0.05752136558294296, -0.006779014132916927, -0.05148708075284958, -0.034257955849170685, 0.06298708915710449, -0.12136059254407883, -0.09091135859489441, -0.05560125410556793, -0.0083417734131217, -0.03344108536839485, -0.07473809272050858, -0.019548200070858, 0.07662302255630493, 0.14781777560710907, -0.05502733215689659, 0.06005467101931572, -0.004367031157016754, -0.04969286173582077, -0.13970479369163513, -0.13660922646522522, 0.05449144169688225, -0.129489928483963, 0.26909253001213074, -0.050524767488241196, -0.05207161232829094, 0.041712693870067596, -0.03221052139997482, -0.05838879942893982, 0.020522039383649826, 0.009778409264981747, -0.05078497156500816, -0.029240628704428673, 0.09255361557006836, -0.033305004239082336, 0.009149706922471523, -0.022496739402413368, -0.22135144472122192, 0.0034119023475795984, -0.05107501149177551, 0.028507398441433907, -0.12569822371006012, 0.06501629203557968, -0.09348012506961823, 0.12403472512960434, 0.07595156878232956, -0.01166640967130661, -0.036088403314352036, -0.04733064025640488, 0.1257045865058899, 0.08392459154129028, -0.02910126931965351, -0.0870935395359993, -0.16758979856967926, -0.004611360374838114, -0.0011314527364447713, -0.08687946200370789, -0.23090760409832, -0.008421163074672222, -0.031696807593107224, 0.0109195401892066, -0.00838692206889391, 0.12826944887638092, 0.14749252796173096, 0.05249129980802536, 0.016358694061636925, -0.12719306349754333, 0.041898638010025024, 0.08496948331594467, -0.15762199461460114, -0.1707899123430252 ]
null
null
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
null
rh-research/clancy-decodable-texts-tokenizer
[ "transformers", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
2024-02-13T18:47:31+00:00
[ "1910.09700" ]
[]
TAGS #transformers #arxiv-1910.09700 #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #arxiv-1910.09700 #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ 26, 6, 3, 82, 28, 3, 4, 9, 9, 10, 42, 20, 3, 4, 5, 9, 11, 13, 3, 12, 5, 4, 5, 3, 4, 9, 53, 9, 8, 6, 3, 14, 8, 7, 9, 4 ]
[ "passage: TAGS\n#transformers #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact" ]
[ -0.08389580249786377, 0.19830818474292755, -0.0013316317927092314, 0.02313883788883686, 0.11396584659814835, 0.01961737498641014, 0.053626976907253265, 0.14538456499576569, 0.0060051376931369305, 0.10656800121068954, 0.066679947078228, 0.09131570905447006, 0.09678101539611816, 0.20042605698108673, 0.04371999576687813, -0.17659740149974823, 0.010636410675942898, -0.06930278241634369, -0.010073255747556686, 0.11651819199323654, 0.141214057803154, -0.10151198506355286, 0.07627976685762405, -0.03319970890879631, -0.02870541252195835, -0.0070160143077373505, -0.07769215852022171, -0.05755697935819626, 0.07573003321886063, 0.054863471537828445, 0.04207949340343475, -0.0008347301045432687, 0.08447454124689102, -0.2674994468688965, 0.013753628358244896, 0.07452993094921112, 0.010659529827535152, 0.05990942195057869, 0.07833302766084671, -0.04036625102162361, 0.12881849706172943, -0.06320446729660034, 0.13035163283348083, 0.0906217098236084, -0.0681561604142189, -0.24378153681755066, -0.08239314705133438, 0.06505522131919861, 0.12533815205097198, 0.07694927603006363, -0.02823091857135296, 0.16422191262245178, -0.07247646898031235, 0.019290022552013397, 0.09481704235076904, -0.1151006743311882, -0.060644298791885376, 0.08318385481834412, 0.14101974666118622, 0.10340547561645508, -0.1255619376897812, -0.012289565056562424, 0.04275871813297272, 0.045979104936122894, 0.07389909774065018, 0.011339850723743439, 0.1143413558602333, 0.05629947781562805, -0.13526225090026855, -0.05700986459851265, 0.14547574520111084, 0.023872992023825645, -0.057064127177000046, -0.2138909548521042, -0.002902575535699725, -0.07730814069509506, -0.011685127392411232, -0.06846728920936584, 0.0291305985301733, -0.01194276288151741, 0.060226380825042725, -0.0496203787624836, -0.09797755628824234, -0.046314824372529984, 0.1015089675784111, 0.054820988327264786, 0.011354796588420868, -0.01489334274083376, 0.03576440364122391, 0.13432876765727997, 0.04213530570268631, -0.10012737661600113, -0.07065672427415848, -0.0701170489192009, -0.09620913118124008, -0.03947552293539047, 0.04272124543786049, 0.020167991518974304, 0.042202774435281754, 0.2283228635787964, 0.024096308276057243, 0.05459817871451378, 0.029667891561985016, 0.0026177873369306326, 0.03211980313062668, 0.1073630079627037, -0.041210614144802094, -0.188126802444458, -0.03292805701494217, 0.0931866466999054, -0.009821015410125256, -0.028658604249358177, -0.033444397151470184, 0.035014089196920395, 0.08379437029361725, 0.11821532249450684, 0.08875755965709686, -0.012828069739043713, -0.037612639367580414, -0.03493109717965126, 0.2115669697523117, -0.14141373336315155, 0.045799970626831055, -0.022097334265708923, -0.018195297569036484, -0.06905751675367355, 0.030103791505098343, 0.01831657998263836, -0.003142025787383318, 0.06966056674718857, -0.061253178864717484, -0.05794486775994301, -0.11518853157758713, -0.045523155480623245, 0.04711875319480896, -0.024105608463287354, -0.024469668045639992, -0.07765042781829834, -0.11219723522663116, -0.06417357176542282, 0.06612563133239746, -0.04156653955578804, -0.03974827378988266, 0.005308232270181179, -0.07131324708461761, 0.008387917652726173, 0.008993842639029026, 0.12122467905282974, -0.030063031241297722, 0.05833350867033005, -0.002476902212947607, 0.05916252359747887, 0.10643328726291656, 0.03227818012237549, -0.08492200076580048, 0.057466037571430206, -0.20633617043495178, 0.08371785283088684, -0.11420095711946487, 0.034276340156793594, -0.17048145830631256, -0.024183684960007668, 0.008447963744401932, 0.023597201332449913, 0.023726604878902435, 0.1338067352771759, -0.2097422182559967, -0.016196569427847862, 0.14133213460445404, -0.09649793803691864, -0.12422871589660645, 0.07990546524524689, -0.03459475561976433, 0.1747698187828064, 0.038475677371025085, -0.019652999937534332, 0.09909367561340332, -0.15559963881969452, -0.05852397903800011, -0.026064254343509674, -0.008927824907004833, 0.08823978155851364, 0.07542291283607483, -0.05844951793551445, 0.02285866066813469, 0.02562655322253704, -0.04727208614349365, -0.0268824752420187, -0.05256075784564018, -0.10127434879541397, -0.023140445351600647, -0.09642518311738968, 0.026515161618590355, 0.000058677000197349116, -0.07310442626476288, -0.028560271486639977, -0.17347893118858337, -0.02563360333442688, 0.10103316605091095, 0.004820956848561764, -0.007559072691947222, -0.08540112525224686, 0.022149885073304176, -0.05362366884946823, -0.006164622958749533, -0.16996455192565918, -0.03558015450835228, 0.051895126700401306, -0.14917676150798798, 0.015460150316357613, -0.07327745854854584, 0.07047311216592789, 0.02098717913031578, -0.05859505757689476, -0.03108096309006214, 0.0007694467785768211, 0.004292082041501999, -0.06229274719953537, -0.1903683841228485, -0.058886781334877014, -0.041500482708215714, 0.15720732510089874, -0.24841000139713287, 0.0300158578902483, 0.03247617185115814, 0.13185922801494598, 0.007058668415993452, -0.06344027817249298, 0.02096918225288391, -0.04676475748419762, -0.050621338188648224, -0.06898977607488632, -0.009901339188218117, -0.014539826661348343, -0.031393732875585556, 0.012980648316442966, -0.14970256388187408, -0.060514215379953384, 0.09452559798955917, 0.11224991828203201, -0.14555825293064117, 0.00204002158716321, -0.0460561066865921, -0.07002599537372589, -0.07487804442644119, -0.0761631652712822, 0.07739497721195221, 0.044650159776210785, 0.049250341951847076, -0.06317461282014847, -0.06234706938266754, 0.023210179060697556, 0.005524294450879097, -0.019023682922124863, 0.0948529988527298, 0.074309803545475, -0.09122881293296814, 0.07973480224609375, 0.08461450785398483, 0.04414684325456619, 0.086973637342453, 0.005991141777485609, -0.11396963149309158, -0.03062884695827961, 0.037754856050014496, 0.024159027263522148, 0.15351562201976776, -0.08692087233066559, 0.030462130904197693, 0.052177220582962036, -0.03854219615459442, 0.03157065063714981, -0.0923321321606636, 0.025362705811858177, 0.021495236083865166, -0.006555700208991766, 0.05864228308200836, -0.018769768998026848, -0.01403577346354723, 0.06336429715156555, 0.05677810311317444, 0.044270504266023636, 0.02595379762351513, -0.02093072421848774, -0.1278371512889862, 0.16537296772003174, -0.09028079360723495, -0.2540280222892761, -0.17074446380138397, 0.015454737469553947, 0.03706491366028786, -0.021728800609707832, 0.039588842540979385, -0.06286025792360306, -0.10237989574670792, -0.09417891502380371, 0.0029635571409016848, 0.023925531655550003, -0.058347854763269424, -0.0817074254155159, 0.060779985040426254, 0.04047083482146263, -0.13689260184764862, 0.0349188968539238, 0.06170675903558731, -0.03042641654610634, 0.0018567070364952087, 0.07321398705244064, 0.12743599712848663, 0.14838241040706635, -0.006730219814926386, -0.012446845881640911, 0.035035960376262665, 0.229813352227211, -0.1490442156791687, 0.10630457103252411, 0.14053207635879517, -0.021705523133277893, 0.06635113060474396, 0.1461038440465927, 0.023231739178299904, -0.07546708732843399, 0.04147516191005707, 0.04027445614337921, -0.04228919371962547, -0.2589097023010254, -0.05694316700100899, -0.00946022942662239, -0.07043391466140747, 0.09718906134366989, 0.09238530695438385, 0.11972260475158691, 0.0337289460003376, -0.05568677559494972, -0.025771914049983025, -0.003401360474526882, 0.114128477871418, -0.027640055865049362, -0.004564122296869755, 0.07965842634439468, -0.05878787487745285, 0.011684526689350605, 0.09941446036100388, 0.019347423687577248, 0.17601320147514343, 0.02533329278230667, 0.10681075602769852, 0.06725578010082245, 0.09347675740718842, -0.0015635732561349869, 0.034774236381053925, 0.05337131395936012, 0.022044572979211807, 0.010453542694449425, -0.09408048540353775, -0.012431944720447063, 0.13713060319423676, 0.019816776737570763, 0.009031654335558414, 0.008926562033593655, -0.01010479498654604, 0.03131420537829399, 0.20501568913459778, 0.0009575071162544191, -0.22537250816822052, -0.09500737488269806, 0.059459153562784195, -0.06931101530790329, -0.143676295876503, -0.02094252221286297, 0.030270220711827278, -0.17292405664920807, 0.016790566965937614, -0.0316389761865139, 0.09112390875816345, -0.07145322859287262, -0.028050832450389862, 0.06891903281211853, 0.07569212466478348, -0.012108199298381805, 0.07973295450210571, -0.19069278240203857, 0.12254468351602554, 0.03037673607468605, 0.08605273067951202, -0.11708726733922958, 0.07849059253931046, -0.0019813794642686844, -0.014807495288550854, 0.17999744415283203, -0.014062200672924519, -0.0586031936109066, -0.08878950774669647, -0.08704045414924622, -0.011727320961654186, 0.10361312329769135, -0.09322915226221085, 0.09586969763040543, -0.02775636687874794, -0.03705112263560295, 0.012418309226632118, -0.10469507426023483, -0.1636953055858612, -0.18679304420948029, 0.06244563311338425, -0.07802703976631165, 0.012347841635346413, -0.11227322369813919, -0.06334327906370163, -0.01575082167983055, 0.23160123825073242, -0.16648635268211365, -0.07049825042486191, -0.1498587429523468, -0.03997112438082695, 0.17463743686676025, -0.042160745710134506, 0.06849376112222672, -0.021383514627814293, 0.1873992383480072, -0.008081548847258091, -0.013158116489648819, 0.06569221615791321, -0.09637628495693207, -0.16879262030124664, -0.05748843029141426, 0.14160962402820587, 0.10863390564918518, 0.05731578543782234, -0.0038195757661014795, 0.013171887956559658, -0.03383830562233925, -0.09896382689476013, 0.013824623078107834, 0.13817466795444489, 0.0034514935687184334, 0.00682973163202405, -0.03995988517999649, -0.07027145475149155, -0.05825701728463173, -0.07912654429674149, 0.057147104293107986, 0.187900573015213, -0.09512355923652649, 0.1602867990732193, 0.12431421875953674, -0.06468851119279861, -0.2306901067495346, 0.03996593505144119, 0.04701630026102066, 0.007666614837944508, 0.022401191294193268, -0.19138796627521515, 0.09788824617862701, 0.0009011493530124426, -0.06807263940572739, 0.14616990089416504, -0.16564498841762543, -0.1461436152458191, 0.08002161979675293, 0.025075770914554596, -0.22560662031173706, -0.14821304380893707, -0.1037549376487732, -0.03735695406794548, -0.13707835972309113, 0.048581719398498535, 0.02614329755306244, 0.019834673032164574, 0.025222565978765488, 0.005338077899068594, 0.029657263308763504, -0.07272187620401382, 0.1870686560869217, -0.020297454670071602, 0.0072362530045211315, -0.050640691071748734, -0.04617878794670105, 0.09227550774812698, -0.06150037795305252, 0.11741586774587631, 0.018679620698094368, 0.018796883523464203, -0.1431548148393631, -0.049209367483854294, -0.060803934931755066, 0.04456847906112671, -0.07284719496965408, -0.09393193572759628, -0.04137463867664337, 0.08888561278581619, 0.07211937010288239, -0.032792408019304276, -0.0027768779546022415, -0.07569456845521927, 0.09405932575464249, 0.184477761387825, 0.17357055842876434, 0.009977072477340698, -0.07020942866802216, 0.024555526673793793, -0.042279548943042755, 0.03349342197179794, -0.24652716517448425, 0.03456863760948181, 0.066053606569767, 0.03803660348057747, 0.08509242534637451, -0.016836483031511307, -0.1781480610370636, -0.04086102172732353, 0.08498652279376984, -0.06206206604838371, -0.19876568019390106, -0.02703288197517395, 0.08424776047468185, -0.20383712649345398, -0.032998621463775635, 0.041543323546648026, -0.03834589570760727, -0.02396267279982567, -0.002415500348433852, 0.06396626681089401, -0.008327016606926918, 0.12156640738248825, 0.06747189164161682, 0.10266115516424179, -0.09284433722496033, 0.08920657634735107, 0.10416955500841141, -0.09140542894601822, 0.03545991703867912, 0.10264154523611069, -0.05670900270342827, -0.04460543021559715, 0.033935222774744034, 0.05925208330154419, -0.028357384726405144, -0.06409841030836105, -0.000502707262057811, -0.0359574519097805, 0.04993389546871185, 0.08058220148086548, 0.036113787442445755, -0.01202210783958435, 0.06544706225395203, 0.028145326301455498, -0.11693570017814636, 0.10949387401342392, 0.04405685141682625, 0.04509059712290764, -0.07182393968105316, -0.012280966155230999, 0.015999672934412956, 0.032540347427129745, -0.019734015688300133, -0.014576527290046215, -0.03146412968635559, -0.007561005651950836, -0.1553635597229004, -0.02064543403685093, -0.06516171246767044, 0.006067827809602022, 0.022207623347640038, -0.03830232471227646, -0.012014663778245449, 0.01381110493093729, -0.07979435473680496, -0.07571027427911758, -0.01700955256819725, 0.08539021760225296, -0.1381402313709259, 0.006627439055591822, 0.07182712107896805, -0.10980239510536194, 0.07347989827394485, -0.0048679932951927185, 0.017079560086131096, 0.010923396795988083, -0.11654401570558548, 0.04386281594634056, -0.005810429807752371, 0.01551580335944891, 0.022556742653250694, -0.171111062169075, 0.011553828604519367, -0.038553636521101, -0.03114982508122921, 0.011926400475203991, -0.025060230866074562, -0.11875922232866287, 0.08676479011774063, -0.028097305446863174, -0.037512701004743576, -0.03292486071586609, 0.06296087801456451, 0.08736220002174377, -0.011740099638700485, 0.09667140990495682, -0.025766119360923767, 0.04818311333656311, -0.1756584197282791, -0.01910574547946453, -0.050167568027973175, 0.02537350542843342, -0.01759655587375164, -0.0070639788173139095, 0.055272240191698074, -0.004191063344478607, 0.20991376042366028, -0.03921036794781685, 0.1548677533864975, 0.05199402943253517, -0.009925156831741333, 0.010884369723498821, 0.05032730847597122, 0.06423956155776978, 0.031145188957452774, 0.00853167474269867, 0.04660189896821976, -0.004552975296974182, -0.020357951521873474, -0.13699717819690704, 0.02791593410074711, 0.16117429733276367, 0.061918217688798904, 0.0392887257039547, 0.03704594820737839, -0.1422400325536728, -0.09538721293210983, 0.10306388139724731, -0.0331864058971405, 0.014331420883536339, -0.08317886292934418, 0.17621558904647827, 0.12328410148620605, -0.1574767529964447, 0.0577850341796875, -0.07234696298837662, -0.05066767707467079, -0.1024852767586708, -0.11832084506750107, -0.06293155997991562, -0.06027044355869293, -0.004747506696730852, -0.042489297688007355, 0.05734556168317795, 0.026751231402158737, -0.003270963439717889, -0.006759525276720524, 0.12665949761867523, -0.0249644722789526, -0.004145825747400522, 0.04152364656329155, 0.0326087586581707, 0.019319625571370125, -0.05872373282909393, 0.017997145652770996, 0.018602589145302773, 0.022180357947945595, 0.06835069507360458, 0.0260987039655447, -0.059317342936992645, 0.044286735355854034, 0.00319746439345181, -0.11313364654779434, 0.018146557733416557, -0.00002245741598017048, -0.05020225793123245, 0.13557326793670654, 0.04076748713850975, 0.01548024732619524, -0.029270920902490616, 0.24342355132102966, -0.07199113070964813, -0.08681939542293549, -0.13965600728988647, 0.11511493474245071, -0.023563209921121597, 0.03755274787545204, 0.016542524099349976, -0.12659503519535065, 0.011511262506246567, 0.18531471490859985, 0.12824349105358124, 0.012459068559110165, -0.007656481582671404, 0.05736639350652695, -0.0007639875984750688, -0.05985576659440994, 0.05051197111606598, 0.0664999932050705, 0.16097788512706757, -0.09069112688302994, 0.0652846097946167, -0.008405503816902637, -0.0831485390663147, -0.027498632669448853, 0.11705785244703293, -0.022675158455967903, 0.02148384228348732, -0.03778035193681717, 0.11204422265291214, -0.052532415837049484, -0.2719486355781555, 0.02952493168413639, -0.09503202140331268, -0.13993041217327118, -0.02591860294342041, 0.041448429226875305, -0.03349510580301285, 0.01577647216618061, 0.06254769116640091, -0.045389387756586075, 0.18837277591228485, 0.025987716391682625, -0.08679025620222092, -0.07755549252033234, 0.05874146893620491, -0.08695939928293228, 0.2789687216281891, 0.003863075515255332, 0.04782010242342949, 0.12108923494815826, -0.03053574077785015, -0.18664880096912384, 0.014769754372537136, 0.11989909410476685, -0.09114406257867813, 0.07780203968286514, 0.18139931559562683, -0.005561648402363062, 0.12649618089199066, 0.04705416411161423, -0.03877115994691849, 0.03976387158036232, -0.02721380814909935, -0.03821522742509842, -0.12209630757570267, 0.05661242455244064, -0.0612691193819046, 0.15957388281822205, 0.1158948540687561, -0.05964287370443344, 0.001120698289014399, -0.06126941740512848, 0.06300627440214157, 0.014774397015571594, 0.12115653604269028, 0.018452486023306847, -0.2023056596517563, 0.05087360367178917, -0.03283824771642685, 0.08166342973709106, -0.254973828792572, -0.08186668157577515, 0.07622263580560684, -0.019022729247808456, -0.04275642707943916, 0.12311509251594543, 0.06101066991686821, 0.03676839917898178, -0.03853875398635864, -0.08537755906581879, -0.01412904355674982, 0.15376435220241547, -0.14123432338237762, -0.029574336484074593 ]
null
null
null
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # openhermes-2.5-mistral-7b-losstype-kto-pair This model is a fine-tuned version of [teknium/OpenHermes-2.5-Mistral-7B](https://huggingface.co/teknium/OpenHermes-2.5-Mistral-7B) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 4 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_steps: 50 - training_steps: 200 ### Training results ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.0+cu121 - Datasets 2.17.0 - Tokenizers 0.15.1
{"license": "apache-2.0", "tags": ["trl", "dpo", "generated_from_trainer"], "base_model": "teknium/OpenHermes-2.5-Mistral-7B", "model-index": [{"name": "openhermes-2.5-mistral-7b-losstype-kto-pair", "results": []}]}
null
DrishtiSharma/openhermes-2.5-mistral-7b-losstype-kto-pair
[ "safetensors", "trl", "dpo", "generated_from_trainer", "base_model:teknium/OpenHermes-2.5-Mistral-7B", "license:apache-2.0", "region:us" ]
2024-02-13T18:48:53+00:00
[]
[]
TAGS #safetensors #trl #dpo #generated_from_trainer #base_model-teknium/OpenHermes-2.5-Mistral-7B #license-apache-2.0 #region-us
# openhermes-2.5-mistral-7b-losstype-kto-pair This model is a fine-tuned version of teknium/OpenHermes-2.5-Mistral-7B on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 4 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_steps: 50 - training_steps: 200 ### Training results ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.0+cu121 - Datasets 2.17.0 - Tokenizers 0.15.1
[ "# openhermes-2.5-mistral-7b-losstype-kto-pair\n\nThis model is a fine-tuned version of teknium/OpenHermes-2.5-Mistral-7B on an unknown dataset.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 4\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 16\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_steps: 50\n- training_steps: 200", "### Training results", "### Framework versions\n\n- Transformers 4.35.2\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1" ]
[ "TAGS\n#safetensors #trl #dpo #generated_from_trainer #base_model-teknium/OpenHermes-2.5-Mistral-7B #license-apache-2.0 #region-us \n", "# openhermes-2.5-mistral-7b-losstype-kto-pair\n\nThis model is a fine-tuned version of teknium/OpenHermes-2.5-Mistral-7B on an unknown dataset.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 4\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 16\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_steps: 50\n- training_steps: 200", "### Training results", "### Framework versions\n\n- Transformers 4.35.2\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1" ]
[ 50, 50, 6, 12, 8, 3, 128, 4, 33 ]
[ "passage: TAGS\n#safetensors #trl #dpo #generated_from_trainer #base_model-teknium/OpenHermes-2.5-Mistral-7B #license-apache-2.0 #region-us \n# openhermes-2.5-mistral-7b-losstype-kto-pair\n\nThis model is a fine-tuned version of teknium/OpenHermes-2.5-Mistral-7B on an unknown dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 4\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 16\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_steps: 50\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.35.2\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1" ]
[ -0.09415564686059952, 0.07628729194402695, -0.0033541577868163586, 0.08454758673906326, 0.1139080747961998, 0.030006181448698044, 0.13886018097400665, 0.1274145096540451, -0.0269387885928154, 0.07426600903272629, 0.022806771099567413, 0.06308356672525406, 0.04953146353363991, 0.15982075035572052, -0.06254827231168747, -0.23798879981040955, 0.015395576134324074, -0.03868567943572998, -0.05469183251261711, 0.08951862901449203, 0.09856434166431427, -0.10088978707790375, 0.04484860226511955, 0.01846626214683056, -0.09775164723396301, -0.0031606433913111687, -0.0021025226451456547, -0.05280708521604538, 0.13090473413467407, -0.031225288286805153, 0.11448590457439423, 0.017781352624297142, 0.1378612518310547, -0.17447100579738617, 0.0032481688540428877, 0.08366350829601288, 0.0346168652176857, 0.08862251043319702, 0.06520573794841766, 0.003134312806650996, 0.02726798877120018, -0.16151636838912964, 0.09783852845430374, 0.009509903378784657, -0.08816990256309509, -0.20288102328777313, -0.13412420451641083, 0.038968686014413834, 0.09746963530778885, 0.07287101447582245, 0.022882569581270218, 0.16873225569725037, -0.04651368036866188, 0.06979396939277649, 0.2671772539615631, -0.2864268720149994, -0.06411579251289368, 0.06949043273925781, 0.06380917131900787, 0.07111608982086182, -0.09780830144882202, -0.02879253774881363, 0.03753522410988808, 0.011146257631480694, 0.09943447262048721, 0.00037938146851956844, -0.06524042040109634, -0.028592722490429878, -0.13493883609771729, -0.017101150006055832, 0.10490331053733826, 0.016097262501716614, -0.05141596496105194, -0.09406180679798126, -0.09824108332395554, -0.07948046177625656, -0.010154924355447292, -0.06452923268079758, 0.05089425668120384, -0.039079997688531876, -0.016692835837602615, -0.045176584273576736, -0.07751861959695816, -0.04531623795628548, -0.022307414561510086, 0.1374679058790207, 0.03433318808674812, 0.022383209317922592, -0.026365557685494423, 0.1287393718957901, -0.040481798350811005, -0.13578525185585022, -0.0054516117088496685, -0.0024730677250772715, -0.07875803112983704, -0.07773203402757645, -0.05191458761692047, -0.055217161774635315, -0.006007078103721142, 0.18333609402179718, -0.09519520401954651, 0.05586770176887512, -0.024657711386680603, 0.0034073053393512964, -0.0486285500228405, 0.11506548523902893, -0.04204891249537468, -0.02706856280565262, 0.020618846639990807, 0.10778887569904327, 0.04518864303827286, -0.0013282891595736146, -0.05143723264336586, -0.057504601776599884, 0.09088234603404999, 0.053020160645246506, -0.03477345034480095, 0.02298920601606369, -0.04244516044855118, -0.03966483846306801, 0.08988352119922638, -0.13339632749557495, 0.02698834240436554, -0.0039583537727594376, -0.0927881971001625, -0.028683224692940712, 0.014406967908143997, 0.039417412132024765, 0.008947585709393024, 0.12043051421642303, -0.06418915838003159, 0.011701556853950024, -0.07787884026765823, -0.06806086003780365, 0.01836840808391571, -0.09674199670553207, -0.02706514671444893, -0.06336083263158798, -0.21382348239421844, -0.02646023966372013, 0.05822934955358505, -0.056080035865306854, -0.010831561870872974, -0.0005043167038820684, -0.07542482018470764, 0.027188336476683617, -0.005094372667372227, 0.10269639641046524, -0.042409829795360565, 0.06589853763580322, -0.03398345410823822, 0.04394654184579849, 0.018304212018847466, 0.012681281194090843, -0.08369667828083038, 0.03648554906249046, -0.16851268708705902, 0.03034573793411255, -0.0660797581076622, 0.01806105673313141, -0.1268366128206253, -0.0832219049334526, 0.001484443317167461, -0.06723956763744354, 0.08506245911121368, 0.09251885116100311, -0.19861964881420135, 0.013060124590992928, 0.1844567507505417, -0.10124736279249191, -0.08832518011331558, 0.09290798753499985, -0.05147626996040344, 0.04012801870703697, 0.039244458079338074, 0.15508830547332764, 0.08522003889083862, -0.17942777276039124, 0.005792046897113323, -0.026406699791550636, 0.05803591012954712, 0.05643075332045555, 0.06508741527795792, -0.015849607065320015, 0.08088433742523193, -0.008617814630270004, -0.06552977114915848, -0.03223356977105141, -0.06830575317144394, -0.07903873175382614, -0.06487381458282471, -0.07697867602109909, 0.0911644995212555, 0.017134051769971848, 0.03660152852535248, -0.07130290567874908, -0.08764450252056122, 0.09590795636177063, 0.13564686477184296, -0.0490114800632, 0.030377747491002083, -0.06914399564266205, 0.038975730538368225, -0.0026282398030161858, -0.04879973083734512, -0.2054816335439682, -0.11961983889341354, 0.05411935970187187, -0.10479801893234253, -0.008461068384349346, 0.02870260179042816, 0.07443848252296448, 0.07303903251886368, -0.06823208928108215, -0.01476966217160225, -0.07634853571653366, 0.004808804485946894, -0.0978127270936966, -0.19913014769554138, -0.04391177371144295, -0.03992727771401405, 0.1425188183784485, -0.1848963052034378, 0.012720105238258839, -0.008124923333525658, 0.1630503237247467, 0.04853508621454239, -0.032856568694114685, -0.025879377499222755, 0.024898797273635864, 0.00851092953234911, -0.10054883360862732, 0.030757935717701912, -0.003043713979423046, -0.061848320066928864, -0.05329131335020065, -0.16364893317222595, 0.031372103840112686, 0.06599712371826172, 0.08323551714420319, -0.09138612449169159, -0.03663567826151848, -0.07015998661518097, -0.06084412708878517, -0.07833115011453629, 0.0011022230610251427, 0.14404793083667755, 0.015760691836476326, 0.10095582157373428, -0.08323902636766434, -0.08813268691301346, -0.0027278962079435587, -0.002195800421759486, 0.01676998846232891, 0.04213272035121918, 0.013270189054310322, -0.16080349683761597, 0.06658343225717545, 0.13648059964179993, -0.07012204825878143, 0.08553729206323624, -0.05431130528450012, -0.07021427154541016, -0.04239566624164581, 0.020561352372169495, 0.004696371965110302, 0.1175956204533577, -0.009860987775027752, 0.04466622322797775, 0.029535634443163872, 0.03053656592965126, 0.004718427546322346, -0.18488450348377228, -0.00514982407912612, 0.010837653651833534, -0.037463750690221786, -0.0005358250346034765, 0.018104616552591324, 0.0356251485645771, 0.10497308522462845, 0.02396390214562416, -0.02890288084745407, 0.01444972213357687, -0.02383340150117874, -0.08134769648313522, 0.17137561738491058, -0.11600366979837418, -0.054396938532590866, -0.05397653579711914, 0.035296596586704254, -0.005654553417116404, -0.04010225832462311, 0.012339654378592968, -0.04813060164451599, -0.04558864235877991, -0.09202510863542557, -0.07942081987857819, 0.014125313609838486, -0.008864855393767357, 0.08102438598871231, 0.00842027086764574, 0.0783727616071701, -0.1289219707250595, -0.006072251591831446, -0.02935846894979477, -0.07664506137371063, 0.022891715168952942, 0.07503388822078705, 0.05425817891955376, 0.12521126866340637, -0.03655437380075455, -0.002755203749984503, -0.03848887234926224, 0.19789396226406097, -0.0785386934876442, 0.017245251685380936, 0.13241717219352722, -0.025722038000822067, 0.08651597052812576, 0.15811850130558014, 0.048218026757240295, -0.07761082053184509, 0.030361860990524292, 0.05265137180685997, -0.008308405056595802, -0.27878016233444214, -0.03388535603880882, -0.024580584838986397, -0.042769595980644226, 0.06081630289554596, 0.02815910615026951, 0.05750228092074394, 0.033299520611763, -0.023778289556503296, 0.014637519605457783, 0.023262828588485718, 0.08209516853094101, 0.09469354897737503, 0.06935256719589233, 0.0923079252243042, -0.015148608013987541, -0.004946056287735701, 0.055893998593091965, 0.013318276964128017, 0.28240859508514404, -0.023446787148714066, 0.10433971136808395, 0.03243432193994522, 0.12365072965621948, -0.0369202084839344, 0.05366448312997818, -0.0027892293874174356, -0.0038309311494231224, 0.00403906824067235, -0.08171252906322479, 0.012016392312943935, 0.04249199107289314, -0.0761013999581337, 0.04656297713518143, -0.049630552530288696, 0.03764680027961731, 0.03155979886651039, 0.2515951991081238, 0.04454389214515686, -0.2672520577907562, -0.06738266348838806, 0.011514516547322273, -0.028379743918776512, -0.024082981050014496, -0.01890391670167446, 0.09520906209945679, -0.12970317900180817, 0.08319965749979019, -0.09492561221122742, 0.06241562217473984, -0.013545085676014423, -0.009537145495414734, 0.09030468761920929, 0.10591775923967361, -0.01021862868219614, 0.04619157314300537, -0.17477470636367798, 0.2420545071363449, 0.01638675108551979, 0.10110192000865936, -0.06393413245677948, 0.025961877778172493, 0.032156068831682205, 0.04818068444728851, 0.08608788996934891, -0.005185425281524658, -0.07034716010093689, -0.14027604460716248, -0.09649675339460373, 0.004279772751033306, 0.13507679104804993, -0.042664848268032074, 0.09463880956172943, -0.04936942458152771, -0.009925850667059422, 0.05572585016489029, -0.023498564958572388, -0.18502937257289886, -0.07431679219007492, 0.05556792393326759, -0.0010209310567006469, -0.049311526119709015, -0.08674432337284088, -0.10751810669898987, -0.06236182898283005, 0.12325798720121384, 0.007498182822018862, -0.05243609473109245, -0.13358280062675476, 0.03598625957965851, 0.1602688580751419, -0.06254969537258148, 0.0482083223760128, 0.03505619242787361, 0.09596521407365799, 0.01766115613281727, -0.0867212787270546, 0.07705368101596832, -0.07743079960346222, -0.21998120844364166, -0.073581762611866, 0.1233406588435173, 0.06215346232056618, 0.04983862116932869, 0.00535545963793993, 0.048660457134246826, 0.03084695339202881, -0.06418824940919876, 0.022551029920578003, 0.10019338130950928, 0.10492197424173355, 0.01642126962542534, -0.06758137792348862, -0.01035385113209486, -0.036536868661642075, -0.03208886459469795, 0.08883287757635117, 0.2556919455528259, -0.08506572246551514, 0.08951257914304733, 0.03480324521660805, -0.0652274563908577, -0.1333221197128296, 0.06813769787549973, 0.12367662042379379, -0.005383449140936136, 0.09093260765075684, -0.14219138026237488, 0.08649400621652603, 0.11020687222480774, -0.05440666526556015, 0.062033846974372864, -0.3437484800815582, -0.14571411907672882, 0.030547725036740303, 0.09918521344661713, 0.029602503404021263, -0.12683792412281036, -0.03658593073487282, -0.008681632578372955, -0.1417877972126007, 0.09890991449356079, -0.12722040712833405, 0.08126861602067947, -0.010191967710852623, 0.05997741222381592, 0.022420315071940422, -0.051381878554821014, 0.13667762279510498, 0.02263065241277218, 0.09578553587198257, -0.0598640963435173, 0.009517188183963299, 0.11006847023963928, -0.07621238380670547, 0.027296114712953568, 0.017417334020137787, 0.06140592321753502, -0.07534836232662201, -0.004907695110887289, -0.06496009230613708, 0.04821264371275902, -0.04733230546116829, -0.07628881931304932, -0.04382631555199623, 0.07795432209968567, 0.07800733298063278, -0.046215713024139404, 0.0667785182595253, 0.06730861216783524, 0.12616954743862152, 0.12486351281404495, 0.09516119211912155, 0.04443173110485077, -0.03763735666871071, 0.016932817175984383, -0.015672599896788597, 0.03591959923505783, -0.09644560515880585, 0.025993267074227333, 0.12027396261692047, 0.049420371651649475, 0.10349061340093613, 0.03679405897855759, -0.06078817695379257, 0.013848784379661083, 0.02484068088233471, -0.11194601655006409, -0.12276507169008255, 0.02861921116709709, 0.022299760952591896, -0.11735674738883972, 0.042116835713386536, 0.1273413449525833, -0.049710676074028015, -0.016460711136460304, -0.0018558748997747898, 0.024588417261838913, 0.0067938221618533134, 0.18415260314941406, 0.01857379637658596, 0.055311087518930435, -0.05329513922333717, 0.11713460087776184, 0.08354152739048004, -0.07995761930942535, 0.019609814509749413, 0.05329069495201111, -0.10983770340681076, -0.002803683513775468, 0.06257972866296768, 0.0805298313498497, -0.02329406887292862, -0.024682624265551567, -0.08854049444198608, -0.06706590205430984, 0.02275441400706768, 0.17339977622032166, 0.055103570222854614, -0.008101635612547398, -0.006830768659710884, 0.042707253247499466, -0.1031835600733757, 0.12274375557899475, 0.060252223163843155, 0.09368405491113663, -0.1360054463148117, 0.11052626371383667, -0.008504601195454597, 0.012558961287140846, -0.01231768261641264, 0.037074014544487, -0.05752559006214142, -0.032520413398742676, -0.1235106885433197, 0.010643314570188522, -0.017747534438967705, -0.009920729324221611, -0.026020878925919533, -0.053203780204057693, -0.023193106055259705, 0.04416833817958832, -0.0664018839597702, -0.04553529620170593, 0.005437130574136972, 0.037845510989427567, -0.1499107927083969, -0.044450413435697556, 0.039706017822027206, -0.08831346035003662, 0.09536069631576538, 0.06776225566864014, 0.034301139414310455, 0.010727271437644958, -0.11420886963605881, -0.0016490316484123468, 0.02799701690673828, 0.028398271650075912, 0.04721834883093834, -0.13723206520080566, -0.03299254924058914, -0.04360666871070862, 0.023908443748950958, 0.014315498061478138, 0.054265134036540985, -0.1278737485408783, 0.0014321445487439632, -0.03727377951145172, -0.04896644502878189, -0.04937710985541344, 0.0025050980038940907, 0.09090953320264816, 0.018232053145766258, 0.134674534201622, -0.06853785365819931, 0.060566600412130356, -0.20913110673427582, -0.04533657804131508, 0.01332295686006546, -0.020412856712937355, -0.0860479325056076, -0.0057545010931789875, 0.07495120167732239, -0.06721490621566772, 0.08923127502202988, -0.033086374402046204, 0.0885339081287384, 0.037246882915496826, -0.06570161879062653, -0.027018873021006584, 0.022032974287867546, 0.1649504452943802, 0.048320118337869644, -0.02070780098438263, 0.07001719623804092, -0.033153317868709564, 0.0349532850086689, -0.008536643348634243, 0.21273663640022278, 0.17334017157554626, -0.012874466367065907, 0.022416342049837112, 0.041725773364305496, -0.10862325131893158, -0.17280001938343048, 0.0708780437707901, -0.03545912355184555, 0.0759965032339096, -0.05308929458260536, 0.16709859669208527, 0.10467540472745895, -0.19266442954540253, 0.03746087849140167, -0.04990864545106888, -0.08545932173728943, -0.12093113362789154, -0.05048812925815582, -0.08027942478656769, -0.11160160601139069, 0.009755573235452175, -0.10749182850122452, 0.0647885724902153, 0.10269198566675186, 0.007557902485132217, 0.02219163440167904, 0.15880391001701355, -0.0443430058658123, 0.017561234533786774, 0.05518759414553642, 0.04482260346412659, -0.0039611137472093105, -0.017642242833971977, -0.08576377481222153, 0.030695363879203796, -0.009738360531628132, 0.04889412596821785, -0.043458689004182816, -0.0003363807045388967, 0.010110621340572834, 0.005968363955616951, -0.07316170632839203, 0.034312620759010315, 0.01776651106774807, 0.02767670340836048, -0.0012045766925439239, 0.06269340217113495, 0.03591826558113098, -0.03744853660464287, 0.28770381212234497, -0.0751434788107872, -0.09253992885351181, -0.1070559099316597, 0.1835961937904358, 0.03306316211819649, -0.02231001853942871, 0.08524934947490692, -0.1268829107284546, -0.0009146155207417905, 0.08372130244970322, 0.1264238953590393, -0.06132224202156067, -0.0019258243264630437, 0.0016972341109067202, -0.01896948553621769, -0.04952584579586983, 0.1002286821603775, 0.07921626418828964, 0.0313524529337883, -0.06893803179264069, -0.008059817366302013, -0.008670013397932053, -0.015027527697384357, -0.08565892279148102, 0.043653666973114014, -0.010990400798618793, -0.0006272666505537927, -0.028222380205988884, 0.08295784890651703, 0.03159837797284126, -0.1616995632648468, 0.03448081389069557, -0.18553079664707184, -0.18594256043434143, 0.00199879496358335, 0.0961449146270752, -0.02653377503156662, 0.04235105589032173, -0.01326674409210682, -0.02659304067492485, 0.1320308893918991, -0.013988188467919827, -0.014871903695166111, -0.08247038722038269, 0.07084719836711884, -0.08330710232257843, 0.21110819280147552, -0.009391079656779766, 0.1246439591050148, 0.09033023566007614, 0.025031717494130135, -0.11786220222711563, 0.01171621959656477, 0.10615307837724686, -0.0975874736905098, 0.012623792514204979, 0.1569194346666336, -0.045103006064891815, 0.06719639152288437, 0.07498707622289658, -0.13428281247615814, -0.004811562597751617, 0.054798368364572525, -0.015669861808419228, -0.09066356718540192, 0.016278423368930817, -0.05931542441248894, 0.16510291397571564, 0.1787969172000885, -0.04272816330194473, 0.0430523082613945, -0.04663589596748352, -0.0006092198891565204, 0.06003591790795326, 0.06281708925962448, -0.039848193526268005, -0.21655739843845367, 0.04685371369123459, 0.036114465445280075, 0.049303945153951645, -0.1565106213092804, -0.11447723954916, 0.05073541775345802, -0.06680595874786377, -0.07820413261651993, 0.08444137871265411, 0.03736540302634239, 0.011604090221226215, -0.019049350172281265, -0.10760336369276047, -0.054110605269670486, 0.14383283257484436, -0.17746739089488983, -0.04131060466170311 ]
null
null
peft
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed] ### Framework versions - PEFT 0.8.2
{"library_name": "peft", "base_model": "bigscience/bloomz-560m"}
null
KapitalK/bloomz-560m_PROMPT_TUNING_CAUSAL_LM
[ "peft", "arxiv:1910.09700", "base_model:bigscience/bloomz-560m", "region:us" ]
2024-02-13T18:49:41+00:00
[ "1910.09700" ]
[]
TAGS #peft #arxiv-1910.09700 #base_model-bigscience/bloomz-560m #region-us
# Model Card for Model ID ## Model Details ### Model Description - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact ### Framework versions - PEFT 0.8.2
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact", "### Framework versions\n\n- PEFT 0.8.2" ]
[ "TAGS\n#peft #arxiv-1910.09700 #base_model-bigscience/bloomz-560m #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact", "### Framework versions\n\n- PEFT 0.8.2" ]
[ 32, 6, 3, 54, 28, 3, 4, 9, 9, 10, 42, 20, 3, 4, 5, 9, 11, 13, 3, 12, 5, 4, 5, 3, 4, 9, 53, 9, 8, 6, 3, 14, 8, 7, 9, 4, 11 ]
[ "passage: TAGS\n#peft #arxiv-1910.09700 #base_model-bigscience/bloomz-560m #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact### Framework versions\n\n- PEFT 0.8.2" ]
[ -0.09827404469251633, 0.17266730964183807, -0.00376726221293211, 0.04485897347331047, 0.0893060564994812, 0.018520722165703773, 0.04626883938908577, 0.12264665961265564, -0.043283611536026, 0.10607341676950455, 0.06183099374175072, 0.09882752597332001, 0.09598874300718307, 0.20144499838352203, 0.0003961017355322838, -0.20316070318222046, 0.017409248277544975, -0.0968066155910492, -0.01019456796348095, 0.12070485949516296, 0.15854911506175995, -0.09508828073740005, 0.08335523307323456, -0.015109829604625702, -0.015660399571061134, -0.03595199063420296, -0.06995563954114914, -0.040859393775463104, 0.038270700722932816, 0.058497220277786255, 0.04765821993350983, -0.009079203940927982, 0.07373015582561493, -0.25608915090560913, 0.018842291086912155, 0.03376877307891846, -0.012535449117422104, 0.08973148465156555, 0.10477028042078018, -0.038252927362918854, 0.10601062327623367, -0.045205965638160706, 0.12653307616710663, 0.07298438251018524, -0.08114704489707947, -0.17820730805397034, -0.08292187005281448, 0.0804942175745964, 0.15695533156394958, 0.07365331798791885, -0.04043007642030716, 0.14912497997283936, -0.120717853307724, 0.01472374889999628, 0.03554055467247963, -0.04110551252961159, -0.07723139971494675, 0.0428672656416893, 0.10999336838722229, 0.06302442401647568, -0.13748592138290405, -0.03736738860607147, 0.019169704988598824, 0.032298486679792404, 0.0784728080034256, 0.023193173110485077, 0.14914056658744812, 0.035773854702711105, -0.14309023320674896, -0.029802558943629265, 0.1401473581790924, 0.05311301350593567, -0.046030182391405106, -0.22319577634334564, 0.009310593828558922, -0.080091692507267, -0.02502988837659359, -0.05182606726884842, 0.044844768941402435, -0.013002433814108372, 0.084390789270401, -0.013166582211852074, -0.08815861493349075, -0.01796162687242031, 0.07386163622140884, 0.04399246349930763, 0.026025108993053436, -0.022301876917481422, -0.015375199727714062, 0.11676954478025436, 0.05453617870807648, -0.12733811140060425, -0.06597702205181122, -0.06423495709896088, -0.045732200145721436, -0.06545696407556534, 0.028913646936416626, 0.05762161687016487, 0.064895860850811, 0.23540373146533966, -0.007948646321892738, 0.04018096625804901, 0.06306301057338715, 0.015095217153429985, 0.06392905116081238, 0.08752516657114029, -0.07850594818592072, -0.14332883059978485, -0.014993380755186081, 0.08322615921497345, -0.01143932156264782, -0.010391321033239365, -0.040087029337882996, 0.039801474660634995, 0.03866592422127724, 0.09526184946298599, 0.09682362526655197, -0.015496031381189823, -0.08544738590717316, -0.05454263836145401, 0.2207583785057068, -0.14364829659461975, 0.038841743022203445, 0.018264520913362503, -0.03496870771050453, -0.0243898443877697, -0.0005758291226811707, 0.010753428563475609, -0.01964801922440529, 0.09145888686180115, -0.0757153257727623, -0.02588549628853798, -0.11404547095298767, -0.009066109545528889, 0.04092908278107643, 0.029571836814284325, -0.003923976793885231, -0.020146317780017853, -0.05780142545700073, -0.0844995379447937, 0.08962846547365189, -0.09154074639081955, -0.07107854634523392, -0.019801847636699677, -0.09979425370693207, 0.021255599334836006, 0.01882046088576317, 0.1405511498451233, -0.02851918712258339, 0.03508340194821358, -0.01989334635436535, 0.05272646248340607, 0.0719083845615387, 0.0335816852748394, -0.058722157031297684, 0.05683068558573723, -0.18173059821128845, 0.09491260349750519, -0.08277388662099838, 0.023462090641260147, -0.15695013105869293, -0.02290850505232811, 0.021636078134179115, 0.010529430583119392, 0.029922164976596832, 0.140400230884552, -0.2092137336730957, -0.013065788894891739, 0.14696837961673737, -0.08200514316558838, -0.11919160932302475, 0.05126875266432762, -0.06705854088068008, 0.14716219902038574, 0.022744515910744667, -0.033359501510858536, 0.08683118224143982, -0.15939053893089294, -0.037495486438274384, -0.027818839997053146, -0.010687648318707943, 0.10164796561002731, 0.1002616137266159, -0.06318724155426025, 0.042133528739213943, 0.018418265506625175, -0.03751600906252861, -0.032476939260959625, -0.054911211133003235, -0.11586476862430573, -0.0036700156051665545, -0.07739117741584778, 0.026741810142993927, -0.02418622002005577, -0.05807553976774216, -0.01978924684226513, -0.15841038525104523, -0.006169588770717382, 0.08603319525718689, 0.02697078511118889, -0.021377170458436012, -0.08886057883501053, 0.028417151421308517, -0.022466065362095833, -0.03544139489531517, -0.14594268798828125, -0.021627871319651604, 0.023548021912574768, -0.14787057042121887, 0.01313406229019165, -0.1014133095741272, 0.05729568004608154, 0.012602048926055431, -0.06706416606903076, -0.019272511824965477, -0.019462576135993004, 0.013997922651469707, -0.05208559334278107, -0.23741251230239868, -0.015125464648008347, -0.0469844825565815, 0.12714146077632904, -0.20867863297462463, 0.031363487243652344, 0.060749247670173645, 0.11317174136638641, -0.00538917351514101, -0.058160532265901566, 0.024006487801671028, -0.07428193837404251, -0.02537122741341591, -0.06016148626804352, -0.01378115825355053, -0.012913265265524387, -0.054448552429676056, 0.01679251901805401, -0.09908322244882584, -0.035430654883384705, 0.10510715842247009, 0.07684783637523651, -0.16666515171527863, -0.03172311186790466, -0.037872690707445145, -0.07491100579500198, -0.08683482557535172, -0.05730609968304634, 0.10779287666082382, 0.04557664319872856, 0.03054034523665905, -0.07871444523334503, -0.08143545687198639, 0.009766398929059505, -0.022226881235837936, -0.024741515517234802, 0.11652851849794388, 0.06814341992139816, -0.11853597313165665, 0.10286245495080948, 0.07667357474565506, 0.022074216976761818, 0.09174758940935135, -0.023795874789357185, -0.11826328933238983, -0.051209889352321625, 0.038096386939287186, 0.007294717710465193, 0.1596144288778305, -0.07043374329805374, 0.07174376398324966, 0.04966499283909798, -0.016445133835077286, 0.055969033390283585, -0.08727490156888962, 0.012526416219770908, 0.005292365327477455, -0.011805002577602863, -0.0007113047176972032, -0.028615852817893028, 0.019927211105823517, 0.08374058455228806, 0.048562806099653244, 0.03973165899515152, 0.043686918914318085, -0.03257102891802788, -0.12130744010210037, 0.1890627145767212, -0.10268159210681915, -0.21919062733650208, -0.1630459725856781, 0.048309795558452606, 0.04544161632657051, -0.02361382730305195, 0.010783377103507519, -0.043995242565870285, -0.09933824092149734, -0.0768556222319603, 0.005279871169477701, 0.03604967147111893, -0.06484314799308777, -0.08014102280139923, 0.05974289029836655, 0.04873797670006752, -0.12508618831634521, 0.03656737878918648, 0.05442854389548302, -0.020494773983955383, 0.009645677171647549, 0.07937455177307129, 0.07558566331863403, 0.14484533667564392, -0.010084442794322968, -0.016699708998203278, 0.055788423866033554, 0.27848124504089355, -0.15555016696453094, 0.10555193573236465, 0.117070272564888, -0.0598658062517643, 0.07629258185625076, 0.1835314929485321, 0.03809867054224014, -0.10639524459838867, 0.041297633200883865, 0.022011781111359596, -0.022058818489313126, -0.2811734974384308, -0.05612686276435852, -0.012059752829372883, -0.10717790573835373, 0.06780356913805008, 0.08334983885288239, 0.07789995521306992, 0.04694349318742752, -0.06110457703471184, -0.08575929701328278, 0.015274429693818092, 0.08429945260286331, -0.02752428874373436, 0.010007893666625023, 0.0821513757109642, -0.022572945803403854, 0.011858902871608734, 0.11125783622264862, -0.0003316145739518106, 0.18756777048110962, 0.05026058107614517, 0.12485052645206451, 0.08532799035310745, 0.09155002981424332, -0.0017510338220745325, 0.023795226588845253, 0.022403020411729813, 0.01483996957540512, 0.007560350466519594, -0.07960300892591476, 0.04828066751360893, 0.10711178183555603, 0.05897655338048935, 0.04045264422893524, 0.014514916576445103, -0.05622367188334465, 0.05368030071258545, 0.17201849818229675, -0.005226670764386654, -0.19052989780902863, -0.07189963757991791, 0.06594829261302948, -0.08326873928308487, -0.13007162511348724, -0.022824538871645927, 0.04193832352757454, -0.17079856991767883, 0.007558615878224373, -0.03922991082072258, 0.09708382189273834, -0.07656515389680862, -0.04083341732621193, 0.07631676644086838, 0.07551859319210052, -0.02041557990014553, 0.07216814160346985, -0.19463591277599335, 0.12749259173870087, 0.017533807083964348, 0.06933008879423141, -0.09369450062513351, 0.10771512240171432, 0.003505607368424535, -0.02369958721101284, 0.15774938464164734, 0.00887187197804451, -0.054139912128448486, -0.057250071316957474, -0.11364222317934036, -0.014552746899425983, 0.0920276865363121, -0.12564973533153534, 0.06726434826850891, -0.004189790692180395, -0.023898236453533173, 0.009359912946820259, -0.0774727389216423, -0.12463488429784775, -0.171754851937294, 0.057629067450761795, -0.13735994696617126, 0.04045981541275978, -0.08980630338191986, -0.06881117075681686, -0.01359375286847353, 0.17088359594345093, -0.189789816737175, -0.07634947448968887, -0.1421215832233429, -0.09029028564691544, 0.1790163516998291, -0.0473557710647583, 0.08452105522155762, 0.018015198409557343, 0.16011777520179749, 0.03007567673921585, 0.004529264289885759, 0.1066836267709732, -0.08994124084711075, -0.19140680134296417, -0.057434841990470886, 0.15142817795276642, 0.14971856772899628, 0.04845865070819855, -0.013551324605941772, 0.02077396586537361, -0.06651601195335388, -0.12283282727003098, 0.018174385651946068, 0.1325407177209854, 0.09909801930189133, -0.00021029741037636995, -0.025267725810408592, -0.10140743106603622, -0.057602640241384506, -0.0696893185377121, 0.01637539453804493, 0.20123475790023804, -0.06980805099010468, 0.16476072371006012, 0.1084740161895752, -0.05691925063729286, -0.19778351485729218, 0.05911043658852577, 0.06592816114425659, 0.01963995024561882, 0.05388486385345459, -0.18582816421985626, 0.1049533411860466, 0.027848662808537483, -0.06406404823064804, 0.15288279950618744, -0.14499540627002716, -0.15360745787620544, 0.08571985363960266, 0.03791547194123268, -0.2222774475812912, -0.12392372637987137, -0.0967608243227005, -0.024982605129480362, -0.10977569967508316, 0.0915229320526123, 0.00973030086606741, 0.016659462824463844, 0.02732243202626705, 0.030255574733018875, 0.018587272614240646, -0.05189171060919762, 0.20935063064098358, -0.007011496927589178, 0.025420954450964928, -0.047318845987319946, -0.09585642069578171, 0.04464532807469368, -0.04319741204380989, 0.09085579216480255, 0.003001651493832469, 0.021299755200743675, -0.13604845106601715, -0.04204915836453438, -0.0699876919388771, 0.03272818401455879, -0.0992671400308609, -0.0888388454914093, -0.05681660398840904, 0.10331171751022339, 0.09561827778816223, -0.043106138706207275, -0.004020937252789736, -0.0696784257888794, 0.033655308187007904, 0.19536720216274261, 0.1936640590429306, 0.06899749487638474, -0.08110526949167252, 0.01520280446857214, -0.026458989828824997, 0.04102811589837074, -0.2254687249660492, 0.04748023673892021, 0.048288311809301376, 0.01973448507487774, 0.09604235738515854, -0.019577471539378166, -0.14105060696601868, -0.060078077018260956, 0.0699879601597786, -0.0358322337269783, -0.16533330082893372, -0.026256389915943146, 0.02497711591422558, -0.21008390188217163, -0.05193285271525383, 0.012395771220326424, -0.010522712022066116, -0.04601946473121643, 0.013974392786622047, 0.0855022445321083, -0.01934860460460186, 0.12742871046066284, 0.09221979230642319, 0.09087447077035904, -0.10475257784128189, 0.06883943825960159, 0.06487412750720978, -0.05536272004246712, 0.025526897981762886, 0.08307692408561707, -0.036971092224121094, -0.03346537798643112, 0.10105370730161667, 0.06612151116132736, 0.036009494215250015, -0.0402960442006588, 0.00024392011982854456, -0.05775166675448418, 0.06673843413591385, 0.10228231549263, 0.044824033975601196, -0.0016153574688360095, 0.04645991697907448, 0.027687475085258484, -0.09009035676717758, 0.108667753636837, 0.05832088738679886, 0.025004198774695396, -0.0394146591424942, -0.034790560603141785, -0.009539220482110977, -0.013103055767714977, -0.018956484273076057, -0.0023195345420390368, -0.08890502899885178, -0.02089976705610752, -0.11896965652704239, 0.046745698899030685, -0.07533777505159378, 0.018380269408226013, 0.015937799587845802, -0.05251970887184143, -0.004472099710255861, 0.01269409991800785, -0.07934430241584778, -0.050901588052511215, -0.008013768121600151, 0.10852599889039993, -0.11675715446472168, 0.03733733668923378, 0.08895022422075272, -0.10612653940916061, 0.07924079149961472, 0.007243188098073006, 0.0088069848716259, 0.01071830652654171, -0.16757941246032715, 0.061520811170339584, -0.02290198765695095, -0.00761200487613678, 0.022472627460956573, -0.24000638723373413, -0.007015077862888575, -0.03386852145195007, -0.03185177594423294, 0.010435637086629868, -0.03855711221694946, -0.13288496434688568, 0.0824635773897171, -0.009533129632472992, -0.07297207415103912, -0.028084509074687958, 0.02828974276781082, 0.10630329698324203, -0.02742956019937992, 0.1470690220594406, -0.011274177581071854, 0.06831628829240799, -0.17451293766498566, -0.008080846630036831, -0.017591532319784164, 0.03676823154091835, -0.026578444987535477, -0.014522974379360676, 0.06094959005713463, -0.020221684128046036, 0.212271586060524, -0.03901487588882446, 0.05290162190794945, 0.05441335588693619, 0.03311430662870407, 0.0020413065794855356, 0.091901034116745, 0.07735120505094528, -0.011654259636998177, 0.006926799658685923, 0.037549614906311035, -0.00882771611213684, -0.03867499157786369, -0.15015079081058502, 0.06530047208070755, 0.1665215939283371, 0.026645779609680176, 0.010066618211567402, 0.04670295864343643, -0.1055668368935585, -0.07320688664913177, 0.12422164529561996, -0.007260077632963657, -0.040182583034038544, -0.07277680188417435, 0.15873034298419952, 0.11048931628465652, -0.20608778297901154, 0.08614157885313034, -0.0622154101729393, -0.06607585400342941, -0.11559836566448212, -0.1482492834329605, -0.06798744946718216, -0.040864937007427216, -0.013752805069088936, -0.07243428379297256, 0.059671465307474136, 0.08623167872428894, 0.01024005375802517, -0.027288902550935745, 0.094522625207901, 0.002762814983725548, -0.02345896139740944, 0.04100678116083145, 0.06166966259479523, 0.019147342070937157, -0.10199017077684402, 0.00987168774008751, -0.004860773682594299, 0.022410035133361816, 0.06450547277927399, 0.013034150935709476, -0.04055510833859444, -0.012933559715747833, -0.03203978389501572, -0.1140187680721283, 0.03767044097185135, -0.024979818612337112, -0.0378577895462513, 0.1409195065498352, 0.021793577820062637, 0.006047355011105537, -0.02354799211025238, 0.23084229230880737, -0.0702228918671608, -0.07700937241315842, -0.1603325605392456, 0.04304204136133194, -0.06430458277463913, 0.03441668301820755, 0.04358699545264244, -0.10727842152118683, 0.021225279197096825, 0.14353570342063904, 0.137290820479393, -0.013620193116366863, 0.009629837237298489, 0.054915715008974075, -0.0024798414669930935, -0.02987760305404663, 0.027493856847286224, 0.04623271897435188, 0.11015468835830688, -0.06501564383506775, 0.08286993950605392, -0.009014596231281757, -0.08282686024904251, -0.0044896663166582584, 0.1224825382232666, -0.004531750455498695, 0.0074329618364572525, -0.07015924155712128, 0.13462743163108826, -0.07893470674753189, -0.2279055267572403, 0.04942353442311287, -0.07358410954475403, -0.1672123819589615, -0.0435827262699604, 0.013278920203447342, -0.01771375723183155, 0.017231551930308342, 0.08599650114774704, -0.04457475617527962, 0.1674698293209076, 0.043741267174482346, -0.07119767367839813, -0.07186643034219742, 0.07221293449401855, -0.12696990370750427, 0.2702426612377167, 0.024802066385746002, 0.06416530907154083, 0.10925575345754623, -0.016145143657922745, -0.1404721885919571, 0.019958769902586937, 0.0987318903207779, -0.07379059493541718, 0.07866541296243668, 0.18498477339744568, -0.0004697230760939419, 0.11964226514101028, 0.06109100952744484, -0.04026420786976814, 0.03018057532608509, -0.11700427532196045, -0.05095883831381798, -0.1166081428527832, 0.08054231852293015, -0.08018659800291061, 0.16047504544258118, 0.1375519335269928, -0.07558874040842056, -0.008924508467316628, -0.02545667253434658, 0.09010578691959381, 0.0020767671521753073, 0.10920744389295578, 0.004430610686540604, -0.2043604850769043, 0.030041106045246124, 0.02992434613406658, 0.11029580980539322, -0.1990683674812317, -0.0713842585682869, 0.05328008905053139, -0.021598435938358307, -0.06861024349927902, 0.11080104857683182, 0.04573493450880051, 0.038614556193351746, -0.037060827016830444, -0.03291317820549011, -0.015202827751636505, 0.1335524618625641, -0.10707305371761322, -0.007032520603388548 ]
null
null
peft
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed] ### Framework versions - PEFT 0.8.2
{"library_name": "peft", "base_model": "NousResearch/Llama-2-7b-chat-hf"}
null
frikh-said/query_optimizer_model
[ "peft", "safetensors", "arxiv:1910.09700", "base_model:NousResearch/Llama-2-7b-chat-hf", "region:us" ]
2024-02-13T19:05:58+00:00
[ "1910.09700" ]
[]
TAGS #peft #safetensors #arxiv-1910.09700 #base_model-NousResearch/Llama-2-7b-chat-hf #region-us
# Model Card for Model ID ## Model Details ### Model Description - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact ### Framework versions - PEFT 0.8.2
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact", "### Framework versions\n\n- PEFT 0.8.2" ]
[ "TAGS\n#peft #safetensors #arxiv-1910.09700 #base_model-NousResearch/Llama-2-7b-chat-hf #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact", "### Framework versions\n\n- PEFT 0.8.2" ]
[ 43, 6, 3, 54, 28, 3, 4, 9, 9, 10, 42, 20, 3, 4, 5, 9, 11, 13, 3, 12, 5, 4, 5, 3, 4, 9, 53, 9, 8, 6, 3, 14, 8, 7, 9, 4, 11 ]
[ "passage: TAGS\n#peft #safetensors #arxiv-1910.09700 #base_model-NousResearch/Llama-2-7b-chat-hf #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact### Framework versions\n\n- PEFT 0.8.2" ]
[ -0.11769948899745941, 0.20666998624801636, -0.002912783296778798, 0.02549395151436329, 0.07785112410783768, 0.015407757833600044, 0.05577832832932472, 0.13303913176059723, 0.03283666446805, 0.11651013046503067, 0.06938543915748596, 0.11774429678916931, 0.1151692196726799, 0.21962304413318634, 0.003263794118538499, -0.1657102108001709, 0.01971868798136711, -0.07241468876600266, 0.01743026077747345, 0.11806745082139969, 0.14102990925312042, -0.09932662546634674, 0.07670142501592636, -0.020442117005586624, 0.0024542235769331455, -0.027936438098549843, -0.06807847321033478, -0.011055584996938705, 0.05399150028824806, 0.03122783452272415, 0.056819941848516464, -0.010763264261186123, 0.08520374447107315, -0.2704300880432129, 0.01883009262382984, 0.04265301674604416, -0.00045290516573004425, 0.08344653248786926, 0.09688374400138855, -0.04538474604487419, 0.12346991151571274, -0.021854383870959282, 0.13367369771003723, 0.09051225334405899, -0.09567297995090485, -0.2351798564195633, -0.06292394548654556, 0.07993721961975098, 0.18764273822307587, 0.08551130443811417, -0.04316225275397301, 0.12375939637422562, -0.0640316754579544, 0.022428808733820915, 0.06704075634479523, -0.10372592508792877, -0.06345343589782715, 0.06291820853948593, 0.1294797956943512, 0.0773601308465004, -0.12618185579776764, -0.037074875086545944, 0.035886481404304504, 0.04580415412783623, 0.0580124706029892, 0.006647665984928608, 0.1484030783176422, 0.028769001364707947, -0.1454513818025589, -0.049566421657800674, 0.13674598932266235, 0.010416027158498764, -0.03749023377895355, -0.21604330837726593, -0.00459075253456831, -0.09522778540849686, -0.03878160938620567, -0.04798002541065216, 0.03698987141251564, 0.010453630238771439, 0.13307736814022064, -0.049591004848480225, -0.09215915948152542, -0.014346052892506123, 0.11040274053812027, 0.0616430938243866, 0.02060583047568798, -0.01945985108613968, 0.008026303723454475, 0.12192189693450928, 0.0676833912730217, -0.13428759574890137, -0.06298412382602692, -0.06815947592258453, -0.03369535133242607, -0.024816837161779404, 0.040182024240493774, 0.017229147255420685, 0.0635613352060318, 0.27198895812034607, -0.04016723483800888, 0.06374870985746384, 0.04097883775830269, 0.022351374849677086, 0.03009030781686306, 0.10533419251441956, -0.03212955966591835, -0.16400747001171112, -0.007433966733515263, 0.10063730925321579, 0.002702203579246998, -0.03417186439037323, -0.05627066642045975, 0.03344479948282242, 0.03579871356487274, 0.11764659732580185, 0.10942773520946503, -0.028066188097000122, -0.0745202898979187, -0.05581606552004814, 0.19079482555389404, -0.15589196979999542, 0.043175265192985535, 0.031009791418910027, 0.0013891590060666203, -0.06065008044242859, 0.008123516105115414, 0.018420519307255745, -0.03341829776763916, 0.0739302784204483, -0.06741747260093689, -0.0401163212954998, -0.12049110978841782, -0.029961997643113136, 0.03624962642788887, 0.009220915846526623, -0.04452921822667122, -0.042916469275951385, -0.07037478685379028, -0.10976991057395935, 0.1085909754037857, -0.054557181894779205, -0.05871255323290825, -0.028399605304002762, -0.08273676037788391, 0.018992358818650246, 0.03493666648864746, 0.06826084107160568, -0.026227839291095734, 0.046194083988666534, -0.010782663710415363, 0.06776405870914459, 0.06998622417449951, 0.030902881175279617, -0.0827704519033432, 0.06522461771965027, -0.19576740264892578, 0.07253402471542358, -0.08013460040092468, 0.044235534965991974, -0.1595429927110672, -0.004312295466661453, -0.0022420838940888643, 0.029259683564305305, 0.041751157492399216, 0.16127003729343414, -0.21196487545967102, -0.03095497004687786, 0.1684923619031906, -0.10783151537179947, -0.13275355100631714, 0.040584247559309006, -0.03692902997136116, 0.18247874081134796, 0.02804495394229889, 0.029673883691430092, 0.08894111216068268, -0.16022709012031555, -0.02174060046672821, -0.018446754664182663, 0.010418129153549671, 0.06808888167142868, 0.08132006227970123, -0.09663040190935135, -0.001616360037587583, 0.010858171619474888, -0.061541199684143066, -0.01785045862197876, -0.04080429673194885, -0.1045517548918724, 0.004818684887140989, -0.08689999580383301, 0.010899664834141731, 0.005562866572290659, -0.09412923455238342, -0.00767026050016284, -0.15247979760169983, -0.05846627429127693, 0.08434145152568817, 0.00026128877652809024, -0.01405352633446455, -0.09419026970863342, 0.06373747438192368, -0.03559573367238045, -0.020782528445124626, -0.14397205412387848, -0.015432771295309067, 0.017898816615343094, -0.13868916034698486, 0.0012420830316841602, -0.11995251476764679, 0.06763311475515366, 0.004810863174498081, -0.05048419162631035, -0.04406342655420303, -0.002766441088169813, -0.004278186243027449, -0.06090925633907318, -0.23663276433944702, -0.02428145334124565, -0.052476897835731506, 0.1713789999485016, -0.23148222267627716, 0.04160921275615692, 0.0034466448705643415, 0.11964506655931473, 0.0047644018195569515, -0.058687981218099594, 0.022583601996302605, -0.06231268495321274, -0.024701951071619987, -0.06840142607688904, -0.0037527058739215136, 0.003462479216977954, -0.02865241840481758, 0.014165260829031467, -0.12116673588752747, -0.06389053910970688, 0.09515070170164108, 0.058769457042217255, -0.1450631022453308, 0.00842469185590744, -0.040074050426483154, -0.056336693465709686, -0.06754444539546967, -0.07108866423368454, 0.08409534394741058, 0.05292753130197525, 0.047818623483181, -0.08274413645267487, -0.06752345710992813, 0.003514396958053112, -0.02452346496284008, -0.013681194745004177, 0.12610596418380737, 0.09137961268424988, -0.09851912409067154, 0.09228390455245972, 0.07080904394388199, 0.021283060312271118, 0.08558592200279236, -0.02348261885344982, -0.10639158636331558, -0.02593001164495945, 0.05667613446712494, 0.01070303376764059, 0.1701316386461258, -0.07188218832015991, 0.055811841040849686, 0.047385260462760925, -0.05746626481413841, 0.04811330884695053, -0.09233375638723373, 0.006447041407227516, -0.0029063266701996326, -0.015782566741108894, 0.036864910274744034, -0.016450000926852226, 0.004836694337427616, 0.09010760486125946, 0.062471237033605576, 0.021535998210310936, 0.012572001665830612, -0.0362418070435524, -0.14193294942378998, 0.1797328144311905, -0.09205848723649979, -0.23891016840934753, -0.15006007254123688, 0.054771315306425095, 0.05779189616441727, -0.013948877342045307, 0.03144465386867523, -0.05449340119957924, -0.09502875059843063, -0.08760391175746918, 0.004416328854858875, 0.03345770016312599, -0.06084810197353363, -0.06309141218662262, 0.03578837960958481, 0.03894244134426117, -0.12027259171009064, 0.023747729137539864, 0.05629263445734978, -0.0018340221140533686, -0.003648567944765091, 0.045919474214315414, 0.09278853237628937, 0.20445209741592407, -0.002732523949816823, 0.0053982362151145935, 0.05899197608232498, 0.2761322557926178, -0.15901462733745575, 0.11325082182884216, 0.13837623596191406, -0.06625627726316452, 0.07702389359474182, 0.1908654421567917, 0.030556995421648026, -0.09384198486804962, 0.018727079033851624, 0.031007766723632812, -0.023953305557370186, -0.27104878425598145, -0.05058536306023598, -0.023827584460377693, -0.07544421404600143, 0.08135921508073807, 0.08835428208112717, 0.09257134795188904, 0.028403934091329575, -0.06399580091238022, -0.09893711656332016, 0.02674330212175846, 0.11227049678564072, -0.017586790025234222, 0.0025482589844614267, 0.07991060614585876, -0.04866483062505722, 0.004952625837177038, 0.08520778268575668, -0.02139362134039402, 0.12702924013137817, 0.056118953973054886, 0.1073608547449112, 0.08325479924678802, 0.08240807801485062, -0.009224953129887581, 0.03056410513818264, 0.0027502768207341433, 0.020547926425933838, 0.020710214972496033, -0.09094986319541931, 0.01736580580472946, 0.11510791629552841, 0.014805049635469913, 0.020639518275856972, 0.014339569956064224, -0.059905439615249634, 0.037447262555360794, 0.1929825097322464, 0.03151291236281395, -0.2053559273481369, -0.0801534503698349, 0.05455378443002701, -0.0739559680223465, -0.15504314005374908, -0.00788013357669115, 0.014482896775007248, -0.1574634462594986, 0.018814608454704285, -0.03978566825389862, 0.10737770050764084, -0.06571333855390549, -0.03766518458724022, 0.10156018286943436, 0.047414667904376984, -0.028234774246811867, 0.04994218423962593, -0.19223366677761078, 0.10771425813436508, 0.028445864096283913, 0.06718984991312027, -0.08868084102869034, 0.08744743466377258, -0.001796784228645265, -0.011346758343279362, 0.1650870144367218, -0.0022033178247511387, -0.06180639937520027, -0.07702392339706421, -0.07925916463136673, -0.005427278578281403, 0.07996804267168045, -0.13732460141181946, 0.07520841062068939, -0.0333210825920105, -0.031404491513967514, -0.007430676370859146, -0.086235411465168, -0.11866632848978043, -0.16253423690795898, 0.061424531042575836, -0.08553852140903473, 0.025479501113295555, -0.08024374395608902, -0.052194323390722275, 0.03343738615512848, 0.17655520141124725, -0.2028171271085739, -0.10914232581853867, -0.14351201057434082, -0.10141443461179733, 0.15255947411060333, -0.04746145382523537, 0.08725551515817642, -0.007392728701233864, 0.16233710944652557, 0.000411053973948583, -0.01836213283240795, 0.08401200920343399, -0.09487809985876083, -0.18540970981121063, -0.04660943150520325, 0.18383155763149261, 0.1311776340007782, 0.028439510613679886, -0.011346815153956413, 0.026449725031852722, -0.06680743396282196, -0.10957765579223633, 0.030112503096461296, 0.1476605385541916, 0.06770458072423935, -0.020437177270650864, -0.042344409972429276, -0.09610117226839066, -0.06520573794841766, -0.04310684651136398, -0.002870124764740467, 0.20515766739845276, -0.07029063999652863, 0.15548402070999146, 0.11205708235502243, -0.060042425990104675, -0.21054470539093018, 0.032464709132909775, 0.03981616720557213, 0.016663486137986183, 0.03228053078055382, -0.1917620599269867, 0.08767081797122955, -0.02572266198694706, -0.08159942924976349, 0.1786719262599945, -0.19226399064064026, -0.129422128200531, 0.10824183374643326, 0.02104264684021473, -0.201046884059906, -0.150085911154747, -0.10347102582454681, -0.01812194101512432, -0.12009748816490173, 0.04840534180402756, 0.008618081919848919, 0.010992096737027168, 0.011450343765318394, 0.020118551328778267, 0.041532836854457855, -0.04830056428909302, 0.20299124717712402, -0.04482565075159073, -0.005569585133343935, -0.0527876652777195, -0.07773393392562866, 0.013384186662733555, -0.054856233298778534, 0.12370224297046661, -0.015441779978573322, 0.033861491829156876, -0.16196617484092712, -0.04311643913388252, -0.06270512193441391, 0.035143591463565826, -0.09606029093265533, -0.0794484093785286, -0.04419834166765213, 0.08294829726219177, 0.09136927872896194, -0.012586906552314758, 0.01242639496922493, -0.09655292332172394, 0.09700454771518707, 0.1995052993297577, 0.19330982863903046, 0.06315502524375916, -0.053107570856809616, 0.02997264452278614, -0.038537558168172836, 0.04430471360683441, -0.21931912004947662, 0.04287564381957054, 0.06498876214027405, 0.026542434468865395, 0.06985615193843842, -0.005677002016454935, -0.1625482589006424, -0.09128525853157043, 0.08836907148361206, -0.06292731314897537, -0.17292796075344086, -0.033785052597522736, 0.041705161333084106, -0.20931172370910645, -0.04640975967049599, 0.03935948386788368, -0.0181092731654644, -0.041782595217227936, 0.02617095597088337, 0.08081985265016556, -0.021255910396575928, 0.08439317345619202, 0.09534917026758194, 0.08989959210157394, -0.09506035596132278, 0.05267556756734848, 0.07946302741765976, -0.019431734457612038, 0.029825052246451378, 0.13751423358917236, -0.0364147424697876, -0.04645836725831032, 0.0798555314540863, 0.12185007333755493, -0.002486835466697812, -0.05506465584039688, 0.004287934862077236, -0.049309078603982925, 0.061294808983802795, 0.12155837565660477, 0.021408192813396454, -0.01193462684750557, 0.07872650027275085, 0.025506949052214622, -0.09194063395261765, 0.12346944957971573, 0.04140791669487953, 0.02029072493314743, -0.03513696417212486, -0.028924908488988876, -0.013744531199336052, -0.0018778513185679913, -0.014825914986431599, 0.00004693585287895985, -0.0909915491938591, 0.0014284261269494891, -0.11594712734222412, 0.01780756004154682, -0.06718336790800095, -0.0002576978877186775, 0.028643004596233368, -0.0489656962454319, -0.003824668936431408, -0.005410241428762674, -0.07838259637355804, -0.05261590704321861, -0.021815035492181778, 0.07858611643314362, -0.13979020714759827, 0.03456014022231102, 0.07484147697687149, -0.10328766703605652, 0.06876613199710846, -0.008326759561896324, 0.013081645593047142, 0.008228299207985401, -0.1439802497625351, 0.056155234575271606, -0.029309317469596863, -0.006359034683555365, 0.0010422393679618835, -0.17944684624671936, -0.011577526107430458, -0.042701829224824905, -0.07143910974264145, 0.013309884816408157, -0.013215545564889908, -0.1226518526673317, 0.11009237170219421, 0.008095293305814266, -0.06616021692752838, -0.015245208516716957, 0.044449418783187866, 0.07164029777050018, -0.012409849092364311, 0.10877691954374313, -0.02684897929430008, 0.083103708922863, -0.1807156205177307, -0.00621566828340292, -0.016833368688821793, 0.05384806543588638, -0.018549276515841484, -0.04573789983987808, 0.05623883008956909, -0.020538190379738808, 0.16466617584228516, -0.0018338061636313796, 0.0742441937327385, 0.051905106753110886, 0.010930253192782402, 0.04378392919898033, 0.0728876143693924, 0.06468360126018524, -0.016203518956899643, -0.004701197147369385, 0.03255317360162735, -0.0020409130956977606, -0.045227568596601486, -0.14094270765781403, 0.07253962010145187, 0.17666760087013245, 0.07048549503087997, 0.02179078198969364, 0.008067925460636616, -0.1332378387451172, -0.07408107072114944, 0.10511837154626846, -0.017402758821845055, -0.031061973422765732, -0.06629138439893723, 0.22787198424339294, 0.14990010857582092, -0.18986721336841583, 0.07560385763645172, -0.05423163250088692, -0.03786854073405266, -0.14348988234996796, -0.16802245378494263, -0.05776524171233177, -0.04911024123430252, -0.0318753756582737, -0.05938649922609329, 0.050970252603292465, 0.03954758495092392, -0.004729952663183212, -0.02203095331788063, 0.10803087800741196, 0.031586550176143646, -0.04009048268198967, 0.045863546431064606, 0.060998860746622086, 0.04236721992492676, -0.09942521899938583, 0.011735196225345135, 0.001886715879663825, 0.008814944885671139, 0.062213458120822906, 0.023173239082098007, -0.06990323960781097, 0.02930132858455181, -0.01787971705198288, -0.12080670148134232, 0.0495670922100544, -0.007516996935009956, -0.021949628368020058, 0.14967697858810425, 0.03512033075094223, 0.008099704049527645, -0.010065858252346516, 0.23994873464107513, -0.07199644297361374, -0.0820726528763771, -0.13058407604694366, 0.08454304188489914, -0.0638623833656311, 0.023955434560775757, 0.015532204881310463, -0.12446270138025284, 0.012716526165604591, 0.17904044687747955, 0.11603523045778275, -0.019778354093432426, 0.013520904816687107, 0.04626742750406265, 0.009430119767785072, -0.03490632027387619, 0.011960557661950588, 0.055921632796525955, 0.20638400316238403, -0.07805577665567398, 0.06097545102238655, -0.017648804932832718, -0.0689961239695549, -0.031498104333877563, 0.10827583074569702, -0.011656714603304863, -0.01122299861162901, -0.05968675762414932, 0.14143596589565277, -0.07639602571725845, -0.21431203186511993, 0.05089925602078438, -0.08246009796857834, -0.13886047899723053, -0.04927203059196472, 0.027118146419525146, -0.02602965012192726, 0.005761643406003714, 0.06048549711704254, -0.05353428050875664, 0.18044669926166534, 0.029145246371626854, -0.042828578501939774, -0.09458549320697784, 0.056870587170124054, -0.16182497143745422, 0.2819679081439972, 0.021850652992725372, 0.0487053208053112, 0.1097458079457283, -0.021935712546110153, -0.1319884955883026, 0.015168975107371807, 0.1129152700304985, -0.0632040724158287, 0.06390555948019028, 0.1606759876012802, 0.0027896345127373934, 0.12182102352380753, 0.06664198637008667, -0.0592242032289505, 0.035914625972509384, -0.06755085289478302, -0.05441083759069443, -0.11569532752037048, 0.07832225412130356, -0.0966244786977768, 0.1526871919631958, 0.12093057483434677, -0.07346441596746445, -0.0029697499703615904, -0.020845314487814903, 0.08185786008834839, 0.018558043986558914, 0.10965380072593689, 0.008656207472085953, -0.1857033669948578, 0.046339020133018494, 0.00887568574398756, 0.09886037558317184, -0.21062983572483063, -0.04863942787051201, 0.041914358735084534, -0.017102444544434547, -0.08565417677164078, 0.11376497149467468, 0.03838564455509186, 0.01722962036728859, -0.035149652510881424, -0.04790586978197098, 0.01729344017803669, 0.15234188735485077, -0.1053488478064537, -0.014266646467149258 ]
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # marian-finetuned-kde4-en-to-fr This model is a fine-tuned version of [Helsinki-NLP/opus-mt-en-fr](https://huggingface.co/Helsinki-NLP/opus-mt-en-fr) on the kde4 dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 32 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.0+cu121 - Datasets 2.17.0 - Tokenizers 0.15.2
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["kde4"], "base_model": "Helsinki-NLP/opus-mt-en-fr", "model-index": [{"name": "marian-finetuned-kde4-en-to-fr", "results": []}]}
text2text-generation
DouglasChan/lab1_finetuned
[ "transformers", "safetensors", "marian", "text2text-generation", "generated_from_trainer", "dataset:kde4", "base_model:Helsinki-NLP/opus-mt-en-fr", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2024-02-13T19:06:20+00:00
[]
[]
TAGS #transformers #safetensors #marian #text2text-generation #generated_from_trainer #dataset-kde4 #base_model-Helsinki-NLP/opus-mt-en-fr #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
# marian-finetuned-kde4-en-to-fr This model is a fine-tuned version of Helsinki-NLP/opus-mt-en-fr on the kde4 dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 32 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.0+cu121 - Datasets 2.17.0 - Tokenizers 0.15.2
[ "# marian-finetuned-kde4-en-to-fr\n\nThis model is a fine-tuned version of Helsinki-NLP/opus-mt-en-fr on the kde4 dataset.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 32\n- eval_batch_size: 64\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3", "### Training results", "### Framework versions\n\n- Transformers 4.35.2\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.2" ]
[ "TAGS\n#transformers #safetensors #marian #text2text-generation #generated_from_trainer #dataset-kde4 #base_model-Helsinki-NLP/opus-mt-en-fr #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n", "# marian-finetuned-kde4-en-to-fr\n\nThis model is a fine-tuned version of Helsinki-NLP/opus-mt-en-fr on the kde4 dataset.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 32\n- eval_batch_size: 64\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3", "### Training results", "### Framework versions\n\n- Transformers 4.35.2\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.2" ]
[ 83, 47, 6, 12, 8, 3, 90, 4, 33 ]
[ "passage: TAGS\n#transformers #safetensors #marian #text2text-generation #generated_from_trainer #dataset-kde4 #base_model-Helsinki-NLP/opus-mt-en-fr #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n# marian-finetuned-kde4-en-to-fr\n\nThis model is a fine-tuned version of Helsinki-NLP/opus-mt-en-fr on the kde4 dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 32\n- eval_batch_size: 64\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3### Training results### Framework versions\n\n- Transformers 4.35.2\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.2" ]
[ -0.11344804614782333, 0.17834211885929108, -0.0030620666220784187, 0.08407655358314514, 0.10683934390544891, -0.006709678564220667, 0.09885814785957336, 0.1303863823413849, -0.044523660093545914, 0.08535023033618927, 0.099642813205719, 0.03322869539260864, 0.07374729961156845, 0.15354396402835846, -0.04810834676027298, -0.2131354808807373, 0.03077813796699047, -0.0016014387365430593, -0.06921543926000595, 0.08830002695322037, 0.11988687515258789, -0.07759588956832886, 0.07957719266414642, -0.003001938108354807, -0.08096500486135483, 0.015010661445558071, -0.04461519792675972, -0.06676530838012695, 0.07573159784078598, 0.013918442651629448, 0.07721750438213348, 0.020929038524627686, 0.10394202917814255, -0.2131856083869934, -0.004643000662326813, 0.04452973231673241, 0.024833017960190773, 0.06803803890943527, 0.041910570114851, 0.03640036657452583, 0.08525138348340988, -0.16740044951438904, 0.08085141330957413, -0.01032261922955513, -0.05799861624836922, -0.10682541131973267, -0.09227252006530762, 0.0548393651843071, 0.11882583051919937, 0.11464722454547882, 0.007937219925224781, 0.16416771709918976, -0.030701329931616783, 0.07560690492391586, 0.1362791359424591, -0.2463310807943344, -0.060227882117033005, 0.04099106416106224, 0.057263560593128204, 0.06901994347572327, -0.07302548736333847, 0.014974987134337425, 0.041340284049510956, 0.02767365239560604, 0.05451718345284462, -0.021317053586244583, -0.0527820810675621, -0.02824275940656662, -0.11081243306398392, -0.03265414759516716, 0.22607673704624176, 0.06790986657142639, -0.03999853879213333, -0.09207932651042938, -0.023324767127633095, -0.06236136704683304, -0.006508809980005026, -0.05662383884191513, 0.0065681845881044865, -0.06528568267822266, -0.028364328667521477, -0.07904021441936493, -0.09588408470153809, -0.034557782113552094, 0.03029843047261238, 0.10530039668083191, 0.020729267969727516, 0.018107574433088303, -0.006380648817867041, 0.07017882168292999, -0.05208402872085571, -0.15011975169181824, -0.0135258249938488, -0.006736245471984148, -0.03389349207282066, -0.06781981140375137, -0.01870332844555378, -0.09321233630180359, 0.004310987889766693, 0.0836828425526619, -0.03569358214735985, 0.04477498680353165, 0.020536916330456734, -0.0036941550206393003, -0.012019532732665539, 0.12648947536945343, -0.0360681414604187, -0.10479266941547394, 0.02468612790107727, 0.11220747232437134, 0.03398973122239113, -0.03474477306008339, -0.09504444152116776, -0.048874881118535995, 0.0815945416688919, 0.06816937029361725, -0.009351604618132114, 0.013731349259614944, -0.022452227771282196, -0.05041008070111275, 0.08016718924045563, -0.13409093022346497, 0.04393158480525017, -0.027363210916519165, -0.07752170413732529, -0.07774544507265091, 0.008393682539463043, 0.025178439915180206, -0.04926321655511856, 0.0582367405295372, -0.049674756824970245, -0.02026129513978958, -0.053953636437654495, -0.05333472788333893, 0.03396854177117348, -0.05994446575641632, 0.02639125846326351, -0.09178165346384048, -0.16280482709407806, -0.02874968945980072, 0.06223364546895027, -0.07504981756210327, -0.07061617821455002, -0.04170387238264084, -0.06252159178256989, 0.011884584091603756, -0.01739943027496338, 0.09882847964763641, -0.04943247139453888, 0.04219146817922592, 0.00157506566029042, 0.008081668056547642, 0.004857426974922419, 0.031244687736034393, -0.07687325030565262, 0.037370890378952026, -0.0955919474363327, 0.050081875175237656, -0.09946715086698532, 0.02278883196413517, -0.1261332929134369, -0.09093540906906128, 0.003062112955376506, -0.0451827310025692, 0.0879620686173439, 0.13202476501464844, -0.14805340766906738, 0.012487098574638367, 0.09324312210083008, -0.07399407029151917, -0.11722918599843979, 0.0995756983757019, -0.01817798800766468, 0.033967357128858566, 0.056133951991796494, 0.15846313536167145, 0.1393437385559082, -0.14761295914649963, -0.03588360920548439, 0.02079542726278305, 0.08257803320884705, -0.026030369102954865, 0.08805440366268158, 0.008618151769042015, 0.028295107185840607, 0.01845771260559559, -0.07275790721178055, -0.02133571356534958, -0.038681481033563614, -0.10505186766386032, -0.02620384655892849, -0.09293313324451447, -0.005018947180360556, 0.03505466505885124, 0.02728382498025894, -0.08586075901985168, -0.09652996808290482, 0.02886301279067993, 0.12481235712766647, -0.05150338634848595, 0.016918038949370384, -0.06391274183988571, 0.08856294304132462, -0.06676434725522995, -0.03189557045698166, -0.14749059081077576, -0.09042130410671234, 0.04990861192345619, -0.052527837455272675, 0.012582904659211636, -0.004981668666005135, 0.0683455616235733, 0.09183918684720993, -0.0549888014793396, -0.02012474648654461, -0.09044118225574493, 0.015888506546616554, -0.1016307845711708, -0.14469727873802185, -0.034841593354940414, -0.04324723035097122, 0.18830621242523193, -0.2238118052482605, -0.006838418543338776, 0.041546013206243515, 0.14728210866451263, 0.025084229186177254, -0.04480554908514023, 0.0024273835588246584, 0.029466159641742706, -0.0059365988709032536, -0.0895693376660347, 0.022476432844996452, 0.0007588886073790491, -0.0924316793680191, -0.01552323903888464, -0.13396666944026947, 0.05259433016180992, 0.07891342788934708, 0.09938064962625504, -0.06544803828001022, -0.018015649169683456, -0.046861808747053146, -0.033719707280397415, -0.047762300819158554, 0.000337379053235054, 0.1626073569059372, 0.014804240316152573, 0.10148899257183075, -0.06255726516246796, -0.04924603924155235, 0.02916312776505947, -0.0027794730849564075, -0.07188747078180313, 0.09281198680400848, 0.02942795678973198, -0.17792613804340363, 0.0837678536772728, 0.1046932190656662, -0.035906724631786346, 0.14177767932415009, -0.02464839443564415, -0.103427954018116, -0.03944726660847664, 0.0161390770226717, 0.015002170577645302, 0.17055658996105194, -0.06160711124539375, 0.01812922954559326, 0.0429888553917408, 0.019139008596539497, 0.03313043713569641, -0.13845615088939667, -0.0152432881295681, 0.0366373136639595, -0.04025818035006523, 0.0045669106766581535, -0.019757069647312164, -0.009781677275896072, 0.06789978593587875, 0.03146521747112274, -0.03885175287723541, 0.02025429531931877, -0.022179532796144485, -0.06251346319913864, 0.1537337452173233, -0.1057635173201561, -0.19558246433734894, -0.14536350965499878, 0.05170954391360283, -0.07260403037071228, -0.014718042686581612, 0.016534123569726944, -0.054661430418491364, -0.07481734454631805, -0.13090389966964722, -0.033350586891174316, -0.028731470927596092, -0.030525824055075645, 0.03964010626077652, 0.03405812010169029, 0.07000309973955154, -0.10970857739448547, 0.006073724012821913, 0.010137070901691914, -0.03208528459072113, -0.0278884619474411, 0.02103404328227043, 0.11382395774126053, 0.06818132847547531, -0.029007967561483383, 0.032836657017469406, -0.008505848236382008, 0.2289448082447052, -0.09512598812580109, 0.018151473253965378, 0.11813525855541229, 0.008578253909945488, 0.04187691956758499, 0.1588982194662094, 0.01146392896771431, -0.07714557647705078, 0.013018409721553326, 0.03633876517415047, -0.014605446718633175, -0.2224821150302887, -0.06201014667749405, -0.03963141143321991, -0.04243461787700653, 0.10304906964302063, 0.050842463970184326, -0.0020947244483977556, 0.07317215204238892, -0.018288323655724525, 0.026094380766153336, 0.0066780517809093, 0.08646583557128906, 0.07830971479415894, 0.04709815979003906, 0.06491255015134811, -0.03258395567536354, -0.027395475655794144, 0.07227791100740433, 0.05437347665429115, 0.2173788994550705, -0.04660332575440407, 0.11540400236845016, 0.003960903733968735, 0.15808743238449097, -0.040117211639881134, 0.03902444615960121, 0.02134634740650654, 0.012085030786693096, 0.007144920527935028, -0.0794794112443924, -0.034794483333826065, 0.049476273357868195, 0.0034045185893774033, 0.009342801757156849, -0.06886294484138489, 0.04437262564897537, 0.0065700821578502655, 0.19427227973937988, 0.07529056072235107, -0.2798524796962738, -0.07907145470380783, 0.0349779911339283, -0.001017472823150456, -0.07009965926408768, -0.0005893563502468169, 0.1382504254579544, -0.12221207469701767, 0.07646206766366959, -0.0687103420495987, 0.09494604170322418, -0.031015008687973022, -0.019458558410406113, 0.04591212421655655, 0.071019746363163, 0.01621103100478649, 0.11752105504274368, -0.15017452836036682, 0.2408585101366043, 0.03201482817530632, 0.1238820031285286, -0.08692797273397446, 0.03236529231071472, -0.003320289310067892, 0.10660858452320099, 0.13838382065296173, 0.023104263469576836, -0.08429808169603348, -0.1612148880958557, -0.1398981213569641, 0.031070496886968613, 0.07376951724290848, -0.0484234094619751, 0.07367599755525589, -0.022267289459705353, -0.00995033048093319, 0.016138749197125435, -0.02467828057706356, -0.14890821278095245, -0.16103990375995636, 0.045901067554950714, 0.020944740623235703, -0.06122846528887749, -0.07266107201576233, -0.09857217967510223, -0.0013198073720559478, 0.1785334199666977, 0.08555516600608826, -0.04088406264781952, -0.1394883692264557, 0.015450695529580116, 0.14534275233745575, -0.08411183208227158, -0.007357920054346323, 0.013898775912821293, 0.14339514076709747, 0.009126619435846806, -0.045811764895915985, 0.040256693959236145, -0.07037781924009323, -0.12174364179372787, -0.02633814699947834, 0.1475892961025238, 0.024230647832155228, 0.04060828685760498, 0.03578634187579155, 0.03694850951433182, 0.008608746342360973, -0.07021573185920715, 0.004015032667666674, 0.018222380429506302, 0.10890437662601471, 0.014818781055510044, -0.03918079659342766, 0.027230072766542435, -0.07800329476594925, -0.031041568145155907, 0.1278776377439499, 0.22612275183200836, -0.06663701683282852, 0.06737902015447617, 0.07700391113758087, -0.07254204899072647, -0.18551455438137054, 0.03504718095064163, 0.08472847938537598, 0.036082204431295395, 0.062203437089920044, -0.13089226186275482, 0.059957824647426605, 0.0758613646030426, -0.031727708876132965, 0.07314879447221756, -0.2489434778690338, -0.12738275527954102, 0.06672091037034988, 0.1307549774646759, 0.021490976214408875, -0.09526126086711884, -0.0507475808262825, -0.04061239957809448, -0.167191281914711, 0.09713265299797058, -0.06726077198982239, 0.08837666362524033, 0.012121432460844517, 0.0910109207034111, 0.038430046290159225, -0.04994175583124161, 0.17972950637340546, 0.0007737026899121702, 0.02931118756532669, -0.08290279656648636, 0.03991934657096863, 0.07636214047670364, -0.09301766008138657, 0.09820986539125443, -0.0539378821849823, 0.05367095395922661, -0.1489625722169876, -0.03177928552031517, -0.04581078514456749, 0.08587341010570526, -0.05496294051408768, -0.059029024094343185, -0.03347046673297882, 0.07799745351076126, 0.09025325626134872, -0.03250224143266678, 0.11972310394048691, 0.034854717552661896, 0.042716775089502335, 0.13599273562431335, 0.10094383358955383, -0.0019057993777096272, -0.047306668013334274, 0.019419297575950623, -0.03398527577519417, 0.053382303565740585, -0.06079782545566559, 0.02602141909301281, 0.1207166463136673, 0.005779755301773548, 0.10856906324625015, -0.018817687407135963, -0.07360411435365677, -0.017573190852999687, 0.04348480701446533, -0.12147368490695953, -0.12599582970142365, -0.04549139365553856, -0.007664310280233622, -0.11723600327968597, -0.0027900978457182646, 0.14220884442329407, -0.0749531164765358, -0.005761284846812487, -0.030942965298891068, 0.03558345139026642, -0.026245765388011932, 0.17046622931957245, 0.04508417099714279, 0.0543241985142231, -0.055243730545043945, 0.11056306213140488, 0.0835859477519989, -0.10145080834627151, 0.07420161366462708, 0.04136461764574051, -0.09529094398021698, -0.04516858980059624, 0.05892631784081459, 0.1240898072719574, 0.009109387174248695, -0.0658584013581276, -0.08847374469041824, -0.05874820426106453, 0.015959033742547035, 0.04934319108724594, 0.04240212216973305, -0.0312705934047699, -0.006462098099291325, -0.005495072808116674, -0.1565982550382614, 0.1298677623271942, 0.049578383564949036, 0.05828767642378807, -0.14404842257499695, 0.03321071341633797, -0.01231937762349844, 0.04245806857943535, -0.012516394257545471, 0.03740251436829567, -0.04794502258300781, -0.03259227052330971, -0.10491560399532318, 0.00152000249363482, -0.03692305088043213, 0.006968227680772543, -0.04247629642486572, -0.0696297287940979, -0.062286362051963806, 0.0525413379073143, -0.05566137656569481, -0.05045535787940025, -0.022118793800473213, 0.03662858158349991, -0.11789019405841827, -0.047658175230026245, 0.03401262313127518, -0.09261437505483627, 0.055006399750709534, 0.043398790061473846, 0.0208565853536129, 0.021437520161271095, -0.03151683136820793, 0.01858413778245449, -0.0022600165102630854, 0.060087013989686966, 0.07270169258117676, -0.12611764669418335, -0.019144007936120033, 0.011225233785808086, 0.03590914234519005, 0.019294966012239456, 0.09392931312322617, -0.10767479240894318, -0.06269872933626175, -0.05203815922141075, -0.08256729692220688, -0.044537708163261414, 0.07951278239488602, 0.08833733946084976, 0.013054843991994858, 0.16392002999782562, -0.06913390755653381, 0.061858974397182465, -0.15562184154987335, -0.026793936267495155, 0.01250989269465208, -0.042140811681747437, -0.05171841382980347, -0.01705990359187126, 0.08328337222337723, -0.06798192858695984, 0.10156077146530151, 0.008881747722625732, 0.10583170503377914, 0.04477240890264511, -0.07739405333995819, 0.04251594468951225, 0.003158856648951769, 0.14776156842708588, 0.045806922018527985, -0.009667574428021908, 0.06536009907722473, -0.030114445835351944, 0.05203324928879738, 0.054023973643779755, 0.11889679729938507, 0.1487913876771927, 0.03932509571313858, 0.07561080157756805, 0.07197627425193787, -0.06798000633716583, -0.13553908467292786, 0.019188720732927322, -0.03182503208518028, 0.10609956830739975, -0.022967005148530006, 0.12383109331130981, 0.09000401198863983, -0.19652192294597626, 0.03929801657795906, -0.08261328935623169, -0.12731827795505524, -0.09157364070415497, -0.14785350859165192, -0.10099044442176819, -0.06731967628002167, 0.02975302003324032, -0.11809303611516953, 0.019237911328673363, 0.06365446001291275, 0.004901999142020941, -0.010395682416856289, 0.17257839441299438, -0.059283651411533356, 0.023691143840551376, 0.06456208974123001, 0.020325692370533943, 0.0048711891286075115, -0.025482062250375748, -0.03270093351602554, 0.02493264712393284, 0.016969336196780205, 0.06297817081212997, -0.031831566244363785, 0.005914940964430571, 0.02949478104710579, 0.03197047486901283, -0.08044377714395523, 0.003821771126240492, 0.01167997345328331, 0.0656813383102417, 0.06736592948436737, 0.054595332592725754, 0.012265332974493504, -0.032470881938934326, 0.2500772774219513, -0.047503598034381866, -0.07698950171470642, -0.1287834495306015, 0.08565540611743927, 0.03739509359002113, -0.004028010182082653, 0.06842847168445587, -0.11696721613407135, 0.002963562263175845, 0.1472458392381668, 0.16101136803627014, -0.05691475793719292, -0.02531123347580433, -0.011324615217745304, -0.0033983478788286448, -0.049580350518226624, 0.08522091060876846, 0.07676587998867035, 0.012267309240996838, -0.06349793821573257, -0.03200465440750122, -0.027094239369034767, -0.022294990718364716, -0.09335823357105255, 0.0872107595205307, 0.006669535767287016, -0.00630700308829546, -0.03578038886189461, 0.08647577464580536, 0.028746487572789192, -0.16545206308364868, 0.011590204201638699, -0.19061394035816193, -0.20498333871364594, -0.01615445874631405, 0.13000647723674774, -0.026054298505187035, 0.05904286727309227, -0.007497571408748627, 0.015495818108320236, 0.08651180565357208, 0.004047557711601257, -0.04309548810124397, -0.08065836131572723, 0.08024222403764725, -0.054394353181123734, 0.2603135108947754, 0.016588125377893448, 0.07705757021903992, 0.09799052029848099, -0.015551640652120113, -0.1369262933731079, 0.04883546009659767, 0.10801060497760773, -0.021200429648160934, 0.04675542563199997, 0.1600484549999237, -0.07021467387676239, 0.0919518992304802, 0.055205799639225006, -0.10244676470756531, -0.03443122282624245, -0.009424101561307907, 0.009360644035041332, -0.05560813471674919, 0.012585217133164406, -0.09035708010196686, 0.1703495979309082, 0.17917479574680328, -0.043504659086465836, -0.016113489866256714, -0.07362125813961029, 0.05217653512954712, 0.05523716285824776, 0.06478413939476013, -0.0034121088683605194, -0.17906589806079865, -0.035326264798641205, 0.010820037685334682, 0.048245206475257874, -0.23155346512794495, -0.08769059181213379, 0.031466688960790634, -0.05716019868850708, -0.05074140429496765, 0.10389638692140579, 0.04994365945458412, 0.0031391505617648363, -0.04847162961959839, -0.06542012840509415, -0.04109596461057663, 0.10657136887311935, -0.13692468404769897, -0.06547123938798904 ]
null
null
null
# **Reinforce** Agent playing **Pixelcopter-PLE-v0** This is a trained model of a **Reinforce** agent playing **Pixelcopter-PLE-v0** . To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: https://huggingface.co/deep-rl-course/unit4/introduction
{"tags": ["Pixelcopter-PLE-v0", "reinforce", "reinforcement-learning", "custom-implementation", "deep-rl-class"], "model-index": [{"name": "Reinnforce-Pixelcopter-PLE-v0", "results": [{"task": {"type": "reinforcement-learning", "name": "reinforcement-learning"}, "dataset": {"name": "Pixelcopter-PLE-v0", "type": "Pixelcopter-PLE-v0"}, "metrics": [{"type": "mean_reward", "value": "55.80 +/- 41.16", "name": "mean_reward", "verified": false}]}]}]}
reinforcement-learning
juan9/Reinnforce-Pixelcopter-PLE-v0
[ "Pixelcopter-PLE-v0", "reinforce", "reinforcement-learning", "custom-implementation", "deep-rl-class", "model-index", "region:us" ]
2024-02-13T19:06:20+00:00
[]
[]
TAGS #Pixelcopter-PLE-v0 #reinforce #reinforcement-learning #custom-implementation #deep-rl-class #model-index #region-us
# Reinforce Agent playing Pixelcopter-PLE-v0 This is a trained model of a Reinforce agent playing Pixelcopter-PLE-v0 . To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: URL
[ "# Reinforce Agent playing Pixelcopter-PLE-v0\n This is a trained model of a Reinforce agent playing Pixelcopter-PLE-v0 .\n To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: URL" ]
[ "TAGS\n#Pixelcopter-PLE-v0 #reinforce #reinforcement-learning #custom-implementation #deep-rl-class #model-index #region-us \n", "# Reinforce Agent playing Pixelcopter-PLE-v0\n This is a trained model of a Reinforce agent playing Pixelcopter-PLE-v0 .\n To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: URL" ]
[ 41, 58 ]
[ "passage: TAGS\n#Pixelcopter-PLE-v0 #reinforce #reinforcement-learning #custom-implementation #deep-rl-class #model-index #region-us \n# Reinforce Agent playing Pixelcopter-PLE-v0\n This is a trained model of a Reinforce agent playing Pixelcopter-PLE-v0 .\n To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: URL" ]
[ 0.0073175891302526, -0.2259262204170227, -0.0017347558168694377, 0.05054566636681557, 0.0658537745475769, -0.055378563702106476, 0.1412602812051773, 0.05916554853320122, -0.04990595206618309, 0.059261854737997055, 0.14166708290576935, 0.03996060788631439, 0.022112762555480003, 0.1513713151216507, 0.09764605015516281, -0.2469022423028946, 0.07438477873802185, 0.01641594059765339, 0.008152224123477936, 0.09583204984664917, 0.060265738517045975, -0.1405058205127716, 0.037032704800367355, -0.01332044042646885, -0.13650871813297272, 0.0010478810872882605, -0.021802188828587532, -0.03625129908323288, 0.15681709349155426, 0.006844013463705778, 0.09602472931146622, -0.001560068572871387, 0.06475798785686493, -0.12438877671957016, 0.05466329678893089, 0.06455880403518677, -0.06293967366218567, 0.058029334992170334, -0.057374246418476105, 0.11959903687238693, 0.04641333222389221, -0.01578129455447197, 0.054811324924230576, 0.010941818356513977, -0.14131468534469604, -0.006710252724587917, 0.007013716734945774, 0.15098218619823456, 0.1339312642812729, 0.01409265398979187, -0.0014771400019526482, 0.1363491266965866, -0.16774429380893707, 0.045684073120355606, 0.061802688986063004, -0.2633039951324463, -0.04168876260519028, 0.12259352207183838, 0.08951573073863983, 0.06848238408565521, -0.060910262167453766, 0.07636868953704834, 0.049813780933618546, 0.013985024765133858, 0.023094501346349716, -0.042509064078330994, -0.040479615330696106, 0.02289252169430256, -0.0921095609664917, -0.05999262258410454, 0.11517233401536942, -0.006806366611272097, 0.03735918551683426, -0.12476086616516113, -0.015330453403294086, -0.07314357161521912, -0.05917041376233101, -0.082573801279068, 0.07563583552837372, 0.030191516503691673, -0.048283837735652924, -0.08895846456289291, -0.056533291935920715, -0.11489585787057877, -0.023082571104168892, -0.07226225733757019, 0.005096882116049528, -0.03157244250178337, -0.035645097494125366, 0.09446526318788528, -0.0021088174544274807, -0.015028090216219425, -0.03452150896191597, -0.05930153280496597, -0.04213470220565796, -0.02359505370259285, -0.03510070592164993, -0.059062156826257706, 0.054655663669109344, 0.0680202916264534, 0.04938843473792076, 0.09133565425872803, -0.0467856265604496, 0.1667373925447464, -0.03256719931960106, 0.08078566938638687, -0.011897698976099491, 0.2012830525636673, 0.11370102316141129, 0.12129533290863037, 0.06716908514499664, -0.05294690653681755, -0.16726544499397278, 0.039163749665021896, 0.12641896307468414, 0.07664673775434494, -0.032492902129888535, 0.018162984400987625, -0.12440363317728043, 0.05439428985118866, -0.14826108515262604, -0.06745084375143051, 0.024251462891697884, 0.01822635903954506, -0.060682263225317, 0.03656952083110809, -0.0028792342636734247, 0.003339326474815607, 0.004654870834201574, -0.16432709991931915, -0.05568019300699234, 0.028964387252926826, -0.15712425112724304, -0.06656725704669952, 0.06277995556592941, -0.10113482922315598, -0.012132617644965649, -0.16982388496398926, -0.16305199265480042, -0.03628521412611008, 0.017857929691672325, -0.040613796561956406, -0.056917786598205566, -0.14010562002658844, -0.019415250048041344, -0.045320261269807816, -0.004312154371291399, 0.044072363525629044, 0.0020940210670232773, 0.04635847359895706, 0.0066573889926075935, 0.09289347380399704, 0.010714372619986534, -0.0014722738415002823, -0.04595406726002693, 0.0909833237528801, -0.30731555819511414, 0.07525643706321716, -0.08645553886890411, 0.05539081245660782, -0.057316381484270096, -0.0926317572593689, -0.007509906310588121, 0.06277763843536377, 0.060464419424533844, 0.20788121223449707, -0.2800109386444092, -0.07025618106126785, 0.13655538856983185, -0.09533236175775528, -0.13146020472049713, 0.0513952374458313, -0.050213608890771866, 0.07593657076358795, 0.027370907366275787, 0.140700101852417, -0.028026295825839043, -0.15554022789001465, 0.06281048059463501, 0.04586128890514374, -0.11356306821107864, 0.019295670092105865, 0.03597676753997803, 0.06723599135875702, 0.05744141340255737, -0.036986757069826126, -0.04105675220489502, 0.08096802979707718, -0.07076814025640488, -0.037564266473054886, 0.04588831216096878, -0.0579565204679966, 0.1630958467721939, 0.033971156924963, 0.09856503456830978, -0.04149768501520157, -0.07435470074415207, -0.005698562134057283, 0.038746561855077744, -0.08962973952293396, 0.025353478267788887, -0.18320298194885254, 0.2423991560935974, -0.02621818706393242, 0.027546977624297142, -0.16845986247062683, -0.0588528998196125, 0.011087946593761444, 0.21568740904331207, 0.030399197712540627, 0.12989304959774017, 0.07485637813806534, -0.01250512059777975, 0.014156299643218517, -0.06183977797627449, -0.1972363442182541, -0.03247830644249916, 0.008314179256558418, -0.058311350643634796, -0.04934588819742203, -0.0900716632604599, 0.10427892208099365, -0.19334633648395538, -0.005319371819496155, 0.08282599598169327, 0.023504555225372314, 0.03946567326784134, 0.0035407328978180885, -0.03634254261851311, 0.055148303508758545, 0.02030518464744091, -0.08980578929185867, 0.14668866991996765, 0.0035520538222044706, -0.03514726087450981, -0.03927676007151604, -0.03267495706677437, 0.05703731253743172, 0.08045367896556854, -0.18214593827724457, -0.0733821839094162, -0.0838410034775734, -0.02458474040031433, 0.050523869693279266, 0.036679428070783615, 0.02738112211227417, 0.44813573360443115, 0.057562243193387985, 0.09003535658121109, -0.08811535686254501, 0.039806611835956573, 0.012785476632416248, -0.031281858682632446, 0.013625281862914562, 0.04725322127342224, 0.11279468983411789, 0.028284218162298203, 0.01669839769601822, 0.03680038824677467, 0.01938779093325138, 0.08824212104082108, -0.10939645022153854, -0.003965397831052542, 0.002614045049995184, 0.038018375635147095, 0.03672022372484207, 0.07190682739019394, 0.015936892479658127, -0.09583546966314316, -0.030848123133182526, -0.11166880279779434, 0.015594755299389362, -0.20979784429073334, -0.025905707851052284, -0.029619399458169937, 0.0003502996696624905, 0.09109684824943542, 0.04222718998789787, -0.04444896802306175, 0.035467714071273804, 0.03947039321064949, -0.0861397460103035, 0.0594942644238472, -0.014317752793431282, -0.07008631527423859, 0.13023322820663452, -0.1002996563911438, -0.3153233230113983, -0.08797995746135712, 0.05698639526963234, 0.05295826122164726, 0.06816939264535904, -0.05876303091645241, -0.09240786731243134, 0.03294730558991432, -0.06836386770009995, -0.0017794050509110093, 0.0037346978206187487, -0.051060982048511505, 0.07253886014223099, 0.08541567623615265, -0.014505518600344658, -0.08911184966564178, -0.006620637606829405, -0.041561197489500046, -0.124965138733387, 0.044060997664928436, -0.03760828450322151, 0.00007921225915197283, 0.18620672821998596, 0.03724536672234535, 0.06256633251905441, -0.06291008740663528, 0.07596296072006226, -0.09150096774101257, 0.0004740063741337508, 0.18428465723991394, -0.015377625823020935, -0.004100616089999676, -0.03996327146887779, -0.0259257685393095, -0.10829219967126846, 0.053985193371772766, -0.07330703735351562, -0.07349077612161636, -0.0023273853585124016, -0.07770214974880219, -0.0351552739739418, 0.0012160884216427803, 0.07817990332841873, 0.029699061065912247, -0.09635239094495773, 0.04920589178800583, 0.1298678070306778, 0.0931883230805397, 0.03626195341348648, 0.023981640115380287, 0.13739009201526642, -0.11230582743883133, 0.019063033163547516, -0.05148853361606598, -0.1041760966181755, -0.042787205427885056, -0.0714287981390953, 0.07368279993534088, 0.06034531816840172, -0.09970010071992874, 0.05144011229276657, 0.041872985661029816, 0.0883496031165123, 0.1373600959777832, -0.04213863983750343, -0.11244629323482513, -0.041393622756004333, -0.022004956379532814, -0.1777329444885254, 0.0341336652636528, 0.22155584394931793, 0.0073304991237819195, -0.10497386753559113, 0.07876885682344437, -0.005956185050308704, 0.11527370661497116, 0.031222699210047722, -0.278682678937912, 0.016931315883994102, 0.00203216471709311, 0.042359162122011185, -0.047676295042037964, 0.10937416553497314, 0.11747439950704575, -0.14421136677265167, -0.06650938838720322, -0.03273930773139, 0.044137366116046906, -0.15618287026882172, 0.036923591047525406, -0.12602220475673676, 0.06240779533982277, 0.050940994173288345, 0.05090156942605972, -0.2197665423154831, 0.06881614029407501, -0.0274215005338192, 0.06763827055692673, -0.062248338013887405, -0.01823522336781025, 0.04473711550235748, 0.025079863145947456, 0.14955177903175354, -0.014347962103784084, 0.14454017579555511, -0.09031219780445099, -0.11753576993942261, 0.0027052261866629124, 0.08532248437404633, 0.013173088431358337, 0.013580933213233948, 0.0026939227245748043, 0.041669201105833054, -0.02811569906771183, 0.17063532769680023, -0.08147624880075455, -0.022407781332731247, -0.06592555344104767, -0.018158966675400734, 0.2039334923028946, -0.12064731866121292, -0.10121093690395355, -0.11619500070810318, 0.08663272857666016, -0.04296411573886871, 0.08175522089004517, -0.020344657823443413, 0.049704354256391525, -0.02509051002562046, 0.007178863976150751, 0.09594997018575668, 0.01950966566801071, 0.08983828872442245, -0.09791163355112076, -0.019585272297263145, 0.13838915526866913, -0.037155888974666595, -0.036971647292375565, -0.019425252452492714, 0.11054370552301407, -0.0358734093606472, 0.08033111691474915, 0.03929615020751953, 0.03664831817150116, 0.03428546339273453, -0.039165496826171875, 0.10309428721666336, 0.10041618347167969, -0.06291446089744568, 0.03864621743559837, -0.07954532653093338, 0.26597461104393005, 0.040773067623376846, 0.07301845401525497, 0.28390514850616455, 0.19391325116157532, -0.03036464750766754, 0.10683353990316391, -0.017607249319553375, -0.024403288960456848, -0.2950931787490845, 0.0006976581644266844, 0.027765681967139244, 0.11812873929738998, 0.01744898222386837, -0.20587195456027985, -0.1211688369512558, -0.03560304269194603, -0.007791717536747456, 0.0310499370098114, -0.2441052496433258, -0.06442268192768097, 0.06107868626713753, 0.13779635727405548, 0.15878525376319885, -0.05917542055249214, -0.007856467738747597, 0.029358724132180214, 0.07593556493520737, 0.017292039468884468, -0.11598441749811172, 0.11550791561603546, 0.025637371465563774, -0.05708931386470795, 0.0267958827316761, -0.044003549963235855, 0.04214555397629738, -0.17736166715621948, 0.10933554917573929, -0.05924695357680321, -0.08421005308628082, 0.07140472531318665, -0.02217724733054638, -0.048552993685007095, 0.0789642184972763, 0.020652711391448975, -0.13173207640647888, 0.038154006004333496, 0.005618774797767401, 0.04346654564142227, -0.004941361024975777, -0.019811764359474182, -0.029163256287574768, 0.07706235349178314, -0.03806605935096741, 0.09605937451124191, 0.19590972363948822, -0.0573095865547657, 0.03974950686097145, 0.085201695561409, 0.09593135863542557, -0.05523005872964859, -0.0809539332985878, -0.03812742978334427, -0.005277194548398256, 0.0674438327550888, -0.08598461747169495, -0.019085103645920753, 0.07938229292631149, 0.015313901007175446, 0.14910826086997986, 0.14389736950397491, -0.08835655450820923, 0.11321785300970078, 0.10694554448127747, -0.11366690695285797, -0.08583837002515793, -0.02963297814130783, 0.0009990704711526632, 0.04910186678171158, -0.048617590218782425, 0.05932905897498131, -0.1035301461815834, 0.012819357216358185, 0.03532040864229202, 0.0038119733799248934, -0.09975302964448929, 0.009764863178133965, 0.08645275235176086, 0.06119582802057266, -0.0567571222782135, 0.09250631928443909, -0.0019178141374140978, -0.10868195444345474, 0.07241881638765335, 0.009918469935655594, -0.021528873592615128, -0.06352251768112183, 0.03211374953389168, 0.2370220273733139, 0.13945111632347107, -0.04336636886000633, -0.12396618723869324, -0.15508891642093658, 0.037849195301532745, 0.024356422945857048, 0.051251959055662155, 0.0062240250408649445, -0.06906022876501083, 0.01234503649175167, -0.04392383247613907, 0.005266309250146151, -0.05930564925074577, -0.047703344374895096, -0.12081446498632431, 0.1154373437166214, 0.053290288895368576, 0.11705748736858368, -0.0842847004532814, -0.07057584822177887, -0.1921386867761612, 0.09190598875284195, 0.041707299649715424, -0.05532265454530716, 0.06002674251794815, -0.030134430155158043, 0.017344338819384575, 0.11256659775972366, -0.051967836916446686, 0.008543911390006542, -0.09269233793020248, 0.03236149623990059, 0.03133073076605797, 0.04903566092252731, -0.004612727556377649, -0.017903391271829605, 0.04399999976158142, -0.05730267986655235, 0.07619527727365494, -0.07757602632045746, -0.033709146082401276, 0.0645759105682373, -0.16051416099071503, -0.054324716329574585, 0.08708633482456207, 0.013749903067946434, 0.02590017393231392, -0.05825240537524223, 0.019142305478453636, -0.05566488951444626, -0.04483235627412796, 0.01169554702937603, -0.05552767962217331, -0.011517677456140518, 0.05293213203549385, -0.05287189036607742, -0.040493328124284744, -0.06794002652168274, 0.061874233186244965, -0.07247710227966309, 0.09816460311412811, 0.031187955290079117, -0.10892423242330551, 0.07648903876543045, -0.037552736699581146, -0.0049397205002605915, -0.009439278393983841, 0.039307788014411926, 0.15598824620246887, -0.1606634259223938, 0.05345672369003296, -0.0484454482793808, 0.13272921741008759, 0.046888746321201324, -0.04458791762590408, -0.020207170397043228, 0.02469455823302269, -0.05549024045467377, 0.06932897865772247, 0.15877580642700195, 0.09880131483078003, 0.02571805939078331, 0.008134597912430763, 0.10187267512083054, 0.1060529574751854, 0.08136752992868423, 0.08394161611795425, -0.03428563475608826, -0.11287897825241089, 0.14338994026184082, 0.09748584777116776, 0.024613093584775925, 0.21077860891819, 0.17944025993347168, 0.03125298395752907, 0.03018142655491829, -0.06512103229761124, 0.17325744032859802, 0.061261482536792755, -0.08229418843984604, 0.014424329623579979, 0.03221147879958153, -0.049809664487838745, -0.047004032880067825, -0.09757380187511444, -0.029556652531027794, -0.24085633456707, 0.10851483792066574, -0.057250600308179855, -0.09750643372535706, 0.022772664204239845, 0.02990041859447956, -0.018839845433831215, 0.11280566453933716, -0.07735858112573624, 0.012980576604604721, 0.18577688932418823, -0.03825045004487038, -0.022322099655866623, -0.1633504331111908, -0.11154003441333771, -0.014046176336705685, -0.11750495433807373, 0.025494296103715897, 0.06305963546037674, 0.01117965579032898, 0.04399528726935387, 0.028923438861966133, -0.020834028720855713, 0.019218796864151955, -0.05903913825750351, -0.042673509567976, -0.01891910657286644, 0.02202831581234932, -0.09593231230974197, -0.03627033904194832, 0.12151803076267242, -0.03246605768799782, -0.08207374066114426, -0.006544890813529491, 0.07848484069108963, -0.042620159685611725, 0.09450104832649231, -0.07687012106180191, -0.03479038178920746, -0.06794454902410507, 0.268902063369751, 0.09388194978237152, -0.20183001458644867, 0.03341769427061081, -0.030470456928014755, 0.026735708117485046, -0.09215684235095978, 0.16250114142894745, 0.0899243950843811, 0.049168527126312256, -0.12686687707901, -0.003401300171390176, -0.09992645680904388, -0.0028723697178065777, -0.12552696466445923, -0.14725084602832794, 0.12093491852283478, -0.003848524997010827, -0.06547791510820389, 0.02844911813735962, -0.15909899771213531, 0.06585367769002914, 0.0978507474064827, -0.1514272391796112, -0.038227714598178864, -0.06086801365017891, 0.06072385236620903, 0.026465637609362602, 0.13005392253398895, -0.05080926790833473, 0.012067130766808987, -0.0656723901629448, -0.011309894733130932, -0.0000654291216051206, -0.017478201538324356, 0.001532604917883873, -0.09828947484493256, 0.05038110539317131, -0.0835796371102333, 0.12184429168701172, 0.05709611251950264, 0.005326167680323124, 0.008464806713163853, 0.0648408755660057, -0.02414623089134693, -0.10202058404684067, -0.01877439208328724, 0.033475372940301895, 0.03998998552560806, 0.010373802855610847, 0.034506846219301224, 0.0006507808575406671, 0.07714920490980148, -0.011413984932005405, -0.027285432443022728, -0.058209117501974106, 0.03936338797211647, -0.10441672056913376, 0.10461361706256866, 0.0013552121818065643, -0.02240127883851528, -0.010913821868598461, -0.05532446503639221, 0.045815300196409225, 0.04572062939405441, 0.029743505641818047, -0.05261747166514397, -0.09262793511152267, -0.021781492978334427, 0.023900283500552177, -0.11539579927921295, -0.18497975170612335, -0.0664035826921463, -0.15038692951202393, -0.01633414439857006, -0.0620744526386261, 0.08902198076248169, 0.13558129966259003, 0.030392181128263474, -0.04822919890284538, -0.12171997129917145, 0.025026977062225342, 0.13544774055480957, -0.03851630911231041, -0.07532322406768799 ]
null
null
diffusers
# banner <Gallery /> ## Trigger words You should use `car` to trigger the image generation. You should use `banner` to trigger the image generation. You should use `game` to trigger the image generation. ## Download model Weights for this model are available in Safetensors format. [Download](/haliliboselcuk/banner/tree/main) them in the Files & versions tab.
{"tags": ["text-to-image", "stable-diffusion", "lora", "diffusers", "template:sd-lora"], "widget": [{"text": "a 3d ultra graphic car", "parameters": {"negative_prompt": "2d, bad"}, "output": {"url": "images/drift-max-pro-car-racing-game.png"}}, {"text": "2 people runing on the train way, 3d game style", "output": {"url": "images/subway-surfers.png"}}, {"text": "cars game, ultra graphics", "parameters": {"negative_prompt": "2d"}, "output": {"url": "images/traffic-racer.png"}}], "base_model": "stabilityai/stable-diffusion-xl-base-1.0", "instance_prompt": "car, banner, game"}
text-to-image
haliliboselcuk/banner
[ "diffusers", "text-to-image", "stable-diffusion", "lora", "template:sd-lora", "base_model:stabilityai/stable-diffusion-xl-base-1.0", "region:us" ]
2024-02-13T19:07:27+00:00
[]
[]
TAGS #diffusers #text-to-image #stable-diffusion #lora #template-sd-lora #base_model-stabilityai/stable-diffusion-xl-base-1.0 #region-us
# banner <Gallery /> ## Trigger words You should use 'car' to trigger the image generation. You should use 'banner' to trigger the image generation. You should use 'game' to trigger the image generation. ## Download model Weights for this model are available in Safetensors format. Download them in the Files & versions tab.
[ "# banner\n\n<Gallery />", "## Trigger words\n\nYou should use 'car' to trigger the image generation.\n\nYou should use 'banner' to trigger the image generation.\n\nYou should use 'game' to trigger the image generation.", "## Download model\n\nWeights for this model are available in Safetensors format.\n\nDownload them in the Files & versions tab." ]
[ "TAGS\n#diffusers #text-to-image #stable-diffusion #lora #template-sd-lora #base_model-stabilityai/stable-diffusion-xl-base-1.0 #region-us \n", "# banner\n\n<Gallery />", "## Trigger words\n\nYou should use 'car' to trigger the image generation.\n\nYou should use 'banner' to trigger the image generation.\n\nYou should use 'game' to trigger the image generation.", "## Download model\n\nWeights for this model are available in Safetensors format.\n\nDownload them in the Files & versions tab." ]
[ 56, 7, 41, 28 ]
[ "passage: TAGS\n#diffusers #text-to-image #stable-diffusion #lora #template-sd-lora #base_model-stabilityai/stable-diffusion-xl-base-1.0 #region-us \n# banner\n\n<Gallery />## Trigger words\n\nYou should use 'car' to trigger the image generation.\n\nYou should use 'banner' to trigger the image generation.\n\nYou should use 'game' to trigger the image generation.## Download model\n\nWeights for this model are available in Safetensors format.\n\nDownload them in the Files & versions tab." ]
[ -0.13751430809497833, -0.037988074123859406, -0.00013539222709368914, 0.03952335566282272, 0.1267416775226593, 0.05533215031027794, 0.13843891024589539, 0.04483330249786377, 0.14634805917739868, 0.08362183719873428, 0.1101895347237587, 0.07110042124986649, 0.04575970396399498, 0.2582109272480011, -0.03196896240115166, -0.20963013172149658, 0.05772121995687485, 0.011149588041007519, 0.043564315885305405, 0.043847888708114624, -0.0033744105603545904, -0.08517563343048096, 0.14155076444149017, -0.05318314582109451, -0.08501922339200974, 0.01760050840675831, 0.06190798431634903, 0.026740992441773415, 0.012920957989990711, 0.022229421883821487, -0.008703241124749184, 0.11895522475242615, 0.08258369565010071, -0.0985414981842041, 0.05351319909095764, 0.03495236113667488, -0.05502428486943245, 0.007126608397811651, 0.0478929728269577, -0.008786718361079693, 0.17019887268543243, -0.03607014939188957, -0.053631410002708435, 0.029769113287329674, -0.0419556088745594, -0.11397262662649155, 0.011699597351253033, -0.08563360571861267, 0.09723204374313354, -0.033795297145843506, -0.003411522600799799, 0.014532258734107018, 0.00944411102682352, 0.0671359971165657, 0.17731893062591553, -0.20699582993984222, -0.06074211001396179, 0.3400167226791382, 0.111444391310215, 0.20024316012859344, -0.0677543431520462, 0.14182355999946594, 0.09632542729377747, -0.04750987887382507, -0.005040965508669615, -0.021129373461008072, 0.13361012935638428, -0.06781800836324692, -0.03623499348759651, -0.016057288274168968, 0.2632180452346802, 0.017436347901821136, -0.01078188605606556, -0.10248885303735733, -0.08606104552745819, 0.09828732162714005, -0.08803100883960724, 0.0013827820075675845, 0.028426969423890114, 0.04722127318382263, -0.015728045254945755, -0.12385258823633194, -0.05685121938586235, -0.14675584435462952, -0.03017224743962288, 0.12634800374507904, -0.01860753260552883, 0.06582682579755783, -0.004674750845879316, 0.14603017270565033, -0.14450210332870483, -0.15030179917812347, -0.031276725232601166, -0.10711196810007095, -0.008577033877372742, 0.05690930038690567, 0.017422398552298546, -0.11139117926359177, 0.07495389133691788, 0.00881332065910101, 0.033024732023477554, 0.025303661823272705, -0.08166561275720596, 0.10239312052726746, 0.00315054296515882, 0.02645111456513405, -0.0007712776423431933, -0.10496920347213745, 0.0685553029179573, 0.07025675475597382, 0.09714782238006592, -0.05908585339784622, -0.15994606912136078, -0.025745803490281105, -0.08508563041687012, 0.029280351474881172, 0.020951226353645325, 0.04871995002031326, -0.06291325390338898, -0.017169702798128128, 0.1690281331539154, 0.003406336298212409, -0.032945599406957626, -0.005183821078389883, -0.011015634052455425, 0.1408582478761673, 0.06580539047718048, -0.014853849075734615, 0.10907825082540512, -0.03761173412203789, -0.05272993445396423, -0.048667170107364655, -0.11314230412244797, -0.035413216799497604, -0.007278414908796549, -0.19107487797737122, -0.0014843258541077375, -0.08841691166162491, -0.32267481088638306, 0.004023601301014423, 0.05904867872595787, -0.03924989700317383, 0.0028473439160734415, -0.06355983018875122, 0.00959993340075016, 0.002369730034843087, 0.016438299790024757, -0.006553688086569309, -0.05869758874177933, 0.114008329808712, -0.021597672253847122, 0.1736443191766739, -0.12051396816968918, 0.004654570948332548, -0.09214048087596893, 0.07879338413476944, -0.31624966859817505, 0.12162461131811142, -0.06021440774202347, 0.04111526161432266, 0.018206913024187088, 0.03186048939824104, -0.1155080497264862, 0.02560572512447834, -0.017253268510103226, 0.20995667576789856, -0.24471205472946167, -0.07573696970939636, 0.07445388287305832, -0.21226371824741364, -0.11038937419652939, 0.023880917578935623, -0.021427901461720467, 0.12186583876609802, 0.11228416860103607, 0.15570957958698273, 0.04835947975516319, -0.20735584199428558, 0.03737449273467064, 0.09088592976331711, -0.17470206320285797, -0.06059849262237549, 0.06713958084583282, 0.08588995784521103, -0.10920975357294083, 0.09410380572080612, -0.16216729581356049, 0.1259135603904724, -0.06143062561750412, 0.02026325836777687, -0.03253848850727081, -0.10514193773269653, 0.028678206726908684, 0.08132809400558472, 0.006127789616584778, -0.011465407907962799, -0.01344279758632183, 0.017493095248937607, 0.08540491759777069, -0.06774181127548218, -0.005086121149361134, 0.0048513454385101795, 0.1985270082950592, -0.22087526321411133, 0.041007302701473236, -0.05131908878684044, -0.1522998958826065, 0.03186127915978432, 0.20794066786766052, 0.02873648889362812, 0.03402073308825493, 0.10988488048315048, 0.08174275606870651, -0.07374034821987152, -0.06085817888379097, 0.10569705814123154, -0.05333966761827469, -0.035254210233688354, -0.11964987218379974, -0.03534426540136337, -0.08043813705444336, 0.07605396211147308, -0.19882410764694214, 0.028799593448638916, 0.03394026309251785, 0.0400690883398056, 0.06213962286710739, -0.03763726353645325, 0.00020558408868964761, -0.1079161986708641, -0.076626256108284, -0.05005883052945137, 0.0613497719168663, 0.014545434154570103, -0.03608198091387749, 0.13020049035549164, -0.1052059605717659, 0.12257946282625198, 0.16973544657230377, 0.031041905283927917, 0.046878110617399216, -0.14101064205169678, 0.03614602982997894, 0.004927974194288254, -0.027458645403385162, -0.03502481430768967, -0.11571023613214493, 0.013287150301039219, 0.060680195689201355, -0.09175316989421844, 0.08375196158885956, 0.06764644384384155, -0.012608123011887074, -0.04849999025464058, 0.03331153839826584, 0.1518837958574295, 0.04547068476676941, 0.10172566771507263, 0.10303130000829697, -0.007151142228394747, 0.20194794237613678, 0.003612831002101302, -0.11514168977737427, 0.06231353431940079, 0.03683627396821976, 0.011134655214846134, 0.1630573868751526, 0.027688724920153618, -0.01609942317008972, 0.01895422860980034, -0.08668303489685059, -0.0007756894337944686, -0.08872739970684052, -0.05394255742430687, 0.010656184516847134, -0.06262059509754181, 0.09848187863826752, 0.10219941288232803, -0.057203564792871475, 0.08056126534938812, -0.07277100533246994, -0.036453139036893845, 0.03520246595144272, -0.027432048693299294, -0.03375699743628502, 0.08364710211753845, -0.004417378921061754, -0.1490887850522995, -0.12048108875751495, -0.023116404190659523, -0.12099715322256088, 0.0820741057395935, 0.02464209496974945, -0.07384283095598221, -0.017608433961868286, -0.05428178608417511, 0.05079289525747299, 0.1010749340057373, -0.030057748779654503, -0.07408661395311356, -0.008217945694923401, -0.0653604045510292, -0.07075825333595276, -0.02423325553536415, -0.07584602385759354, -0.08002790808677673, 0.0683583915233612, -0.12444230914115906, 0.16542071104049683, 0.13796471059322357, 0.0030188041273504496, 0.06610674411058426, -0.0023410930298268795, 0.09084463119506836, -0.08557907491922379, 0.10419021546840668, 0.18461337685585022, 0.0771414116024971, 0.05593002215027809, 0.12127428501844406, -0.014762228354811668, -0.089545838534832, 0.05308030918240547, -0.03403972461819649, -0.14414681494235992, -0.026609322056174278, -0.0839385837316513, -0.08460851013660431, 0.026150589808821678, 0.04015200957655907, 0.027164582163095474, 0.06400323659181595, 0.18919967114925385, -0.006452622823417187, -0.034930694848299026, 0.0743010938167572, 0.06886778026819229, -0.04248575121164322, -0.032833077013492584, 0.08204000443220139, -0.05720854550600052, -0.012062753550708294, 0.14443792402744293, 0.00849931687116623, 0.18249273300170898, -0.01195247657597065, 0.03347013518214226, 0.016687944531440735, 0.0519951693713665, 0.10787003487348557, 0.13432686030864716, -0.027163619175553322, -0.04075503349304199, -0.0491766519844532, -0.09481502324342728, 0.006363269407302141, 0.09009464085102081, -0.0019448775565251708, 0.015520512126386166, -0.020155299454927444, 0.06460700184106827, 0.02685631439089775, 0.007147193420678377, 0.04826277494430542, -0.3383787274360657, 0.06997336447238922, 0.09874208271503448, 0.1463163048028946, -0.10091672837734222, 0.04539298638701439, 0.1143658459186554, -0.04819820076227188, 0.03547172620892525, -0.023841900750994682, 0.07137620449066162, 0.029002318158745766, -0.05890101194381714, -0.03990969806909561, 0.13592529296875, -0.05058758333325386, 0.02537386119365692, -0.07409457117319107, 0.03630053997039795, -0.01835022307932377, -0.019286761060357094, -0.04545825719833374, -0.040981072932481766, 0.10742692649364471, 0.18068194389343262, 0.15201450884342194, -0.020093882456421852, 0.024552669376134872, -0.08973469585180283, -0.09220436215400696, 0.01136363111436367, 0.006496616639196873, -0.09906033426523209, 0.017817262560129166, 0.022730886936187744, -0.0063244798220694065, 0.0012871769722551107, 0.04423833638429642, -0.13740985095500946, -0.08384000509977341, -0.08281403034925461, 0.0714593455195427, 0.08944565057754517, -0.03019545041024685, -0.07044414430856705, -0.07935445755720139, 0.11906032264232635, 0.15352396667003632, -0.11870225518941879, -0.09477080404758453, -0.05659129470586777, 0.03806445747613907, -0.010211060754954815, 0.033501654863357544, -0.0727585107088089, 0.16753582656383514, -0.11786308884620667, -0.1461223065853119, 0.060257408767938614, -0.05042540654540062, -0.06494058668613434, -0.03376270830631256, 0.10269884765148163, 0.0005758422194048762, -0.03486740216612816, 0.005433378741145134, 0.027146706357598305, 0.04606654867529869, -0.06740549206733704, 0.039341140538454056, 0.031146036460995674, 0.006526663433760405, 0.02725484035909176, 0.015759620815515518, -0.0402449406683445, 0.021958421915769577, 0.06852112710475922, 0.012397516518831253, 0.24907632172107697, -0.09000688791275024, 0.0619637593626976, 0.13012981414794922, -0.0010642936686053872, -0.253439724445343, 0.016461946070194244, -0.05602147802710533, -0.02692204713821411, 0.07247978448867798, -0.09090254455804825, 0.15717513859272003, 0.06271091848611832, -0.0772923156619072, 0.27758878469467163, -0.23420195281505585, -0.09260407835245132, 0.048559363931417465, 0.15066540241241455, 0.28916287422180176, -0.2185361534357071, -0.03591000288724899, -0.0895272046327591, -0.10035819560289383, 0.05949877202510834, -0.09720133990049362, 0.0490066260099411, -0.02826259844005108, -0.08441562950611115, 0.016893655061721802, -0.049085140228271484, 0.15382853150367737, -0.04783770814538002, 0.11671093851327896, -0.05861543491482735, -0.015770521014928818, 0.12658940255641937, -0.07699130475521088, 0.09211742132902145, -0.16192206740379333, 0.015326671302318573, -0.08870409429073334, -0.036207810044288635, -0.012107037007808685, 0.0793374553322792, 0.031781505793333054, -0.06456413120031357, -0.044300176203250885, -0.009072663262486458, -0.05114638805389404, 0.040979474782943726, 0.05564391613006592, -0.07405270636081696, -0.010123652406036854, 0.1442434936761856, -0.011914413422346115, -0.05067150294780731, -0.054261185228824615, -0.04658234119415283, -0.055679239332675934, 0.11809621006250381, -0.11416345089673996, -0.027253082022070885, 0.07263991981744766, 0.011916765943169594, 0.07158292829990387, 0.06549476832151413, -0.002680753357708454, 0.11381188035011292, 0.10522837936878204, -0.08789651840925217, -0.033789027482271194, -0.06693200021982193, -0.08540482074022293, 0.06821152567863464, -0.002352886600419879, 0.031208273023366928, -0.036098118871450424, 0.06743282079696655, -0.021271537989377975, 0.016222164034843445, -0.01763828843832016, 0.008498024195432663, 0.06283135712146759, -0.0187454242259264, -0.128358393907547, 0.10759396851062775, -0.036598704755306244, -0.010770492255687714, -0.09776172041893005, 0.05327008292078972, -0.09725494682788849, -0.02753044292330742, -0.03535810112953186, 0.17319531738758087, -0.03477707505226135, -0.029587050899863243, -0.02837405912578106, -0.09092101454734802, -0.007614336907863617, 0.06734029948711395, 0.11158128827810287, -0.03685740381479263, 0.02768528275191784, 0.020918656140565872, -0.06579360365867615, 0.0917314887046814, 0.061484482139348984, 0.08101653307676315, -0.1859944462776184, -0.14027726650238037, 0.05400989204645157, 0.009719098918139935, -0.14416059851646423, -0.0628206729888916, -0.08294197171926498, 0.032330743968486786, -0.056438613682985306, 0.08428945392370224, -0.10441966354846954, -0.021580765023827553, -0.06143694370985031, -0.05665775388479233, -0.04351075738668442, -0.016494186595082283, -0.06222984939813614, 0.030884483829140663, 0.03257390484213829, 0.033235859125852585, -0.07894120365381241, -0.06355814635753632, 0.02249239571392536, -0.08949387818574905, 0.03319172561168671, 0.021332748234272003, -0.022058237344026566, -0.010468733496963978, -0.2547385096549988, -0.03393596410751343, 0.1111757680773735, 0.02250750921666622, -0.03761494159698486, 0.10274975746870041, 0.0479828380048275, 0.028454938903450966, 0.005908182356506586, -0.04277310147881508, -0.06651685386896133, -0.08354015648365021, 0.1622931957244873, -0.0521480031311512, -0.03411991521716118, -0.004675895441323519, 0.015105102211236954, 0.11773326247930527, 0.09125297516584396, 0.11289478838443756, -0.0721384584903717, -0.009024236351251602, -0.1655595600605011, 0.018376652151346207, 0.02314486727118492, -0.09309153258800507, -0.1052454486489296, -0.05210272595286369, 0.01627853699028492, 0.022642284631729126, 0.17674002051353455, 0.05430018529295921, -0.07928076386451721, -0.0508919358253479, 0.13857698440551758, 0.1918909102678299, -0.011454552412033081, 0.24122796952724457, 0.10287760198116302, 0.10762827843427658, -0.018802696838974953, 0.14118993282318115, 0.1227698102593422, -0.10076542943716049, 0.043633367866277695, 0.013226273469626904, -0.06992669403553009, 0.11035033315420151, 0.004281526431441307, 0.0683918446302414, -0.029047247022390366, 0.0653216540813446, -0.085821732878685, -0.016840828582644463, -0.0197124145925045, 0.07109947502613068, 0.19137059152126312, -0.06657319515943527, -0.03153498098254204, 0.1054995208978653, 0.010392632335424423, -0.12274056673049927, -0.24314557015895844, -0.07526033371686935, -0.27548763155937195, 0.06603097915649414, -0.05232001468539238, 0.02268810383975506, 0.14674925804138184, 0.0060959467664361, 0.02727540396153927, 0.1261119693517685, -0.042805951088666916, -0.022157523781061172, 0.04457103833556175, -0.052422985434532166, -0.06557382643222809, -0.013611197471618652, -0.06165798008441925, 0.09066023677587509, -0.052087582647800446, -0.025979261845350266, 0.058460429310798645, 0.013715232722461224, 0.05363558232784271, 0.017424555495381355, -0.07992898672819138, -0.04653682932257652, 0.04966633394360542, -0.04786321520805359, 0.11356958001852036, 0.01462529692798853, -0.0006900448934175074, -0.018685143440961838, 0.05594160407781601, -0.01073016319423914, -0.03874267265200615, -0.021480411291122437, 0.11337174475193024, -0.12197371572256088, 0.05797800049185753, -0.049986544996500015, -0.1187078207731247, 0.0007833117269910872, 0.24817663431167603, 0.14228373765945435, -0.07385901361703873, -0.019615530967712402, -0.07041053473949432, 0.0010941595537588, -0.029824696481227875, 0.09035401046276093, -0.03977850079536438, 0.18947827816009521, -0.10643360018730164, -0.016863655298948288, -0.09245480597019196, -0.06327728927135468, -0.007977252826094627, -0.11423125863075256, -0.0012502201134338975, -0.03528792783617973, -0.07654837518930435, 0.09966454654932022, -0.14015062153339386, 0.04749009758234024, 0.0784938856959343, 0.006775679532438517, 0.07053020596504211, -0.09813365340232849, 0.012552364729344845, 0.01966240629553795, -0.02907644584774971, -0.13488706946372986, 0.03195798024535179, -0.08458733558654785, -0.0405912771821022, -0.17426559329032898, -0.07610853016376495, 0.006014248821884394, -0.09566004574298859, 0.19241134822368622, -0.07799673080444336, 0.015369197353720665, -0.03768207132816315, -0.034359101206064224, -0.0655234307050705, 0.07072663307189941, -0.03979762643575668, -0.11775797605514526, -0.04917595535516739, 0.04032770171761513, -0.03382863849401474, 0.06239049509167671, -0.011191071011126041, -0.09996447712182999, 0.046043600887060165, 0.08544474095106125, -0.07187516987323761, -0.08655398339033127, 0.017412599176168442, -0.10002566128969193, 0.08299978077411652, -0.011767511256039143, 0.05575186759233475, -0.010716882534325123, -0.03478808328509331, 0.09059686958789825, 0.08005482703447342, 0.02856225147843361, 0.06379029154777527, -0.06508015841245651, -0.05469067022204399, 0.0953826978802681, -0.04134625196456909, -0.20122385025024414, -0.01373737957328558, -0.20728176832199097, -0.004853721708059311, -0.03614332899451256, 0.03747277706861496, 0.17680107057094574, 0.010357585735619068, -0.006490274798125029, -0.21918442845344543, 0.052681293338537216, 0.11166387051343918, -0.11596429347991943, -0.04816431179642677 ]
null
null
peft
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed] ### Framework versions - PEFT 0.8.2
{"library_name": "peft", "base_model": "openai-community/gpt2-xl"}
null
KapitalK/GPT-XL
[ "peft", "arxiv:1910.09700", "base_model:openai-community/gpt2-xl", "region:us" ]
2024-02-13T19:07:30+00:00
[ "1910.09700" ]
[]
TAGS #peft #arxiv-1910.09700 #base_model-openai-community/gpt2-xl #region-us
# Model Card for Model ID ## Model Details ### Model Description - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact ### Framework versions - PEFT 0.8.2
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact", "### Framework versions\n\n- PEFT 0.8.2" ]
[ "TAGS\n#peft #arxiv-1910.09700 #base_model-openai-community/gpt2-xl #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact", "### Framework versions\n\n- PEFT 0.8.2" ]
[ 34, 6, 3, 54, 28, 3, 4, 9, 9, 10, 42, 20, 3, 4, 5, 9, 11, 13, 3, 12, 5, 4, 5, 3, 4, 9, 53, 9, 8, 6, 3, 14, 8, 7, 9, 4, 11 ]
[ "passage: TAGS\n#peft #arxiv-1910.09700 #base_model-openai-community/gpt2-xl #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact### Framework versions\n\n- PEFT 0.8.2" ]
[ -0.10944072157144547, 0.1947823464870453, -0.003353343578055501, 0.03594445064663887, 0.09093703329563141, 0.023151764646172523, 0.05293167382478714, 0.12669996917247772, -0.030834853649139404, 0.10282713919878006, 0.06382612884044647, 0.10728568583726883, 0.10327816009521484, 0.2011939287185669, 0.007335085887461901, -0.19108857214450836, 0.031972840428352356, -0.09058409184217453, -0.015165152959525585, 0.12072603404521942, 0.14970490336418152, -0.09543144702911377, 0.07926523685455322, -0.011417927220463753, -0.019745932891964912, -0.04231948405504227, -0.07989784330129623, -0.030933713540434837, 0.03846520930528641, 0.04920923709869385, 0.050888147205114365, -0.0003864439786411822, 0.08180375397205353, -0.26250001788139343, 0.016710516065359116, 0.043245796114206314, -0.010520601645112038, 0.0811990424990654, 0.10111361742019653, -0.03772459551692009, 0.12633462250232697, -0.03158261254429817, 0.14542149007320404, 0.07804311811923981, -0.09352491050958633, -0.20273341238498688, -0.07368503510951996, 0.07469208538532257, 0.1675443798303604, 0.08164195716381073, -0.0438992902636528, 0.14158234000205994, -0.10815612971782684, 0.014837509021162987, 0.04284926876425743, -0.062474727630615234, -0.07367338240146637, 0.06164265796542168, 0.10417385399341583, 0.047217462211847305, -0.139267235994339, -0.03478344902396202, 0.01862514019012451, 0.03664027899503708, 0.07339321821928024, 0.019428474828600883, 0.14080913364887238, 0.031787242740392685, -0.15022636950016022, -0.04239985719323158, 0.1439979076385498, 0.03573906794190407, -0.04020191356539726, -0.2244696319103241, 0.011940413154661655, -0.08507007360458374, -0.024923410266637802, -0.04768023267388344, 0.038705311715602875, -0.00033921547583304346, 0.0856306180357933, -0.029342766851186752, -0.09008584916591644, -0.020190251991152763, 0.09163089841604233, 0.0534796342253685, 0.02895314432680607, -0.0238115843385458, 0.0029095930512994528, 0.12793204188346863, 0.05927468091249466, -0.12400132417678833, -0.06922591477632523, -0.06908059865236282, -0.04979671910405159, -0.053895242512226105, 0.030116090551018715, 0.03634670004248619, 0.05951434373855591, 0.24458588659763336, -0.021928803995251656, 0.055131129920482635, 0.06332056224346161, 0.019924823194742203, 0.05170498043298721, 0.09420084953308105, -0.06185983493924141, -0.13816048204898834, -0.024021735414862633, 0.09061964601278305, -0.015159261412918568, -0.020146740600466728, -0.04602694511413574, 0.03410838916897774, 0.045287396758794785, 0.10082076489925385, 0.09724103659391403, 0.004124699626117945, -0.07557135820388794, -0.05655656382441521, 0.20923402905464172, -0.14592255651950836, 0.03894606977701187, 0.016127996146678925, -0.02757573500275612, -0.056563105434179306, 0.007880115881562233, 0.016954511404037476, -0.02888290025293827, 0.0888478234410286, -0.06581390649080276, -0.03544197604060173, -0.12100638449192047, -0.015422397293150425, 0.038722436875104904, 0.007003475911915302, -0.021259084343910217, -0.02491041086614132, -0.06812500208616257, -0.09506494551897049, 0.10750292241573334, -0.07226914167404175, -0.06937359273433685, -0.032288551330566406, -0.0934503898024559, 0.01816636137664318, 0.028837935999035835, 0.12439020723104477, -0.025908956304192543, 0.04054786637425423, -0.02296469919383526, 0.059265799820423126, 0.08038046211004257, 0.037636008113622665, -0.06761174649000168, 0.05705371871590614, -0.18351249396800995, 0.0966126099228859, -0.08195896446704865, 0.02235630713403225, -0.1536521464586258, -0.014127005822956562, 0.010008358396589756, 0.019234206527471542, 0.03264787793159485, 0.14939363300800323, -0.19307735562324524, -0.02197428233921528, 0.16256912052631378, -0.09758635610342026, -0.12122322618961334, 0.037983085960149765, -0.058950066566467285, 0.16108039021492004, 0.01939433440566063, -0.015340207144618034, 0.09153001010417938, -0.16100166738033295, -0.0274574663490057, -0.025949712842702866, -0.010340082459151745, 0.10397094488143921, 0.08717357367277145, -0.07603730261325836, 0.03241610527038574, 0.017674976959824562, -0.04265597090125084, -0.03085826151072979, -0.05373816564679146, -0.11254118382930756, 0.0010315550025552511, -0.09072443097829819, 0.024999039247632027, -0.009354005567729473, -0.07028668373823166, -0.012106634676456451, -0.16019484400749207, -0.020248617976903915, 0.08557764440774918, 0.015559723600745201, -0.02128426358103752, -0.09486330300569534, 0.03364481404423714, -0.028271986171603203, -0.029301419854164124, -0.15483154356479645, -0.03294621780514717, 0.021821243688464165, -0.14737772941589355, 0.014267655089497566, -0.10566584020853043, 0.06404106318950653, 0.007953759282827377, -0.06966651976108551, -0.028292391449213028, -0.016889620572328568, 0.008181779645383358, -0.04799443855881691, -0.2412346452474594, -0.020019667223095894, -0.05199592188000679, 0.15045960247516632, -0.2138085663318634, 0.03961145132780075, 0.05266125872731209, 0.12054955214262009, -0.0009035554830916226, -0.06467462331056595, 0.029471253976225853, -0.07511275261640549, -0.017351627349853516, -0.07159969955682755, -0.004248881246894598, 0.005328451748937368, -0.0450906939804554, 0.019305219873785973, -0.1075306385755539, -0.049178171902894974, 0.10417618602514267, 0.05527631193399429, -0.17346957325935364, -0.023097822442650795, -0.046116504818201065, -0.07124318182468414, -0.09169773012399673, -0.05415806546807289, 0.10515284538269043, 0.041585735976696014, 0.03269373998045921, -0.07475687563419342, -0.07640749216079712, 0.012526088394224644, -0.020289583131670952, -0.020940471440553665, 0.1124240979552269, 0.08039423078298569, -0.11441559344530106, 0.10038081556558609, 0.05642333999276161, 0.022568801417946815, 0.08533167839050293, -0.023286325857043266, -0.10948912054300308, -0.0353396050632, 0.04703282564878464, 0.006245602387934923, 0.16222289204597473, -0.09755446016788483, 0.0529886819422245, 0.0440160296857357, -0.024740222841501236, 0.0573631189763546, -0.10083404928445816, 0.007252480369061232, 0.00619475869461894, -0.013228358700871468, 0.013129032216966152, -0.018381411209702492, 0.010771160945296288, 0.08230727165937424, 0.05341639742255211, 0.04012325778603554, 0.04041486978530884, -0.03360074386000633, -0.13050709664821625, 0.1816704273223877, -0.0924571081995964, -0.2249785214662552, -0.14634695649147034, 0.04399142786860466, 0.0514918752014637, -0.017738396301865578, 0.023651374503970146, -0.05044744536280632, -0.09845245629549026, -0.0740722268819809, -0.004893820267170668, 0.023804306983947754, -0.06170307472348213, -0.07565140724182129, 0.056919462978839874, 0.040460724383592606, -0.11576100438833237, 0.03636781498789787, 0.06001606956124306, -0.021949635818600655, 0.007642357610166073, 0.05665047839283943, 0.08089082688093185, 0.17461374402046204, -0.008344740606844425, 0.0017759462352842093, 0.055072810500860214, 0.2798279821872711, -0.16532540321350098, 0.1020711287856102, 0.10985559970140457, -0.05893383547663689, 0.07532299309968948, 0.18618538975715637, 0.03361507132649422, -0.10542897135019302, 0.034598495811223984, 0.03260239213705063, -0.026565296575427055, -0.27217236161231995, -0.04892345145344734, -0.015015819109976292, -0.09997636824846268, 0.08054386079311371, 0.08565572649240494, 0.10213218629360199, 0.04432207718491554, -0.06219084933400154, -0.08246864378452301, 0.032585080713033676, 0.0978967547416687, -0.029215795919299126, 0.009400546550750732, 0.08119280636310577, -0.026892056688666344, 0.012422733940184116, 0.09824589639902115, -0.015579886734485626, 0.18118871748447418, 0.04385717958211899, 0.10755256563425064, 0.08679575473070145, 0.09309112280607224, -0.008465249091386795, 0.016574915498495102, 0.018525421619415283, 0.02110176347196102, 0.016441771760582924, -0.07994204759597778, 0.03717470541596413, 0.10806397348642349, 0.05075093358755112, 0.018337735906243324, 0.00901766773313284, -0.05652062967419624, 0.04567372053861618, 0.18429163098335266, 0.008376488462090492, -0.19362112879753113, -0.07377733290195465, 0.05328003689646721, -0.07502224296331406, -0.14255426824092865, -0.02121184952557087, 0.024546971544623375, -0.17657439410686493, 0.014729551039636135, -0.04045349359512329, 0.10020656138658524, -0.0807483121752739, -0.045424725860357285, 0.10272405296564102, 0.06895950436592102, -0.024891171604394913, 0.0643819272518158, -0.20409251749515533, 0.13192440569400787, 0.0227514635771513, 0.07352744042873383, -0.09912186861038208, 0.09887838363647461, 0.0029074251651763916, -0.017915459349751472, 0.16446749866008759, 0.005446172319352627, -0.06250042468309402, -0.05079827457666397, -0.09286834299564362, -0.013283716514706612, 0.0965781882405281, -0.1273345798254013, 0.06374328583478928, -0.012035369873046875, -0.025750044733285904, 0.006634507793933153, -0.07210461795330048, -0.12643256783485413, -0.17625276744365692, 0.06302531808614731, -0.10946117341518402, 0.030561324208974838, -0.08902372419834137, -0.06466980278491974, 0.0009326168219558895, 0.18720470368862152, -0.17282986640930176, -0.09261804074048996, -0.13980360329151154, -0.08808182179927826, 0.16285085678100586, -0.03744572401046753, 0.08460323512554169, 0.008431597612798214, 0.16339614987373352, 0.019917229190468788, 0.0019714361988008022, 0.10055560618638992, -0.0866347923874855, -0.19160644710063934, -0.05623016878962517, 0.15441858768463135, 0.14883501827716827, 0.042322319000959396, -0.013986372388899326, 0.02072916366159916, -0.0529087595641613, -0.11497963964939117, 0.029175426810979843, 0.13886870443820953, 0.08626461029052734, -0.0076826224103569984, -0.027459600940346718, -0.09045881032943726, -0.06142232194542885, -0.06472029536962509, -0.00025014206767082214, 0.18964825570583344, -0.0694744735956192, 0.16145937144756317, 0.11166119575500488, -0.05875924602150917, -0.20107071101665497, 0.05941571667790413, 0.05859732627868652, 0.01204205397516489, 0.028586890548467636, -0.20422744750976562, 0.084892138838768, 0.0009188737021759152, -0.07320011407136917, 0.1614963263273239, -0.1689157783985138, -0.14540784060955048, 0.0966864824295044, 0.03669729083776474, -0.22733815014362335, -0.13237477838993073, -0.09820576757192612, -0.014729002490639687, -0.12527956068515778, 0.07288645952939987, 0.004153280518949032, 0.018022364005446434, 0.02970731444656849, 0.022243158891797066, 0.02734936960041523, -0.05155503749847412, 0.20668193697929382, -0.019465556368231773, 0.014945040456950665, -0.055863332003355026, -0.09477385133504868, 0.03634431213140488, -0.049176037311553955, 0.09747040271759033, 0.0074060410261154175, 0.025111159309744835, -0.13588523864746094, -0.04813672974705696, -0.0667164996266365, 0.030183831229805946, -0.09707280248403549, -0.09178482741117477, -0.04958734288811684, 0.10132525861263275, 0.10213971138000488, -0.0331345833837986, -0.005954638589173555, -0.08328456431627274, 0.05818207934498787, 0.1943139284849167, 0.19051027297973633, 0.07337861508131027, -0.06983349472284317, 0.014397216960787773, -0.03079381212592125, 0.04250641539692879, -0.22024917602539062, 0.04183183237910271, 0.05049645155668259, 0.025407833978533745, 0.09134689718484879, -0.009934632107615471, -0.15035557746887207, -0.07274655252695084, 0.07270115613937378, -0.0421663336455822, -0.1551537960767746, -0.026273222640156746, 0.04429435729980469, -0.20711728930473328, -0.049630165100097656, 0.005106727126985788, -0.016107412055134773, -0.040502842515707016, 0.022228330373764038, 0.07807353883981705, -0.01810431107878685, 0.1108367070555687, 0.08823920786380768, 0.09147890657186508, -0.10283440351486206, 0.08024819195270538, 0.07136257737874985, -0.05360680818557739, 0.027252690866589546, 0.09608721733093262, -0.04667741060256958, -0.035749197006225586, 0.09687162935733795, 0.087377168238163, 0.02700074203312397, -0.050636082887649536, 0.012657051905989647, -0.05128291994333267, 0.07055194675922394, 0.11265672743320465, 0.03614848107099533, 0.0023890577722340822, 0.060735441744327545, 0.0406060628592968, -0.09906987100839615, 0.10251971334218979, 0.06059487536549568, 0.021049203351140022, -0.040392808616161346, -0.030351931229233742, -0.005514017306268215, -0.008865377865731716, -0.017069676890969276, -0.00994140561670065, -0.08593963086605072, -0.008958933874964714, -0.10302065312862396, 0.03490179404616356, -0.07524871826171875, 0.01219728123396635, 0.0254184752702713, -0.050375957041978836, 0.003508947556838393, 0.008333287201821804, -0.08077675849199295, -0.05043422430753708, -0.011203004978597164, 0.08692436665296555, -0.12053600698709488, 0.03298840671777725, 0.08099669963121414, -0.10860563814640045, 0.06997398287057877, 0.003978778142482042, 0.007252615410834551, 0.010887968353927135, -0.16440065205097198, 0.05931297689676285, -0.03038373962044716, -0.01342853158712387, 0.016210881993174553, -0.21930204331874847, -0.010714126750826836, -0.044988542795181274, -0.04716929420828819, 0.012694490142166615, -0.03562631085515022, -0.12426263839006424, 0.1023775190114975, -0.003163744928315282, -0.07257839292287827, -0.023368624970316887, 0.04373403266072273, 0.10180068016052246, -0.02075527422130108, 0.1245015487074852, -0.02032524347305298, 0.07318782806396484, -0.17039193212985992, -0.0019751046784222126, -0.008241270668804646, 0.046401407569646835, -0.013422461226582527, -0.02167847566306591, 0.06217268481850624, -0.020010512322187424, 0.19820813834667206, -0.02723773382604122, 0.05852743610739708, 0.050803933292627335, 0.016617335379123688, 0.011346948333084583, 0.08531802147626877, 0.06558694690465927, -0.009960758499801159, -0.0018669069977477193, 0.048180289566516876, -0.004269620403647423, -0.04668188840150833, -0.1513918936252594, 0.07056503742933273, 0.15318799018859863, 0.049019504338502884, 0.01789281703531742, 0.02847662940621376, -0.12184341996908188, -0.06982408463954926, 0.13682611286640167, -0.007598268799483776, -0.03346158564090729, -0.07823546230792999, 0.18409036099910736, 0.11790676414966583, -0.19921985268592834, 0.08636754006147385, -0.0570930615067482, -0.06071317195892334, -0.1269688457250595, -0.1647690236568451, -0.06356672197580338, -0.045259296894073486, -0.01546055544167757, -0.06265394389629364, 0.057892780750989914, 0.054394353181123734, 0.007095261476933956, -0.01922091841697693, 0.10057384520769119, 0.0049527776427567005, -0.021939856931567192, 0.04453388601541519, 0.05382814258337021, 0.02358531951904297, -0.10744942724704742, 0.008430673740804195, -0.0023357088211923838, 0.023482296615839005, 0.06695903837680817, 0.012437023222446442, -0.05434801056981087, 0.005425474606454372, -0.012609249912202358, -0.11060609668493271, 0.04106578603386879, -0.024203045293688774, -0.030465755611658096, 0.14146585762500763, 0.025953080505132675, 0.012512882240116596, -0.020294874906539917, 0.2408498376607895, -0.07351089268922806, -0.07818080484867096, -0.15932218730449677, 0.04089493677020073, -0.06955591589212418, 0.023746788501739502, 0.04229412227869034, -0.11467044055461884, 0.02186744473874569, 0.16599634289741516, 0.13237737119197845, -0.0069916509091854095, 0.010751532390713692, 0.057402126491069794, -0.00035850206040777266, -0.02811710350215435, 0.010377262718975544, 0.04239997640252113, 0.13025915622711182, -0.07624518871307373, 0.06391678005456924, -0.009917879477143288, -0.07555784285068512, -0.003581081749871373, 0.10921265184879303, 0.003309264313429594, 0.0018212915165349841, -0.07530386000871658, 0.13890133798122406, -0.09287035465240479, -0.22887270152568817, 0.06033565476536751, -0.06398708373308182, -0.1541450321674347, -0.043421197682619095, 0.004746082238852978, -0.01299720536917448, 0.018890777602791786, 0.07757552713155746, -0.045782990753650665, 0.169488325715065, 0.0454796627163887, -0.06252717971801758, -0.0849515050649643, 0.0670170709490776, -0.11704422533512115, 0.2850325107574463, 0.01855860836803913, 0.0605301633477211, 0.1048886626958847, -0.015525137074291706, -0.1387251913547516, 0.00742175430059433, 0.10480140149593353, -0.07298865169286728, 0.060339659452438354, 0.18025119602680206, -0.0048653376288712025, 0.12129996716976166, 0.0554901547729969, -0.05191062018275261, 0.03883669152855873, -0.09393538534641266, -0.04719913750886917, -0.11929651349782944, 0.07830383628606796, -0.08233071863651276, 0.16280044615268707, 0.13703136146068573, -0.06441561877727509, -0.009987147524952888, -0.02244110219180584, 0.08321807533502579, 0.0017402702942490578, 0.10338716953992844, 0.0012666404945775867, -0.191202774643898, 0.04180362448096275, 0.019871221855282784, 0.1064046248793602, -0.20043298602104187, -0.06670982390642166, 0.05787203833460808, -0.027026891708374023, -0.06707723438739777, 0.11497782170772552, 0.04113665595650673, 0.037256281822919846, -0.041028644889593124, -0.038949720561504364, 0.00012006583710899577, 0.14592067897319794, -0.11384900659322739, -0.010830949060618877 ]
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # 2020-Q2-90p-filtered This model is a fine-tuned version of [cardiffnlp/twitter-roberta-base-2019-90m](https://huggingface.co/cardiffnlp/twitter-roberta-base-2019-90m) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 3.5439 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 4.1e-07 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08 - lr_scheduler_type: linear - training_steps: 2400000 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:-------:|:---------------:| | No log | 0.17 | 8000 | 4.0640 | | 4.2654 | 0.34 | 16000 | 3.9414 | | 4.2654 | 0.51 | 24000 | 3.8956 | | 4.0459 | 0.67 | 32000 | 3.8527 | | 4.0459 | 0.84 | 40000 | 3.8232 | | 3.9781 | 1.01 | 48000 | 3.7806 | | 3.9781 | 1.18 | 56000 | 3.7861 | | 3.9323 | 1.35 | 64000 | 3.7930 | | 3.9323 | 1.52 | 72000 | 3.7814 | | 3.9224 | 1.68 | 80000 | 3.7815 | | 3.9224 | 1.85 | 88000 | 3.7403 | | 3.8924 | 2.02 | 96000 | 3.7468 | | 3.8924 | 2.19 | 104000 | 3.7400 | | 3.879 | 2.36 | 112000 | 3.7283 | | 3.879 | 2.53 | 120000 | 3.7381 | | 3.8806 | 2.69 | 128000 | 3.7073 | | 3.8806 | 2.86 | 136000 | 3.7083 | | 3.8659 | 3.03 | 144000 | 3.6992 | | 3.8659 | 3.2 | 152000 | 3.6956 | | 3.8634 | 3.37 | 160000 | 3.6745 | | 3.8634 | 3.54 | 168000 | 3.7017 | | 3.8632 | 3.71 | 176000 | 3.6960 | | 3.8632 | 3.87 | 184000 | 3.7202 | | 3.8416 | 4.04 | 192000 | 3.7109 | | 3.8416 | 4.21 | 200000 | 3.6942 | | 3.8368 | 4.38 | 208000 | 3.6944 | | 3.8368 | 4.55 | 216000 | 3.6751 | | 3.8359 | 4.72 | 224000 | 3.6815 | | 3.8359 | 4.88 | 232000 | 3.6915 | | 3.8411 | 5.05 | 240000 | 3.6796 | | 3.8411 | 5.22 | 248000 | 3.6847 | | 3.8359 | 5.39 | 256000 | 3.6988 | | 3.8359 | 5.56 | 264000 | 3.6799 | | 3.8268 | 5.73 | 272000 | 3.6810 | | 3.8268 | 5.89 | 280000 | 3.6639 | | 3.8172 | 6.06 | 288000 | 3.6663 | | 3.8172 | 6.23 | 296000 | 3.6838 | | 3.8263 | 6.4 | 304000 | 3.6756 | | 3.8263 | 6.57 | 312000 | 3.6507 | | 3.8215 | 6.74 | 320000 | 3.6409 | | 3.8215 | 6.91 | 328000 | 3.6790 | | 3.8189 | 7.07 | 336000 | 3.6679 | | 3.8189 | 7.24 | 344000 | 3.6443 | | 3.8155 | 7.41 | 352000 | 3.6588 | | 3.8155 | 7.58 | 360000 | 3.6448 | | 3.8075 | 7.75 | 368000 | 3.6520 | | 3.8075 | 7.92 | 376000 | 3.6541 | | 3.8064 | 8.08 | 384000 | 3.6569 | | 3.8064 | 8.25 | 392000 | 3.6586 | | 3.8092 | 8.42 | 400000 | 3.6701 | | 3.8092 | 8.59 | 408000 | 3.6544 | | 3.8032 | 8.76 | 416000 | 3.6668 | | 3.8032 | 8.93 | 424000 | 3.6631 | | 3.8062 | 9.09 | 432000 | 3.6481 | | 3.8062 | 9.26 | 440000 | 3.6392 | | 3.7987 | 9.43 | 448000 | 3.6482 | | 3.7987 | 9.6 | 456000 | 3.6357 | | 3.7954 | 9.77 | 464000 | 3.6333 | | 3.7954 | 9.94 | 472000 | 3.6653 | | 3.7938 | 10.11 | 480000 | 3.6267 | | 3.7938 | 10.27 | 488000 | 3.6490 | | 3.7901 | 10.44 | 496000 | 3.6417 | | 3.7901 | 10.61 | 504000 | 3.6263 | | 3.7935 | 10.78 | 512000 | 3.6523 | | 3.7935 | 10.95 | 520000 | 3.6444 | | 3.7951 | 11.12 | 528000 | 3.6226 | | 3.7951 | 11.28 | 536000 | 3.6347 | | 3.7861 | 11.45 | 544000 | 3.6372 | | 3.7861 | 11.62 | 552000 | 3.6163 | | 3.7846 | 11.79 | 560000 | 3.6299 | | 3.7846 | 11.96 | 568000 | 3.6330 | | 3.7778 | 12.13 | 576000 | 3.6371 | | 3.7778 | 12.29 | 584000 | 3.6343 | | 3.777 | 12.46 | 592000 | 3.6242 | | 3.777 | 12.63 | 600000 | 3.6119 | | 3.778 | 12.8 | 608000 | 3.6167 | | 3.778 | 12.97 | 616000 | 3.6191 | | 3.7795 | 13.14 | 624000 | 3.6225 | | 3.7795 | 13.3 | 632000 | 3.6056 | | 3.7766 | 13.47 | 640000 | 3.6135 | | 3.7766 | 13.64 | 648000 | 3.6169 | | 3.7729 | 13.81 | 656000 | 3.6035 | | 3.7729 | 13.98 | 664000 | 3.6109 | | 3.7846 | 14.15 | 672000 | 3.6180 | | 3.7846 | 14.32 | 680000 | 3.6171 | | 3.7726 | 14.48 | 688000 | 3.6182 | | 3.7726 | 14.65 | 696000 | 3.6086 | | 3.7717 | 14.82 | 704000 | 3.5852 | | 3.7717 | 14.99 | 712000 | 3.5883 | | 3.7713 | 15.16 | 720000 | 3.6056 | | 3.7713 | 15.33 | 728000 | 3.6004 | | 3.7745 | 15.49 | 736000 | 3.6059 | | 3.7745 | 15.66 | 744000 | 3.6156 | | 3.7557 | 15.83 | 752000 | 3.6029 | | 3.7557 | 16.0 | 760000 | 3.6099 | | 3.7628 | 16.17 | 768000 | 3.6016 | | 3.7628 | 16.34 | 776000 | 3.6008 | | 3.7717 | 16.5 | 784000 | 3.5972 | | 3.7717 | 16.67 | 792000 | 3.5838 | | 3.7616 | 16.84 | 800000 | 3.5868 | | 3.7616 | 17.01 | 808000 | 3.5834 | | 3.7608 | 17.18 | 816000 | 3.6066 | | 3.7608 | 17.35 | 824000 | 3.5911 | | 3.7625 | 17.52 | 832000 | 3.5997 | | 3.7625 | 17.68 | 840000 | 3.5855 | | 3.7634 | 17.85 | 848000 | 3.5861 | | 3.7634 | 18.02 | 856000 | 3.6021 | | 3.75 | 18.19 | 864000 | 3.5966 | | 3.75 | 18.36 | 872000 | 3.5761 | | 3.7492 | 18.53 | 880000 | 3.5757 | | 3.7492 | 18.69 | 888000 | 3.6123 | | 3.7522 | 18.86 | 896000 | 3.5841 | | 3.7522 | 19.03 | 904000 | 3.5831 | | 3.7482 | 19.2 | 912000 | 3.5860 | | 3.7482 | 19.37 | 920000 | 3.5804 | | 3.75 | 19.54 | 928000 | 3.5730 | | 3.75 | 19.7 | 936000 | 3.5955 | | 3.755 | 19.87 | 944000 | 3.5868 | | 3.755 | 20.04 | 952000 | 3.5992 | | 3.7549 | 20.21 | 960000 | 3.5657 | | 3.7549 | 20.38 | 968000 | 3.5780 | | 3.743 | 20.55 | 976000 | 3.5828 | | 3.743 | 20.72 | 984000 | 3.5676 | | 3.75 | 20.88 | 992000 | 3.5724 | | 3.75 | 21.05 | 1000000 | 3.5850 | | 3.7483 | 21.22 | 1008000 | 3.5873 | | 3.7483 | 21.39 | 1016000 | 3.5799 | | 3.7523 | 21.56 | 1024000 | 3.5974 | | 3.7523 | 21.73 | 1032000 | 3.5790 | | 3.7458 | 21.89 | 1040000 | 3.5884 | | 3.7458 | 22.06 | 1048000 | 3.5904 | | 3.7498 | 22.23 | 1056000 | 3.5851 | | 3.7498 | 22.4 | 1064000 | 3.5776 | | 3.7496 | 22.57 | 1072000 | 3.5685 | | 3.7496 | 22.74 | 1080000 | 3.5731 | | 3.7395 | 22.9 | 1088000 | 3.5858 | | 3.7395 | 23.07 | 1096000 | 3.5931 | | 3.7466 | 23.24 | 1104000 | 3.5614 | | 3.7466 | 23.41 | 1112000 | 3.5456 | | 3.7503 | 23.58 | 1120000 | 3.5895 | | 3.7503 | 23.75 | 1128000 | 3.5608 | | 3.7484 | 23.92 | 1136000 | 3.5696 | | 3.7484 | 24.08 | 1144000 | 3.5653 | | 3.7435 | 24.25 | 1152000 | 3.5721 | | 3.7435 | 24.42 | 1160000 | 3.5510 | | 3.7348 | 24.59 | 1168000 | 3.5631 | | 3.7348 | 24.76 | 1176000 | 3.5727 | | 3.7341 | 24.93 | 1184000 | 3.5835 | | 3.7341 | 25.09 | 1192000 | 3.5766 | | 3.7435 | 25.26 | 1200000 | 3.5606 | | 3.7435 | 25.43 | 1208000 | 3.5497 | | 3.732 | 25.6 | 1216000 | 3.5433 | | 3.732 | 25.77 | 1224000 | 3.5420 | | 3.7343 | 25.94 | 1232000 | 3.5987 | | 3.7343 | 26.1 | 1240000 | 3.5956 | | 3.7336 | 26.27 | 1248000 | 3.5673 | | 3.7336 | 26.44 | 1256000 | 3.5643 | | 3.7444 | 26.61 | 1264000 | 3.5848 | | 3.7444 | 26.78 | 1272000 | 3.5693 | | 3.7395 | 26.95 | 1280000 | 3.5745 | | 3.7395 | 27.12 | 1288000 | 3.5758 | | 3.7389 | 27.28 | 1296000 | 3.5685 | | 3.7389 | 27.45 | 1304000 | 3.5712 | | 3.7416 | 27.62 | 1312000 | 3.5693 | | 3.7416 | 27.79 | 1320000 | 3.5740 | | 3.7305 | 27.96 | 1328000 | 3.5803 | | 3.7305 | 28.13 | 1336000 | 3.5682 | | 3.7268 | 28.29 | 1344000 | 3.5928 | | 3.7268 | 28.46 | 1352000 | 3.5608 | | 3.7363 | 28.63 | 1360000 | 3.5587 | | 3.7363 | 28.8 | 1368000 | 3.5603 | | 3.7325 | 28.97 | 1376000 | 3.5711 | | 3.7325 | 29.14 | 1384000 | 3.5828 | | 3.7337 | 29.3 | 1392000 | 3.5790 | | 3.7337 | 29.47 | 1400000 | 3.5795 | | 3.7367 | 29.64 | 1408000 | 3.5528 | | 3.7367 | 29.81 | 1416000 | 3.5766 | | 3.7313 | 29.98 | 1424000 | 3.5610 | | 3.7313 | 30.15 | 1432000 | 3.5834 | | 3.7277 | 30.32 | 1440000 | 3.5546 | | 3.7277 | 30.48 | 1448000 | 3.5534 | | 3.7296 | 30.65 | 1456000 | 3.5646 | | 3.7296 | 30.82 | 1464000 | 3.5436 | | 3.7411 | 30.99 | 1472000 | 3.5778 | | 3.7411 | 31.16 | 1480000 | 3.5541 | | 3.7233 | 31.33 | 1488000 | 3.5720 | | 3.7233 | 31.49 | 1496000 | 3.5567 | | 3.7291 | 31.66 | 1504000 | 3.5477 | | 3.7291 | 31.83 | 1512000 | 3.5557 | | 3.7265 | 32.0 | 1520000 | 3.5643 | | 3.7265 | 32.17 | 1528000 | 3.5739 | | 3.7352 | 32.34 | 1536000 | 3.5628 | | 3.7352 | 32.5 | 1544000 | 3.5542 | | 3.7353 | 32.67 | 1552000 | 3.5496 | | 3.7353 | 32.84 | 1560000 | 3.5737 | | 3.7243 | 33.01 | 1568000 | 3.5788 | | 3.7243 | 33.18 | 1576000 | 3.5631 | | 3.7192 | 33.35 | 1584000 | 3.5438 | | 3.7192 | 33.52 | 1592000 | 3.5554 | | 3.7266 | 33.68 | 1600000 | 3.5748 | | 3.7266 | 33.85 | 1608000 | 3.5620 | | 3.73 | 34.02 | 1616000 | 3.5464 | | 3.73 | 34.19 | 1624000 | 3.5670 | | 3.7264 | 34.36 | 1632000 | 3.5626 | | 3.7264 | 34.53 | 1640000 | 3.5640 | | 3.7317 | 34.69 | 1648000 | 3.5650 | | 3.7317 | 34.86 | 1656000 | 3.5458 | | 3.7332 | 35.03 | 1664000 | 3.5567 | | 3.7332 | 35.2 | 1672000 | 3.5610 | | 3.7248 | 35.37 | 1680000 | 3.5650 | | 3.7248 | 35.54 | 1688000 | 3.5580 | | 3.7232 | 35.7 | 1696000 | 3.5829 | | 3.7232 | 35.87 | 1704000 | 3.5532 | | 3.729 | 36.04 | 1712000 | 3.5723 | | 3.729 | 36.21 | 1720000 | 3.5454 | | 3.7273 | 36.38 | 1728000 | 3.5623 | | 3.7273 | 36.55 | 1736000 | 3.5462 | | 3.7261 | 36.72 | 1744000 | 3.5743 | | 3.7261 | 36.88 | 1752000 | 3.5638 | | 3.7208 | 37.05 | 1760000 | 3.5519 | | 3.7208 | 37.22 | 1768000 | 3.5584 | | 3.7183 | 37.39 | 1776000 | 3.5308 | | 3.7183 | 37.56 | 1784000 | 3.5549 | | 3.7193 | 37.73 | 1792000 | 3.5409 | | 3.7193 | 37.89 | 1800000 | 3.5396 | | 3.7271 | 38.06 | 1808000 | 3.5536 | | 3.7271 | 38.23 | 1816000 | 3.5452 | | 3.7284 | 38.4 | 1824000 | 3.5582 | | 3.7284 | 38.57 | 1832000 | 3.5668 | | 3.714 | 38.74 | 1840000 | 3.5673 | | 3.714 | 38.9 | 1848000 | 3.5477 | | 3.7105 | 39.07 | 1856000 | 3.5662 | | 3.7105 | 39.24 | 1864000 | 3.5498 | | 3.7189 | 39.41 | 1872000 | 3.5493 | | 3.7189 | 39.58 | 1880000 | 3.5676 | | 3.7203 | 39.75 | 1888000 | 3.5640 | | 3.7203 | 39.91 | 1896000 | 3.5747 | | 3.7271 | 40.08 | 1904000 | 3.5592 | | 3.7271 | 40.25 | 1912000 | 3.5515 | | 3.7237 | 40.42 | 1920000 | 3.5704 | | 3.7237 | 40.59 | 1928000 | 3.5642 | | 3.723 | 40.76 | 1936000 | 3.5300 | | 3.723 | 40.93 | 1944000 | 3.5482 | | 3.7224 | 41.09 | 1952000 | 3.5586 | | 3.7224 | 41.26 | 1960000 | 3.5463 | | 3.715 | 41.43 | 1968000 | 3.5323 | | 3.715 | 41.6 | 1976000 | 3.5426 | | 3.7209 | 41.77 | 1984000 | 3.5513 | | 3.7209 | 41.94 | 1992000 | 3.5614 | | 3.7183 | 42.1 | 2000000 | 3.5678 | | 3.7183 | 42.27 | 2008000 | 3.5304 | | 3.7161 | 42.44 | 2016000 | 3.5631 | | 3.7161 | 42.61 | 2024000 | 3.5589 | | 3.7215 | 42.78 | 2032000 | 3.5639 | | 3.7215 | 42.95 | 2040000 | 3.5376 | | 3.7205 | 43.11 | 2048000 | 3.5478 | | 3.7205 | 43.28 | 2056000 | 3.5511 | | 3.7178 | 43.45 | 2064000 | 3.5285 | | 3.7178 | 43.62 | 2072000 | 3.5428 | | 3.7232 | 43.79 | 2080000 | 3.5347 | | 3.7232 | 43.96 | 2088000 | 3.5501 | | 3.7167 | 44.13 | 2096000 | 3.5422 | | 3.7167 | 44.29 | 2104000 | 3.5487 | | 3.7253 | 44.46 | 2112000 | 3.5540 | | 3.7253 | 44.63 | 2120000 | 3.5432 | | 3.7139 | 44.8 | 2128000 | 3.5502 | | 3.7139 | 44.97 | 2136000 | 3.5450 | | 3.7194 | 45.14 | 2144000 | 3.5564 | | 3.7194 | 45.3 | 2152000 | 3.5441 | | 3.7167 | 45.47 | 2160000 | 3.5549 | | 3.7167 | 45.64 | 2168000 | 3.5429 | | 3.7202 | 45.81 | 2176000 | 3.5613 | | 3.7202 | 45.98 | 2184000 | 3.5469 | | 3.7193 | 46.15 | 2192000 | 3.5467 | | 3.7193 | 46.31 | 2200000 | 3.5493 | | 3.717 | 46.48 | 2208000 | 3.5652 | | 3.717 | 46.65 | 2216000 | 3.5669 | | 3.7164 | 46.82 | 2224000 | 3.5755 | | 3.7164 | 46.99 | 2232000 | 3.5580 | | 3.715 | 47.16 | 2240000 | 3.5403 | | 3.715 | 47.33 | 2248000 | 3.5521 | | 3.7091 | 47.49 | 2256000 | 3.5604 | | 3.7091 | 47.66 | 2264000 | 3.5401 | | 3.7199 | 47.83 | 2272000 | 3.5408 | | 3.7199 | 48.0 | 2280000 | 3.5509 | | 3.7238 | 48.17 | 2288000 | 3.5348 | | 3.7238 | 48.34 | 2296000 | 3.5530 | | 3.7193 | 48.5 | 2304000 | 3.5447 | | 3.7193 | 48.67 | 2312000 | 3.5453 | | 3.7195 | 48.84 | 2320000 | 3.5487 | | 3.7195 | 49.01 | 2328000 | 3.5357 | | 3.7187 | 49.18 | 2336000 | 3.5404 | | 3.7187 | 49.35 | 2344000 | 3.5247 | | 3.7157 | 49.51 | 2352000 | 3.5557 | | 3.7157 | 49.68 | 2360000 | 3.5532 | | 3.7144 | 49.85 | 2368000 | 3.5453 | | 3.7144 | 50.02 | 2376000 | 3.5421 | | 3.715 | 50.19 | 2384000 | 3.5183 | | 3.715 | 50.36 | 2392000 | 3.5473 | | 3.7208 | 50.53 | 2400000 | 3.5386 | ### Framework versions - Transformers 4.35.0.dev0 - Pytorch 2.0.1+cu117 - Datasets 2.14.5 - Tokenizers 0.14.0
{"license": "mit", "tags": ["generated_from_trainer"], "base_model": "cardiffnlp/twitter-roberta-base-2019-90m", "model-index": [{"name": "2020-Q2-90p-filtered", "results": []}]}
fill-mask
DouglasPontes/2020-Q2-90p-filtered-random
[ "transformers", "pytorch", "roberta", "fill-mask", "generated_from_trainer", "base_model:cardiffnlp/twitter-roberta-base-2019-90m", "license:mit", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2024-02-13T19:08:26+00:00
[]
[]
TAGS #transformers #pytorch #roberta #fill-mask #generated_from_trainer #base_model-cardiffnlp/twitter-roberta-base-2019-90m #license-mit #autotrain_compatible #endpoints_compatible #region-us
2020-Q2-90p-filtered ==================== This model is a fine-tuned version of cardiffnlp/twitter-roberta-base-2019-90m on an unknown dataset. It achieves the following results on the evaluation set: * Loss: 3.5439 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 4.1e-07 * train\_batch\_size: 16 * eval\_batch\_size: 16 * seed: 42 * optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08 * lr\_scheduler\_type: linear * training\_steps: 2400000 ### Training results ### Framework versions * Transformers 4.35.0.dev0 * Pytorch 2.0.1+cu117 * Datasets 2.14.5 * Tokenizers 0.14.0
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 4.1e-07\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* training\\_steps: 2400000", "### Training results", "### Framework versions\n\n\n* Transformers 4.35.0.dev0\n* Pytorch 2.0.1+cu117\n* Datasets 2.14.5\n* Tokenizers 0.14.0" ]
[ "TAGS\n#transformers #pytorch #roberta #fill-mask #generated_from_trainer #base_model-cardiffnlp/twitter-roberta-base-2019-90m #license-mit #autotrain_compatible #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 4.1e-07\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* training\\_steps: 2400000", "### Training results", "### Framework versions\n\n\n* Transformers 4.35.0.dev0\n* Pytorch 2.0.1+cu117\n* Datasets 2.14.5\n* Tokenizers 0.14.0" ]
[ 68, 99, 4, 36 ]
[ "passage: TAGS\n#transformers #pytorch #roberta #fill-mask #generated_from_trainer #base_model-cardiffnlp/twitter-roberta-base-2019-90m #license-mit #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 4.1e-07\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* training\\_steps: 2400000### Training results### Framework versions\n\n\n* Transformers 4.35.0.dev0\n* Pytorch 2.0.1+cu117\n* Datasets 2.14.5\n* Tokenizers 0.14.0" ]
[ -0.11636469513177872, 0.0374920628964901, -0.0023877108469605446, 0.12173619121313095, 0.14796839654445648, 0.02338266558945179, 0.11255091428756714, 0.12022665143013, -0.09211784601211548, 0.03922338783740997, 0.1639150083065033, 0.14658480882644653, 0.0033937087282538414, 0.17548437416553497, -0.02247615158557892, -0.26358968019485474, -0.01664688065648079, 0.035683341324329376, -0.05968266725540161, 0.14450877904891968, 0.11273184418678284, -0.13672815263271332, 0.0918685793876648, 0.010280945338308811, -0.21466630697250366, 0.0038081014063209295, 0.018149210140109062, -0.05080239102244377, 0.1505923867225647, 0.004567424766719341, 0.12552379071712494, 0.004073940683156252, 0.0838128924369812, -0.1429458111524582, 0.019442562013864517, 0.04116683825850487, -0.0029734019190073013, 0.08862502872943878, 0.0004030554264318198, -0.015087228268384933, 0.10711544752120972, -0.0814312994480133, 0.061319515109062195, 0.019665943458676338, -0.15593968331813812, -0.2219081073999405, -0.09022660553455353, 0.050376053899526596, 0.08121716976165771, 0.09755876660346985, -0.00973249040544033, 0.15057028830051422, -0.0881449356675148, 0.08784180879592896, 0.2319309264421463, -0.31431788206100464, -0.07873780280351639, 0.02697363682091236, 0.04352298378944397, 0.034689318388700485, -0.09762581437826157, -0.01737610436975956, 0.048357389867305756, 0.06513139605522156, 0.14362633228302002, -0.024981018155813217, -0.03159512206912041, 0.004705184604972601, -0.1219104528427124, -0.0506431981921196, 0.11263933777809143, 0.040749065577983856, -0.05255407094955444, -0.039711225777864456, -0.04044175520539284, -0.15751759707927704, -0.050459787249565125, -0.002335791476070881, 0.033565983176231384, -0.03593877702951431, -0.12841320037841797, -0.0053458139300346375, -0.10073855519294739, -0.06596270203590393, -0.0735229030251503, 0.1745215356349945, 0.025760894641280174, 0.015349007211625576, -0.02485809102654457, 0.11582809686660767, -0.05828782916069031, -0.14290516078472137, 0.010535375215113163, 0.02620396576821804, 0.015447981655597687, -0.047801289707422256, -0.07097576558589935, -0.05202902853488922, -0.0037263210397213697, 0.1584612876176834, -0.03790978342294693, 0.02293865568935871, 0.07018900662660599, 0.05196629464626312, -0.09062322229146957, 0.17084267735481262, -0.058701369911432266, -0.032304517924785614, 0.029544593766331673, 0.055084001272916794, 0.021180301904678345, -0.0009259246289730072, -0.12563183903694153, -0.006908819545060396, 0.059654101729393005, -0.0008845959673635662, -0.051965758204460144, 0.0789761170744896, -0.060391053557395935, -0.0066838390193879604, 0.046774156391620636, -0.0753282830119133, 0.017761239781975746, -0.028540654107928276, -0.08167579025030136, -0.06312546879053116, 0.0301993228495121, 0.021033875644207, 0.01082602795213461, 0.12210622429847717, -0.10136006772518158, 0.019379982724785805, -0.09066049009561539, -0.10690348595380783, 0.0029348854441195726, -0.10736742615699768, 0.027361463755369186, -0.10937206447124481, -0.20380103588104248, -0.006655615288764238, 0.054365698248147964, -0.029563115909695625, -0.06804480403661728, -0.045585110783576965, -0.07483857125043869, 0.01252047624439001, -0.0056431968696415424, 0.12448962777853012, -0.057544611394405365, 0.12433075904846191, 0.05509607866406441, 0.08825674653053284, -0.07439932227134705, 0.044877585023641586, -0.11925310641527176, 0.016419224441051483, -0.22594474256038666, 0.03434821590781212, -0.026216691359877586, 0.09802015870809555, -0.07936249673366547, -0.11456464976072311, -0.008654279634356499, -0.004868588410317898, 0.09702347218990326, 0.09477343410253525, -0.16114741563796997, -0.07817427814006805, 0.16643677651882172, -0.05310633406043053, -0.11078514158725739, 0.1373467594385147, -0.07689767330884933, 0.08091967552900314, 0.06504141539335251, 0.14499609172344208, 0.05635026469826698, -0.12637250125408173, 0.039180390536785126, -0.03698894381523132, 0.013756299391388893, -0.057034458965063095, 0.061869505792856216, 0.0025312495417892933, 0.03044968843460083, 0.028503447771072388, -0.03767721727490425, 0.059171661734580994, -0.10643116384744644, -0.0896122008562088, -0.03163863718509674, -0.10929498076438904, 0.08192756026983261, 0.05683461204171181, 0.07456602901220322, -0.13767988979816437, -0.10324884951114655, 0.003254322102293372, 0.07088818401098251, -0.031025925651192665, 0.017141573131084442, -0.08619705587625504, 0.10093064606189728, -0.04435229301452637, -0.033522896468639374, -0.1587762087583542, 0.001631816034205258, 0.001737403217703104, 0.0577983520925045, 0.03398424759507179, -0.026760801672935486, 0.08218883723020554, 0.05879199132323265, -0.057375114411115646, -0.007077888585627079, -0.05794805660843849, -0.010350391268730164, -0.12779764831066132, -0.17537999153137207, -0.04812363162636757, -0.03089975193142891, 0.1273757964372635, -0.18899214267730713, 0.039461929351091385, -0.04491544887423515, 0.09583074599504471, 0.005133313592523336, -0.018031045794487, -0.05125604569911957, 0.0808260515332222, -0.017515700310468674, -0.047184091061353683, 0.0717918649315834, 0.0173241775482893, -0.08020085841417313, -0.023717334493994713, -0.08999843895435333, 0.17535297572612762, 0.139180988073349, -0.11728759109973907, -0.10518643260002136, 0.025046173483133316, -0.06156634911894798, -0.020441671833395958, -0.038855019956827164, 0.027499869465827942, 0.18188625574111938, -0.010554071515798569, 0.14603640139102936, -0.07854558527469635, -0.04703349247574806, 0.04076415300369263, -0.04761236906051636, 0.034957367926836014, 0.09937797486782074, 0.08881406486034393, -0.09273497760295868, 0.13256414234638214, 0.13492648303508759, -0.07676944881677628, 0.1321885734796524, -0.012806870974600315, -0.05649830773472786, -0.04333826154470444, -0.036332935094833374, -0.005074921529740095, 0.13493695855140686, -0.11170928180217743, -0.014940210618078709, 0.01450180634856224, -0.003345699980854988, 0.007029504980891943, -0.2144838571548462, -0.06668828427791595, 0.04255779832601547, -0.0348735935986042, -0.07911410182714462, 0.014676098711788654, 0.007047128863632679, 0.10229170322418213, 0.03381732851266861, -0.07429873198270798, 0.0476100780069828, 0.014549537561833858, -0.06845558434724808, 0.19828978180885315, -0.07529694586992264, -0.1295512318611145, -0.1114891842007637, -0.08988355100154877, -0.0069991182535886765, 0.024816177785396576, 0.06006162241101265, -0.0781957283616066, -0.01500639971345663, -0.05475758761167526, -0.0021008122712373734, -0.0034842644818127155, 0.027900902554392815, 0.0049751014448702335, -0.011644446291029453, 0.059682272374629974, -0.08582266420125961, -0.01982131041586399, -0.055097825825214386, -0.047233860939741135, 0.057934727519750595, 0.0165674090385437, 0.11828717589378357, 0.11750439554452896, -0.04461250454187393, 0.015190690755844116, -0.042692895978689194, 0.26415297389030457, -0.08372768759727478, -0.022110167890787125, 0.13176052272319794, 0.009884105063974857, 0.0656794086098671, 0.12050881236791611, 0.06388964504003525, -0.09267722815275192, 0.0031207986176013947, 0.003345428965985775, -0.051012661308050156, -0.17409420013427734, -0.04238920658826828, -0.054321181029081345, -0.025163274258375168, 0.10662292689085007, 0.016036605462431908, 0.05577115714550018, 0.0792318657040596, 0.03635884076356888, 0.06711331009864807, -0.057047512382268906, 0.07675088942050934, 0.09177298098802567, 0.05252106860280037, 0.12977610528469086, -0.04039955884218216, -0.06674373894929886, 0.02489824779331684, -0.016124164685606956, 0.2027513086795807, 0.002165405545383692, 0.09112972021102905, 0.062108904123306274, 0.20455053448677063, 0.006615753751248121, 0.07485414296388626, -0.009003439918160439, -0.060017429292201996, -0.01396053284406662, -0.03418601304292679, -0.04107256606221199, 0.015065377578139305, -0.025263935327529907, 0.05800960212945938, -0.13120637834072113, -0.01839127019047737, 0.04278950393199921, 0.28033116459846497, 0.042344603687524796, -0.3657715618610382, -0.12404582649469376, -0.02066134661436081, -0.033675745129585266, -0.017681455239653587, 0.007399213965982199, 0.09694540500640869, -0.10000387579202652, 0.02641502022743225, -0.054574254900217056, 0.08694847673177719, 0.016804805025458336, 0.04780493676662445, 0.0697101354598999, 0.09725406765937805, -0.01216763537377119, 0.06312695890665054, -0.2940155267715454, 0.31647616624832153, -0.0009943152545019984, 0.09724273532629013, -0.06511767208576202, -0.024363387376070023, 0.02416185289621353, 0.054977014660835266, 0.09279108792543411, -0.006570268422365189, -0.025975694879889488, -0.21089719235897064, -0.03879842162132263, 0.016347307711839676, 0.08493711054325104, -0.036546893417835236, 0.11168045550584793, -0.009268410503864288, 0.0028257095254957676, 0.05501193553209305, 0.015079417265951633, -0.045221760869026184, -0.06933906674385071, -0.018515978008508682, 0.017475184053182602, -0.03990117833018303, -0.06684622913599014, -0.10587241500616074, -0.09092167019844055, 0.13543905317783356, 0.007037903647869825, -0.036778565496206284, -0.10973585397005081, 0.0800745040178299, 0.07146449387073517, -0.0825556069612503, 0.06106291711330414, 0.00971591379493475, 0.060079336166381836, -0.005540413316339254, -0.0428687147796154, 0.11713667958974838, -0.07013958692550659, -0.1631205976009369, -0.0789978876709938, 0.12581829726696014, 0.049969837069511414, 0.06756478548049927, 0.011181419715285301, 0.03562912717461586, -0.05609413608908653, -0.07024767994880676, 0.054499123245477676, -0.07575123012065887, 0.08078634738922119, -0.000013797905012324918, -0.011060550808906555, 0.036700911819934845, -0.06474920362234116, -0.014054334722459316, 0.16157488524913788, 0.2744290232658386, -0.11685330420732498, 0.020597700029611588, 0.027737583965063095, -0.03814752399921417, -0.18047486245632172, 0.030613187700510025, 0.05447310954332352, 0.03505166992545128, 0.03385462984442711, -0.15659597516059875, 0.07936949282884598, 0.08849282562732697, -0.014271462336182594, 0.10761528462171555, -0.2845722436904907, -0.12780754268169403, 0.09720969200134277, 0.12488920241594315, 0.18597500026226044, -0.12011361867189407, -0.012863416224718094, -0.02416853979229927, -0.16028177738189697, 0.07374859601259232, -0.050365664064884186, 0.12463956326246262, -0.03242744132876396, 0.14493845403194427, 0.006038660649210215, -0.06000911816954613, 0.10654549300670624, 0.002918300684541464, 0.09684165567159653, -0.07049502432346344, -0.044366296380758286, 0.07230136543512344, -0.03163786977529526, 0.007222177926450968, -0.06879955530166626, 0.0318802073597908, -0.09698976576328278, -0.005893247202038765, -0.10099371522665024, 0.04818352684378624, -0.03529456630349159, -0.05497775599360466, -0.04066504165530205, 0.041874807327985764, 0.034345678985118866, -0.011534261517226696, 0.08779705315828323, 0.0036127613857388496, 0.18238535523414612, 0.06422969698905945, 0.0681498795747757, -0.060590941458940506, -0.04806721210479736, 0.0032605554442852736, -0.04131939262151718, 0.05792964994907379, -0.1571376919746399, 0.004298022016882896, 0.12349225580692291, 0.046012453734874725, 0.1142355278134346, 0.07534758001565933, -0.03525993227958679, 0.032922305166721344, 0.08120465278625488, -0.1691618710756302, -0.04766615852713585, 0.005346063058823347, -0.06258609890937805, -0.09700419008731842, 0.03620229288935661, 0.08775260299444199, -0.08411271125078201, -0.027146248146891594, -0.018699556589126587, -0.006708715111017227, -0.08311883360147476, 0.187168151140213, 0.06968231499195099, 0.04853147640824318, -0.09357310086488724, 0.040729764848947525, 0.03684013709425926, -0.04389893263578415, 0.006277688313275576, 0.062138937413692474, -0.08489940315485, -0.03288326412439346, 0.02690998837351799, 0.13731057941913605, -0.05791594460606575, -0.01256565935909748, -0.15227775275707245, -0.10659994184970856, 0.07519809901714325, 0.20885814726352692, 0.09618151187896729, -0.006214609369635582, -0.041639294475317, 0.02033732831478119, -0.12453042715787888, 0.06925833970308304, 0.05967395380139351, 0.0688871294260025, -0.1083441898226738, 0.19055825471878052, -0.013527859933674335, 0.05939541012048721, -0.02676212787628174, 0.026311282068490982, -0.09038709104061127, 0.023527588695287704, -0.08699867874383926, -0.05154218524694443, -0.030577749013900757, -0.01549763698130846, -0.021115189418196678, -0.06857579946517944, -0.05877121910452843, 0.009192686527967453, -0.11612674593925476, -0.02119511365890503, 0.0523981899023056, 0.021219074726104736, -0.10407398641109467, -0.036050450056791306, 0.04487835243344307, -0.052383508533239365, 0.07380423694849014, 0.0707024484872818, 0.01897057332098484, 0.03374842181801796, -0.14057950675487518, -0.003543603466823697, 0.030744867399334908, -0.016701864078640938, 0.07003219425678253, -0.07729381322860718, -0.007840093225240707, -0.015596565790474415, 0.0641297921538353, 0.02688620239496231, 0.0767514780163765, -0.13717129826545715, 0.026588764041662216, 0.03386104851961136, -0.0729464516043663, -0.06725138425827026, 0.011749588884413242, 0.07562889903783798, 0.013136153109371662, 0.18795308470726013, -0.09292003512382507, 0.04969516023993492, -0.2099715620279312, -0.011526528745889664, -0.014954622834920883, -0.10270190238952637, -0.1176641583442688, -0.0524163618683815, 0.06836669147014618, -0.050556622445583344, 0.13502374291419983, 0.01635655015707016, 0.01748725026845932, 0.0346592515707016, -0.013550053350627422, 0.010771659202873707, 0.004667272791266441, 0.19480732083320618, 0.021919209510087967, -0.0547507144510746, 0.05741841346025467, 0.06297776848077774, 0.07832437753677368, 0.08861824870109558, 0.17625251412391663, 0.15642885863780975, 0.08251985162496567, 0.0851081982254982, 0.05402268096804619, -0.00923155713826418, -0.14841926097869873, 0.00021421666315291077, -0.015184752643108368, 0.07057009637355804, -0.026930656284093857, 0.19406788051128387, 0.11383974552154541, -0.16837719082832336, 0.04628028720617294, -0.05136144161224365, -0.06954070180654526, -0.0899176374077797, -0.07708035409450531, -0.07212530076503754, -0.14124315977096558, 0.02501044236123562, -0.09593784809112549, 0.023166228085756302, 0.10661362111568451, -0.004391107242554426, -0.0319465808570385, 0.1493789702653885, 0.003187549766153097, 0.03528589382767677, 0.07537847012281418, -0.0057254983112216, -0.010865243151783943, -0.07236230373382568, -0.06594471633434296, -0.016836507245898247, -0.030564645305275917, 0.03706028312444687, -0.06150184944272041, -0.06465770304203033, 0.02782965637743473, -0.014127294532954693, -0.11013343185186386, 0.01038474589586258, 0.04189646244049072, 0.07341721653938293, 0.030246274545788765, 0.0027928885538131, 0.03353730961680412, -0.016185306012630463, 0.21438531577587128, -0.06773596256971359, -0.07824812084436417, -0.107664555311203, 0.24079535901546478, 0.040728528052568436, -0.027505388483405113, 0.021918274462223053, -0.06023706495761871, 0.02594738081097603, 0.24914921820163727, 0.21692071855068207, -0.08771921694278717, 0.008779069408774376, -0.005206046625971794, -0.014669789932668209, -0.01751120388507843, 0.09362853318452835, 0.12091982364654541, 0.032388295978307724, -0.09212638437747955, -0.04667915031313896, -0.07887230813503265, -0.006397313438355923, -0.041562605649232864, 0.02823956124484539, 0.02667245641350746, 0.004400291014462709, -0.054121650755405426, 0.06044403463602066, -0.05386604741215706, -0.10161062330007553, 0.07609890401363373, -0.21821758151054382, -0.1488495022058487, -0.0006759882089681923, 0.04240753874182701, 0.033778116106987, 0.08799058198928833, -0.026354214176535606, 0.009021082893013954, 0.07696053385734558, -0.02314845286309719, -0.06022382900118828, -0.09946729242801666, 0.10753278434276581, -0.09524912387132645, 0.21753434836864471, -0.055901382118463516, 0.06798763573169708, 0.12801583111286163, 0.07098929584026337, -0.0702962726354599, 0.0616007000207901, 0.04365628957748413, -0.04051509499549866, 0.013136353343725204, 0.09869316220283508, -0.017540717497467995, 0.05255104601383209, 0.04625944793224335, -0.13708725571632385, 0.038751859217882156, -0.08015932142734528, -0.03945973142981529, -0.05480131134390831, -0.009387045167386532, -0.04417979717254639, 0.13154985010623932, 0.21742680668830872, -0.040093984454870224, -0.006217358633875847, -0.07215675711631775, 0.005226569715887308, 0.08751948922872543, 0.0068515995517373085, -0.09614334255456924, -0.22692422568798065, 0.0051860143430531025, 0.0874495729804039, -0.037651050835847855, -0.27700111269950867, -0.09169138967990875, -0.011394159868359566, -0.07691892236471176, -0.05618215724825859, 0.09220390766859055, 0.0835980474948883, 0.05755927041172981, -0.048898518085479736, -0.05679700896143913, -0.06490446627140045, 0.16352665424346924, -0.1446819007396698, -0.09425539523363113 ]
null
null
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
null
avemio-digital/unsloth_mistral_4bit_adapter_max200
[ "transformers", "safetensors", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
2024-02-13T19:10:34+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ 31, 6, 3, 82, 28, 3, 4, 9, 9, 10, 42, 20, 3, 4, 5, 9, 11, 13, 3, 12, 5, 4, 5, 3, 4, 9, 53, 9, 8, 6, 3, 14, 8, 7, 9, 4 ]
[ "passage: TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact" ]
[ -0.06646376848220825, 0.2168014943599701, -0.00225935154594481, 0.023818302899599075, 0.1271018385887146, -0.001635765191167593, 0.04218708351254463, 0.13324736058712006, -0.020175931975245476, 0.11144465953111649, 0.046588581055402756, 0.09377603232860565, 0.09928803145885468, 0.18404334783554077, 0.04859916493296623, -0.2059975117444992, 0.007056170143187046, -0.09090408682823181, 0.014076028019189835, 0.1116579994559288, 0.13719257712364197, -0.10291384905576706, 0.08272874355316162, -0.04045208916068077, -0.02019004337489605, 0.00012576708104461432, -0.09259183704853058, -0.07032395154237747, 0.06885425746440887, 0.06264153122901917, 0.051234472543001175, 0.001456156256608665, 0.09140396863222122, -0.2864592671394348, 0.017265573143959045, 0.08406311273574829, 0.0027674848679453135, 0.06290827691555023, 0.07236549258232117, -0.07389893382787704, 0.11328595131635666, -0.08021481335163116, 0.13019037246704102, 0.08625296503305435, -0.062064990401268005, -0.23071379959583282, -0.07525765895843506, 0.0963398814201355, 0.12251301854848862, 0.06215599179267883, -0.022921854630112648, 0.15455181896686554, -0.06248689442873001, 0.012971068732440472, 0.1294165402650833, -0.11526761949062347, -0.05572471022605896, 0.061741601675748825, 0.11775490641593933, 0.10740239918231964, -0.14110268652439117, -0.0017287094378843904, 0.04900608956813812, 0.029121357947587967, 0.08589313924312592, 0.022661056369543076, 0.12003941088914871, 0.04652795568108559, -0.13695219159126282, -0.04037507623434067, 0.12011898308992386, 0.038862764835357666, -0.06446044892072678, -0.2168138176202774, -0.006778308190405369, -0.0601806715130806, -0.014732478186488152, -0.07019448280334473, 0.039128515869379044, -0.02470310963690281, 0.07317749410867691, -0.04465159401297569, -0.1063927412033081, -0.0421026237308979, 0.0892222449183464, 0.07748593389987946, 0.011527054943144321, -0.02519804798066616, 0.04627908393740654, 0.13455867767333984, 0.05402068421244621, -0.10399353504180908, -0.07017925381660461, -0.06942764669656754, -0.09420394152402878, -0.04035796597599983, 0.056760527193546295, 0.031942449510097504, 0.02665667235851288, 0.22703726589679718, 0.016653569415211678, 0.04155244305729866, 0.0224777739495039, 0.01032855175435543, 0.043662428855895996, 0.0955500528216362, -0.05303520709276199, -0.15660029649734497, -0.04072032496333122, 0.09077946096658707, -0.0027527001220732927, -0.036689214408397675, -0.03966725245118141, 0.03849169611930847, 0.06843466311693192, 0.13122352957725525, 0.07552056759595871, -0.017929591238498688, -0.04813180863857269, -0.030096933245658875, 0.23523783683776855, -0.1493375599384308, 0.04426715523004532, -0.02271856553852558, -0.01804111897945404, -0.03908449783921242, 0.03597262129187584, 0.022118929773569107, -0.000004518366949923802, 0.09706240892410278, -0.058981191366910934, -0.05378659814596176, -0.10168042778968811, -0.03272576630115509, 0.04088849574327469, -0.013975566253066063, -0.010589460842311382, -0.09025166928768158, -0.09490354359149933, -0.04766594246029854, 0.05537205561995506, -0.05123869329690933, -0.03770573064684868, 0.009465423412621021, -0.08151785284280777, -0.005444355774670839, -0.005417742300778627, 0.10699385404586792, -0.03222226724028587, 0.04445803165435791, -0.027600755915045738, 0.05225523188710213, 0.09919606149196625, 0.031576547771692276, -0.0773419588804245, 0.0561848059296608, -0.22559374570846558, 0.07503069192171097, -0.11481974273920059, 0.04335082694888115, -0.1704932004213333, -0.042439818382263184, 0.005444696638733149, 0.0139949731528759, 0.013206101022660732, 0.12720820307731628, -0.19255615770816803, -0.01654396951198578, 0.13260798156261444, -0.09212633967399597, -0.118110790848732, 0.07884611934423447, -0.029701577499508858, 0.1624738723039627, 0.04682036489248276, -0.027025915682315826, 0.09224298596382141, -0.16434773802757263, -0.07092688232660294, -0.00949116237461567, -0.01727987825870514, 0.12109188735485077, 0.07512219995260239, -0.05991523340344429, 0.046571120619773865, 0.02832140028476715, -0.038078423589468, -0.04424772411584854, -0.050857074558734894, -0.10884185880422592, -0.01070026308298111, -0.08987759798765182, 0.04065500199794769, -0.01250192429870367, -0.07916021347045898, -0.029885273426771164, -0.18612512946128845, -0.0030564051121473312, 0.10038342326879501, 0.0035033065360039473, -0.005652366206049919, -0.08666291832923889, 0.026358824223279953, -0.03112892620265484, -0.008404186926782131, -0.16764774918556213, -0.04399421438574791, 0.046902090311050415, -0.16094985604286194, 0.020117372274398804, -0.06413903087377548, 0.06334125250577927, 0.03641495108604431, -0.05590536445379257, -0.0248766727745533, -0.01730942726135254, 0.011945613659918308, -0.05083848536014557, -0.18994836509227753, -0.056277405470609665, -0.037882111966609955, 0.149809330701828, -0.25956398248672485, 0.032966937869787216, 0.051140617579221725, 0.14649195969104767, 0.00406361510977149, -0.05115427449345589, 0.01429014839231968, -0.05360214412212372, -0.054652128368616104, -0.06746816635131836, -0.006135428790003061, -0.027576493099331856, -0.05147203803062439, 0.019243421033024788, -0.1755700707435608, -0.021410830318927765, 0.09424154460430145, 0.12876708805561066, -0.1486445665359497, -0.018640631809830666, -0.048725154250860214, -0.06339836865663528, -0.0715010017156601, -0.07038594037294388, 0.10712739825248718, 0.0513901449739933, 0.04796046018600464, -0.07435787469148636, -0.07092321664094925, 0.02726263552904129, 0.006906150374561548, -0.03382374346256256, 0.08727246522903442, 0.05199531093239784, -0.09209315478801727, 0.0756213590502739, 0.1092359870672226, 0.07177663594484329, 0.09363535046577454, 0.01574566215276718, -0.11756632477045059, -0.028492970392107964, 0.036266472190618515, 0.02740776725113392, 0.1465986967086792, -0.05952361226081848, 0.04016614332795143, 0.04494241625070572, -0.04170418903231621, 0.022319864481687546, -0.08787637203931808, 0.024075502529740334, 0.025203049182891846, -0.0034381982404738665, 0.06284574419260025, -0.02525499276816845, -0.0050758360885083675, 0.07016654312610626, 0.047779910266399384, 0.04621000960469246, 0.009655474685132504, -0.01720241829752922, -0.1047825813293457, 0.16950392723083496, -0.0951867327094078, -0.269941508769989, -0.17632324993610382, 0.026197833940386772, 0.04035249724984169, -0.022378476336598396, 0.031619444489479065, -0.07056326419115067, -0.10630585998296738, -0.1060405746102333, -0.002429972169920802, 0.01714223250746727, -0.06364088505506516, -0.0741225928068161, 0.07348573952913284, 0.04382912442088127, -0.14902326464653015, 0.038552410900592804, 0.055694397538900375, -0.057955220341682434, -0.0233661737293005, 0.09118817001581192, 0.12397737801074982, 0.14583967626094818, -0.021366750821471214, -0.028626007959246635, 0.029004426673054695, 0.19620531797409058, -0.13469526171684265, 0.10371150821447372, 0.13814030587673187, -0.04545360431075096, 0.08360563963651657, 0.1560150384902954, 0.029186224564909935, -0.08317049592733383, 0.05044832453131676, 0.04082648828625679, -0.043159641325473785, -0.2666129767894745, -0.0534592866897583, 0.012832709588110447, -0.06255637854337692, 0.09786593168973923, 0.10183793306350708, 0.11542957276105881, 0.034910861402750015, -0.07166364789009094, -0.043925940990448, -0.0058974819257855415, 0.11737963557243347, -0.05490213260054588, -0.012639665976166725, 0.07686592638492584, -0.05086168646812439, 0.005355054512619972, 0.10266812145709991, 0.02973790094256401, 0.17442677915096283, 0.020399179309606552, 0.11231429129838943, 0.06195578724145889, 0.08633565157651901, 0.0007386076031252742, 0.02951662428677082, 0.05147615820169449, 0.017203815281391144, -0.002300140680745244, -0.10421168059110641, -0.006156572140753269, 0.1449710875749588, 0.028103826567530632, 0.029669636860489845, -0.0018948549404740334, -0.005003341939300299, 0.05121048167347908, 0.1746254414319992, -0.011592294089496136, -0.22072425484657288, -0.0845772922039032, 0.06936841458082199, -0.06218599155545235, -0.12968985736370087, -0.026130788028240204, 0.045467354357242584, -0.17519839107990265, 0.026703642681241035, -0.027433741837739944, 0.0919293761253357, -0.09345759451389313, -0.02221956104040146, 0.03687324374914169, 0.084866963326931, -0.014529162086546421, 0.08703910559415817, -0.14498743414878845, 0.11886418610811234, 0.02978132851421833, 0.09024628251791, -0.11081171780824661, 0.07909037172794342, -0.007550720125436783, 0.009180475026369095, 0.19379350543022156, -0.011335089802742004, -0.03514958545565605, -0.08774717897176743, -0.11210042238235474, -0.013537433929741383, 0.12687496840953827, -0.1243172138929367, 0.08773399889469147, -0.015198243781924248, -0.044079482555389404, 0.00937260314822197, -0.12100647389888763, -0.17273177206516266, -0.19628387689590454, 0.05585884302854538, -0.09575839340686798, 0.025643249973654747, -0.11914430558681488, -0.07089093327522278, -0.02952558360993862, 0.241120383143425, -0.1745356321334839, -0.06510113179683685, -0.1468164622783661, -0.046294767409563065, 0.1662203073501587, -0.04437198117375374, 0.0718095526099205, -0.0208172257989645, 0.20345525443553925, 0.005988610442727804, -0.004939318168908358, 0.06724198162555695, -0.08892562240362167, -0.16873881220817566, -0.06771010160446167, 0.1510489284992218, 0.11680185794830322, 0.04907919466495514, -0.002248800592496991, 0.0011772146681323647, -0.016943959519267082, -0.1137804463505745, -0.0033210667315870523, 0.16037839651107788, 0.03878779336810112, 0.025986969470977783, -0.05243593826889992, -0.08797456324100494, -0.06899320334196091, -0.06853509694337845, 0.06221301481127739, 0.19590823352336884, -0.10376439243555069, 0.1700313836336136, 0.147536963224411, -0.07305635511875153, -0.23175598680973053, 0.035342130810022354, 0.04983805492520332, 0.0014306638622656465, 0.04886869341135025, -0.18252557516098022, 0.10521943867206573, 0.019543392583727837, -0.05505957826972008, 0.13485197722911835, -0.1557481735944748, -0.1552847921848297, 0.0722852572798729, 0.03904085233807564, -0.22423844039440155, -0.1354004591703415, -0.09622503817081451, -0.05825018882751465, -0.14065024256706238, 0.06054598465561867, -0.002136280992999673, 0.015948504209518433, 0.03500790148973465, -0.0015643214574083686, 0.027123261243104935, -0.058935679495334625, 0.18609118461608887, -0.004065449349582195, 0.020676052197813988, -0.060264769941568375, -0.0478842556476593, 0.09839435666799545, -0.06130504235625267, 0.12208222597837448, 0.004057085141539574, 0.01594383642077446, -0.10362856835126877, -0.048314861953258514, -0.04328322783112526, 0.05154227837920189, -0.07548051327466965, -0.10070807486772537, -0.043625857681035995, 0.08841723203659058, 0.07005169242620468, -0.03383097052574158, 0.00549331633374095, -0.07189501076936722, 0.10019614547491074, 0.17795267701148987, 0.17573626339435577, 0.009926567785441875, -0.07241068035364151, 0.01677953451871872, -0.04142116755247116, 0.044231921434402466, -0.2513144314289093, 0.03756171092391014, 0.06098250672221184, 0.029438555240631104, 0.09217222779989243, -0.020435843616724014, -0.1820858269929886, -0.04050002992153168, 0.08094815909862518, -0.05452597141265869, -0.22617179155349731, -0.019085140898823738, 0.0954197570681572, -0.2020406424999237, -0.007372708059847355, 0.03995226323604584, -0.048725228756666183, -0.023169852793216705, 0.00010950004070764408, 0.06317184865474701, 0.002471912419423461, 0.09773622453212738, 0.0735151618719101, 0.09715340286493301, -0.08337292820215225, 0.10562895983457565, 0.10150538384914398, -0.09572599828243256, 0.03605884686112404, 0.06754924356937408, -0.05300498008728027, -0.043293699622154236, 0.03665391728281975, 0.033023297786712646, 0.005234600510448217, -0.060321882367134094, 0.013913018628954887, -0.036497246474027634, 0.044923391193151474, 0.08326134830713272, 0.03754979372024536, -0.013354414142668247, 0.06462216377258301, 0.03401726484298706, -0.10898099094629288, 0.10366570204496384, 0.01731540448963642, 0.04105307161808014, -0.08384523540735245, -0.019968897104263306, 0.035425446927547455, 0.030576206743717194, -0.01765924133360386, -0.02306121215224266, -0.02860277332365513, -0.01614218018949032, -0.14299540221691132, -0.023106401786208153, -0.07243485748767853, 0.006181265693157911, 0.014656842686235905, -0.031884219497442245, -0.011233693920075893, 0.02475680410861969, -0.06979699432849884, -0.07426341623067856, -0.006949664559215307, 0.09833318740129471, -0.15115703642368317, 0.008848577737808228, 0.06907843053340912, -0.11088496446609497, 0.08190931379795074, -0.008411259390413761, 0.016245156526565552, 0.022527478635311127, -0.15448406338691711, 0.05601610988378525, 0.0008648968650959432, 0.01916889287531376, 0.025886621326208115, -0.16471809148788452, 0.004104440100491047, -0.04661374166607857, -0.02149827405810356, -0.00004464812809601426, -0.02647159807384014, -0.12325995415449142, 0.06858719140291214, -0.015622655861079693, -0.035931166261434555, -0.02701525390148163, 0.0539589487016201, 0.07888586074113846, -0.027474910020828247, 0.10445091128349304, -0.008690856397151947, 0.04941811040043831, -0.16801609098911285, -0.02470702864229679, -0.04982255399227142, 0.019377702847123146, 0.009884213097393513, -0.007693959400057793, 0.04183054715394974, -0.00976533442735672, 0.21883612871170044, -0.05075952783226967, 0.1607085019350052, 0.05847611650824547, -0.017352959141135216, -0.0007513365126214921, 0.06180921941995621, 0.05997028574347496, 0.04658793285489082, 0.009480604901909828, 0.023740366101264954, -0.022450892254710197, -0.006695089396089315, -0.15932634472846985, 0.01890849508345127, 0.14999441802501678, 0.06301083415746689, 0.024745315313339233, 0.05866100639104843, -0.12775006890296936, -0.12135478109121323, 0.09311001747846603, -0.026755332946777344, 0.00928465835750103, -0.08245618641376495, 0.1358020007610321, 0.14980104565620422, -0.14000412821769714, 0.05256148427724838, -0.06134212389588356, -0.05217423290014267, -0.10388828068971634, -0.12032219022512436, -0.05887215584516525, -0.053666237741708755, 0.002330566756427288, -0.03760887682437897, 0.054546963423490524, 0.03344334661960602, -0.009351172484457493, -0.00022941511997487396, 0.13597318530082703, -0.019751882180571556, -0.0028988157864660025, 0.048313532024621964, 0.03693558648228645, 0.02373051457107067, -0.05275435373187065, 0.02940409444272518, 0.02539868652820587, 0.032232340425252914, 0.06546790152788162, 0.033412106335163116, -0.047448933124542236, 0.03804153576493263, -0.0025254099164158106, -0.11207924783229828, 0.019641218706965446, -0.00460948096588254, -0.0742158442735672, 0.1268945336341858, 0.0407399944961071, 0.010224059224128723, -0.03741471841931343, 0.24361543357372284, -0.06653323769569397, -0.06378097087144852, -0.13251738250255585, 0.10491154342889786, -0.0027236645109951496, 0.06476365029811859, 0.023412218317389488, -0.1284150779247284, 0.005243356805294752, 0.13858191668987274, 0.12181595712900162, 0.0045748427510261536, 0.009228081442415714, 0.0518609918653965, 0.0025186820421367884, -0.06998204439878464, 0.054019294679164886, 0.06992026418447495, 0.12919506430625916, -0.07847554981708527, 0.07680778950452805, 0.0006860480643808842, -0.08370215445756912, -0.02947772853076458, 0.11312682181596756, -0.0409729965031147, 0.03491825982928276, -0.047444481402635574, 0.10916327685117722, -0.05787910893559456, -0.29412412643432617, 0.02350960113108158, -0.09588567912578583, -0.15202060341835022, -0.018367812037467957, 0.05944539234042168, -0.02624768204987049, 0.018029648810625076, 0.06971040368080139, -0.06011629104614258, 0.20098382234573364, 0.0335683599114418, -0.07864278554916382, -0.0664360448718071, 0.04837050288915634, -0.06564252078533173, 0.2949807047843933, 0.008418165147304535, 0.02863333560526371, 0.10770907253026962, -0.03253700211644173, -0.18271861970424652, 0.010723991319537163, 0.1133992001414299, -0.08056149631738663, 0.08200647681951523, 0.19000613689422607, -0.012578671798110008, 0.1209007054567337, 0.05294662341475487, -0.047376248985528946, 0.04217283055186272, -0.03389401361346245, -0.051268599927425385, -0.10752558708190918, 0.058453381061553955, -0.05909625440835953, 0.15447644889354706, 0.10152646154165268, -0.05671518296003342, -0.004550917539745569, -0.05555408447980881, 0.04875178262591362, 0.01804669201374054, 0.12263146042823792, 0.02951994352042675, -0.1865430772304535, 0.032826557755470276, -0.01144319772720337, 0.10186848044395447, -0.25588861107826233, -0.08421015739440918, 0.08833149075508118, -0.011924264021217823, -0.05105875805020332, 0.10560628771781921, 0.057650718837976456, 0.04243382066488266, -0.043439045548439026, -0.10480839014053345, -0.02186836116015911, 0.14663739502429962, -0.1469624787569046, -0.025013303384184837 ]
null
null
peft
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # fine-tuning-Phi2-with-webglm-qa-with-lora This model is a fine-tuned version of [microsoft/phi-2](https://huggingface.co/microsoft/phi-2) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.2084 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - gradient_accumulation_steps: 5 - total_train_batch_size: 10 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 20 - training_steps: 200 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | No log | 0.2 | 10 | 7.3955 | | 7.0321 | 0.4 | 20 | 2.6772 | | 7.0321 | 0.6 | 30 | 0.5779 | | 0.751 | 0.8 | 40 | 0.5021 | | 0.751 | 1.0 | 50 | 0.4501 | | 0.3719 | 1.2 | 60 | 0.4067 | | 0.3719 | 1.39 | 70 | 0.3655 | | 0.3398 | 1.59 | 80 | 0.3302 | | 0.3398 | 1.79 | 90 | 0.3029 | | 0.2285 | 1.99 | 100 | 0.2831 | | 0.2285 | 2.19 | 110 | 0.2666 | | 0.2156 | 2.39 | 120 | 0.2549 | | 0.2156 | 2.59 | 130 | 0.2435 | | 0.2049 | 2.79 | 140 | 0.2324 | | 0.2049 | 2.99 | 150 | 0.2246 | | 0.177 | 3.19 | 160 | 0.2197 | | 0.177 | 3.39 | 170 | 0.2149 | | 0.1745 | 3.59 | 180 | 0.2112 | | 0.1745 | 3.78 | 190 | 0.2091 | | 0.1742 | 3.98 | 200 | 0.2084 | ### Framework versions - PEFT 0.7.1 - Transformers 4.36.2 - Pytorch 2.0.0 - Datasets 2.15.0 - Tokenizers 0.15.0
{"license": "mit", "library_name": "peft", "tags": ["generated_from_trainer"], "base_model": "microsoft/phi-2", "model-index": [{"name": "fine-tuning-Phi2-with-webglm-qa-with-lora", "results": []}]}
null
Gunslinger3D/fine-tuning-Phi2-with-webglm-qa-with-lora
[ "peft", "safetensors", "generated_from_trainer", "base_model:microsoft/phi-2", "license:mit", "region:us" ]
2024-02-13T19:13:09+00:00
[]
[]
TAGS #peft #safetensors #generated_from_trainer #base_model-microsoft/phi-2 #license-mit #region-us
fine-tuning-Phi2-with-webglm-qa-with-lora ========================================= This model is a fine-tuned version of microsoft/phi-2 on an unknown dataset. It achieves the following results on the evaluation set: * Loss: 0.2084 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 5e-05 * train\_batch\_size: 2 * eval\_batch\_size: 2 * seed: 42 * gradient\_accumulation\_steps: 5 * total\_train\_batch\_size: 10 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * lr\_scheduler\_warmup\_steps: 20 * training\_steps: 200 * mixed\_precision\_training: Native AMP ### Training results ### Framework versions * PEFT 0.7.1 * Transformers 4.36.2 * Pytorch 2.0.0 * Datasets 2.15.0 * Tokenizers 0.15.0
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 2\n* eval\\_batch\\_size: 2\n* seed: 42\n* gradient\\_accumulation\\_steps: 5\n* total\\_train\\_batch\\_size: 10\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 20\n* training\\_steps: 200\n* mixed\\_precision\\_training: Native AMP", "### Training results", "### Framework versions\n\n\n* PEFT 0.7.1\n* Transformers 4.36.2\n* Pytorch 2.0.0\n* Datasets 2.15.0\n* Tokenizers 0.15.0" ]
[ "TAGS\n#peft #safetensors #generated_from_trainer #base_model-microsoft/phi-2 #license-mit #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 2\n* eval\\_batch\\_size: 2\n* seed: 42\n* gradient\\_accumulation\\_steps: 5\n* total\\_train\\_batch\\_size: 10\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 20\n* training\\_steps: 200\n* mixed\\_precision\\_training: Native AMP", "### Training results", "### Framework versions\n\n\n* PEFT 0.7.1\n* Transformers 4.36.2\n* Pytorch 2.0.0\n* Datasets 2.15.0\n* Tokenizers 0.15.0" ]
[ 35, 158, 4, 36 ]
[ "passage: TAGS\n#peft #safetensors #generated_from_trainer #base_model-microsoft/phi-2 #license-mit #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 2\n* eval\\_batch\\_size: 2\n* seed: 42\n* gradient\\_accumulation\\_steps: 5\n* total\\_train\\_batch\\_size: 10\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 20\n* training\\_steps: 200\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* PEFT 0.7.1\n* Transformers 4.36.2\n* Pytorch 2.0.0\n* Datasets 2.15.0\n* Tokenizers 0.15.0" ]
[ -0.1231389045715332, 0.09117334336042404, -0.0029303012415766716, 0.0840098038315773, 0.11769097298383713, 0.010705992579460144, 0.09065578877925873, 0.15191759169101715, -0.09116426855325699, 0.08451981097459793, 0.10868307948112488, 0.10667676478624344, 0.06005077809095383, 0.21604792773723602, -0.03520265966653824, -0.2739909589290619, 0.031351279467344284, -0.023104781284928322, -0.046697232872247696, 0.12419623881578445, 0.09520524740219116, -0.12092027813196182, 0.0567474365234375, -0.021865248680114746, -0.13447308540344238, -0.006472267210483551, -0.006629473064094782, -0.03434270620346069, 0.10809661448001862, -0.00033930278732441366, 0.09664319455623627, 0.03509072959423065, 0.0995534285902977, -0.23499299585819244, 0.006358013022691011, 0.06979210674762726, 0.02840615250170231, 0.07571766525506973, 0.10564251989126205, -0.022121533751487732, 0.12140960246324539, -0.10977863520383835, 0.06507828831672668, 0.02811763808131218, -0.15134064853191376, -0.3343483507633209, -0.11835943907499313, 0.04875439405441284, 0.1125277429819107, 0.06983682513237, -0.022233910858631134, 0.1243852823972702, -0.06387393176555634, 0.08547891676425934, 0.283424973487854, -0.27707579731941223, -0.0658724382519722, 0.012907469645142555, 0.03606215864419937, 0.07271894812583923, -0.11177496612071991, -0.04141843318939209, 0.03937594220042229, 0.03531525656580925, 0.1289510577917099, 0.0049804882146418095, -0.049113404005765915, -0.0032736214343458414, -0.15172851085662842, -0.05844634026288986, 0.10866153985261917, 0.04079949110746384, -0.04149310290813446, -0.08716163039207458, -0.04289967566728592, -0.17748737335205078, -0.05382324755191803, 0.0072253248654305935, 0.04551910236477852, -0.044642046093940735, -0.04475770518183708, 0.0007084955577738583, -0.07092593610286713, -0.08419519662857056, 0.019044872373342514, 0.1316177397966385, 0.07006433606147766, 0.0019087122054770589, -0.00046788365580141544, 0.11323744058609009, -0.01688443496823311, -0.14287187159061432, -0.014345290139317513, -0.0001420769258402288, -0.034559525549411774, -0.033556126058101654, -0.03843671828508377, 0.02378019690513611, 0.01911802403628826, 0.1524861454963684, -0.14506445825099945, 0.07126972824335098, 0.004561139736324549, 0.021649323403835297, -0.08726512640714645, 0.11552084237337112, -0.058603737503290176, 0.010799482464790344, -0.02064649946987629, 0.09672629088163376, 0.019975146278738976, -0.00043313857167959213, -0.057056110352277756, 0.04472021386027336, 0.11262452602386475, 0.06567864120006561, -0.03578021749854088, 0.030635621398687363, -0.06642543524503708, -0.008439544588327408, 0.05445041134953499, -0.12179192900657654, 0.05355434864759445, 0.01650313474237919, -0.08674868196249008, -0.07755927741527557, -0.001247924636118114, 0.0031939365435391665, -0.020079918205738068, 0.10753198713064194, -0.06730165332555771, 0.03006078116595745, -0.09016820788383484, -0.11836607754230499, 0.023421337828040123, -0.07581168413162231, 0.0034519110340625048, -0.06063282489776611, -0.16628970205783844, -0.03972798213362694, 0.043159667402505875, -0.07013111561536789, -0.03591546788811684, -0.049384407699108124, -0.08968223631381989, 0.012973814271390438, -0.03145549073815346, 0.13984137773513794, -0.07986610382795334, 0.11319342255592346, 0.015899324789643288, 0.05803439021110535, -0.0015419155824929476, 0.042136576026678085, -0.09453276544809341, 0.04366181418299675, -0.2165376842021942, 0.04925893247127533, -0.08455508947372437, 0.05012897029519081, -0.12506920099258423, -0.11036871373653412, -0.0019600235391408205, -0.016830815002322197, 0.11189614981412888, 0.1363573670387268, -0.19016601145267487, -0.04642277956008911, 0.22203348577022552, -0.1122760996222496, -0.11457334458827972, 0.09223685413599014, -0.02964063547551632, 0.009248379617929459, 0.046664588153362274, 0.2353198379278183, 0.027127796784043312, -0.1345677673816681, 0.01933194510638714, -0.059025540947914124, 0.089266836643219, 0.007559692021459341, 0.07586321979761124, -0.03370359167456627, 0.029224596917629242, 0.017581108957529068, -0.04588761925697327, 0.03933335840702057, -0.1104295626282692, -0.0780545175075531, -0.04193675145506859, -0.08014114201068878, 0.01657685451209545, 0.06287894397974014, 0.028990894556045532, -0.1218915730714798, -0.09260419011116028, 0.0439242459833622, 0.09583066403865814, -0.0658072978258133, 0.034807298332452774, -0.06298473477363586, 0.08932927250862122, -0.005054790992289782, -0.01783589832484722, -0.18521186709403992, -0.07329478114843369, 0.037558939307928085, -0.04587714001536369, -0.013704965822398663, -0.07836052030324936, 0.07721671462059021, 0.07972399890422821, -0.07079875469207764, -0.05963178351521492, -0.049095518887043, 0.0005159327993169427, -0.10632498562335968, -0.2435934692621231, -0.060398004949092865, -0.037002112716436386, 0.10646738857030869, -0.22585657238960266, 0.02773180603981018, 0.008932552300393581, 0.12463986873626709, 0.030705226585268974, -0.051167700439691544, -0.00013384527119342238, 0.07018648833036423, -0.014604742638766766, -0.08143313974142075, 0.053843896836042404, 0.002892988733947277, -0.07230763882398605, 0.00980612076818943, -0.11799602955579758, 0.11422968655824661, 0.10852835327386856, 0.020461557433009148, -0.09967242926359177, -0.05363663285970688, -0.08152668178081512, -0.03999217972159386, -0.05159405991435051, 0.046532537788152695, 0.09224413335323334, 0.01278571505099535, 0.11556289345026016, -0.10018541663885117, -0.044643837958574295, 0.03578227385878563, -0.029216313734650612, 0.007534598931670189, 0.13857655227184296, 0.035643257200717926, -0.07069525122642517, 0.12829329073429108, 0.1380934864282608, -0.010410451330244541, 0.10309259593486786, -0.07449515163898468, -0.10654311627149582, -0.026812417432665825, 0.04423123225569725, 0.015435227192938328, 0.15281037986278534, -0.03887291997671127, 0.020810604095458984, 0.021387435495853424, 0.03818877041339874, 0.011612876318395138, -0.20633165538311005, -0.02557876706123352, 0.013341025449335575, -0.08108135312795639, -0.04216247797012329, -0.02425294928252697, 0.020779013633728027, 0.11306100338697433, -0.004279141314327717, -0.04216058924794197, -0.009757883846759796, -0.005042115692049265, -0.08192082494497299, 0.20351773500442505, -0.10368817299604416, -0.11670447140932083, -0.06453099101781845, -0.0302998349070549, -0.019821327179670334, -0.016392653807997704, 0.0677693784236908, -0.07881394773721695, -0.03214235603809357, -0.11285538226366043, -0.015573065727949142, 0.0028844382613897324, 0.013517649844288826, -0.03788736090064049, 0.0055185058154165745, 0.09494633227586746, -0.09002723544836044, -0.0013101688819006085, -0.03531051427125931, -0.03774392604827881, 0.053978800773620605, 0.03707847371697426, 0.1015024408698082, 0.12412264198064804, 0.00915750116109848, 0.03246620297431946, -0.04765365272760391, 0.21610935032367706, -0.0695657953619957, -0.02562635764479637, 0.10044503957033157, 0.007622519973665476, 0.0717848613858223, 0.15512578189373016, 0.04279311001300812, -0.12253449857234955, 0.009297901764512062, 0.04415084794163704, -0.02211366780102253, -0.227694571018219, -0.05710204690694809, -0.03169256076216698, -0.018086610361933708, 0.11603356152772903, 0.04290858283638954, -0.025228828191757202, 0.01837780699133873, -0.015894250944256783, -0.04007013887166977, 0.013055196963250637, 0.07666691392660141, 0.018030622974038124, 0.03531167283654213, 0.1124548614025116, -0.030467886477708817, -0.008598050102591515, 0.04773714020848274, -0.011755518615245819, 0.2673741281032562, -0.028129322454333305, 0.09298768639564514, 0.06548389047384262, 0.20936504006385803, -0.0019390074303373694, 0.06325452774763107, 0.015893200412392616, -0.01746109127998352, 0.004386975895613432, -0.06386636197566986, -0.009910923428833485, 0.036806367337703705, -0.037227142602205276, 0.027498139068484306, -0.14644549787044525, -0.0303167924284935, 0.05439980328083038, 0.31853824853897095, 0.07838556915521622, -0.3253489136695862, -0.06996963918209076, 0.006278686691075563, -0.022045517340302467, -0.05094093084335327, 0.0030522942543029785, 0.10085410624742508, -0.08112461119890213, 0.08129394054412842, -0.0637902095913887, 0.08810043334960938, -0.020144682377576828, 0.013939066790044308, 0.06801239401102066, 0.0761926993727684, -0.010830431245267391, 0.05417485907673836, -0.27697721123695374, 0.3263319432735443, 0.010949144139885902, 0.07538819313049316, -0.038846295326948166, -0.0033374426420778036, 0.012698478065431118, 0.020014379173517227, 0.10969472676515579, -0.0032947103027254343, -0.151864692568779, -0.20873543620109558, -0.10908690094947815, 0.01667330041527748, 0.13434724509716034, -0.023102838546037674, 0.11404304206371307, -0.013826670125126839, 0.006779996212571859, 0.03919416293501854, -0.08003846555948257, -0.11631850898265839, -0.055211812257766724, -0.00047789665404707193, -0.02013896219432354, 0.014758531004190445, -0.09963677823543549, -0.09227585047483444, -0.0496232695877552, 0.14578327536582947, -0.03380307927727699, -0.05107646435499191, -0.14011076092720032, 0.08881668746471405, 0.13016453385353088, -0.0722445547580719, 0.04458775743842125, 0.016807477921247482, 0.09060198813676834, 0.03352811187505722, -0.039484165608882904, 0.13052423298358917, -0.061138592660427094, -0.2116720974445343, -0.04809845611453056, 0.13803844153881073, 0.05248650535941124, 0.048901528120040894, -0.017417246475815773, 0.04560280591249466, 0.006039006635546684, -0.08566910028457642, 0.050890326499938965, 0.020546484738588333, 0.07477681338787079, 0.044913507997989655, -0.06803515553474426, 0.03857973963022232, -0.06980689615011215, -0.045422084629535675, 0.10489174723625183, 0.33048757910728455, -0.10330871492624283, 0.05989101529121399, 0.04597686231136322, -0.06300121545791626, -0.18611806631088257, 0.02362787537276745, 0.08241690695285797, -0.006699561607092619, 0.06285698711872101, -0.1807490736246109, 0.03703175485134125, 0.1207331046462059, -0.034571368247270584, 0.09176060557365417, -0.31583184003829956, -0.13340798020362854, 0.09210334718227386, 0.1384596824645996, -0.008807954378426075, -0.17833606898784637, -0.0556151457130909, 0.012197867967188358, -0.06615708768367767, 0.07271148264408112, -0.09505928307771683, 0.10011134296655655, -0.02466362714767456, 0.030451174825429916, 0.021851638332009315, -0.05663750693202019, 0.14534792304039001, -0.01770041510462761, 0.10773523896932602, -0.041053902357816696, 0.052437763661146164, 0.039904847741127014, -0.07021164149045944, 0.044331371784210205, -0.041984863579273224, 0.04397385194897652, -0.09795188903808594, -0.0073839514516294, -0.0959269255399704, 0.030826589092612267, -0.04406921565532684, -0.04029901325702667, -0.037747785449028015, 0.0656767413020134, 0.06679727137088776, -0.012979677878320217, 0.1362059861421585, -0.015943318605422974, 0.18528138101100922, 0.12283387780189514, 0.04802746698260307, -0.041048865765333176, -0.04658721759915352, 0.0087025947868824, -0.022650115191936493, 0.03343372792005539, -0.14625941216945648, 0.03215629607439041, 0.13749855756759644, 0.02889442816376686, 0.11978606134653091, 0.05149536579847336, -0.07089804857969284, -0.0017652853857725859, 0.07179912179708481, -0.1467178761959076, -0.13041704893112183, 0.02199205383658409, 0.02143901214003563, -0.12317443639039993, 0.029318898916244507, 0.0844716802239418, -0.05677979812026024, -0.021936403587460518, -0.012998117133975029, 0.05787654593586922, -0.028284290805459023, 0.22172150015830994, 0.03627811372280121, 0.07110081613063812, -0.11286832392215729, 0.11097698658704758, 0.055168017745018005, -0.11340431869029999, 0.03423672914505005, 0.10047142952680588, -0.08071645349264145, -0.021983029320836067, 0.09091398119926453, 0.11733484268188477, -0.0013700539711862803, -0.0568135641515255, -0.12025722861289978, -0.13134609162807465, 0.08530604839324951, 0.1258699893951416, 0.06012364476919174, 0.021221579983830452, 0.03214908391237259, 0.010310879908502102, -0.11071088165044785, 0.101844422519207, 0.07949522882699966, 0.07258910685777664, -0.12442135810852051, 0.13404817879199982, 0.007874883711338043, 0.025306036695837975, -0.012189924716949463, 0.03954363614320755, -0.12768645584583282, 0.01752457022666931, -0.10018415749073029, -0.009711428545415401, -0.048773445188999176, -0.00771418446674943, -0.01103401929140091, -0.06233993545174599, -0.04477570578455925, 0.023444809019565582, -0.11911621689796448, -0.04784396290779114, -0.007067551836371422, 0.05007875710725784, -0.1432185173034668, -0.04132891818881035, 0.017229003831744194, -0.08871838450431824, 0.07627878338098526, 0.04799073562026024, 0.024959197267889977, 0.045853618532419205, -0.09963808953762054, 0.014926612377166748, 0.049155451357364655, -0.022934868931770325, 0.04586600139737129, -0.14989393949508667, -0.02709811180830002, -0.005809340626001358, 0.01836874894797802, 0.021209223195910454, 0.05522668734192848, -0.1416366696357727, -0.008857116103172302, -0.02043905109167099, -0.05527307465672493, -0.03206142410635948, 0.035978902131319046, 0.06789398193359375, 0.026198536157608032, 0.14939475059509277, -0.09516938030719757, 0.03393169492483139, -0.22972173988819122, -0.016788050532341003, -0.027887137606739998, -0.08595473319292068, -0.08401933312416077, -0.011296768672764301, 0.09565810114145279, -0.05454609543085098, 0.14044517278671265, -0.007348678074777126, 0.06010333076119423, 0.04506337642669678, -0.06816312670707703, 0.01729828678071499, 0.04914187639951706, 0.1785445511341095, 0.007498307153582573, -0.041149090975522995, 0.08868982642889023, 0.0336194708943367, 0.05205125734210014, 0.13023827970027924, 0.22361420094966888, 0.1795535832643509, 0.06597739458084106, 0.0641496554017067, 0.040417253971099854, -0.10785364359617233, -0.12640708684921265, 0.08046580106019974, -0.0031019144225865602, 0.10732360929250717, -0.026254035532474518, 0.20103813707828522, 0.11145395785570145, -0.21527716517448425, 0.06147448718547821, -0.03985980525612831, -0.0858173668384552, -0.11848927289247513, -0.0523579902946949, -0.08507589250802994, -0.1696438044309616, -0.009182880632579327, -0.11616822332143784, 0.04761228710412979, 0.08616003394126892, 0.013026205822825432, 0.028214655816555023, 0.13733546435832977, 0.06621129810810089, 0.026662904769182205, 0.05991966277360916, 0.023955365642905235, -0.014197353273630142, -0.03924005478620529, -0.09404388815164566, 0.03264915198087692, -0.0314546599984169, 0.03609929233789444, -0.027636662125587463, -0.06931731849908829, 0.057835325598716736, -0.011924676597118378, -0.10013605654239655, 0.024722017347812653, 0.018079141154885292, 0.05853962525725365, 0.07535051554441452, 0.036751940846443176, -0.007267958018928766, -0.02568761631846428, 0.26020583510398865, -0.07792025804519653, -0.06434702128171921, -0.0947125032544136, 0.31050899624824524, 0.037644609808921814, -0.019285619258880615, 0.030611392110586166, -0.08729203045368195, 0.007065409328788519, 0.15544725954532623, 0.16185054183006287, -0.03567669540643692, -0.003989541903138161, -0.024860478937625885, -0.014994197525084019, -0.029316870495676994, 0.10795250535011292, 0.12371812015771866, 0.0295134037733078, -0.09059453010559082, -0.024010613560676575, -0.05631566420197487, -0.027838533744215965, -0.074967160820961, 0.06267301738262177, 0.029623571783304214, 0.0033200872130692005, -0.04595300555229187, 0.10213658213615417, -0.057852163910865784, -0.08244878053665161, 0.05090227350592613, -0.18403726816177368, -0.17826184630393982, -0.02235143631696701, 0.051973696798086166, 0.017352748662233353, 0.047551706433296204, -0.01602400466799736, 0.0011677979491651058, 0.08361735194921494, -0.021162742748856544, -0.039352815598249435, -0.1524062305688858, 0.077025406062603, -0.12254098057746887, 0.24535921216011047, -0.02507200464606285, 0.024037525057792664, 0.11876175552606583, 0.025755420327186584, -0.13023239374160767, 0.06753383576869965, 0.06370409578084946, -0.08535730093717575, 0.02108968421816826, 0.14476211369037628, -0.04218187555670738, 0.07657668739557266, 0.04291249439120293, -0.11186633259057999, 0.011254939250648022, -0.05407656356692314, -0.05902021378278732, -0.03902494162321091, -0.014667167328298092, -0.041276756674051285, 0.12422897666692734, 0.2061019241809845, -0.05784847214818001, 0.031004300341010094, -0.07073105871677399, 0.023293470963835716, 0.04490531235933304, 0.08001045882701874, -0.007790453732013702, -0.26317375898361206, 0.03698813170194626, 0.10411026328802109, 0.0054547665640711784, -0.2155689001083374, -0.08139314502477646, 0.030088260769844055, -0.05614849925041199, -0.08818886429071426, 0.12916700541973114, 0.06966982781887054, 0.049895770847797394, -0.0567227341234684, -0.11444714665412903, -0.04624485969543457, 0.1873634159564972, -0.13764984905719757, -0.06738624721765518 ]
null
null
peft
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # phi2-lora-distilabel-intel-orca-dpo-pairs This model is a fine-tuned version of [microsoft/phi-2](https://huggingface.co/microsoft/phi-2) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.4547 - Rewards/chosen: -0.0932 - Rewards/rejected: -1.3103 - Rewards/accuracies: 0.8386 - Rewards/margins: 1.2171 - Logps/rejected: -222.2418 - Logps/chosen: -199.7473 - Logits/rejected: 0.5130 - Logits/chosen: 0.3441 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - gradient_accumulation_steps: 16 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 250 - num_epochs: 1 ### Training results | Training Loss | Epoch | Step | Validation Loss | Rewards/chosen | Rewards/rejected | Rewards/accuracies | Rewards/margins | Logps/rejected | Logps/chosen | Logits/rejected | Logits/chosen | |:-------------:|:-----:|:----:|:---------------:|:--------------:|:----------------:|:------------------:|:---------------:|:--------------:|:------------:|:---------------:|:-------------:| | 0.5579 | 0.78 | 250 | 0.4547 | -0.0932 | -1.3103 | 0.8386 | 1.2171 | -222.2418 | -199.7473 | 0.5130 | 0.3441 | ### Framework versions - PEFT 0.8.2 - Transformers 4.37.2 - Pytorch 2.2.0+cu121 - Datasets 2.17.0 - Tokenizers 0.15.2
{"license": "mit", "library_name": "peft", "tags": ["trl", "dpo", "generated_from_trainer"], "base_model": "microsoft/phi-2", "model-index": [{"name": "phi2-lora-distilabel-intel-orca-dpo-pairs", "results": []}]}
null
southmost/phi2-lora-distilabel-intel-orca-dpo-pairs
[ "peft", "safetensors", "trl", "dpo", "generated_from_trainer", "base_model:microsoft/phi-2", "license:mit", "region:us" ]
2024-02-13T19:15:26+00:00
[]
[]
TAGS #peft #safetensors #trl #dpo #generated_from_trainer #base_model-microsoft/phi-2 #license-mit #region-us
phi2-lora-distilabel-intel-orca-dpo-pairs ========================================= This model is a fine-tuned version of microsoft/phi-2 on an unknown dataset. It achieves the following results on the evaluation set: * Loss: 0.4547 * Rewards/chosen: -0.0932 * Rewards/rejected: -1.3103 * Rewards/accuracies: 0.8386 * Rewards/margins: 1.2171 * Logps/rejected: -222.2418 * Logps/chosen: -199.7473 * Logits/rejected: 0.5130 * Logits/chosen: 0.3441 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 1e-05 * train\_batch\_size: 2 * eval\_batch\_size: 2 * seed: 42 * gradient\_accumulation\_steps: 16 * total\_train\_batch\_size: 32 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * lr\_scheduler\_warmup\_steps: 250 * num\_epochs: 1 ### Training results ### Framework versions * PEFT 0.8.2 * Transformers 4.37.2 * Pytorch 2.2.0+cu121 * Datasets 2.17.0 * Tokenizers 0.15.2
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 2\n* eval\\_batch\\_size: 2\n* seed: 42\n* gradient\\_accumulation\\_steps: 16\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 250\n* num\\_epochs: 1", "### Training results", "### Framework versions\n\n\n* PEFT 0.8.2\n* Transformers 4.37.2\n* Pytorch 2.2.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.2" ]
[ "TAGS\n#peft #safetensors #trl #dpo #generated_from_trainer #base_model-microsoft/phi-2 #license-mit #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 2\n* eval\\_batch\\_size: 2\n* seed: 42\n* gradient\\_accumulation\\_steps: 16\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 250\n* num\\_epochs: 1", "### Training results", "### Framework versions\n\n\n* PEFT 0.8.2\n* Transformers 4.37.2\n* Pytorch 2.2.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.2" ]
[ 41, 144, 4, 39 ]
[ "passage: TAGS\n#peft #safetensors #trl #dpo #generated_from_trainer #base_model-microsoft/phi-2 #license-mit #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 2\n* eval\\_batch\\_size: 2\n* seed: 42\n* gradient\\_accumulation\\_steps: 16\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 250\n* num\\_epochs: 1### Training results### Framework versions\n\n\n* PEFT 0.8.2\n* Transformers 4.37.2\n* Pytorch 2.2.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.2" ]
[ -0.13301244378089905, 0.10428035259246826, -0.0025865351781249046, 0.09877348691225052, 0.13190957903862, -0.0011064889840781689, 0.13528026640415192, 0.11933097243309021, -0.13562671840190887, 0.09061423689126968, 0.12715865671634674, 0.10927078872919083, 0.03740455210208893, 0.21377058327198029, -0.05751601606607437, -0.26654213666915894, 0.026405908167362213, -0.02031133323907852, -0.0676097497344017, 0.118306003510952, 0.07391711324453354, -0.12459035962820053, 0.07191409915685654, -0.019502252340316772, -0.17243488132953644, -0.009718620218336582, -0.0017093307105824351, -0.03267495706677437, 0.11498408764600754, 0.012788698077201843, 0.129292830824852, 0.03329382836818695, 0.11327141523361206, -0.2030152678489685, 0.006126567255705595, 0.07987339049577713, 0.015511024743318558, 0.08104211091995239, 0.07560024410486221, -0.00032698563882149756, 0.10971253365278244, -0.09047983586788177, 0.07577928155660629, 0.01568508706986904, -0.15387886762619019, -0.26340025663375854, -0.13133664429187775, 0.045876458287239075, 0.10518177598714828, 0.07321491092443466, -0.01286158338189125, 0.17568857967853546, -0.0683470219373703, 0.09081866592168808, 0.2901917099952698, -0.2872755229473114, -0.0738784670829773, 0.0326400063931942, 0.02399553544819355, 0.07264700531959534, -0.09968780726194382, -0.04184408485889435, 0.059197358787059784, 0.025266116484999657, 0.10534445196390152, 0.003274634713307023, -0.02156425639986992, 0.004260238725692034, -0.15626466274261475, -0.029814599081873894, 0.0846036747097969, 0.034961968660354614, -0.048982661217451096, -0.04771668463945389, -0.0741000697016716, -0.21990825235843658, -0.042394354939460754, -0.013274776749312878, 0.05526811257004738, -0.052353404462337494, -0.0738983154296875, 0.032448820769786835, -0.06378792226314545, -0.08908843249082565, -0.017471857368946075, 0.18149881064891815, 0.05529338866472244, 0.012211133725941181, -0.021221274510025978, 0.11362047493457794, -0.03371294587850571, -0.15270139276981354, -0.011209228076040745, 0.0077865393832325935, -0.02679433673620224, -0.05591942369937897, -0.0400029718875885, 0.0037185612600296736, 0.01570199802517891, 0.17513692378997803, -0.1385994255542755, 0.06190972775220871, 0.010044067166745663, 0.01733117364346981, -0.08399923890829086, 0.13609422743320465, -0.07821045070886612, 0.022334644570946693, 0.003734984202310443, 0.09534300863742828, 0.049480754882097244, -0.005923817399889231, -0.07084712386131287, 0.05146234109997749, 0.09258883446455002, 0.046020373702049255, -0.05083322897553444, 0.02538658306002617, -0.048670217394828796, 0.004567291587591171, 0.0794740840792656, -0.10954123735427856, 0.046750400215387344, 0.017414895817637444, -0.07536453008651733, -0.044950082898139954, 0.0008940369007177651, 0.012821323238313198, 0.014321543276309967, 0.15105940401554108, -0.08733923733234406, 0.031185787171125412, -0.0885649248957634, -0.1307409256696701, 0.01596161350607872, -0.0815136507153511, 0.007426081225275993, -0.07719994336366653, -0.15578950941562653, -0.03320701792836189, 0.03937084600329399, -0.06138962507247925, -0.02296251617372036, -0.044860661029815674, -0.10572289675474167, 0.0130459014326334, -0.015318203717470169, 0.08969547599554062, -0.07902797311544418, 0.11149609088897705, 0.038097091019153595, 0.06920328736305237, -0.029718246310949326, 0.02187534235417843, -0.0844942256808281, 0.04427742585539818, -0.26560625433921814, 0.02988562174141407, -0.07481042295694351, 0.06709654629230499, -0.08848359435796738, -0.10412634164094925, 0.004016245249658823, -0.018702886998653412, 0.1180785521864891, 0.14438405632972717, -0.2100832462310791, -0.052602071315050125, 0.201902836561203, -0.11748094111680984, -0.10591111332178116, 0.10344187170267105, -0.03739900887012482, 0.000005271642294246703, 0.050881437957286835, 0.18911482393741608, 0.03736206889152527, -0.17012007534503937, 0.0021680172067135572, -0.06322132796049118, 0.08142207562923431, -0.007473150733858347, 0.0627860501408577, -0.023007608950138092, 0.051620613783597946, 0.006722179241478443, -0.031957849860191345, 0.033737462013959885, -0.1124720424413681, -0.07132744789123535, -0.03425246849656105, -0.08827590197324753, 0.021454844623804092, 0.05499705299735069, 0.03632167726755142, -0.12631462514400482, -0.09198406338691711, 0.07175116240978241, 0.09609391540288925, -0.050509046763181686, 0.025606794282794, -0.06361164897680283, 0.08724522590637207, -0.03703828155994415, -0.030264150351285934, -0.17629936337471008, -0.06927569210529327, 0.028782818466424942, 0.0022485286463052034, -0.020468993112444878, -0.05743638798594475, 0.07917923480272293, 0.09930119663476944, -0.07028291374444962, -0.030416669324040413, -0.05939177796244621, 0.003809052985161543, -0.11163344979286194, -0.23915325105190277, -0.032771509140729904, -0.0386512465775013, 0.09659252315759659, -0.21015048027038574, 0.034783247858285904, 0.04032447934150696, 0.12069568037986755, 0.04957224428653717, -0.05438927933573723, -0.017761310562491417, 0.08239766955375671, -0.01577378809452057, -0.08884472399950027, 0.02784310281276703, 0.014948241412639618, -0.04380880296230316, -0.04325638711452484, -0.15016427636146545, 0.17835848033428192, 0.13078053295612335, 0.051404982805252075, -0.1276397854089737, -0.02816903404891491, -0.07486286759376526, -0.029337026178836823, -0.0736336037516594, 0.04415298253297806, 0.0961548238992691, 0.017903489992022514, 0.11862380057573318, -0.10022944211959839, -0.04715678468346596, 0.04357120767235756, -0.024573462083935738, 0.025425255298614502, 0.1163535937666893, 0.07895511388778687, -0.07749312371015549, 0.1424514204263687, 0.10175996273756027, -0.05129098892211914, 0.11040842533111572, -0.0758267194032669, -0.08407807350158691, -0.040257640182971954, 0.030891532078385353, 0.028239311650395393, 0.18051040172576904, -0.009543891996145248, 0.019327351823449135, -0.0013057939941063523, 0.030996529385447502, -0.0010466049425303936, -0.214629128575325, -0.040226131677627563, 0.027675366029143333, -0.0612187460064888, -0.06284774839878082, -0.03620883822441101, 0.009917981922626495, 0.11491978913545609, 0.0018994456622749567, -0.059635259211063385, -0.008569149300456047, 0.01292479783296585, -0.07999616116285324, 0.20397424697875977, -0.09799592941999435, -0.0678764060139656, -0.0753638967871666, -0.01365796197205782, -0.03300020471215248, -0.0015243294183164835, 0.05850003659725189, -0.09742098301649094, -0.03054211661219597, -0.10468343645334244, -0.01590120419859886, 0.037703461945056915, 0.030462918803095818, -0.0002474401262588799, -0.0020477313082665205, 0.0772222951054573, -0.10599339008331299, -0.002806944539770484, -0.050170864909887314, -0.006693174131214619, 0.055575452744960785, 0.046932172030210495, 0.11037208884954453, 0.13646870851516724, -0.003595448564738035, 0.02658981643617153, -0.03640518710017204, 0.2405754029750824, -0.07954137027263641, -0.026372890919446945, 0.0786234587430954, -0.003322346368804574, 0.06924881786108017, 0.15857142210006714, 0.0608353391289711, -0.13484308123588562, 0.022041795775294304, 0.04146333038806915, -0.03781891241669655, -0.20983801782131195, -0.03163394331932068, -0.023461395874619484, -0.006879962980747223, 0.08351723104715347, 0.034378957003355026, -0.030793029814958572, 0.02603747881948948, -0.0019214238272979856, 0.009334656409919262, -0.01022444199770689, 0.07894273847341537, 0.024419540539383888, 0.04997459799051285, 0.11538877338171005, -0.0444418303668499, -0.03391684219241142, 0.042828962206840515, -0.02825484611093998, 0.23495575785636902, -0.0185860525816679, 0.07324066013097763, 0.040738459676504135, 0.16814912855625153, -0.0016888590762391686, 0.08594818413257599, 0.021404292434453964, -0.03923972696065903, 0.011925551109015942, -0.06959948688745499, -0.01508500799536705, 0.032823994755744934, -0.0737890675663948, 0.04662404581904411, -0.12721242010593414, 0.022651536390185356, 0.056633926928043365, 0.3058015704154968, 0.06722631305456161, -0.33906853199005127, -0.08424244076013565, -0.008816557005047798, -0.007174460217356682, -0.018512871116399765, 0.013085815124213696, 0.1287234127521515, -0.05398496985435486, 0.0768345519900322, -0.06371414661407471, 0.07621787488460541, -0.02860400080680847, 0.02319018915295601, 0.06732512265443802, 0.0924120768904686, -0.03485683724284172, 0.019543960690498352, -0.25931209325790405, 0.30732202529907227, 0.0278463251888752, 0.09226322919130325, -0.022198444232344627, -0.00030222535133361816, 0.010806931182742119, 0.04630518704652786, 0.0675647184252739, -0.011885736137628555, -0.10200619697570801, -0.21040195226669312, -0.11145059019327164, 0.019274083897471428, 0.13215035200119019, -0.013468566350638866, 0.12035419791936874, -0.0005751597345806658, -0.003022987861186266, 0.053988248109817505, -0.05603932961821556, -0.09929458051919937, -0.04934653639793396, -0.005322969052940607, -0.03391381353139877, -0.01639922522008419, -0.08857560902833939, -0.10849031060934067, -0.08405926078557968, 0.1280132234096527, -0.011168956756591797, -0.05160902440547943, -0.1344904750585556, 0.06851223856210709, 0.09842774271965027, -0.0679168775677681, 0.03758366033434868, 0.027253182604908943, 0.0786488875746727, 0.017116082832217216, -0.0324840247631073, 0.14616069197654724, -0.0580245666205883, -0.2174779623746872, -0.07002807408571243, 0.10914488881826401, 0.06228956952691078, 0.04565771669149399, -0.026529034599661827, 0.04364901781082153, 0.02516380324959755, -0.09705650806427002, 0.03761765733361244, 0.008857323788106441, 0.0703965499997139, 0.013522799126803875, -0.024192724376916885, 0.06951803714036942, -0.06197464466094971, -0.033648647367954254, 0.09673704952001572, 0.35012251138687134, -0.09706957638263702, 0.02562493272125721, 0.028070438653230667, -0.03928348049521446, -0.17631196975708008, 0.05339159071445465, 0.07005424797534943, -0.004515890497714281, 0.03315490856766701, -0.1596372425556183, 0.048970434814691544, 0.1379701793193817, -0.031796276569366455, 0.11131352931261063, -0.3103131949901581, -0.1396145522594452, 0.09332893043756485, 0.14022675156593323, 0.03816791623830795, -0.1850343644618988, -0.03342637047171593, -0.004202970769256353, -0.10285504162311554, 0.08716578781604767, -0.12123659998178482, 0.08267367631196976, -0.012314954772591591, 0.041937205940485, 0.01704074628651142, -0.057026103138923645, 0.13875994086265564, -0.009130541235208511, 0.12769578397274017, -0.04403907433152199, 0.02379646897315979, 0.03056160733103752, -0.07385782897472382, 0.03053104318678379, -0.042005203664302826, 0.0479716993868351, -0.059073660522699356, -0.01129702664911747, -0.07219233363866806, 0.010391087271273136, -0.05428752303123474, -0.05708298459649086, -0.05027331784367561, 0.03290260210633278, 0.04077288508415222, -0.03372451663017273, 0.1315237134695053, 0.015090998262166977, 0.18261665105819702, 0.1267813742160797, 0.04108297824859619, -0.06192382052540779, -0.032320570200681686, 0.03347695991396904, -0.021956119686365128, 0.049530163407325745, -0.15397045016288757, 0.023192699998617172, 0.1370493620634079, 0.046028025448322296, 0.10301511734724045, 0.06358537822961807, -0.07058390974998474, 0.0025784482713788748, 0.06036938354372978, -0.1623447984457016, -0.10694722831249237, 0.029370661824941635, 0.0061475192196667194, -0.11108285933732986, 0.059285201132297516, 0.10037430375814438, -0.07114586979150772, -0.00940951332449913, -0.0033708636183291674, 0.03818270191550255, -0.021705539897084236, 0.2304144650697708, 0.06129094958305359, 0.06795910000801086, -0.10732872784137726, 0.08557915687561035, 0.04031577333807945, -0.08366125822067261, 0.03326184302568436, 0.08364392071962357, -0.07866135984659195, -0.0038302186876535416, 0.09316256642341614, 0.1495448350906372, -0.022794190794229507, -0.03497306630015373, -0.15466921031475067, -0.12632226943969727, 0.07598429173231125, 0.19694936275482178, 0.07261111587285995, 0.025921855121850967, 0.016800012439489365, 0.021591445431113243, -0.13519282639026642, 0.09789212048053741, 0.05648607015609741, 0.0917263776063919, -0.14255717396736145, 0.13836489617824554, -0.013650456443428993, -0.0008949315524660051, -0.015306800603866577, 0.05396550893783569, -0.15643103420734406, 0.006816536653786898, -0.1029813215136528, -0.020395731553435326, -0.04475076124072075, 0.001531172194518149, -0.010472948662936687, -0.05239925906062126, -0.05131940543651581, 0.0173341054469347, -0.11181120574474335, -0.028911348432302475, 0.008329328149557114, 0.014559702947735786, -0.14533773064613342, -0.01893223449587822, 0.019172461703419685, -0.0936352089047432, 0.07245443761348724, 0.0504867248237133, 0.03859272971749306, 0.03760148584842682, -0.09706133604049683, 0.02515491470694542, 0.06700652837753296, -0.02876804582774639, 0.07525316625833511, -0.1307838261127472, -0.025703148916363716, -0.020896248519420624, 0.0379503034055233, 0.027186688035726547, 0.09418710321187973, -0.13502372801303864, 0.011005794629454613, -0.014301406219601631, -0.0540948249399662, -0.03182158246636391, 0.016517475247383118, 0.10591311007738113, 0.005564848892390728, 0.15258373320102692, -0.09467747807502747, 0.02819596417248249, -0.22074654698371887, -0.029230279847979546, -0.008251452818512917, -0.0798548012971878, -0.07835232466459274, -0.000007526317403971916, 0.0919509008526802, -0.048865076154470444, 0.1173391193151474, -0.0021995771676301956, 0.054637931287288666, 0.050754278898239136, -0.07774507999420166, 0.004578635096549988, 0.049586113542318344, 0.1615617275238037, 0.02445218525826931, -0.042175114154815674, 0.058460984379053116, 0.04026114568114281, 0.09711607545614243, 0.07196434587240219, 0.2392273247241974, 0.13721276819705963, 0.014463449828326702, 0.07613526284694672, 0.04424228146672249, -0.1117166131734848, -0.12791211903095245, 0.04772258922457695, -0.043899234384298325, 0.09013694524765015, -0.02181103453040123, 0.16635072231292725, 0.1155138611793518, -0.17977562546730042, 0.026689710095524788, -0.049458473920822144, -0.08452804386615753, -0.13351425528526306, 0.007243778556585312, -0.07896209508180618, -0.1788424849510193, 0.005425706505775452, -0.11861836910247803, 0.026357227936387062, 0.10426168143749237, 0.012371261604130268, 0.03137355297803879, 0.15965332090854645, 0.05292107164859772, 0.047457899898290634, 0.04038966819643974, 0.03244059905409813, -0.03674279525876045, -0.039289433509111404, -0.08975248783826828, 0.029630063101649284, -0.043775398284196854, 0.0374007448554039, -0.032032888382673264, -0.05856987461447716, 0.06180260330438614, 0.010469816625118256, -0.09454428404569626, 0.02821766398847103, 0.028118962422013283, 0.05703674629330635, 0.059846945106983185, 0.026415091007947922, 0.012068334966897964, -0.02047172375023365, 0.21944007277488708, -0.08277392387390137, -0.05962388962507248, -0.1145329549908638, 0.30495575070381165, 0.03297378122806549, 0.009462431073188782, 0.0209918525069952, -0.1044982448220253, -0.0022026284132152796, 0.14085426926612854, 0.16244731843471527, -0.04908021166920662, 0.00024969966034404933, -0.026244809851050377, -0.016100626438856125, -0.04309837520122528, 0.09630971401929855, 0.13459157943725586, 0.042933039367198944, -0.07668289542198181, -0.03668438270688057, -0.05621877312660217, -0.03603356331586838, -0.04523485153913498, 0.038643550127744675, 0.03226112574338913, 0.014415167272090912, -0.058263737708330154, 0.08747576922178268, -0.041629157960414886, -0.11094103753566742, 0.07559232413768768, -0.20328480005264282, -0.1710149347782135, -0.004378665704280138, 0.05446331948041916, -0.010822992771863937, 0.0641075074672699, -0.019333066418766975, -0.015976404771208763, 0.09811028838157654, -0.026514016091823578, -0.05821932479739189, -0.14173388481140137, 0.06327058374881744, -0.11454983800649643, 0.2013222873210907, -0.03586361184716225, 0.039761606603860855, 0.12285371869802475, 0.03957854583859444, -0.11813797801733017, 0.0705331414937973, 0.0635465756058693, -0.12302154302597046, -0.01870044134557247, 0.1097399964928627, -0.04968314245343208, 0.08085561543703079, 0.04717539623379707, -0.12405838072299957, 0.002605479909107089, -0.053638964891433716, -0.04511817917227745, -0.055531103163957596, -0.024450253695249557, -0.05288626253604889, 0.12122126668691635, 0.2014322727918625, -0.04995015636086464, 0.05506681650876999, -0.046547550708055496, 0.035866037011146545, 0.05417831242084503, 0.04414669796824455, -0.01773003302514553, -0.2837870419025421, 0.042023468762636185, 0.10085251927375793, -0.012727707624435425, -0.23073141276836395, -0.07520025223493576, 0.03267582133412361, -0.05168355628848076, -0.08696987479925156, 0.09624270349740982, 0.055861685425043106, 0.06224466487765312, -0.06716195493936539, -0.10290374606847763, -0.06959021091461182, 0.17739760875701904, -0.12924791872501373, -0.08850162476301193 ]
null
null
transformers
# UNA-SOLAR-10.7B-Instruct-v1.0 ExLlamaV2 8.0 bpw quants of https://huggingface.co/fblgit/UNA-SOLAR-10.7B-Instruct-v1.0
{"language": ["en"], "license": "cc-by-nc-nd-4.0", "library_name": "transformers", "tags": ["alignment-handbook", "generated_from_trainer", "UNA", "single-turn"], "base_model": "upstage/SOLAR-10.7B-Instruct-v1.0", "model-index": [{"name": "UNA-SOLAR-10.7B-Instruct-v1.0", "results": []}]}
text-generation
altomek/UNA-SOLAR-10.7B-Instruct-v1.0-8bpw-EXL2
[ "transformers", "safetensors", "llama", "text-generation", "alignment-handbook", "generated_from_trainer", "UNA", "single-turn", "conversational", "en", "base_model:upstage/SOLAR-10.7B-Instruct-v1.0", "license:cc-by-nc-nd-4.0", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2024-02-13T19:17:18+00:00
[]
[ "en" ]
TAGS #transformers #safetensors #llama #text-generation #alignment-handbook #generated_from_trainer #UNA #single-turn #conversational #en #base_model-upstage/SOLAR-10.7B-Instruct-v1.0 #license-cc-by-nc-nd-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# UNA-SOLAR-10.7B-Instruct-v1.0 ExLlamaV2 8.0 bpw quants of URL
[ "# UNA-SOLAR-10.7B-Instruct-v1.0\n\nExLlamaV2 8.0 bpw quants of URL" ]
[ "TAGS\n#transformers #safetensors #llama #text-generation #alignment-handbook #generated_from_trainer #UNA #single-turn #conversational #en #base_model-upstage/SOLAR-10.7B-Instruct-v1.0 #license-cc-by-nc-nd-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# UNA-SOLAR-10.7B-Instruct-v1.0\n\nExLlamaV2 8.0 bpw quants of URL" ]
[ 107, 28 ]
[ "passage: TAGS\n#transformers #safetensors #llama #text-generation #alignment-handbook #generated_from_trainer #UNA #single-turn #conversational #en #base_model-upstage/SOLAR-10.7B-Instruct-v1.0 #license-cc-by-nc-nd-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# UNA-SOLAR-10.7B-Instruct-v1.0\n\nExLlamaV2 8.0 bpw quants of URL" ]
[ -0.04471826180815697, 0.01090039499104023, -0.0032351852860301733, 0.021019013598561287, 0.12457150220870972, -0.015906190499663353, 0.1867033988237381, 0.0273927953094244, 0.012672368437051773, 0.056392326951026917, 0.0841432511806488, 0.18316400051116943, 0.04831337928771973, 0.07837315648794174, -0.03084956668317318, -0.15889792144298553, 0.07003342360258102, 0.011231247335672379, 0.07456417381763458, 0.04657737910747528, 0.06366203725337982, -0.05153210088610649, 0.07956402748823166, -0.07437527179718018, -0.135841503739357, 0.024116696789860725, -0.0036760345101356506, -0.09874764084815979, 0.14035159349441528, 0.07504326850175858, 0.08784953504800797, 0.10709958523511887, 0.02613135799765587, -0.17245447635650635, 0.01888042315840721, -0.008323850110173225, -0.11252980679273605, 0.08614477515220642, 0.0003855809336528182, 0.060248397290706635, 0.10957371443510056, 0.03720325231552124, 0.0058601126074790955, 0.06974716484546661, -0.11606643348932266, -0.026842191815376282, -0.033246323466300964, 0.017826709896326065, 0.08897402882575989, 0.08712182193994522, 0.004780414514243603, 0.12502922117710114, 0.028404224663972855, 0.06173946335911751, 0.043791234493255615, -0.3393886983394623, -0.0006774518406018615, 0.13437412679195404, -0.020903006196022034, 0.054940365254879, -0.05930047854781151, 0.06136038154363632, 0.10426343977451324, -0.033359505236148834, 0.046295035630464554, -0.019492849707603455, -0.13968954980373383, 0.030276844277977943, -0.11987953633069992, 0.010677042417228222, 0.3297768533229828, 0.07684878259897232, -0.06904793530702591, 0.02003614231944084, -0.09534371644258499, 0.03125615790486336, -0.029732735827565193, 0.003106754971668124, -0.010705647058784962, 0.05435975268483162, -0.05398484319448471, -0.009496977552771568, -0.13894106447696686, -0.0522160679101944, -0.1319064050912857, 0.2298484593629837, 0.004262786358594894, 0.0008089698967523873, -0.004587067756801844, 0.1159222424030304, 0.09030638635158539, -0.11403509229421616, -0.0014883485855534673, -0.03941717743873596, 0.05631715804338455, 0.009169314056634903, -0.03474544361233711, -0.010982486419379711, 0.04053647071123123, 0.20310762524604797, 0.12755194306373596, -0.013481805101037025, -0.0564199760556221, 0.07059342414140701, -0.050773538649082184, 0.07808993011713028, 0.019919902086257935, -0.015932755544781685, 0.07528992742300034, 0.047994017601013184, 0.19592195749282837, -0.03241363540291786, -0.14193706214427948, -0.03218155354261398, -0.0028027473017573357, 0.06506124883890152, 0.04301761835813522, 0.05601195991039276, -0.07158882170915604, 0.01583854854106903, 0.15050536394119263, -0.07282453030347824, -0.024787534028291702, 0.0016062374925240874, 0.09250055998563766, 0.035351984202861786, 0.11073783785104752, 0.0009461494628340006, 0.05291541665792465, -0.07759648561477661, -0.08063492923974991, -0.04732030630111694, -0.057879213243722916, -0.028642933815717697, 0.04166238009929657, 0.08223175257444382, 0.050459425896406174, -0.20784887671470642, -0.10296894609928131, 0.03149043396115303, -0.013632884249091148, -0.08931928128004074, -0.08744224160909653, -0.04049893468618393, -0.038416292518377304, 0.007976310327649117, -0.038805343210697174, -0.050340328365564346, -0.027880989015102386, 0.042761411517858505, 0.10996783524751663, 0.12879353761672974, -0.13589881360530853, 0.02203899249434471, -0.07574156671762466, 0.03582439944148064, -0.07322169840335846, -0.01612258329987526, -0.0658469945192337, 0.12967224419116974, -0.057156868278980255, 0.006396257784217596, -0.07141764461994171, 0.05122924968600273, 0.10003345459699631, 0.16846820712089539, -0.12915082275867462, -0.042770881205797195, 0.11605431884527206, -0.18382079899311066, -0.20336724817752838, 0.07675618678331375, 0.056581418961286545, 0.10275903344154358, 0.006754124537110329, 0.0507659986615181, -0.058656562119722366, -0.0543302483856678, -0.034953195601701736, 0.05054282024502754, 0.013695835135877132, -0.11055445671081543, 0.03222555294632912, 0.030161282047629356, -0.19128155708312988, 0.03057677485048771, 0.11630398780107498, 0.10338568687438965, -0.043069954961538315, -0.09811810404062271, -0.07947571575641632, -0.026444289833307266, -0.08993931859731674, 0.009255633689463139, -0.012438923120498657, -0.03318331018090248, -0.04444313049316406, 0.01596749573945999, 0.11183270066976547, -0.08883316814899445, 0.0469735749065876, -0.06870631128549576, 0.07919247448444366, -0.10029103606939316, 0.02273360639810562, -0.02753647416830063, -0.00850601028650999, 0.0013846856309100986, 0.042251307517290115, -0.013090096414089203, -0.018109431490302086, 0.010342433117330074, 0.02735140174627304, -0.06994316726922989, -0.05032268539071083, 0.008324339054524899, 0.020389951765537262, -0.04092574119567871, -0.17131508886814117, 0.01562048215419054, -0.030323289334774017, 0.12764865159988403, -0.1966717541217804, 0.058986980468034744, 0.06402435898780823, 0.07274450361728668, -0.02987469546496868, 0.03602074459195137, 0.03194399178028107, 0.07067606598138809, -0.0897594690322876, -0.002144327387213707, 0.06927106529474258, 0.04541638493537903, -0.15808238089084625, 0.06960197538137436, -0.13686245679855347, 0.0815494954586029, 0.1532658338546753, -0.20340941846370697, -0.026317013427615166, 0.029977716505527496, -0.01248639915138483, -0.018291525542736053, 0.024929095059633255, -0.053273461759090424, -0.12208963185548782, -0.05580056458711624, 0.09561526030302048, -0.010794349946081638, 0.029650406911969185, 0.016202080994844437, -0.02279697172343731, -0.04819222167134285, 0.11882946640253067, 0.1737784445285797, -0.09723000228404999, 0.11233396083116531, 0.21002250909805298, -0.016029248014092445, 0.07151658087968826, -0.02396610751748085, -0.05452929809689522, 0.00183383678086102, 0.03984188660979271, 0.045001134276390076, 0.09961812943220139, -0.07561516761779785, 0.029692426323890686, 0.03839185833930969, -0.03445115685462952, 0.04653735086321831, -0.15772131085395813, -0.07058537751436234, -0.002653339644894004, 0.009374882094562054, -0.008240018971264362, 0.022689232602715492, -0.0847458466887474, 0.06635910272598267, -0.027657385915517807, -0.0793752521276474, 0.05336242541670799, -0.0016575895715504885, -0.10291285812854767, 0.22227205336093903, -0.040002159774303436, -0.17283135652542114, -0.24446311593055725, 0.01674746721982956, -0.03580602630972862, 0.036933381110429764, 0.06344390660524368, -0.04276837408542633, -0.0789329782128334, -0.0946798101067543, 0.08905128389596939, 0.0015838767867535353, 0.054954856634140015, 0.03167090564966202, 0.030314994975924492, 0.021129954606294632, -0.1115856021642685, 0.01825156807899475, 0.010151183232665062, -0.14154215157032013, 0.09415747970342636, -0.07566317915916443, 0.1630983203649521, 0.09841941297054291, 0.053521811962127686, -0.0669080913066864, 0.016657734289765358, 0.16303953528404236, -0.07649677246809006, 0.039131563156843185, 0.20661303400993347, 0.04220066964626312, 0.026310574263334274, 0.1493072509765625, 0.06265256553888321, -0.10494598746299744, 0.04863208532333374, -0.07883092015981674, -0.06763159483671188, -0.1886591613292694, -0.08929748088121414, -0.07621652632951736, 0.06676032394170761, 0.019887279719114304, 0.06782867759466171, 0.08929935097694397, 0.17158570885658264, -0.0446331761777401, 0.009434397332370281, -0.05662567541003227, 0.13119186460971832, 0.2527540922164917, 0.017692806199193, 0.16250410676002502, -0.1271875649690628, -0.03963305428624153, 0.05257466062903404, 0.07750275731086731, 0.06676293909549713, 0.09904471784830093, 0.03328933194279671, 0.05383432283997536, 0.09993015974760056, 0.06666569411754608, 0.11023470014333725, 0.016237283125519753, -0.07501073181629181, -0.021515345200896263, -0.0659392699599266, -0.029848437756299973, 0.05962331220507622, -0.17923155426979065, 0.03303718566894531, -0.047743797302246094, -0.09558463096618652, 0.048507966101169586, 0.04508541524410248, 0.11764971166849136, -0.22527723014354706, -0.11061276495456696, 0.09665096551179886, 0.020280050113797188, -0.059971313923597336, 0.04276801273226738, 0.067823126912117, -0.004127162974327803, 0.08034110814332962, -0.015361744910478592, 0.0663081482052803, -0.01995653472840786, 0.034162163734436035, -0.08506406098604202, 0.06014325097203255, -0.020245052874088287, 0.08670210093259811, -0.2713994085788727, 0.2165299504995346, 0.02579723857343197, 0.04341093450784683, 0.04434046521782875, 0.03399262949824333, -0.03499779477715492, 0.1925823986530304, 0.0492633692920208, -0.024051060900092125, -0.09840419143438339, -0.11980035156011581, -0.02750452049076557, 0.08735665678977966, 0.008536656387150288, 0.0018155272118747234, 0.05885659530758858, -0.004864697344601154, 0.0261048786342144, 0.06146378070116043, 0.08914155513048172, -0.14301447570323944, -0.1567452996969223, 0.051614172756671906, 0.12300709635019302, 0.11918014287948608, -0.04341335594654083, -0.03983146697282791, -0.14180870354175568, 0.07487639784812927, -0.17149893939495087, -0.014448915608227253, -0.06840778142213821, -0.0547386072576046, 0.04890488460659981, -0.06445338577032089, 0.0789426788687706, -0.03547538444399834, -0.016125647351145744, -0.0008731488487683237, -0.13877196609973907, 0.09135140478610992, -0.1243070662021637, -0.09938714653253555, 0.006483884993940592, 0.08041398227214813, -0.057058122009038925, -0.028321364894509315, 0.0009515549172647297, 0.020826902240514755, -0.13575316965579987, -0.14225855469703674, 0.09457886219024658, -0.04784887656569481, 0.016401071101427078, -0.029912980273365974, -0.1420210897922516, -0.07125774025917053, 0.0042061652056872845, -0.13694746792316437, 0.0820021778345108, 0.3070721924304962, -0.050579797476530075, 0.02790556289255619, 0.18867109715938568, -0.03952087461948395, -0.2377469390630722, -0.10700243711471558, -0.16732768714427948, 0.011625677347183228, 0.054922595620155334, -0.22410349547863007, 0.07435096055269241, 0.059894293546676636, -0.046782441437244415, 0.004425476770848036, -0.13095618784427643, -0.10365985333919525, 0.13494496047496796, 0.07900257408618927, 0.2020418792963028, -0.16020183265209198, -0.07093467563390732, -0.0991402342915535, -0.24320083856582642, 0.11533045768737793, -0.051710713654756546, 0.08825743943452835, -0.03654690086841583, 0.06495912373065948, -0.0001847475505201146, -0.008129146881401539, 0.19755275547504425, -0.05250263959169388, 0.0038053097669035196, -0.07192225754261017, 0.006472594570368528, 0.059780560433864594, 0.04101608321070671, 0.018585706129670143, -0.09398671239614487, 0.044824086129665375, -0.07100127637386322, -0.014868141151964664, -0.008667485788464546, 0.04081287235021591, -0.04483739286661148, -0.07116605341434479, -0.017026908695697784, -0.03977983444929123, 0.025240225717425346, -0.008390059694647789, 0.2513057589530945, -0.03624333068728447, 0.18114109337329865, 0.1594884991645813, 0.0966312363743782, -0.02564629167318344, 0.14784450829029083, -0.0416521318256855, -0.09433137625455856, 0.0791483148932457, -0.15259307622909546, 0.04025772213935852, 0.01813826709985733, -0.03203151002526283, 0.0003839889832306653, 0.11266960203647614, 0.009171320125460625, 0.03621453046798706, 0.13889232277870178, -0.1427440196275711, -0.11485117673873901, -0.029716886579990387, 0.10113906115293503, -0.011614866554737091, 0.22839432954788208, 0.12273028492927551, -0.0537043996155262, -0.005891313776373863, 0.03281167522072792, 0.013395339250564575, -0.023065153509378433, 0.010474937967956066, 0.07516150921583176, 0.05704038590192795, -0.06229157745838165, 0.08925868570804596, 0.03293608874082565, -0.03188146650791168, -0.024839036166667938, 0.004519738722592592, -0.07406322658061981, -0.09350644052028656, -0.05976016819477081, -0.033938948065042496, -0.24552151560783386, -0.07924743741750717, -0.10897163301706314, -0.12221533805131912, 0.016977878287434578, 0.09027501940727234, 0.06148037686944008, 0.018772313371300697, 0.0276956669986248, 0.029178403317928314, -0.02725716307759285, 0.013428930193185806, -0.11601080000400543, 0.09440477192401886, -0.12046501040458679, -0.010222581215202808, -0.04729796573519707, 0.04482690989971161, -0.06717969477176666, -0.014179492369294167, -0.15885721147060394, 0.0031275677029043436, -0.05935359373688698, -0.07086579501628876, -0.10137765109539032, -0.020205356180667877, 0.010211615823209286, -0.07706527411937714, -0.08125578612089157, 0.02674584835767746, -0.045539502054452896, 0.014970861375331879, -0.01433477271348238, 0.06668567657470703, -0.044741190969944, 0.027990248054265976, 0.04848731681704521, -0.10546775907278061, 0.05658230185508728, 0.014551891945302486, 0.008608970791101456, 0.014274846762418747, -0.22809718549251556, 0.04859103634953499, 0.049979012459516525, 0.014426388777792454, -0.0379500649869442, 0.01820574514567852, -0.04202460125088692, 0.04398764669895172, -0.039180152118206024, -0.011304304003715515, 0.1347471922636032, -0.08860025554895401, 0.006166993174701929, -0.06930138915777206, -0.07204408198595047, -0.06358560919761658, 0.027785375714302063, -0.012848593294620514, -0.011838700622320175, 0.07414078712463379, -0.1183101013302803, -0.058241136372089386, -0.1354934573173523, 0.037485916167497635, -0.003661372233182192, -0.11262281239032745, -0.0814412385225296, -0.0829969197511673, 0.026166455820202827, 0.03471212089061737, 0.15795265138149261, -0.13622386753559113, 0.011723761446774006, 0.02962976135313511, 0.05261413753032684, 0.07419317215681076, 0.00649625901132822, 0.35523107647895813, 0.12578438222408295, 0.05506507307291031, -0.05056566745042801, 0.022701621055603027, 0.032693665474653244, 0.03836703300476074, -0.03526494279503822, 0.04034924879670143, -0.18039865791797638, 0.1250283122062683, 0.038730520755052567, -0.05264586955308914, -0.010205160826444626, -0.00622857641428709, -0.08572080731391907, -0.0092450687661767, -0.045706041157245636, 0.13488967716693878, 0.16345004737377167, 0.012545540928840637, -0.002866840222850442, -0.008872239850461483, -0.005891292821615934, -0.1021285355091095, -0.10892369598150253, -0.09413038939237595, -0.21076545119285583, 0.02251897193491459, -0.07154746353626251, -0.041282977908849716, 0.09888540953397751, -0.019367849454283714, -0.00045637175207957625, 0.07126019895076752, 0.15073873102664948, -0.006156661547720432, -0.028991658240556717, -0.015671461820602417, -0.0019398831063881516, -0.09598436951637268, -0.06038131192326546, 0.030307823792099953, 0.03763123229146004, -0.026544056832790375, 0.03883112594485283, 0.05902474746108055, 0.09272196888923645, -0.02818595990538597, -0.08686619251966476, -0.0025372887030243874, 0.0040343222208321095, 0.006541967857629061, 0.04344020038843155, -0.02816111408174038, 0.017207499593496323, 0.03854604810476303, 0.2513260245323181, -0.0793704017996788, -0.048907820135354996, -0.03914659470319748, 0.18169458210468292, -0.07351426035165787, 0.0957103818655014, -0.01189684122800827, -0.11250628530979156, -0.014057775028049946, 0.24841393530368805, 0.08449054509401321, -0.00910827238112688, 0.03427286446094513, 0.0373307429254055, 0.00047511086449958384, 0.022185036912560463, 0.07922367006540298, 0.097428098320961, 0.10557380318641663, -0.06587297469377518, -0.05970318242907524, -0.03427721560001373, 0.004600297659635544, -0.07415668666362762, 0.03389960899949074, -0.07038730382919312, -0.062301065772771835, -0.0449904166162014, 0.05872856825590134, -0.017637060955166817, 0.0021429567132145166, 0.028986437246203423, -0.14002612233161926, -0.05548384413123131, -0.11090449243783951, 0.16867102682590485, -0.03420022502541542, -0.01264270395040512, -0.08460167795419693, 0.006223724689334631, 0.048377230763435364, 0.0470966137945652, -0.0562405027449131, -0.0028318578843027353, 0.021770674735307693, -0.04770161211490631, 0.07311270385980606, -0.03163972869515419, 0.04583505913615227, 0.09853129833936691, 0.03753528743982315, -0.057881273329257965, 0.1672987937927246, 0.047245629131793976, -0.12010142207145691, 0.071739062666893, 0.03802360221743584, -0.009102833457291126, 0.10206815600395203, 0.014635634608566761, -0.07762708514928818, 0.034007176756858826, 0.08754566311836243, -0.08950605988502502, 0.02080526202917099, -0.028048114851117134, -0.08372890949249268, 0.11776796728372574, 0.004495932254940271, -0.08575589209794998, -0.049217429012060165, -0.04584573581814766, 0.04736502468585968, 0.025142375379800797, -0.007003589067608118, 0.032537367194890976, -0.09875143319368362, -0.013971148058772087, 0.004429294727742672, 0.05308765172958374, -0.19051064550876617, -0.031482405960559845, -0.030192386358976364, 0.0006595922750420868, -0.16279847919940948, 0.04347725957632065, 0.18230637907981873, -0.011907094158232212, -0.062391869723796844, -0.23566792905330658, 0.008278320543467999, 0.10358340293169022, -0.07598693668842316, -0.09514419734477997 ]
null
null
diffusers
### My-Pet-Dog Dreambooth model trained by srsrs following the "Build your own Gen AI model" session by NxtWave. Project Submission Code: U20CC053 Sample pictures of this concept: ![0](https://huggingface.co/srsrs/my-pet-dog/resolve/main/sample_images/vit_(1).jpg)
{"license": "creativeml-openrail-m", "tags": ["NxtWave-GenAI-Webinar", "text-to-image", "stable-diffusion"]}
text-to-image
srsrs/my-pet-dog
[ "diffusers", "safetensors", "NxtWave-GenAI-Webinar", "text-to-image", "stable-diffusion", "license:creativeml-openrail-m", "endpoints_compatible", "diffusers:StableDiffusionPipeline", "region:us" ]
2024-02-13T19:17:52+00:00
[]
[]
TAGS #diffusers #safetensors #NxtWave-GenAI-Webinar #text-to-image #stable-diffusion #license-creativeml-openrail-m #endpoints_compatible #diffusers-StableDiffusionPipeline #region-us
### My-Pet-Dog Dreambooth model trained by srsrs following the "Build your own Gen AI model" session by NxtWave. Project Submission Code: U20CC053 Sample pictures of this concept: !0.jpg)
[ "### My-Pet-Dog Dreambooth model trained by srsrs following the \"Build your own Gen AI model\" session by NxtWave.\n\nProject Submission Code: U20CC053\n\nSample pictures of this concept:\n\n !0.jpg)" ]
[ "TAGS\n#diffusers #safetensors #NxtWave-GenAI-Webinar #text-to-image #stable-diffusion #license-creativeml-openrail-m #endpoints_compatible #diffusers-StableDiffusionPipeline #region-us \n", "### My-Pet-Dog Dreambooth model trained by srsrs following the \"Build your own Gen AI model\" session by NxtWave.\n\nProject Submission Code: U20CC053\n\nSample pictures of this concept:\n\n !0.jpg)" ]
[ 73, 58 ]
[ "passage: TAGS\n#diffusers #safetensors #NxtWave-GenAI-Webinar #text-to-image #stable-diffusion #license-creativeml-openrail-m #endpoints_compatible #diffusers-StableDiffusionPipeline #region-us \n### My-Pet-Dog Dreambooth model trained by srsrs following the \"Build your own Gen AI model\" session by NxtWave.\n\nProject Submission Code: U20CC053\n\nSample pictures of this concept:\n\n !0.jpg)" ]
[ -0.09152603149414062, 0.11667817831039429, -0.0015675926115363836, -0.01782039739191532, 0.10443507134914398, -0.011875038035213947, 0.20468513667583466, -0.006274884101003408, 0.00007445988740073517, 0.030742254108190536, 0.11998769640922546, 0.08835230767726898, 0.011105290614068508, 0.14891159534454346, -0.004389310255646706, -0.12415154278278351, 0.060521360486745834, 0.06101261451840401, -0.008009969256818295, 0.03975958749651909, 0.04881605878472328, -0.07659055292606354, 0.11778612434864044, -0.014650627970695496, -0.17315934598445892, -0.0014444160042330623, -0.05282146856188774, -0.011470307596027851, 0.049281518906354904, 0.009247605688869953, 0.05165790021419525, 0.10534454137086868, 0.057840634137392044, -0.018262295052409172, 0.04072899743914604, 0.01263571996241808, -0.06319166719913483, 0.05549928918480873, 0.06555600464344025, 0.03990258648991585, 0.10616786032915115, 0.04242055118083954, -0.0608086958527565, 0.03350701555609703, -0.05448948219418526, -0.01605929248034954, 0.04485902935266495, 0.1482909470796585, 0.12151070684194565, 0.10377135127782822, -0.005254296585917473, 0.08769911527633667, 0.06233548745512962, 0.11282581835985184, 0.12520481646060944, -0.2503584325313568, -0.09263067692518234, 0.20250122249126434, 0.01488637924194336, 0.02039671689271927, -0.03887636214494705, 0.08594962954521179, 0.09852907806634903, -0.023020651191473007, 0.00939151644706726, -0.05225531756877899, 0.049285486340522766, -0.09709975868463516, -0.12659375369548798, 0.0001421909691998735, 0.20538756251335144, 0.056749388575553894, -0.05604056641459465, -0.02543177269399166, -0.0947931632399559, -0.004632844589650631, -0.07870921492576599, -0.00024117165594361722, -0.05021439865231514, 0.035291194915771484, -0.04726662486791611, -0.04148196801543236, -0.1345927119255066, -0.0817648395895958, -0.04073004424571991, 0.15758217871189117, 0.03176749497652054, 0.06832471489906311, -0.12063179165124893, 0.10131201148033142, 0.011705395765602589, -0.10277879983186722, 0.05023084208369255, -0.06744161993265152, 0.01263076439499855, 0.06956928968429565, 0.02417393960058689, -0.10680568218231201, 0.10820704698562622, -0.007734284736216068, 0.038814857602119446, -0.04323471337556839, 0.04742129519581795, 0.08350080996751785, -0.006591761484742165, -0.06507512181997299, -0.09966979175806046, -0.06437516957521439, -0.00035091041354462504, -0.0591256208717823, -0.0026399653870612383, -0.008159294724464417, -0.0953950360417366, 0.0025877230800688267, -0.06618822365999222, 0.020210523158311844, 0.05294115096330643, 0.08256617933511734, 0.002635186305269599, -0.018462784588336945, 0.1934812366962433, 0.059422656893730164, -0.038127556443214417, -0.015103572979569435, 0.03986712172627449, 0.06554843485355377, 0.04768916964530945, -0.0061966911889612675, -0.007721826899796724, 0.040420226752758026, -0.06860462576150894, -0.04893196001648903, -0.04020492359995842, -0.0382566899061203, -0.010307404212653637, -0.16099640727043152, 0.05651014298200607, -0.19628295302391052, -0.10118888318538666, 0.0383470319211483, 0.042632702738046646, -0.020392630249261856, -0.05707848072052002, -0.04419320449233055, -0.1042335256934166, 0.015637675300240517, -0.00023965239233803004, 0.0019005389185622334, -0.03316594287753105, 0.034391552209854126, -0.006785610690712929, 0.08955763280391693, -0.23928111791610718, 0.01232944056391716, -0.06664705276489258, -0.001579879317432642, 0.02255547046661377, 0.008809264749288559, -0.02387624979019165, 0.13053788244724274, -0.004144470207393169, -0.0016970435390248895, -0.062088217586278915, -0.0017340275226160884, 0.023992929607629776, 0.15261317789554596, -0.10068152099847794, 0.011329344473779202, 0.17220236361026764, -0.14406602084636688, -0.1561482846736908, 0.11605214327573776, 0.06419558823108673, 0.10399823635816574, 0.07771220803260803, 0.11036320775747299, 0.11071159690618515, -0.21922867000102997, -0.014979800209403038, 0.035414110869169235, -0.1174226924777031, -0.18104758858680725, 0.005079406313598156, 0.10333799570798874, -0.08748306334018707, 0.021088354289531708, -0.1066332459449768, 0.1185542494058609, -0.0850360095500946, -0.02504604682326317, -0.04388725385069847, -0.12535476684570312, -0.009338309057056904, 0.023848172277212143, 0.05054866150021553, -0.005196436308324337, 0.0203066598623991, -0.1305040717124939, 0.03382575884461403, -0.025697220116853714, -0.02733227238059044, -0.08766359090805054, 0.06954814493656158, -0.055894166231155396, 0.033078715205192566, -0.039303749799728394, -0.03202396258711815, 0.022853132337331772, 0.14980745315551758, 0.017828483134508133, 0.1916096955537796, 0.05571971461176872, 0.045770157128572464, 0.0017816860927268863, -0.09193635731935501, 0.08724766969680786, 0.011664008721709251, -0.06734725087881088, -0.15535573661327362, 0.13069893419742584, -0.06378281861543655, -0.05736776441335678, -0.13665421307086945, 0.06232094392180443, 0.026838896796107292, 0.14195996522903442, 0.035034384578466415, -0.0004639760300051421, 0.02434362657368183, -0.014508401975035667, -0.05464489012956619, 0.003080048132687807, 0.08175654709339142, 0.053566500544548035, -0.07652736455202103, 0.1717894822359085, -0.09304913878440857, 0.14430847764015198, 0.11093342304229736, -0.033660296350717545, -0.026471946388483047, 0.0526285320520401, -0.05979251116514206, 0.013321966864168644, -0.0033107372000813484, -0.024227790534496307, -0.005950750317424536, -0.051593318581581116, 0.11163908243179321, -0.04825786501169205, -0.004760283045470715, 0.05577186495065689, -0.037225548177957535, -0.005540680140256882, 0.09930434823036194, 0.020167704671621323, -0.12454165518283844, 0.1376505345106125, 0.10041123628616333, -0.0019791019149124622, 0.17734895646572113, 0.016492318361997604, -0.02417522482573986, -0.023988202214241028, 0.08869896084070206, 0.028950924053788185, 0.19842685759067535, -0.11375350505113602, 0.024927467107772827, 0.023414205759763718, 0.0022420205641537905, 0.03755408152937889, -0.13360214233398438, -0.04293731972575188, -0.03058895468711853, -0.03665781393647194, 0.09757959097623825, 0.08793599158525467, -0.11187994480133057, 0.09871995449066162, -0.08728733658790588, -0.09328634291887283, 0.03820768743753433, -0.0032729373779147863, -0.07295683026313782, 0.08551791310310364, -0.05463157966732979, -0.20843437314033508, -0.13901953399181366, -0.06088878586888313, -0.05432392284274101, 0.0077741206623613834, 0.07893389463424683, -0.07239392399787903, -0.023333467543125153, -0.05309474468231201, -0.05647088587284088, -0.07226454466581345, 0.07396125048398972, 0.06929536163806915, 0.04357074946165085, 0.008465300314128399, -0.01671152561903, 0.02013501152396202, -0.04589821398258209, 0.018514359369874, 0.09677771478891373, 0.031177571043372154, 0.18389037251472473, 0.058638233691453934, -0.01209358312189579, -0.03689390420913696, 0.0046702101826667786, 0.24081401526927948, -0.027217702940106392, 0.05888516083359718, 0.12586063146591187, 0.027141934260725975, 0.06616847217082977, 0.18018770217895508, 0.03669091686606407, -0.08518455177545547, 0.054481763392686844, -0.06148238852620125, -0.11956045031547546, -0.11772456020116806, -0.06553451716899872, -0.06586650758981705, 0.13672798871994019, 0.011457928456366062, 0.0874294713139534, 0.05778709799051285, 0.18313343822956085, -0.007790182251483202, 0.00023344048531726003, -0.059091128408908844, 0.10034424811601639, -0.008956004865467548, -0.03289905562996864, 0.04251479357481003, -0.11053536832332611, -0.059474654495716095, 0.08615274727344513, 0.041846249252557755, 0.14577265083789825, 0.036429207772016525, -0.016245616599917412, 0.09620781242847443, 0.13049356639385223, 0.13138458132743835, 0.11600935459136963, -0.041547417640686035, -0.07675772905349731, -0.0021123590413480997, -0.07179653644561768, 0.08702948689460754, 0.06573708355426788, -0.08026675134897232, -0.031743522733449936, 0.047819800674915314, 0.07124075293540955, -0.015610533766448498, 0.15010470151901245, 0.07716930657625198, -0.25062546133995056, 0.0048093730583786964, -0.015126291662454605, 0.05012696608901024, -0.04609573632478714, 0.005911487620323896, 0.23452560603618622, 0.002287322888150811, 0.06825320422649384, -0.019129352644085884, 0.07805724442005157, 0.03998623788356781, 0.01583136059343815, -0.055317871272563934, 0.021619366481900215, -0.0008825394907034934, 0.03626388683915138, -0.20604975521564484, 0.17828872799873352, -0.026079263538122177, 0.05959688872098923, 0.015135735273361206, -0.0258820541203022, -0.035413824021816254, 0.19378894567489624, 0.16483208537101746, 0.015898022800683975, -0.060074079781770706, -0.04402417689561844, -0.10670077055692673, 0.023315276950597763, 0.03443434089422226, 0.011492010205984116, 0.0280300285667181, 0.056391358375549316, -0.02093314379453659, -0.012217016890645027, 0.027713630348443985, -0.1889684796333313, -0.11971358209848404, -0.03336619958281517, 0.23502925038337708, 0.09777282923460007, -0.016044357791543007, 0.04575009271502495, -0.06505919247865677, 0.07429569214582443, -0.22802172601222992, -0.08061865717172623, -0.08334659039974213, -0.05794467777013779, -0.05761599540710449, -0.046585001051425934, -0.0061051915399730206, -0.09079516679048538, 0.056866295635700226, -0.03877562656998634, -0.13876619935035706, -0.004457033704966307, -0.2002241164445877, -0.12033788859844208, -0.10491546988487244, 0.014417300000786781, 0.053474921733140945, 0.014866044744849205, 0.007686353754252195, -0.05958590283989906, -0.04734911769628525, -0.09528172016143799, 0.02218269370496273, 0.0795285552740097, -0.03585822507739067, -0.04437457397580147, -0.10143726319074631, -0.14531730115413666, -0.03682069480419159, -0.059845004230737686, 0.08227477967739105, 0.23277351260185242, -0.07628846168518066, 0.062188487499952316, 0.21905460953712463, -0.04934847727417946, -0.25917235016822815, -0.11311928182840347, -0.025876328349113464, -0.021152731031179428, -0.013002770021557808, -0.11223463714122772, 0.14609552919864655, 0.023195799440145493, -0.04367676004767418, 0.1454920619726181, -0.2225586622953415, -0.0622548833489418, 0.04749021679162979, 0.1548304259777069, 0.3081791400909424, -0.17995324730873108, -0.03940080478787422, -0.006598612293601036, -0.14317858219146729, 0.16816729307174683, -0.004089589696377516, 0.05596613883972168, -0.04347426816821098, 0.007760819513350725, -0.025202253833413124, -0.022197075188159943, 0.08987061679363251, -0.02781941555440426, 0.039081986993551254, -0.06182604283094406, 0.09724298119544983, 0.143108531832695, -0.0031492803245782852, 0.03162587061524391, -0.10779449343681335, 0.03501950576901436, -0.10164260864257812, 0.0031613248866051435, -0.01687624678015709, -0.01482054777443409, -0.026933902874588966, -0.11920049041509628, -0.07165069878101349, -0.021214768290519714, 0.026326285675168037, 0.006309219170361757, -0.016629347577691078, -0.009418896399438381, 0.01749834045767784, 0.14538244903087616, 0.012893891893327236, -0.05302798002958298, -0.021792592480778694, -0.0863872691988945, -0.060193274170160294, 0.12537343800067902, -0.021989913657307625, -0.03205399215221405, 0.09923931956291199, -0.007317423354834318, 0.023537639528512955, 0.034473832696676254, -0.02935287542641163, 0.04307084158062935, 0.13293571770191193, -0.15704311430454254, -0.161338210105896, -0.015329400077462196, 0.15972404181957245, 0.06348542124032974, 0.12027190625667572, 0.12342068552970886, -0.09343515336513519, 0.04368797689676285, -0.041898950934410095, 0.0004016560560557991, -0.02171298675239086, 0.05438802391290665, -0.040858738124370575, 0.04250933974981308, -0.06870099902153015, 0.021547727286815643, -0.044214580208063126, -0.05966239422559738, -0.031047584488987923, 0.0046490151435136795, -0.12546807527542114, -0.07119999080896378, 0.054890889674425125, 0.12418310344219208, -0.13549645245075226, -0.1084461361169815, -0.05718943104147911, -0.07984492182731628, 0.05015861988067627, 0.029066642746329308, 0.018427424132823944, 0.004453142639249563, 0.0578392930328846, 0.002040076768025756, -0.05597301945090294, 0.01973530277609825, -0.044680219143629074, 0.10983972251415253, -0.2510836124420166, -0.05996795371174812, -0.009232058189809322, 0.013194574043154716, -0.07986744493246078, -0.00014378727064467967, -0.1059996485710144, 0.01603449136018753, -0.001659751171246171, 0.08621194213628769, -0.1195075735449791, -0.08496436476707458, -0.01057684700936079, -0.01614810898900032, -0.03315860033035278, 0.016276760026812553, -0.040220558643341064, 0.047610364854335785, 0.0405576191842556, 0.013417030684649944, -0.005061632487922907, 0.012184703722596169, -0.03965781629085541, -0.024109959602355957, 0.06528468430042267, 0.00030218687606975436, -0.10307660698890686, -0.028703512623906136, -0.259921669960022, 0.014167683199048042, 0.11673039197921753, 0.009065159596502781, -0.00230129761621356, 0.13283056020736694, 0.00038364683859981596, 0.033275704830884933, 0.03281305357813835, -0.007586476393043995, 0.059365421533584595, -0.10310802608728409, -0.0013436103472486138, -0.06074925884604454, 0.040503568947315216, -0.07856310158967972, -0.035260915756225586, 0.10544591397047043, 0.06083231419324875, 0.15240159630775452, -0.12511180341243744, 0.02702835574746132, -0.013215480372309685, 0.02556619979441166, 0.0856262594461441, -0.06296119093894958, 0.006092351395636797, -0.0799882784485817, -0.022463619709014893, 0.025338273495435715, 0.09842096269130707, -0.0459626130759716, -0.23245437443256378, 0.0005011697649024427, -0.08936875313520432, -0.01766492798924446, -0.03343459591269493, 0.26329073309898376, 0.016055403277277946, 0.003604956204071641, -0.15125301480293274, 0.07937262952327728, 0.08873941004276276, 0.05336675047874451, 0.03407770395278931, 0.07096026837825775, -0.005426439456641674, 0.08740103244781494, 0.06675242632627487, 0.037445615977048874, -0.10054358094930649, 0.014654836617410183, -0.16989123821258545, 0.1335858702659607, -0.044579289853572845, 0.11589325964450836, 0.1902172714471817, -0.01654309220612049, -0.02879820205271244, 0.08866946399211884, 0.007214994169771671, -0.04613794758915901, -0.18040601909160614, -0.041403740644454956, -0.1735881119966507, 0.013067533262073994, -0.03322823345661163, -0.04374280571937561, -0.009309304878115654, 0.05470392853021622, -0.04969743266701698, 0.058889277279376984, 0.07036643475294113, -0.0006915273261256516, 0.06360860913991928, -0.006290961056947708, -0.04613927751779556, 0.03519869223237038, -0.0038275064434856176, 0.03612753003835678, 0.019452592357993126, -0.00979992188513279, 0.06083226203918457, -0.0018279296346008778, 0.05688278004527092, 0.029278215020895004, -0.025033453479409218, -0.016487639397382736, 0.005817955359816551, 0.012998556718230247, 0.125589519739151, 0.046166591346263885, -0.031856875866651535, 0.0061846221797168255, 0.09917516261339188, -0.025743987411260605, -0.03534140810370445, -0.08946388959884644, 0.04288100451231003, -0.13158120214939117, 0.07230374217033386, -0.017315909266471863, -0.04073546826839447, -0.07895997911691666, 0.27449074387550354, 0.14285314083099365, -0.04738624021410942, 0.015915103256702423, -0.08026998490095139, 0.013215325772762299, -0.06976916640996933, 0.0963076502084732, 0.05755295976996422, 0.29235854744911194, -0.024260664358735085, -0.03212795779109001, -0.1112867221236229, -0.03304195776581764, -0.10585411638021469, -0.12381479889154434, 0.0077606202103197575, -0.045973245054483414, -0.08545161783695221, 0.08041328191757202, -0.20332226157188416, -0.07015526294708252, 0.11025042086839676, 0.026802033185958862, 0.00875278189778328, -0.03945312649011612, 0.07948379218578339, 0.03503986820578575, 0.026234248653054237, -0.10527849197387695, 0.0693928524851799, 0.026948783546686172, -0.05277375504374504, -0.06734728813171387, 0.07826430350542068, -0.032224517315626144, -0.1539006233215332, 0.13802793622016907, -0.019792187958955765, 0.0012729428708553314, 0.06534066796302795, -0.052740562707185745, -0.15951095521450043, 0.09340246021747589, -0.06691261380910873, -0.09516840428113937, -0.012465642765164375, 0.09210696071386337, 0.0021271950099617243, -0.013407229445874691, 0.009281907230615616, -0.05212414637207985, -0.037225138396024704, 0.0613914355635643, 0.04267430678009987, -0.0937306359410286, 0.08979526162147522, -0.05216594785451889, 0.0892777219414711, -0.006253660190850496, -0.029973022639751434, -0.023310860618948936, -0.01003598514944315, 0.03612828254699707, 0.0016317175468429923, -0.07983160018920898, 0.0594065897166729, -0.15352857112884521, -0.038104165345430374, 0.08223041146993637, 0.05358264967799187, -0.20415830612182617, -0.011059493757784367, -0.16859588027000427, 0.017060140147805214, -0.06582973152399063, 0.014333201572299004, 0.22641046345233917, 0.005251770839095116, -0.005223835818469524, -0.1505524218082428, -0.04397658258676529, 0.05286959558725357, 0.011379175819456577, -0.14302599430084229 ]
null
null
peft
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed] ### Framework versions - PEFT 0.8.2
{"library_name": "peft", "base_model": "openai-community/gpt2-xl"}
null
KapitalK/GPT-XL2
[ "peft", "arxiv:1910.09700", "base_model:openai-community/gpt2-xl", "region:us" ]
2024-02-13T19:18:15+00:00
[ "1910.09700" ]
[]
TAGS #peft #arxiv-1910.09700 #base_model-openai-community/gpt2-xl #region-us
# Model Card for Model ID ## Model Details ### Model Description - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact ### Framework versions - PEFT 0.8.2
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact", "### Framework versions\n\n- PEFT 0.8.2" ]
[ "TAGS\n#peft #arxiv-1910.09700 #base_model-openai-community/gpt2-xl #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact", "### Framework versions\n\n- PEFT 0.8.2" ]
[ 34, 6, 3, 54, 28, 3, 4, 9, 9, 10, 42, 20, 3, 4, 5, 9, 11, 13, 3, 12, 5, 4, 5, 3, 4, 9, 53, 9, 8, 6, 3, 14, 8, 7, 9, 4, 11 ]
[ "passage: TAGS\n#peft #arxiv-1910.09700 #base_model-openai-community/gpt2-xl #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact### Framework versions\n\n- PEFT 0.8.2" ]
[ -0.10944072157144547, 0.1947823464870453, -0.003353343578055501, 0.03594445064663887, 0.09093703329563141, 0.023151764646172523, 0.05293167382478714, 0.12669996917247772, -0.030834853649139404, 0.10282713919878006, 0.06382612884044647, 0.10728568583726883, 0.10327816009521484, 0.2011939287185669, 0.007335085887461901, -0.19108857214450836, 0.031972840428352356, -0.09058409184217453, -0.015165152959525585, 0.12072603404521942, 0.14970490336418152, -0.09543144702911377, 0.07926523685455322, -0.011417927220463753, -0.019745932891964912, -0.04231948405504227, -0.07989784330129623, -0.030933713540434837, 0.03846520930528641, 0.04920923709869385, 0.050888147205114365, -0.0003864439786411822, 0.08180375397205353, -0.26250001788139343, 0.016710516065359116, 0.043245796114206314, -0.010520601645112038, 0.0811990424990654, 0.10111361742019653, -0.03772459551692009, 0.12633462250232697, -0.03158261254429817, 0.14542149007320404, 0.07804311811923981, -0.09352491050958633, -0.20273341238498688, -0.07368503510951996, 0.07469208538532257, 0.1675443798303604, 0.08164195716381073, -0.0438992902636528, 0.14158234000205994, -0.10815612971782684, 0.014837509021162987, 0.04284926876425743, -0.062474727630615234, -0.07367338240146637, 0.06164265796542168, 0.10417385399341583, 0.047217462211847305, -0.139267235994339, -0.03478344902396202, 0.01862514019012451, 0.03664027899503708, 0.07339321821928024, 0.019428474828600883, 0.14080913364887238, 0.031787242740392685, -0.15022636950016022, -0.04239985719323158, 0.1439979076385498, 0.03573906794190407, -0.04020191356539726, -0.2244696319103241, 0.011940413154661655, -0.08507007360458374, -0.024923410266637802, -0.04768023267388344, 0.038705311715602875, -0.00033921547583304346, 0.0856306180357933, -0.029342766851186752, -0.09008584916591644, -0.020190251991152763, 0.09163089841604233, 0.0534796342253685, 0.02895314432680607, -0.0238115843385458, 0.0029095930512994528, 0.12793204188346863, 0.05927468091249466, -0.12400132417678833, -0.06922591477632523, -0.06908059865236282, -0.04979671910405159, -0.053895242512226105, 0.030116090551018715, 0.03634670004248619, 0.05951434373855591, 0.24458588659763336, -0.021928803995251656, 0.055131129920482635, 0.06332056224346161, 0.019924823194742203, 0.05170498043298721, 0.09420084953308105, -0.06185983493924141, -0.13816048204898834, -0.024021735414862633, 0.09061964601278305, -0.015159261412918568, -0.020146740600466728, -0.04602694511413574, 0.03410838916897774, 0.045287396758794785, 0.10082076489925385, 0.09724103659391403, 0.004124699626117945, -0.07557135820388794, -0.05655656382441521, 0.20923402905464172, -0.14592255651950836, 0.03894606977701187, 0.016127996146678925, -0.02757573500275612, -0.056563105434179306, 0.007880115881562233, 0.016954511404037476, -0.02888290025293827, 0.0888478234410286, -0.06581390649080276, -0.03544197604060173, -0.12100638449192047, -0.015422397293150425, 0.038722436875104904, 0.007003475911915302, -0.021259084343910217, -0.02491041086614132, -0.06812500208616257, -0.09506494551897049, 0.10750292241573334, -0.07226914167404175, -0.06937359273433685, -0.032288551330566406, -0.0934503898024559, 0.01816636137664318, 0.028837935999035835, 0.12439020723104477, -0.025908956304192543, 0.04054786637425423, -0.02296469919383526, 0.059265799820423126, 0.08038046211004257, 0.037636008113622665, -0.06761174649000168, 0.05705371871590614, -0.18351249396800995, 0.0966126099228859, -0.08195896446704865, 0.02235630713403225, -0.1536521464586258, -0.014127005822956562, 0.010008358396589756, 0.019234206527471542, 0.03264787793159485, 0.14939363300800323, -0.19307735562324524, -0.02197428233921528, 0.16256912052631378, -0.09758635610342026, -0.12122322618961334, 0.037983085960149765, -0.058950066566467285, 0.16108039021492004, 0.01939433440566063, -0.015340207144618034, 0.09153001010417938, -0.16100166738033295, -0.0274574663490057, -0.025949712842702866, -0.010340082459151745, 0.10397094488143921, 0.08717357367277145, -0.07603730261325836, 0.03241610527038574, 0.017674976959824562, -0.04265597090125084, -0.03085826151072979, -0.05373816564679146, -0.11254118382930756, 0.0010315550025552511, -0.09072443097829819, 0.024999039247632027, -0.009354005567729473, -0.07028668373823166, -0.012106634676456451, -0.16019484400749207, -0.020248617976903915, 0.08557764440774918, 0.015559723600745201, -0.02128426358103752, -0.09486330300569534, 0.03364481404423714, -0.028271986171603203, -0.029301419854164124, -0.15483154356479645, -0.03294621780514717, 0.021821243688464165, -0.14737772941589355, 0.014267655089497566, -0.10566584020853043, 0.06404106318950653, 0.007953759282827377, -0.06966651976108551, -0.028292391449213028, -0.016889620572328568, 0.008181779645383358, -0.04799443855881691, -0.2412346452474594, -0.020019667223095894, -0.05199592188000679, 0.15045960247516632, -0.2138085663318634, 0.03961145132780075, 0.05266125872731209, 0.12054955214262009, -0.0009035554830916226, -0.06467462331056595, 0.029471253976225853, -0.07511275261640549, -0.017351627349853516, -0.07159969955682755, -0.004248881246894598, 0.005328451748937368, -0.0450906939804554, 0.019305219873785973, -0.1075306385755539, -0.049178171902894974, 0.10417618602514267, 0.05527631193399429, -0.17346957325935364, -0.023097822442650795, -0.046116504818201065, -0.07124318182468414, -0.09169773012399673, -0.05415806546807289, 0.10515284538269043, 0.041585735976696014, 0.03269373998045921, -0.07475687563419342, -0.07640749216079712, 0.012526088394224644, -0.020289583131670952, -0.020940471440553665, 0.1124240979552269, 0.08039423078298569, -0.11441559344530106, 0.10038081556558609, 0.05642333999276161, 0.022568801417946815, 0.08533167839050293, -0.023286325857043266, -0.10948912054300308, -0.0353396050632, 0.04703282564878464, 0.006245602387934923, 0.16222289204597473, -0.09755446016788483, 0.0529886819422245, 0.0440160296857357, -0.024740222841501236, 0.0573631189763546, -0.10083404928445816, 0.007252480369061232, 0.00619475869461894, -0.013228358700871468, 0.013129032216966152, -0.018381411209702492, 0.010771160945296288, 0.08230727165937424, 0.05341639742255211, 0.04012325778603554, 0.04041486978530884, -0.03360074386000633, -0.13050709664821625, 0.1816704273223877, -0.0924571081995964, -0.2249785214662552, -0.14634695649147034, 0.04399142786860466, 0.0514918752014637, -0.017738396301865578, 0.023651374503970146, -0.05044744536280632, -0.09845245629549026, -0.0740722268819809, -0.004893820267170668, 0.023804306983947754, -0.06170307472348213, -0.07565140724182129, 0.056919462978839874, 0.040460724383592606, -0.11576100438833237, 0.03636781498789787, 0.06001606956124306, -0.021949635818600655, 0.007642357610166073, 0.05665047839283943, 0.08089082688093185, 0.17461374402046204, -0.008344740606844425, 0.0017759462352842093, 0.055072810500860214, 0.2798279821872711, -0.16532540321350098, 0.1020711287856102, 0.10985559970140457, -0.05893383547663689, 0.07532299309968948, 0.18618538975715637, 0.03361507132649422, -0.10542897135019302, 0.034598495811223984, 0.03260239213705063, -0.026565296575427055, -0.27217236161231995, -0.04892345145344734, -0.015015819109976292, -0.09997636824846268, 0.08054386079311371, 0.08565572649240494, 0.10213218629360199, 0.04432207718491554, -0.06219084933400154, -0.08246864378452301, 0.032585080713033676, 0.0978967547416687, -0.029215795919299126, 0.009400546550750732, 0.08119280636310577, -0.026892056688666344, 0.012422733940184116, 0.09824589639902115, -0.015579886734485626, 0.18118871748447418, 0.04385717958211899, 0.10755256563425064, 0.08679575473070145, 0.09309112280607224, -0.008465249091386795, 0.016574915498495102, 0.018525421619415283, 0.02110176347196102, 0.016441771760582924, -0.07994204759597778, 0.03717470541596413, 0.10806397348642349, 0.05075093358755112, 0.018337735906243324, 0.00901766773313284, -0.05652062967419624, 0.04567372053861618, 0.18429163098335266, 0.008376488462090492, -0.19362112879753113, -0.07377733290195465, 0.05328003689646721, -0.07502224296331406, -0.14255426824092865, -0.02121184952557087, 0.024546971544623375, -0.17657439410686493, 0.014729551039636135, -0.04045349359512329, 0.10020656138658524, -0.0807483121752739, -0.045424725860357285, 0.10272405296564102, 0.06895950436592102, -0.024891171604394913, 0.0643819272518158, -0.20409251749515533, 0.13192440569400787, 0.0227514635771513, 0.07352744042873383, -0.09912186861038208, 0.09887838363647461, 0.0029074251651763916, -0.017915459349751472, 0.16446749866008759, 0.005446172319352627, -0.06250042468309402, -0.05079827457666397, -0.09286834299564362, -0.013283716514706612, 0.0965781882405281, -0.1273345798254013, 0.06374328583478928, -0.012035369873046875, -0.025750044733285904, 0.006634507793933153, -0.07210461795330048, -0.12643256783485413, -0.17625276744365692, 0.06302531808614731, -0.10946117341518402, 0.030561324208974838, -0.08902372419834137, -0.06466980278491974, 0.0009326168219558895, 0.18720470368862152, -0.17282986640930176, -0.09261804074048996, -0.13980360329151154, -0.08808182179927826, 0.16285085678100586, -0.03744572401046753, 0.08460323512554169, 0.008431597612798214, 0.16339614987373352, 0.019917229190468788, 0.0019714361988008022, 0.10055560618638992, -0.0866347923874855, -0.19160644710063934, -0.05623016878962517, 0.15441858768463135, 0.14883501827716827, 0.042322319000959396, -0.013986372388899326, 0.02072916366159916, -0.0529087595641613, -0.11497963964939117, 0.029175426810979843, 0.13886870443820953, 0.08626461029052734, -0.0076826224103569984, -0.027459600940346718, -0.09045881032943726, -0.06142232194542885, -0.06472029536962509, -0.00025014206767082214, 0.18964825570583344, -0.0694744735956192, 0.16145937144756317, 0.11166119575500488, -0.05875924602150917, -0.20107071101665497, 0.05941571667790413, 0.05859732627868652, 0.01204205397516489, 0.028586890548467636, -0.20422744750976562, 0.084892138838768, 0.0009188737021759152, -0.07320011407136917, 0.1614963263273239, -0.1689157783985138, -0.14540784060955048, 0.0966864824295044, 0.03669729083776474, -0.22733815014362335, -0.13237477838993073, -0.09820576757192612, -0.014729002490639687, -0.12527956068515778, 0.07288645952939987, 0.004153280518949032, 0.018022364005446434, 0.02970731444656849, 0.022243158891797066, 0.02734936960041523, -0.05155503749847412, 0.20668193697929382, -0.019465556368231773, 0.014945040456950665, -0.055863332003355026, -0.09477385133504868, 0.03634431213140488, -0.049176037311553955, 0.09747040271759033, 0.0074060410261154175, 0.025111159309744835, -0.13588523864746094, -0.04813672974705696, -0.0667164996266365, 0.030183831229805946, -0.09707280248403549, -0.09178482741117477, -0.04958734288811684, 0.10132525861263275, 0.10213971138000488, -0.0331345833837986, -0.005954638589173555, -0.08328456431627274, 0.05818207934498787, 0.1943139284849167, 0.19051027297973633, 0.07337861508131027, -0.06983349472284317, 0.014397216960787773, -0.03079381212592125, 0.04250641539692879, -0.22024917602539062, 0.04183183237910271, 0.05049645155668259, 0.025407833978533745, 0.09134689718484879, -0.009934632107615471, -0.15035557746887207, -0.07274655252695084, 0.07270115613937378, -0.0421663336455822, -0.1551537960767746, -0.026273222640156746, 0.04429435729980469, -0.20711728930473328, -0.049630165100097656, 0.005106727126985788, -0.016107412055134773, -0.040502842515707016, 0.022228330373764038, 0.07807353883981705, -0.01810431107878685, 0.1108367070555687, 0.08823920786380768, 0.09147890657186508, -0.10283440351486206, 0.08024819195270538, 0.07136257737874985, -0.05360680818557739, 0.027252690866589546, 0.09608721733093262, -0.04667741060256958, -0.035749197006225586, 0.09687162935733795, 0.087377168238163, 0.02700074203312397, -0.050636082887649536, 0.012657051905989647, -0.05128291994333267, 0.07055194675922394, 0.11265672743320465, 0.03614848107099533, 0.0023890577722340822, 0.060735441744327545, 0.0406060628592968, -0.09906987100839615, 0.10251971334218979, 0.06059487536549568, 0.021049203351140022, -0.040392808616161346, -0.030351931229233742, -0.005514017306268215, -0.008865377865731716, -0.017069676890969276, -0.00994140561670065, -0.08593963086605072, -0.008958933874964714, -0.10302065312862396, 0.03490179404616356, -0.07524871826171875, 0.01219728123396635, 0.0254184752702713, -0.050375957041978836, 0.003508947556838393, 0.008333287201821804, -0.08077675849199295, -0.05043422430753708, -0.011203004978597164, 0.08692436665296555, -0.12053600698709488, 0.03298840671777725, 0.08099669963121414, -0.10860563814640045, 0.06997398287057877, 0.003978778142482042, 0.007252615410834551, 0.010887968353927135, -0.16440065205097198, 0.05931297689676285, -0.03038373962044716, -0.01342853158712387, 0.016210881993174553, -0.21930204331874847, -0.010714126750826836, -0.044988542795181274, -0.04716929420828819, 0.012694490142166615, -0.03562631085515022, -0.12426263839006424, 0.1023775190114975, -0.003163744928315282, -0.07257839292287827, -0.023368624970316887, 0.04373403266072273, 0.10180068016052246, -0.02075527422130108, 0.1245015487074852, -0.02032524347305298, 0.07318782806396484, -0.17039193212985992, -0.0019751046784222126, -0.008241270668804646, 0.046401407569646835, -0.013422461226582527, -0.02167847566306591, 0.06217268481850624, -0.020010512322187424, 0.19820813834667206, -0.02723773382604122, 0.05852743610739708, 0.050803933292627335, 0.016617335379123688, 0.011346948333084583, 0.08531802147626877, 0.06558694690465927, -0.009960758499801159, -0.0018669069977477193, 0.048180289566516876, -0.004269620403647423, -0.04668188840150833, -0.1513918936252594, 0.07056503742933273, 0.15318799018859863, 0.049019504338502884, 0.01789281703531742, 0.02847662940621376, -0.12184341996908188, -0.06982408463954926, 0.13682611286640167, -0.007598268799483776, -0.03346158564090729, -0.07823546230792999, 0.18409036099910736, 0.11790676414966583, -0.19921985268592834, 0.08636754006147385, -0.0570930615067482, -0.06071317195892334, -0.1269688457250595, -0.1647690236568451, -0.06356672197580338, -0.045259296894073486, -0.01546055544167757, -0.06265394389629364, 0.057892780750989914, 0.054394353181123734, 0.007095261476933956, -0.01922091841697693, 0.10057384520769119, 0.0049527776427567005, -0.021939856931567192, 0.04453388601541519, 0.05382814258337021, 0.02358531951904297, -0.10744942724704742, 0.008430673740804195, -0.0023357088211923838, 0.023482296615839005, 0.06695903837680817, 0.012437023222446442, -0.05434801056981087, 0.005425474606454372, -0.012609249912202358, -0.11060609668493271, 0.04106578603386879, -0.024203045293688774, -0.030465755611658096, 0.14146585762500763, 0.025953080505132675, 0.012512882240116596, -0.020294874906539917, 0.2408498376607895, -0.07351089268922806, -0.07818080484867096, -0.15932218730449677, 0.04089493677020073, -0.06955591589212418, 0.023746788501739502, 0.04229412227869034, -0.11467044055461884, 0.02186744473874569, 0.16599634289741516, 0.13237737119197845, -0.0069916509091854095, 0.010751532390713692, 0.057402126491069794, -0.00035850206040777266, -0.02811710350215435, 0.010377262718975544, 0.04239997640252113, 0.13025915622711182, -0.07624518871307373, 0.06391678005456924, -0.009917879477143288, -0.07555784285068512, -0.003581081749871373, 0.10921265184879303, 0.003309264313429594, 0.0018212915165349841, -0.07530386000871658, 0.13890133798122406, -0.09287035465240479, -0.22887270152568817, 0.06033565476536751, -0.06398708373308182, -0.1541450321674347, -0.043421197682619095, 0.004746082238852978, -0.01299720536917448, 0.018890777602791786, 0.07757552713155746, -0.045782990753650665, 0.169488325715065, 0.0454796627163887, -0.06252717971801758, -0.0849515050649643, 0.0670170709490776, -0.11704422533512115, 0.2850325107574463, 0.01855860836803913, 0.0605301633477211, 0.1048886626958847, -0.015525137074291706, -0.1387251913547516, 0.00742175430059433, 0.10480140149593353, -0.07298865169286728, 0.060339659452438354, 0.18025119602680206, -0.0048653376288712025, 0.12129996716976166, 0.0554901547729969, -0.05191062018275261, 0.03883669152855873, -0.09393538534641266, -0.04719913750886917, -0.11929651349782944, 0.07830383628606796, -0.08233071863651276, 0.16280044615268707, 0.13703136146068573, -0.06441561877727509, -0.009987147524952888, -0.02244110219180584, 0.08321807533502579, 0.0017402702942490578, 0.10338716953992844, 0.0012666404945775867, -0.191202774643898, 0.04180362448096275, 0.019871221855282784, 0.1064046248793602, -0.20043298602104187, -0.06670982390642166, 0.05787203833460808, -0.027026891708374023, -0.06707723438739777, 0.11497782170772552, 0.04113665595650673, 0.037256281822919846, -0.041028644889593124, -0.038949720561504364, 0.00012006583710899577, 0.14592067897319794, -0.11384900659322739, -0.010830949060618877 ]
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # 2020-Q2-75p-filtered-random This model is a fine-tuned version of [cardiffnlp/twitter-roberta-base-2019-90m](https://huggingface.co/cardiffnlp/twitter-roberta-base-2019-90m) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.9264 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 4.1e-07 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08 - lr_scheduler_type: linear - training_steps: 2400000 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:-------:|:---------------:| | No log | 0.07 | 8000 | 2.2326 | | 2.407 | 0.13 | 16000 | 2.1532 | | 2.407 | 0.2 | 24000 | 2.1195 | | 2.23 | 0.27 | 32000 | 2.0944 | | 2.23 | 0.34 | 40000 | 2.0626 | | 2.1798 | 0.4 | 48000 | 2.0517 | | 2.1798 | 0.47 | 56000 | 2.0355 | | 2.1621 | 0.54 | 64000 | 2.0469 | | 2.1621 | 0.61 | 72000 | 2.0306 | | 2.1419 | 0.67 | 80000 | 2.0182 | | 2.1419 | 0.74 | 88000 | 2.0107 | | 2.1264 | 0.81 | 96000 | 2.0096 | | 2.1264 | 0.88 | 104000 | 2.0104 | | 2.1203 | 0.94 | 112000 | 2.0037 | | 2.1203 | 1.01 | 120000 | 2.0078 | | 2.1116 | 1.08 | 128000 | 1.9965 | | 2.1116 | 1.15 | 136000 | 2.0025 | | 2.1041 | 1.21 | 144000 | 1.9929 | | 2.1041 | 1.28 | 152000 | 1.9870 | | 2.1058 | 1.35 | 160000 | 1.9895 | | 2.1058 | 1.41 | 168000 | 1.9795 | | 2.1064 | 1.48 | 176000 | 1.9893 | | 2.1064 | 1.55 | 184000 | 1.9877 | | 2.098 | 1.62 | 192000 | 1.9920 | | 2.098 | 1.68 | 200000 | 1.9801 | | 2.0916 | 1.75 | 208000 | 1.9778 | | 2.0916 | 1.82 | 216000 | 1.9745 | | 2.0951 | 1.89 | 224000 | 1.9831 | | 2.0951 | 1.95 | 232000 | 1.9749 | | 2.092 | 2.02 | 240000 | 1.9754 | | 2.092 | 2.09 | 248000 | 1.9794 | | 2.0968 | 2.16 | 256000 | 1.9675 | | 2.0968 | 2.22 | 264000 | 1.9710 | | 2.0942 | 2.29 | 272000 | 1.9712 | | 2.0942 | 2.36 | 280000 | 1.9662 | | 2.0929 | 2.43 | 288000 | 1.9672 | | 2.0929 | 2.49 | 296000 | 1.9830 | | 2.092 | 2.56 | 304000 | 1.9804 | | 2.092 | 2.63 | 312000 | 1.9661 | | 2.0886 | 2.69 | 320000 | 1.9668 | | 2.0886 | 2.76 | 328000 | 1.9643 | | 2.0883 | 2.83 | 336000 | 1.9710 | | 2.0883 | 2.9 | 344000 | 1.9678 | | 2.0937 | 2.96 | 352000 | 1.9737 | | 2.0937 | 3.03 | 360000 | 1.9638 | | 2.0899 | 3.1 | 368000 | 1.9599 | | 2.0899 | 3.17 | 376000 | 1.9570 | | 2.0839 | 3.23 | 384000 | 1.9688 | | 2.0839 | 3.3 | 392000 | 1.9613 | | 2.0862 | 3.37 | 400000 | 1.9686 | | 2.0862 | 3.44 | 408000 | 1.9690 | | 2.0844 | 3.5 | 416000 | 1.9665 | | 2.0844 | 3.57 | 424000 | 1.9512 | | 2.0802 | 3.64 | 432000 | 1.9652 | | 2.0802 | 3.71 | 440000 | 1.9594 | | 2.0882 | 3.77 | 448000 | 1.9543 | | 2.0882 | 3.84 | 456000 | 1.9635 | | 2.0794 | 3.91 | 464000 | 1.9618 | | 2.0794 | 3.97 | 472000 | 1.9617 | | 2.0848 | 4.04 | 480000 | 1.9597 | | 2.0848 | 4.11 | 488000 | 1.9586 | | 2.0814 | 4.18 | 496000 | 1.9587 | | 2.0814 | 4.24 | 504000 | 1.9510 | | 2.0765 | 4.31 | 512000 | 1.9643 | | 2.0765 | 4.38 | 520000 | 1.9586 | | 2.0887 | 4.45 | 528000 | 1.9476 | | 2.0887 | 4.51 | 536000 | 1.9539 | | 2.0857 | 4.58 | 544000 | 1.9538 | | 2.0857 | 4.65 | 552000 | 1.9528 | | 2.0798 | 4.72 | 560000 | 1.9586 | | 2.0798 | 4.78 | 568000 | 1.9660 | | 2.0752 | 4.85 | 576000 | 1.9639 | | 2.0752 | 4.92 | 584000 | 1.9505 | | 2.0771 | 4.99 | 592000 | 1.9551 | | 2.0771 | 5.05 | 600000 | 1.9495 | | 2.0772 | 5.12 | 608000 | 1.9536 | | 2.0772 | 5.19 | 616000 | 1.9567 | | 2.0836 | 5.25 | 624000 | 1.9534 | | 2.0836 | 5.32 | 632000 | 1.9663 | | 2.0851 | 5.39 | 640000 | 1.9535 | | 2.0851 | 5.46 | 648000 | 1.9554 | | 2.0842 | 5.52 | 656000 | 1.9539 | | 2.0842 | 5.59 | 664000 | 1.9589 | | 2.088 | 5.66 | 672000 | 1.9572 | | 2.088 | 5.73 | 680000 | 1.9603 | | 2.075 | 5.79 | 688000 | 1.9639 | | 2.075 | 5.86 | 696000 | 1.9537 | | 2.077 | 5.93 | 704000 | 1.9612 | | 2.077 | 6.0 | 712000 | 1.9571 | | 2.0692 | 6.06 | 720000 | 1.9545 | | 2.0692 | 6.13 | 728000 | 1.9494 | | 2.087 | 6.2 | 736000 | 1.9555 | | 2.087 | 6.27 | 744000 | 1.9566 | | 2.0783 | 6.33 | 752000 | 1.9447 | | 2.0783 | 6.4 | 760000 | 1.9518 | | 2.0771 | 6.47 | 768000 | 1.9429 | | 2.0771 | 6.53 | 776000 | 1.9603 | | 2.0794 | 6.6 | 784000 | 1.9503 | | 2.0794 | 6.67 | 792000 | 1.9572 | | 2.0777 | 6.74 | 800000 | 1.9607 | | 2.0777 | 6.8 | 808000 | 1.9525 | | 2.0725 | 6.87 | 816000 | 1.9495 | | 2.0725 | 6.94 | 824000 | 1.9517 | | 2.0863 | 7.01 | 832000 | 1.9523 | | 2.0863 | 7.07 | 840000 | 1.9441 | | 2.0735 | 7.14 | 848000 | 1.9430 | | 2.0735 | 7.21 | 856000 | 1.9517 | | 2.0808 | 7.28 | 864000 | 1.9442 | | 2.0808 | 7.34 | 872000 | 1.9414 | | 2.0756 | 7.41 | 880000 | 1.9439 | | 2.0756 | 7.48 | 888000 | 1.9428 | | 2.0799 | 7.55 | 896000 | 1.9472 | | 2.0799 | 7.61 | 904000 | 1.9426 | | 2.0717 | 7.68 | 912000 | 1.9461 | | 2.0717 | 7.75 | 920000 | 1.9583 | | 2.0799 | 7.81 | 928000 | 1.9433 | | 2.0799 | 7.88 | 936000 | 1.9442 | | 2.0704 | 7.95 | 944000 | 1.9396 | | 2.0704 | 8.02 | 952000 | 1.9409 | | 2.0785 | 8.08 | 960000 | 1.9520 | | 2.0785 | 8.15 | 968000 | 1.9409 | | 2.0761 | 8.22 | 976000 | 1.9469 | | 2.0761 | 8.29 | 984000 | 1.9372 | | 2.0739 | 8.35 | 992000 | 1.9385 | | 2.0739 | 8.42 | 1000000 | 1.9540 | | 2.0761 | 8.49 | 1008000 | 1.9488 | | 2.0761 | 8.56 | 1016000 | 1.9464 | | 2.0725 | 8.62 | 1024000 | 1.9466 | | 2.0725 | 8.69 | 1032000 | 1.9460 | | 2.0704 | 8.76 | 1040000 | 1.9449 | | 2.0704 | 8.83 | 1048000 | 1.9493 | | 2.0734 | 8.89 | 1056000 | 1.9463 | | 2.0734 | 8.96 | 1064000 | 1.9403 | | 2.0744 | 9.03 | 1072000 | 1.9467 | | 2.0744 | 9.09 | 1080000 | 1.9406 | | 2.0776 | 9.16 | 1088000 | 1.9492 | | 2.0776 | 9.23 | 1096000 | 1.9433 | | 2.068 | 9.3 | 1104000 | 1.9450 | | 2.068 | 9.36 | 1112000 | 1.9473 | | 2.0755 | 9.43 | 1120000 | 1.9459 | | 2.0755 | 9.5 | 1128000 | 1.9563 | | 2.0783 | 9.57 | 1136000 | 1.9369 | | 2.0783 | 9.63 | 1144000 | 1.9461 | | 2.0776 | 9.7 | 1152000 | 1.9494 | | 2.0776 | 9.77 | 1160000 | 1.9312 | | 2.0757 | 9.84 | 1168000 | 1.9452 | | 2.0757 | 9.9 | 1176000 | 1.9425 | | 2.0776 | 9.97 | 1184000 | 1.9536 | | 2.0776 | 10.04 | 1192000 | 1.9351 | | 2.0769 | 10.11 | 1200000 | 1.9301 | | 2.0769 | 10.17 | 1208000 | 1.9464 | | 2.071 | 10.24 | 1216000 | 1.9410 | | 2.071 | 10.31 | 1224000 | 1.9321 | | 2.0702 | 10.37 | 1232000 | 1.9406 | | 2.0702 | 10.44 | 1240000 | 1.9525 | | 2.0716 | 10.51 | 1248000 | 1.9418 | | 2.0716 | 10.58 | 1256000 | 1.9373 | | 2.0753 | 10.64 | 1264000 | 1.9363 | | 2.0753 | 10.71 | 1272000 | 1.9504 | | 2.0757 | 10.78 | 1280000 | 1.9376 | | 2.0757 | 10.85 | 1288000 | 1.9351 | | 2.0656 | 10.91 | 1296000 | 1.9445 | | 2.0656 | 10.98 | 1304000 | 1.9282 | | 2.0732 | 11.05 | 1312000 | 1.9437 | | 2.0732 | 11.12 | 1320000 | 1.9501 | | 2.0756 | 11.18 | 1328000 | 1.9379 | | 2.0756 | 11.25 | 1336000 | 1.9430 | | 2.0885 | 11.32 | 1344000 | 1.9392 | | 2.0885 | 11.39 | 1352000 | 1.9344 | | 2.0758 | 11.45 | 1360000 | 1.9364 | | 2.0758 | 11.52 | 1368000 | 1.9404 | | 2.0693 | 11.59 | 1376000 | 1.9347 | | 2.0693 | 11.65 | 1384000 | 1.9438 | | 2.0675 | 11.72 | 1392000 | 1.9367 | | 2.0675 | 11.79 | 1400000 | 1.9438 | | 2.0731 | 11.86 | 1408000 | 1.9327 | | 2.0731 | 11.92 | 1416000 | 1.9341 | | 2.0774 | 11.99 | 1424000 | 1.9390 | | 2.0774 | 12.06 | 1432000 | 1.9457 | | 2.0738 | 12.13 | 1440000 | 1.9437 | | 2.0738 | 12.19 | 1448000 | 1.9353 | | 2.0667 | 12.26 | 1456000 | 1.9424 | | 2.0667 | 12.33 | 1464000 | 1.9435 | | 2.0674 | 12.4 | 1472000 | 1.9336 | | 2.0674 | 12.46 | 1480000 | 1.9461 | | 2.0704 | 12.53 | 1488000 | 1.9458 | | 2.0704 | 12.6 | 1496000 | 1.9397 | | 2.0691 | 12.67 | 1504000 | 1.9438 | | 2.0691 | 12.73 | 1512000 | 1.9325 | | 2.0727 | 12.8 | 1520000 | 1.9359 | | 2.0727 | 12.87 | 1528000 | 1.9427 | | 2.0715 | 12.93 | 1536000 | 1.9491 | | 2.0715 | 13.0 | 1544000 | 1.9351 | | 2.0692 | 13.07 | 1552000 | 1.9246 | | 2.0692 | 13.14 | 1560000 | 1.9457 | | 2.0711 | 13.2 | 1568000 | 1.9406 | | 2.0711 | 13.27 | 1576000 | 1.9458 | | 2.0735 | 13.34 | 1584000 | 1.9356 | | 2.0735 | 13.41 | 1592000 | 1.9443 | | 2.0707 | 13.47 | 1600000 | 1.9309 | | 2.0707 | 13.54 | 1608000 | 1.9367 | | 2.0776 | 13.61 | 1616000 | 1.9390 | | 2.0776 | 13.68 | 1624000 | 1.9391 | | 2.074 | 13.74 | 1632000 | 1.9459 | | 2.074 | 13.81 | 1640000 | 1.9316 | | 2.0681 | 13.88 | 1648000 | 1.9355 | | 2.0681 | 13.95 | 1656000 | 1.9428 | | 2.0671 | 14.01 | 1664000 | 1.9286 | | 2.0671 | 14.08 | 1672000 | 1.9374 | | 2.0672 | 14.15 | 1680000 | 1.9413 | | 2.0672 | 14.21 | 1688000 | 1.9372 | | 2.0675 | 14.28 | 1696000 | 1.9317 | | 2.0675 | 14.35 | 1704000 | 1.9432 | | 2.0665 | 14.42 | 1712000 | 1.9444 | | 2.0665 | 14.48 | 1720000 | 1.9393 | | 2.0645 | 14.55 | 1728000 | 1.9462 | | 2.0645 | 14.62 | 1736000 | 1.9374 | | 2.0712 | 14.69 | 1744000 | 1.9367 | | 2.0712 | 14.75 | 1752000 | 1.9407 | | 2.0689 | 14.82 | 1760000 | 1.9361 | | 2.0689 | 14.89 | 1768000 | 1.9395 | | 2.0657 | 14.96 | 1776000 | 1.9389 | | 2.0657 | 15.02 | 1784000 | 1.9396 | | 2.0781 | 15.09 | 1792000 | 1.9406 | | 2.0781 | 15.16 | 1800000 | 1.9366 | | 2.0631 | 15.23 | 1808000 | 1.9357 | | 2.0631 | 15.29 | 1816000 | 1.9456 | | 2.0738 | 15.36 | 1824000 | 1.9325 | | 2.0738 | 15.43 | 1832000 | 1.9377 | | 2.0726 | 15.49 | 1840000 | 1.9405 | | 2.0726 | 15.56 | 1848000 | 1.9333 | | 2.0699 | 15.63 | 1856000 | 1.9369 | | 2.0699 | 15.7 | 1864000 | 1.9418 | | 2.0764 | 15.76 | 1872000 | 1.9363 | | 2.0764 | 15.83 | 1880000 | 1.9375 | | 2.0779 | 15.9 | 1888000 | 1.9335 | | 2.0779 | 15.97 | 1896000 | 1.9455 | | 2.0693 | 16.03 | 1904000 | 1.9447 | | 2.0693 | 16.1 | 1912000 | 1.9349 | | 2.0716 | 16.17 | 1920000 | 1.9339 | | 2.0716 | 16.24 | 1928000 | 1.9426 | | 2.075 | 16.3 | 1936000 | 1.9439 | | 2.075 | 16.37 | 1944000 | 1.9334 | | 2.0751 | 16.44 | 1952000 | 1.9466 | | 2.0751 | 16.51 | 1960000 | 1.9397 | | 2.0734 | 16.57 | 1968000 | 1.9367 | | 2.0734 | 16.64 | 1976000 | 1.9349 | | 2.0685 | 16.71 | 1984000 | 1.9510 | | 2.0685 | 16.77 | 1992000 | 1.9428 | | 2.0706 | 16.84 | 2000000 | 1.9509 | | 2.0706 | 16.91 | 2008000 | 1.9403 | | 2.0716 | 16.98 | 2016000 | 1.9384 | | 2.0716 | 17.04 | 2024000 | 1.9355 | | 2.0741 | 17.11 | 2032000 | 1.9308 | | 2.0741 | 17.18 | 2040000 | 1.9395 | | 2.0714 | 17.25 | 2048000 | 1.9502 | | 2.0714 | 17.31 | 2056000 | 1.9337 | | 2.0696 | 17.38 | 2064000 | 1.9383 | | 2.0696 | 17.45 | 2072000 | 1.9451 | | 2.0729 | 17.52 | 2080000 | 1.9373 | | 2.0729 | 17.58 | 2088000 | 1.9366 | | 2.0716 | 17.65 | 2096000 | 1.9334 | | 2.0716 | 17.72 | 2104000 | 1.9417 | | 2.074 | 17.79 | 2112000 | 1.9408 | | 2.074 | 17.85 | 2120000 | 1.9258 | | 2.0745 | 17.92 | 2128000 | 1.9385 | | 2.0745 | 17.99 | 2136000 | 1.9409 | | 2.074 | 18.05 | 2144000 | 1.9342 | | 2.074 | 18.12 | 2152000 | 1.9437 | | 2.0666 | 18.19 | 2160000 | 1.9406 | | 2.0666 | 18.26 | 2168000 | 1.9382 | | 2.0657 | 18.32 | 2176000 | 1.9398 | | 2.0657 | 18.39 | 2184000 | 1.9247 | | 2.0692 | 18.46 | 2192000 | 1.9377 | | 2.0692 | 18.53 | 2200000 | 1.9423 | | 2.0726 | 18.59 | 2208000 | 1.9395 | | 2.0726 | 18.66 | 2216000 | 1.9286 | | 2.0688 | 18.73 | 2224000 | 1.9357 | | 2.0688 | 18.8 | 2232000 | 1.9267 | | 2.0732 | 18.86 | 2240000 | 1.9293 | | 2.0732 | 18.93 | 2248000 | 1.9415 | | 2.0697 | 19.0 | 2256000 | 1.9456 | | 2.0697 | 19.07 | 2264000 | 1.9331 | | 2.0747 | 19.13 | 2272000 | 1.9439 | | 2.0747 | 19.2 | 2280000 | 1.9294 | | 2.072 | 19.27 | 2288000 | 1.9305 | | 2.072 | 19.33 | 2296000 | 1.9401 | | 2.0609 | 19.4 | 2304000 | 1.9362 | | 2.0609 | 19.47 | 2312000 | 1.9451 | | 2.073 | 19.54 | 2320000 | 1.9352 | | 2.073 | 19.6 | 2328000 | 1.9380 | | 2.0793 | 19.67 | 2336000 | 1.9392 | | 2.0793 | 19.74 | 2344000 | 1.9438 | | 2.0787 | 19.81 | 2352000 | 1.9403 | | 2.0787 | 19.87 | 2360000 | 1.9380 | | 2.0694 | 19.94 | 2368000 | 1.9275 | | 2.0694 | 20.01 | 2376000 | 1.9344 | | 2.0649 | 20.08 | 2384000 | 1.9443 | | 2.0649 | 20.14 | 2392000 | 1.9401 | | 2.0727 | 20.21 | 2400000 | 1.9447 | ### Framework versions - Transformers 4.35.0.dev0 - Pytorch 2.0.1+cu117 - Datasets 2.14.5 - Tokenizers 0.14.0
{"license": "mit", "tags": ["generated_from_trainer"], "base_model": "cardiffnlp/twitter-roberta-base-2019-90m", "model-index": [{"name": "2020-Q2-75p-filtered-random", "results": []}]}
fill-mask
DouglasPontes/2020-Q2-75p-filtered-random
[ "transformers", "pytorch", "roberta", "fill-mask", "generated_from_trainer", "base_model:cardiffnlp/twitter-roberta-base-2019-90m", "license:mit", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2024-02-13T19:18:58+00:00
[]
[]
TAGS #transformers #pytorch #roberta #fill-mask #generated_from_trainer #base_model-cardiffnlp/twitter-roberta-base-2019-90m #license-mit #autotrain_compatible #endpoints_compatible #region-us
2020-Q2-75p-filtered-random =========================== This model is a fine-tuned version of cardiffnlp/twitter-roberta-base-2019-90m on an unknown dataset. It achieves the following results on the evaluation set: * Loss: 1.9264 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 4.1e-07 * train\_batch\_size: 16 * eval\_batch\_size: 16 * seed: 42 * optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08 * lr\_scheduler\_type: linear * training\_steps: 2400000 ### Training results ### Framework versions * Transformers 4.35.0.dev0 * Pytorch 2.0.1+cu117 * Datasets 2.14.5 * Tokenizers 0.14.0
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 4.1e-07\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* training\\_steps: 2400000", "### Training results", "### Framework versions\n\n\n* Transformers 4.35.0.dev0\n* Pytorch 2.0.1+cu117\n* Datasets 2.14.5\n* Tokenizers 0.14.0" ]
[ "TAGS\n#transformers #pytorch #roberta #fill-mask #generated_from_trainer #base_model-cardiffnlp/twitter-roberta-base-2019-90m #license-mit #autotrain_compatible #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 4.1e-07\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* training\\_steps: 2400000", "### Training results", "### Framework versions\n\n\n* Transformers 4.35.0.dev0\n* Pytorch 2.0.1+cu117\n* Datasets 2.14.5\n* Tokenizers 0.14.0" ]
[ 68, 99, 4, 36 ]
[ "passage: TAGS\n#transformers #pytorch #roberta #fill-mask #generated_from_trainer #base_model-cardiffnlp/twitter-roberta-base-2019-90m #license-mit #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 4.1e-07\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* training\\_steps: 2400000### Training results### Framework versions\n\n\n* Transformers 4.35.0.dev0\n* Pytorch 2.0.1+cu117\n* Datasets 2.14.5\n* Tokenizers 0.14.0" ]
[ -0.11636469513177872, 0.0374920628964901, -0.0023877108469605446, 0.12173619121313095, 0.14796839654445648, 0.02338266558945179, 0.11255091428756714, 0.12022665143013, -0.09211784601211548, 0.03922338783740997, 0.1639150083065033, 0.14658480882644653, 0.0033937087282538414, 0.17548437416553497, -0.02247615158557892, -0.26358968019485474, -0.01664688065648079, 0.035683341324329376, -0.05968266725540161, 0.14450877904891968, 0.11273184418678284, -0.13672815263271332, 0.0918685793876648, 0.010280945338308811, -0.21466630697250366, 0.0038081014063209295, 0.018149210140109062, -0.05080239102244377, 0.1505923867225647, 0.004567424766719341, 0.12552379071712494, 0.004073940683156252, 0.0838128924369812, -0.1429458111524582, 0.019442562013864517, 0.04116683825850487, -0.0029734019190073013, 0.08862502872943878, 0.0004030554264318198, -0.015087228268384933, 0.10711544752120972, -0.0814312994480133, 0.061319515109062195, 0.019665943458676338, -0.15593968331813812, -0.2219081073999405, -0.09022660553455353, 0.050376053899526596, 0.08121716976165771, 0.09755876660346985, -0.00973249040544033, 0.15057028830051422, -0.0881449356675148, 0.08784180879592896, 0.2319309264421463, -0.31431788206100464, -0.07873780280351639, 0.02697363682091236, 0.04352298378944397, 0.034689318388700485, -0.09762581437826157, -0.01737610436975956, 0.048357389867305756, 0.06513139605522156, 0.14362633228302002, -0.024981018155813217, -0.03159512206912041, 0.004705184604972601, -0.1219104528427124, -0.0506431981921196, 0.11263933777809143, 0.040749065577983856, -0.05255407094955444, -0.039711225777864456, -0.04044175520539284, -0.15751759707927704, -0.050459787249565125, -0.002335791476070881, 0.033565983176231384, -0.03593877702951431, -0.12841320037841797, -0.0053458139300346375, -0.10073855519294739, -0.06596270203590393, -0.0735229030251503, 0.1745215356349945, 0.025760894641280174, 0.015349007211625576, -0.02485809102654457, 0.11582809686660767, -0.05828782916069031, -0.14290516078472137, 0.010535375215113163, 0.02620396576821804, 0.015447981655597687, -0.047801289707422256, -0.07097576558589935, -0.05202902853488922, -0.0037263210397213697, 0.1584612876176834, -0.03790978342294693, 0.02293865568935871, 0.07018900662660599, 0.05196629464626312, -0.09062322229146957, 0.17084267735481262, -0.058701369911432266, -0.032304517924785614, 0.029544593766331673, 0.055084001272916794, 0.021180301904678345, -0.0009259246289730072, -0.12563183903694153, -0.006908819545060396, 0.059654101729393005, -0.0008845959673635662, -0.051965758204460144, 0.0789761170744896, -0.060391053557395935, -0.0066838390193879604, 0.046774156391620636, -0.0753282830119133, 0.017761239781975746, -0.028540654107928276, -0.08167579025030136, -0.06312546879053116, 0.0301993228495121, 0.021033875644207, 0.01082602795213461, 0.12210622429847717, -0.10136006772518158, 0.019379982724785805, -0.09066049009561539, -0.10690348595380783, 0.0029348854441195726, -0.10736742615699768, 0.027361463755369186, -0.10937206447124481, -0.20380103588104248, -0.006655615288764238, 0.054365698248147964, -0.029563115909695625, -0.06804480403661728, -0.045585110783576965, -0.07483857125043869, 0.01252047624439001, -0.0056431968696415424, 0.12448962777853012, -0.057544611394405365, 0.12433075904846191, 0.05509607866406441, 0.08825674653053284, -0.07439932227134705, 0.044877585023641586, -0.11925310641527176, 0.016419224441051483, -0.22594474256038666, 0.03434821590781212, -0.026216691359877586, 0.09802015870809555, -0.07936249673366547, -0.11456464976072311, -0.008654279634356499, -0.004868588410317898, 0.09702347218990326, 0.09477343410253525, -0.16114741563796997, -0.07817427814006805, 0.16643677651882172, -0.05310633406043053, -0.11078514158725739, 0.1373467594385147, -0.07689767330884933, 0.08091967552900314, 0.06504141539335251, 0.14499609172344208, 0.05635026469826698, -0.12637250125408173, 0.039180390536785126, -0.03698894381523132, 0.013756299391388893, -0.057034458965063095, 0.061869505792856216, 0.0025312495417892933, 0.03044968843460083, 0.028503447771072388, -0.03767721727490425, 0.059171661734580994, -0.10643116384744644, -0.0896122008562088, -0.03163863718509674, -0.10929498076438904, 0.08192756026983261, 0.05683461204171181, 0.07456602901220322, -0.13767988979816437, -0.10324884951114655, 0.003254322102293372, 0.07088818401098251, -0.031025925651192665, 0.017141573131084442, -0.08619705587625504, 0.10093064606189728, -0.04435229301452637, -0.033522896468639374, -0.1587762087583542, 0.001631816034205258, 0.001737403217703104, 0.0577983520925045, 0.03398424759507179, -0.026760801672935486, 0.08218883723020554, 0.05879199132323265, -0.057375114411115646, -0.007077888585627079, -0.05794805660843849, -0.010350391268730164, -0.12779764831066132, -0.17537999153137207, -0.04812363162636757, -0.03089975193142891, 0.1273757964372635, -0.18899214267730713, 0.039461929351091385, -0.04491544887423515, 0.09583074599504471, 0.005133313592523336, -0.018031045794487, -0.05125604569911957, 0.0808260515332222, -0.017515700310468674, -0.047184091061353683, 0.0717918649315834, 0.0173241775482893, -0.08020085841417313, -0.023717334493994713, -0.08999843895435333, 0.17535297572612762, 0.139180988073349, -0.11728759109973907, -0.10518643260002136, 0.025046173483133316, -0.06156634911894798, -0.020441671833395958, -0.038855019956827164, 0.027499869465827942, 0.18188625574111938, -0.010554071515798569, 0.14603640139102936, -0.07854558527469635, -0.04703349247574806, 0.04076415300369263, -0.04761236906051636, 0.034957367926836014, 0.09937797486782074, 0.08881406486034393, -0.09273497760295868, 0.13256414234638214, 0.13492648303508759, -0.07676944881677628, 0.1321885734796524, -0.012806870974600315, -0.05649830773472786, -0.04333826154470444, -0.036332935094833374, -0.005074921529740095, 0.13493695855140686, -0.11170928180217743, -0.014940210618078709, 0.01450180634856224, -0.003345699980854988, 0.007029504980891943, -0.2144838571548462, -0.06668828427791595, 0.04255779832601547, -0.0348735935986042, -0.07911410182714462, 0.014676098711788654, 0.007047128863632679, 0.10229170322418213, 0.03381732851266861, -0.07429873198270798, 0.0476100780069828, 0.014549537561833858, -0.06845558434724808, 0.19828978180885315, -0.07529694586992264, -0.1295512318611145, -0.1114891842007637, -0.08988355100154877, -0.0069991182535886765, 0.024816177785396576, 0.06006162241101265, -0.0781957283616066, -0.01500639971345663, -0.05475758761167526, -0.0021008122712373734, -0.0034842644818127155, 0.027900902554392815, 0.0049751014448702335, -0.011644446291029453, 0.059682272374629974, -0.08582266420125961, -0.01982131041586399, -0.055097825825214386, -0.047233860939741135, 0.057934727519750595, 0.0165674090385437, 0.11828717589378357, 0.11750439554452896, -0.04461250454187393, 0.015190690755844116, -0.042692895978689194, 0.26415297389030457, -0.08372768759727478, -0.022110167890787125, 0.13176052272319794, 0.009884105063974857, 0.0656794086098671, 0.12050881236791611, 0.06388964504003525, -0.09267722815275192, 0.0031207986176013947, 0.003345428965985775, -0.051012661308050156, -0.17409420013427734, -0.04238920658826828, -0.054321181029081345, -0.025163274258375168, 0.10662292689085007, 0.016036605462431908, 0.05577115714550018, 0.0792318657040596, 0.03635884076356888, 0.06711331009864807, -0.057047512382268906, 0.07675088942050934, 0.09177298098802567, 0.05252106860280037, 0.12977610528469086, -0.04039955884218216, -0.06674373894929886, 0.02489824779331684, -0.016124164685606956, 0.2027513086795807, 0.002165405545383692, 0.09112972021102905, 0.062108904123306274, 0.20455053448677063, 0.006615753751248121, 0.07485414296388626, -0.009003439918160439, -0.060017429292201996, -0.01396053284406662, -0.03418601304292679, -0.04107256606221199, 0.015065377578139305, -0.025263935327529907, 0.05800960212945938, -0.13120637834072113, -0.01839127019047737, 0.04278950393199921, 0.28033116459846497, 0.042344603687524796, -0.3657715618610382, -0.12404582649469376, -0.02066134661436081, -0.033675745129585266, -0.017681455239653587, 0.007399213965982199, 0.09694540500640869, -0.10000387579202652, 0.02641502022743225, -0.054574254900217056, 0.08694847673177719, 0.016804805025458336, 0.04780493676662445, 0.0697101354598999, 0.09725406765937805, -0.01216763537377119, 0.06312695890665054, -0.2940155267715454, 0.31647616624832153, -0.0009943152545019984, 0.09724273532629013, -0.06511767208576202, -0.024363387376070023, 0.02416185289621353, 0.054977014660835266, 0.09279108792543411, -0.006570268422365189, -0.025975694879889488, -0.21089719235897064, -0.03879842162132263, 0.016347307711839676, 0.08493711054325104, -0.036546893417835236, 0.11168045550584793, -0.009268410503864288, 0.0028257095254957676, 0.05501193553209305, 0.015079417265951633, -0.045221760869026184, -0.06933906674385071, -0.018515978008508682, 0.017475184053182602, -0.03990117833018303, -0.06684622913599014, -0.10587241500616074, -0.09092167019844055, 0.13543905317783356, 0.007037903647869825, -0.036778565496206284, -0.10973585397005081, 0.0800745040178299, 0.07146449387073517, -0.0825556069612503, 0.06106291711330414, 0.00971591379493475, 0.060079336166381836, -0.005540413316339254, -0.0428687147796154, 0.11713667958974838, -0.07013958692550659, -0.1631205976009369, -0.0789978876709938, 0.12581829726696014, 0.049969837069511414, 0.06756478548049927, 0.011181419715285301, 0.03562912717461586, -0.05609413608908653, -0.07024767994880676, 0.054499123245477676, -0.07575123012065887, 0.08078634738922119, -0.000013797905012324918, -0.011060550808906555, 0.036700911819934845, -0.06474920362234116, -0.014054334722459316, 0.16157488524913788, 0.2744290232658386, -0.11685330420732498, 0.020597700029611588, 0.027737583965063095, -0.03814752399921417, -0.18047486245632172, 0.030613187700510025, 0.05447310954332352, 0.03505166992545128, 0.03385462984442711, -0.15659597516059875, 0.07936949282884598, 0.08849282562732697, -0.014271462336182594, 0.10761528462171555, -0.2845722436904907, -0.12780754268169403, 0.09720969200134277, 0.12488920241594315, 0.18597500026226044, -0.12011361867189407, -0.012863416224718094, -0.02416853979229927, -0.16028177738189697, 0.07374859601259232, -0.050365664064884186, 0.12463956326246262, -0.03242744132876396, 0.14493845403194427, 0.006038660649210215, -0.06000911816954613, 0.10654549300670624, 0.002918300684541464, 0.09684165567159653, -0.07049502432346344, -0.044366296380758286, 0.07230136543512344, -0.03163786977529526, 0.007222177926450968, -0.06879955530166626, 0.0318802073597908, -0.09698976576328278, -0.005893247202038765, -0.10099371522665024, 0.04818352684378624, -0.03529456630349159, -0.05497775599360466, -0.04066504165530205, 0.041874807327985764, 0.034345678985118866, -0.011534261517226696, 0.08779705315828323, 0.0036127613857388496, 0.18238535523414612, 0.06422969698905945, 0.0681498795747757, -0.060590941458940506, -0.04806721210479736, 0.0032605554442852736, -0.04131939262151718, 0.05792964994907379, -0.1571376919746399, 0.004298022016882896, 0.12349225580692291, 0.046012453734874725, 0.1142355278134346, 0.07534758001565933, -0.03525993227958679, 0.032922305166721344, 0.08120465278625488, -0.1691618710756302, -0.04766615852713585, 0.005346063058823347, -0.06258609890937805, -0.09700419008731842, 0.03620229288935661, 0.08775260299444199, -0.08411271125078201, -0.027146248146891594, -0.018699556589126587, -0.006708715111017227, -0.08311883360147476, 0.187168151140213, 0.06968231499195099, 0.04853147640824318, -0.09357310086488724, 0.040729764848947525, 0.03684013709425926, -0.04389893263578415, 0.006277688313275576, 0.062138937413692474, -0.08489940315485, -0.03288326412439346, 0.02690998837351799, 0.13731057941913605, -0.05791594460606575, -0.01256565935909748, -0.15227775275707245, -0.10659994184970856, 0.07519809901714325, 0.20885814726352692, 0.09618151187896729, -0.006214609369635582, -0.041639294475317, 0.02033732831478119, -0.12453042715787888, 0.06925833970308304, 0.05967395380139351, 0.0688871294260025, -0.1083441898226738, 0.19055825471878052, -0.013527859933674335, 0.05939541012048721, -0.02676212787628174, 0.026311282068490982, -0.09038709104061127, 0.023527588695287704, -0.08699867874383926, -0.05154218524694443, -0.030577749013900757, -0.01549763698130846, -0.021115189418196678, -0.06857579946517944, -0.05877121910452843, 0.009192686527967453, -0.11612674593925476, -0.02119511365890503, 0.0523981899023056, 0.021219074726104736, -0.10407398641109467, -0.036050450056791306, 0.04487835243344307, -0.052383508533239365, 0.07380423694849014, 0.0707024484872818, 0.01897057332098484, 0.03374842181801796, -0.14057950675487518, -0.003543603466823697, 0.030744867399334908, -0.016701864078640938, 0.07003219425678253, -0.07729381322860718, -0.007840093225240707, -0.015596565790474415, 0.0641297921538353, 0.02688620239496231, 0.0767514780163765, -0.13717129826545715, 0.026588764041662216, 0.03386104851961136, -0.0729464516043663, -0.06725138425827026, 0.011749588884413242, 0.07562889903783798, 0.013136153109371662, 0.18795308470726013, -0.09292003512382507, 0.04969516023993492, -0.2099715620279312, -0.011526528745889664, -0.014954622834920883, -0.10270190238952637, -0.1176641583442688, -0.0524163618683815, 0.06836669147014618, -0.050556622445583344, 0.13502374291419983, 0.01635655015707016, 0.01748725026845932, 0.0346592515707016, -0.013550053350627422, 0.010771659202873707, 0.004667272791266441, 0.19480732083320618, 0.021919209510087967, -0.0547507144510746, 0.05741841346025467, 0.06297776848077774, 0.07832437753677368, 0.08861824870109558, 0.17625251412391663, 0.15642885863780975, 0.08251985162496567, 0.0851081982254982, 0.05402268096804619, -0.00923155713826418, -0.14841926097869873, 0.00021421666315291077, -0.015184752643108368, 0.07057009637355804, -0.026930656284093857, 0.19406788051128387, 0.11383974552154541, -0.16837719082832336, 0.04628028720617294, -0.05136144161224365, -0.06954070180654526, -0.0899176374077797, -0.07708035409450531, -0.07212530076503754, -0.14124315977096558, 0.02501044236123562, -0.09593784809112549, 0.023166228085756302, 0.10661362111568451, -0.004391107242554426, -0.0319465808570385, 0.1493789702653885, 0.003187549766153097, 0.03528589382767677, 0.07537847012281418, -0.0057254983112216, -0.010865243151783943, -0.07236230373382568, -0.06594471633434296, -0.016836507245898247, -0.030564645305275917, 0.03706028312444687, -0.06150184944272041, -0.06465770304203033, 0.02782965637743473, -0.014127294532954693, -0.11013343185186386, 0.01038474589586258, 0.04189646244049072, 0.07341721653938293, 0.030246274545788765, 0.0027928885538131, 0.03353730961680412, -0.016185306012630463, 0.21438531577587128, -0.06773596256971359, -0.07824812084436417, -0.107664555311203, 0.24079535901546478, 0.040728528052568436, -0.027505388483405113, 0.021918274462223053, -0.06023706495761871, 0.02594738081097603, 0.24914921820163727, 0.21692071855068207, -0.08771921694278717, 0.008779069408774376, -0.005206046625971794, -0.014669789932668209, -0.01751120388507843, 0.09362853318452835, 0.12091982364654541, 0.032388295978307724, -0.09212638437747955, -0.04667915031313896, -0.07887230813503265, -0.006397313438355923, -0.041562605649232864, 0.02823956124484539, 0.02667245641350746, 0.004400291014462709, -0.054121650755405426, 0.06044403463602066, -0.05386604741215706, -0.10161062330007553, 0.07609890401363373, -0.21821758151054382, -0.1488495022058487, -0.0006759882089681923, 0.04240753874182701, 0.033778116106987, 0.08799058198928833, -0.026354214176535606, 0.009021082893013954, 0.07696053385734558, -0.02314845286309719, -0.06022382900118828, -0.09946729242801666, 0.10753278434276581, -0.09524912387132645, 0.21753434836864471, -0.055901382118463516, 0.06798763573169708, 0.12801583111286163, 0.07098929584026337, -0.0702962726354599, 0.0616007000207901, 0.04365628957748413, -0.04051509499549866, 0.013136353343725204, 0.09869316220283508, -0.017540717497467995, 0.05255104601383209, 0.04625944793224335, -0.13708725571632385, 0.038751859217882156, -0.08015932142734528, -0.03945973142981529, -0.05480131134390831, -0.009387045167386532, -0.04417979717254639, 0.13154985010623932, 0.21742680668830872, -0.040093984454870224, -0.006217358633875847, -0.07215675711631775, 0.005226569715887308, 0.08751948922872543, 0.0068515995517373085, -0.09614334255456924, -0.22692422568798065, 0.0051860143430531025, 0.0874495729804039, -0.037651050835847855, -0.27700111269950867, -0.09169138967990875, -0.011394159868359566, -0.07691892236471176, -0.05618215724825859, 0.09220390766859055, 0.0835980474948883, 0.05755927041172981, -0.048898518085479736, -0.05679700896143913, -0.06490446627140045, 0.16352665424346924, -0.1446819007396698, -0.09425539523363113 ]
null
null
transformers
Dataset: [Rhenus spainDupli_BitronDupli_ElantasDupli2_ejot_bormioli_FEB2024](https://drive.google.com/drive/folders/1pC3Ww9aczRY9UuooMhJJLRkwCUR2Wn7J) SpainRhenus: 3 documents Bitron, IDL and Fabregas ItalyRhenus: 3 docuemnts Elantas, Ejot & Bormioli Model Training: | Step | Training Loss | Validation Loss | Precision | Recall | F1 | Accuracy | |------|---------------|-----------------|-----------|--------|----|----------| | 500 | 0.950700 | 0.668345 | 0.249560 | 0.102508 | 0.145324 | 0.851723 | | 1000 | 0.451900 | 0.436430 | 0.510847 | 0.352147 | 0.416905 | 0.906756 | | 1500 | 0.287200 | 0.351042 | 0.583800 | 0.528461 | 0.554754 | 0.923715 | | 2000 | 0.192200 | 0.327971 | 0.622079 | 0.603473 | 0.612635 | 0.930916 | | 2500 | 0.153900 | 0.273926 | 0.702899 | 0.655089 | 0.678152 | 0.943662 | | 3000 | 0.120500 | 0.270737 | 0.741608 | 0.687410 | 0.713481 | 0.946271 | | 3500 | 0.095100 | 0.242222 | 0.727451 | 0.717800 | 0.722593 | 0.947630 | | 4000 | 0.079500 | 0.252485 | 0.750789 | 0.745538 | 0.748154 | 0.949750 | | 4500 | 0.066700 | 0.272716 | 0.752587 | 0.736614 | 0.744515 | 0.951245 | | 5000 | 0.062400 | 0.260085 | 0.780070 | 0.757115 | 0.768421 | 0.954125 | | 5500 | 0.055700 | 0.267790 | 0.795074 | 0.755186 | 0.774617 | 0.955539 | | 6000 | 0.044800 | 0.272648 | 0.783251 | 0.767004 | 0.775043 | 0.954914 | | 6500 | 0.043400 | 0.269749 | 0.801754 | 0.771587 | 0.786382 | 0.956653 | | 7000 | 0.042400 | 0.268811 | 0.793078 | 0.779305 | 0.786131 | 0.956109 | | 7500 | 0.039500 | 0.264429 | 0.794758 | 0.775205 | 0.784860 | 0.956544 | | 8000 | 0.034000 | 0.267346 | 0.787359 | 0.775205 | 0.781235 | 0.955865 |
{}
token-classification
sxandie/Rhenus_DupliSpain_DupliBit_DupliElnts2_EjtBorm_v21.65
[ "transformers", "pytorch", "layoutlmv3", "token-classification", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
2024-02-13T19:21:38+00:00
[]
[]
TAGS #transformers #pytorch #layoutlmv3 #token-classification #autotrain_compatible #endpoints_compatible #has_space #region-us
Dataset: Rhenus spainDupli\_BitronDupli\_ElantasDupli2\_ejot\_bormioli\_FEB2024 SpainRhenus: 3 documents Bitron, IDL and Fabregas ItalyRhenus: 3 docuemnts Elantas, Ejot & Bormioli Model Training:
[]
[ "TAGS\n#transformers #pytorch #layoutlmv3 #token-classification #autotrain_compatible #endpoints_compatible #has_space #region-us \n" ]
[ 45 ]
[ "passage: TAGS\n#transformers #pytorch #layoutlmv3 #token-classification #autotrain_compatible #endpoints_compatible #has_space #region-us \n" ]
[ -0.026217535138130188, 0.0350024588406086, -0.00670337351039052, 0.027072222903370857, 0.14151860773563385, 0.02824188582599163, 0.05301355943083763, 0.10596217960119247, -0.004615988582372665, 0.01658536121249199, 0.12593317031860352, 0.1932903528213501, -0.044866833835840225, 0.11713734269142151, -0.07379312068223953, -0.2699130177497864, 0.06998956948518753, 0.056505993008613586, -0.07891183346509933, 0.08968465030193329, 0.09112808853387833, -0.11423396319150925, 0.07696595042943954, -0.021724605932831764, -0.14679093658924103, 0.0464729443192482, -0.0013659866526722908, -0.11316036432981491, 0.10374444723129272, 0.027452776208519936, 0.18698738515377045, 0.03745978698134422, -0.0350612998008728, -0.10771362483501434, 0.027667824178934097, 0.04130326583981514, -0.07046934217214584, 0.07269278913736343, 0.06449330598115921, -0.05431167781352997, 0.016778534278273582, -0.021490350365638733, 0.01931827701628208, 0.020437652245163918, -0.127427875995636, -0.14035366475582123, -0.03239322826266289, 0.04766184836626053, 0.009757990948855877, 0.03250954672694206, 0.025585602968931198, 0.2161574512720108, -0.1543951779603958, 0.08930514752864838, 0.1550355851650238, -0.2921426296234131, -0.011538827791810036, 0.2141496241092682, 0.05413103476166725, -0.011169956997036934, -0.03599479794502258, 0.060134872794151306, 0.0327732190489769, 0.02088896557688713, 0.07383129745721817, -0.05880868062376976, -0.10827626287937164, 0.06140713766217232, -0.11556468904018402, -0.04686679318547249, 0.221230611205101, -0.04083637148141861, 0.09038747102022171, -0.025542570278048515, -0.10821716487407684, -0.06776424497365952, 0.006376858334988356, 0.014251935295760632, -0.012423178181052208, 0.026581428945064545, 0.007115350104868412, 0.005527074448764324, -0.12539470195770264, 0.03151228651404381, -0.21508854627609253, 0.19386431574821472, 0.0007508713169954717, 0.06624417752027512, -0.14516481757164001, 0.07547404617071152, -0.004924884997308254, -0.10301706939935684, 0.06265278905630112, -0.10291929543018341, -0.005930570885539055, -0.03590947389602661, -0.0673714205622673, 0.0262368842959404, 0.06922288984060287, 0.1063614934682846, 0.06542608886957169, 0.02270599454641342, 0.03159935772418976, 0.10194969177246094, 0.057663217186927795, 0.11286935955286026, -0.04994276165962219, -0.03668896108865738, 0.010117684490978718, -0.07741567492485046, 0.015683861449360847, -0.05941247567534447, -0.14681558310985565, -0.06233510747551918, 0.07275046408176422, 0.08407633006572723, 0.0528288334608078, 0.06707672774791718, -0.04594062268733978, -0.017210809513926506, 0.09324178099632263, -0.06906656175851822, 0.04298298805952072, -0.009484082460403442, 0.022528281435370445, 0.0868171751499176, 0.0023405191022902727, 0.0011702970368787646, -0.0022066570818424225, 0.09547831118106842, -0.07817955315113068, 0.004457850009202957, -0.058714911341667175, -0.11240264028310776, 0.04997067525982857, -0.1546163111925125, 0.03442179039120674, -0.1777021586894989, -0.05199471861124039, 0.020891083404421806, 0.04079736769199371, -0.005076034925878048, -0.04289134219288826, 0.06683509796857834, -0.03696032986044884, 0.035757869482040405, -0.06450066715478897, 0.007502530235797167, -0.061079200357198715, 0.048052914440631866, -0.028765743598341942, 0.09388814866542816, -0.1075233668088913, 0.06032705307006836, -0.0852767825126648, 0.03091370314359665, -0.10358598083257675, -0.016813335940241814, -0.06209569424390793, 0.1424708366394043, -0.014003323391079903, -0.07035768032073975, -0.07051335275173187, 0.03459852188825607, -0.03268611803650856, 0.08370784670114517, -0.1279350072145462, -0.0710844025015831, 0.08915263414382935, -0.09131908416748047, -0.11379636079072952, 0.06324567645788193, 0.007329116575419903, -0.026560785248875618, 0.006499595008790493, 0.15479756891727448, 0.08679163455963135, -0.05156519636511803, -0.00069562962744385, 0.13602472841739655, -0.10649234056472778, -0.12203729152679443, 0.033328231424093246, 0.023496082052588463, -0.06494620442390442, 0.03443503752350807, 0.053003665059804916, 0.0666537806391716, -0.057618290185928345, -0.05201864242553711, -0.06291906535625458, -0.021362489089369774, 0.11876068264245987, 0.052081264555454254, 0.11159620434045792, -0.05163324624300003, -0.011065348982810974, 0.06311475485563278, 0.03375823423266411, 0.04883668199181557, 0.02330232411623001, -0.04732424020767212, 0.15435993671417236, -0.093522809445858, -0.0070268139243125916, -0.19148214161396027, -0.1291397660970688, -0.02059871330857277, 0.06774909794330597, -0.030847938731312752, 0.22580762207508087, 0.06710577011108398, -0.04945243522524834, -0.004392198286950588, -0.02526395209133625, 0.128792867064476, 0.05640481784939766, -0.07766009122133255, -0.082404226064682, -0.030286241322755814, -0.08577947318553925, -0.05271495506167412, -0.046063244342803955, 0.0257300715893507, 0.08171632140874863, 0.1698233187198639, 0.004219277761876583, 0.08092767745256424, -0.0027282009832561016, 0.060893021523952484, -0.08268988132476807, -0.015670962631702423, 0.0885768011212349, -0.019489580765366554, -0.03082304075360298, 0.1495129019021988, -0.162500262260437, 0.34147176146507263, 0.20093943178653717, -0.25877514481544495, 0.004161737859249115, -0.0157513115555048, -0.03229796141386032, 0.025134731084108353, 0.005540153011679649, 0.007978098466992378, -0.02766094170510769, -0.022113678976893425, 0.1419159322977066, -0.01040372159332037, -0.040382277220487595, 0.006478512659668922, -0.06615158170461655, -0.07836631685495377, 0.05658028647303581, 0.07164642959833145, -0.15366633236408234, 0.200495645403862, 0.2970556318759918, -0.02711859531700611, 0.1185130700469017, 0.0008378209895454347, 0.024437902495265007, 0.01491202786564827, -0.06703603267669678, -0.0597672201693058, 0.03213847428560257, -0.16501453518867493, -0.054338518530130386, 0.09831307083368301, 0.04106820374727249, 0.06289476901292801, -0.13403604924678802, -0.0272274948656559, 0.03455449268221855, 0.06693893671035767, -0.018229342997074127, 0.12795400619506836, 0.0599190928041935, 0.08366282284259796, -0.002515563741326332, -0.08220494538545609, 0.08192186057567596, 0.011132832616567612, -0.026906711980700493, 0.12923741340637207, -0.13520053029060364, -0.28556036949157715, -0.1140674278140068, -0.14993321895599365, -0.04433026164770126, 0.037096936255693436, 0.06465069204568863, -0.0925711989402771, -0.05222322419285774, 0.02241784892976284, 0.0030133919790387154, -0.09179180860519409, 0.07371962815523148, -0.060251541435718536, 0.06463334709405899, -0.024772360920906067, -0.08177214115858078, -0.059387218207120895, -0.0331132672727108, -0.031525690108537674, 0.12651212513446808, -0.023545648902654648, 0.07330246269702911, 0.1680878847837448, -0.018288733437657356, 0.05100744217634201, -0.00900212675333023, 0.1745499223470688, -0.04859280213713646, 0.0006006861804053187, 0.20550008118152618, -0.004790117498487234, 0.07304424792528152, 0.18283645808696747, 0.04147649183869362, -0.01780381239950657, -0.002423712983727455, -0.025199929252266884, -0.12532806396484375, -0.13553601503372192, -0.15034236013889313, -0.1405337154865265, 0.0017411672743037343, 0.05270742252469063, 0.07428105175495148, 0.09602998942136765, 0.08313944935798645, 0.03635291010141373, 0.013006317429244518, -0.09454197436571121, 0.06238824501633644, 0.23569753766059875, -0.02945006638765335, 0.1583619862794876, -0.06138022243976593, -0.12910369038581848, 0.07005210220813751, 0.0930020734667778, 0.13122910261154175, 0.07070688158273697, -0.05811241641640663, 0.039727699011564255, 0.15639929473400116, 0.1617501974105835, 0.09845630824565887, 0.029060950502753258, -0.04548487067222595, -0.012775213457643986, -0.010028843767940998, -0.010856008157134056, 0.04319445416331291, 0.17082925140857697, -0.13231882452964783, -0.03700743988156319, -0.1310342252254486, 0.09143315255641937, 0.06956299394369125, 0.08954980224370956, -0.2060479074716568, 0.03503057733178139, 0.09432850033044815, 0.0029298937879502773, -0.05925653874874115, 0.03749734163284302, 0.015410752035677433, -0.10934817045927048, 0.06217198818922043, -0.006325799506157637, 0.09503279626369476, -0.03263614699244499, 0.0666857659816742, -0.040865957736968994, -0.08074583858251572, 0.030273951590061188, 0.09129683673381805, -0.22815096378326416, 0.23397590219974518, -0.0005143997841514647, -0.11387085169553757, -0.07332216203212738, -0.0005452648038044572, 0.056292835623025894, 0.2010566145181656, 0.034850236028432846, 0.030283650383353233, -0.13304828107357025, -0.18963459134101868, -0.013229195959866047, -0.011360232718288898, 0.07569006085395813, -0.003852982074022293, -0.0055311680771410465, -0.03384017571806908, -0.010009619407355785, 0.0045242346823215485, 0.08585461974143982, 0.0036007524468004704, -0.13079184293746948, 0.07363464683294296, 0.07221504300832748, -0.019303355365991592, 0.0025049131363630295, -0.0753852128982544, -0.19358044862747192, 0.20182444155216217, -0.020626043900847435, -0.039734382182359695, -0.1253928691148758, -0.08666118234395981, 0.10754097998142242, -0.06025875732302666, 0.06986456364393234, -0.09002895653247833, 0.04382606968283653, -0.05030876770615578, -0.17111903429031372, 0.1319304257631302, -0.12174028158187866, -0.02941996045410633, -0.05851098150014877, 0.10283015668392181, -0.14434459805488586, 0.04072243347764015, 0.008361591957509518, 0.06637407839298248, -0.12453741580247879, -0.09185497462749481, 0.02432897686958313, 0.0048921722918748856, 0.05631541088223457, 0.05485067144036293, -0.042497046291828156, -0.03277414292097092, 0.07152123004198074, 0.04066447168588638, 0.25638920068740845, 0.1880134642124176, -0.11218178272247314, 0.10795228183269501, 0.0643594041466713, -0.02226700820028782, -0.341745525598526, -0.05694353207945824, -0.13148058950901031, -0.016821203753352165, 0.028404273092746735, -0.09511025995016098, 0.10443772375583649, 0.010213585570454597, -0.06450606137514114, 0.05385679751634598, -0.1654060333967209, -0.07718155533075333, 0.1806754320859909, -0.03903840482234955, 0.3293173015117645, -0.09402964264154434, -0.05080857872962952, -0.017676249146461487, -0.12268482893705368, 0.09319700300693512, -0.04612760990858078, 0.08426490426063538, -0.037091925740242004, 0.03703925758600235, 0.04205253720283508, -0.07207474857568741, 0.14305227994918823, -0.003237425582483411, 0.05164334177970886, -0.11224823445081711, -0.12099632620811462, 0.07892951369285583, -0.06834113597869873, 0.005218821112066507, 0.00017310268594883382, 0.025992566719651222, -0.14856594800949097, 0.01515132188796997, -0.08819056302309036, 0.10028208792209625, 0.027960238978266716, -0.05299297720193863, -0.0372898206114769, -0.0011235402198508382, 0.0028497667517513037, -0.005875715054571629, 0.2722485363483429, 0.0005254537682048976, 0.15310490131378174, 0.1886884719133377, 0.03434703126549721, -0.15538254380226135, -0.045548006892204285, -0.022331183776259422, -0.0504622757434845, 0.09012597054243088, -0.0920729860663414, 0.0635446161031723, 0.10607200115919113, -0.04386894404888153, 0.0388558991253376, 0.11917567998170853, 0.04171154275536537, -0.03294893354177475, 0.1746012568473816, -0.17784282565116882, 0.019885394722223282, -0.019026732072234154, -0.0037834218237549067, 0.03377906233072281, 0.08871813863515854, 0.11325287073850632, 0.0314379557967186, -0.009370964020490646, 0.010315659455955029, -0.016654331237077713, -0.06239292398095131, 0.06452806293964386, 0.09182313829660416, 0.06300826370716095, -0.12308987975120544, 0.03216387704014778, 0.05675947293639183, -0.08925952017307281, -0.041292350739240646, 0.06258814036846161, -0.1143726110458374, -0.14281700551509857, -0.021997880190610886, 0.04191618040204048, -0.1000526025891304, -0.03156242147088051, -0.03265068680047989, -0.12389753013849258, 0.04978858307003975, 0.13963480293750763, 0.1325007528066635, 0.08278181403875351, -0.03744322434067726, -0.04210516810417175, -0.004949698690325022, -0.015856636688113213, -0.004894557408988476, 0.07155954837799072, -0.19797416031360626, 0.06061248853802681, -0.016810599714517593, 0.16686077415943146, -0.10030177980661392, -0.05124539136886597, -0.1473749876022339, 0.0138905243948102, -0.08079130202531815, -0.09398357570171356, -0.09777358174324036, -0.0418531596660614, 0.027719862759113312, -0.07372403889894485, -0.05566064640879631, -0.027000421658158302, -0.12965843081474304, 0.032201509922742844, -0.0070817130617797375, 0.06025144085288048, -0.057555217295885086, -0.04514193907380104, 0.07770728319883347, -0.03768689185380936, 0.08096204698085785, 0.04295668005943298, -0.05187562108039856, 0.044814106076955795, -0.024145498871803284, -0.1515660136938095, 0.1339455097913742, 0.03434858098626137, 0.10439180582761765, 0.026499716565012932, 0.012281595729291439, 0.04300089180469513, 0.04625862464308739, 0.050126779824495316, 0.041894566267728806, -0.10624668747186661, 0.032784268260002136, -0.06407579779624939, -0.1528337448835373, -0.0001891657302621752, -0.058936215937137604, 0.11023260653018951, 0.016413506120443344, 0.1257724016904831, 0.0077932123094797134, 0.05006461590528488, -0.07616747915744781, 0.01114988699555397, -0.056551866233348846, -0.18961061537265778, -0.017882784828543663, -0.039024826139211655, 0.021823842078447342, -0.013801098801195621, 0.29204466938972473, 0.09141461551189423, -0.02859027497470379, 0.041434526443481445, 0.11136382073163986, -0.002273151185363531, 0.014176161028444767, 0.14285850524902344, 0.09215790778398514, -0.031863026320934296, -0.04164145514369011, 0.08459056168794632, 0.02256973460316658, -0.05392938107252121, 0.09412097930908203, 0.06369296461343765, -0.006031056400388479, 0.07197029143571854, 0.03448187932372093, -0.038453083485364914, -0.16070541739463806, -0.14393910765647888, -0.05110718682408333, 0.10252409428358078, -0.037811409682035446, -0.024951107800006866, 0.12244632095098495, -0.03268973156809807, 0.04223176836967468, -0.043620266020298004, -0.008803891949355602, -0.16716568171977997, -0.09648323804140091, -0.080748550593853, -0.12598340213298798, -0.021228710189461708, -0.038255006074905396, -0.003953584935516119, 0.16550058126449585, 0.036126818507909775, -0.0029720275197178125, 0.06919416785240173, 0.009849702008068562, -0.009056666865944862, 0.0001592616317793727, -0.001025764155201614, 0.03265225514769554, -0.0519181527197361, -0.0028308311011642218, -0.1325213760137558, -0.02151523344218731, -0.049472030252218246, 0.0037988321855664253, -0.0575539693236351, 0.004923716653138399, -0.10794976353645325, -0.1120416596531868, -0.055023349821567535, 0.020620521157979965, -0.049651212990283966, 0.1143905520439148, -0.013899865560233593, 0.026366835460066795, 0.0032025333493947983, 0.1878129541873932, -0.10479830950498581, -0.06541265547275543, -0.04009893164038658, 0.2099129855632782, 0.03923157602548599, 0.08958882838487625, -0.03044535592198372, -0.002502979012206197, -0.11305852979421616, 0.23483054339885712, 0.32458624243736267, -0.0565672293305397, 0.07868807762861252, 0.03207504376769066, 0.009783938527107239, 0.06416034698486328, 0.1012740433216095, 0.08437353372573853, 0.1839424967765808, -0.09593776613473892, -0.03443457558751106, -0.05129731819033623, -0.01581636629998684, -0.10849292576313019, 0.03743660822510719, 0.04251522570848465, -0.05493767559528351, -0.0590197779238224, 0.039908550679683685, -0.14666621387004852, 0.1551007628440857, 0.06375879794359207, -0.22213850915431976, -0.08235547691583633, 0.00733402743935585, 0.17457152903079987, -0.028050431981682777, 0.08379825949668884, -0.05530235171318054, -0.07930783182382584, 0.022184548899531364, -0.011218644678592682, -0.18523161113262177, -0.04352652281522751, 0.10983356833457947, 0.020686596632003784, 0.021287748590111732, -0.04500110447406769, 0.03599127382040024, 0.09986057877540588, 0.059809356927871704, -0.06561975926160812, 0.032719943672418594, 0.01138780266046524, -0.09222860634326935, -0.034499283879995346, -0.009455270133912563, 0.0020061873365193605, -0.07023406773805618, 0.05498579889535904, -0.17469368875026703, 0.030603691935539246, -0.061631716787815094, 0.006555626634508371, -0.011964947916567326, -0.013226993381977081, -0.0123868677765131, 0.07312458753585815, 0.06464759260416031, -0.01043564360588789, -0.05102888494729996, -0.047477837651968, -0.03283305466175079, 0.028418704867362976, -0.1054435446858406, -0.15259280800819397, -0.06990573555231094, -0.03140702843666077, 0.05399483069777489, -0.008559263311326504, -0.0514385886490345, -0.06011001020669937, -0.05811115726828575, 0.024769973009824753, -0.10351072996854782, 0.06245842203497887, 0.08294501900672913, 0.017627611756324768, -0.014087275601923466, -0.028937363997101784, 0.011427477933466434, 0.03515661507844925, -0.1444893181324005, -0.07878106087446213 ]
null
null
null
## Exllama v2 Quantizations of Hermes-Instruct-7B-v0.2 Using <a href="https://github.com/turboderp/exllamav2/releases/tag/v0.0.13">turboderp's ExLlamaV2 v0.0.13</a> for quantization. <b>The "main" branch only contains the measurement.json, download one of the other branches for the model (see below)</b> Each branch contains an individual bits per weight, with the main one containing only the meaurement.json for further conversions. Original model: https://huggingface.co/lodrick-the-lafted/Hermes-Instruct-7B-v0.2 | Branch | Bits | lm_head bits | VRAM (4k) | VRAM (16k) | VRAM (32k) | Description | | ----- | ---- | ------- | ------ | ------ | ------ | ------------ | | [8_0](https://huggingface.co/bartowski/Hermes-Instruct-7B-v0.2-exl2/tree/8_0) | 8.0 | 8.0 | 8.4 GB | 9.8 GB | 11.8 GB | Maximum quality that ExLlamaV2 can produce, near unquantized performance. | | [6_5](https://huggingface.co/bartowski/Hermes-Instruct-7B-v0.2-exl2/tree/6_5) | 6.5 | 8.0 | 7.2 GB | 8.6 GB | 10.6 GB | Very similar to 8.0, good tradeoff of size vs performance, **recommended**. | | [5_0](https://huggingface.co/bartowski/Hermes-Instruct-7B-v0.2-exl2/tree/5_0) | 5.0 | 6.0 | 6.0 GB | 7.4 GB | 9.4 GB | Slightly lower quality vs 6.5, but usable on 8GB cards. | | [4_25](https://huggingface.co/bartowski/Hermes-Instruct-7B-v0.2-exl2/tree/4_25) | 4.25 | 6.0 | 5.3 GB | 6.7 GB | 8.7 GB | GPTQ equivalent bits per weight, slightly higher quality. | | [3_5](https://huggingface.co/bartowski/Hermes-Instruct-7B-v0.2-exl2/tree/3_5) | 3.5 | 6.0 | 4.7 GB | 6.1 GB | 8.1 GB | Lower quality, only use if you have to. | ## Download instructions With git: ```shell git clone --single-branch --branch 6_5 https://huggingface.co/bartowski/Hermes-Instruct-7B-v0.2-exl2 Hermes-Instruct-7B-v0.2-exl2-6_5 ``` With huggingface hub (credit to TheBloke for instructions): ```shell pip3 install huggingface-hub ``` To download the `main` (only useful if you only care about measurement.json) branch to a folder called `Hermes-Instruct-7B-v0.2-exl2`: ```shell mkdir Hermes-Instruct-7B-v0.2-exl2 huggingface-cli download bartowski/Hermes-Instruct-7B-v0.2-exl2 --local-dir Hermes-Instruct-7B-v0.2-exl2 --local-dir-use-symlinks False ``` To download from a different branch, add the `--revision` parameter: Linux: ```shell mkdir Hermes-Instruct-7B-v0.2-exl2-6_5 huggingface-cli download bartowski/Hermes-Instruct-7B-v0.2-exl2 --revision 6_5 --local-dir Hermes-Instruct-7B-v0.2-exl2-6_5 --local-dir-use-symlinks False ``` Windows (which apparently doesn't like _ in folders sometimes?): ```shell mkdir Hermes-Instruct-7B-v0.2-exl2-6.5 huggingface-cli download bartowski/Hermes-Instruct-7B-v0.2-exl2 --revision 6_5 --local-dir Hermes-Instruct-7B-v0.2-exl2-6.5 --local-dir-use-symlinks False ``` Want to support my work? Visit my ko-fi page here: https://ko-fi.com/bartowski
{"license": "apache-2.0", "datasets": ["lodrick-the-lafted/Hermes-40K"], "quantized_by": "bartowski", "pipeline_tag": "text-generation"}
text-generation
bartowski/Hermes-Instruct-7B-v0.2-exl2
[ "text-generation", "dataset:lodrick-the-lafted/Hermes-40K", "license:apache-2.0", "region:us" ]
2024-02-13T19:22:56+00:00
[]
[]
TAGS #text-generation #dataset-lodrick-the-lafted/Hermes-40K #license-apache-2.0 #region-us
Exllama v2 Quantizations of Hermes-Instruct-7B-v0.2 --------------------------------------------------- Using <a href="URL ExLlamaV2 v0.0.13 for quantization. **The "main" branch only contains the URL, download one of the other branches for the model (see below)** Each branch contains an individual bits per weight, with the main one containing only the URL for further conversions. Original model: URL Download instructions --------------------- With git: With huggingface hub (credit to TheBloke for instructions): To download the 'main' (only useful if you only care about URL) branch to a folder called 'Hermes-Instruct-7B-v0.2-exl2': To download from a different branch, add the '--revision' parameter: Linux: Windows (which apparently doesn't like \_ in folders sometimes?): Want to support my work? Visit my ko-fi page here: URL
[]
[ "TAGS\n#text-generation #dataset-lodrick-the-lafted/Hermes-40K #license-apache-2.0 #region-us \n" ]
[ 35 ]
[ "passage: TAGS\n#text-generation #dataset-lodrick-the-lafted/Hermes-40K #license-apache-2.0 #region-us \n" ]
[ -0.05022028088569641, 0.1300308108329773, -0.0058738007210195065, 0.06049049273133278, -0.028137922286987305, 0.03962182253599167, 0.2272411286830902, 0.11990104615688324, 0.08424443006515503, -0.08075495064258575, 0.1707633137702942, 0.10080880671739578, 0.019459493458271027, 0.04687110707163811, -0.03605503588914871, -0.12572897970676422, 0.062337275594472885, -0.03041151538491249, -0.0804687887430191, 0.03454425558447838, 0.047580618411302567, 0.026234129443764687, 0.07725324481725693, -0.08445882797241211, -0.01778118498623371, 0.00019932557188440114, 0.030087614431977272, -0.06044742465019226, 0.059069596230983734, -0.019743336364626884, -0.00699939951300621, 0.036902714520692825, -0.006095200777053833, -0.1792708933353424, 0.01651904545724392, -0.02230595238506794, -0.09282451868057251, 0.06379810720682144, -0.003567651379853487, -0.004580206703394651, 0.13490505516529083, -0.010007770732045174, -0.10130148380994797, 0.017830992117524147, -0.11593271046876907, -0.15880157053470612, -0.1701018214225769, -0.0039214082062244415, 0.038633283227682114, 0.05641591176390648, 0.04581572487950325, 0.1094045639038086, -0.04153625667095184, 0.020154966041445732, 0.1229356974363327, -0.337850958108902, 0.015920566394925117, 0.14241255819797516, 0.025998063385486603, 0.12078537046909332, -0.00972888432443142, 0.08969248831272125, 0.10816072672605515, 0.004389228764921427, -0.05907517299056053, -0.07807914912700653, -0.13794904947280884, 0.10276300460100174, -0.009247913956642151, -0.03365599736571312, 0.4688619077205658, 0.04157409444451332, 0.0027886827010661364, 0.07943157106637955, -0.03429962322115898, 0.060461558401584625, -0.005251720547676086, 0.08312145620584488, 0.06269679963588715, 0.12454578280448914, 0.0797203779220581, -0.10456870496273041, -0.12392721325159073, -0.056923381984233856, -0.16823866963386536, -0.011796671897172928, -0.0017054799245670438, 0.12483326345682144, -0.17283964157104492, 0.022937582805752754, -0.054801490157842636, -0.1024615615606308, -0.004107634071260691, -0.07250503450632095, 0.15273962914943695, 0.0712309330701828, -0.04150576516985893, 0.1196691170334816, 0.15521007776260376, 0.18906232714653015, 0.047467608004808426, -0.019749915227293968, -0.07588071376085281, 0.14121881127357483, -0.0271370317786932, -0.09363658726215363, 0.052057888358831406, -0.09259819239377975, 0.11745823174715042, -0.07147470861673355, 0.133731871843338, -0.05000022426247597, -0.11188200116157532, -0.002600935520604253, -0.12184139341115952, 0.09327671676874161, 0.11931590735912323, -0.04108133539557457, -0.016060281544923782, 0.034159157425165176, 0.110459104180336, -0.04259082302451134, -0.023038897663354874, -0.02809109352529049, -0.04174159839749336, 0.04180591553449631, 0.032415684312582016, 0.03447439521551132, 0.007039025891572237, -0.03473763167858124, -0.0643821731209755, -0.023724444210529327, 0.020604319870471954, 0.022359708324074745, 0.12092958390712738, -0.02729281410574913, 0.0414845272898674, -0.09539517015218735, -0.2779145836830139, 0.012975761666893959, 0.07154933363199234, -0.009612653404474258, -0.047984395176172256, 0.0448061004281044, -0.03204704076051712, 0.029674934223294258, -0.07189302146434784, -0.007356130052357912, -0.1067899540066719, 0.08201069384813309, -0.132941335439682, 0.052437618374824524, -0.22879360616207123, 0.017128849402070045, -0.12350090593099594, 0.0032032339368015528, 0.056031301617622375, -0.0035239176359027624, -0.09106791019439697, 0.1476309597492218, -0.07000807672739029, 0.04004378989338875, -0.07696081697940826, 0.016991250216960907, -0.03462996333837509, 0.18347960710525513, -0.2529519498348236, -0.011280989274382591, 0.09597194939851761, -0.08884996920824051, -0.20162871479988098, 0.031373247504234314, 0.019946927204728127, 0.05791180580854416, 0.10413045436143875, 0.31524690985679626, -0.02839059755206108, -0.0308211799710989, 0.010176446288824081, 0.17371416091918945, -0.0014469040324911475, -0.20338784158229828, 0.11081785708665848, -0.08171875029802322, -0.10280986875295639, 0.018333453685045242, -0.06911084800958633, 0.03734774515032768, 0.036912087351083755, -0.11947254091501236, -0.05499511957168579, -0.07178688049316406, -0.034638289362192154, -0.04567837342619896, 0.06139034032821655, -0.06632050126791, 0.03845646604895592, -0.08239483088254929, 0.08225800096988678, 0.08198300749063492, 0.07152687758207321, 0.025771046057343483, 0.03052600286900997, -0.0014218351570889354, 0.05573296546936035, -0.06854695826768875, -0.03089955449104309, -0.022889724001288414, -0.024058420211076736, 0.07672900706529617, 0.039673756808042526, 0.04507762938737869, -0.07345927506685257, -0.034772977232933044, 0.06136646121740341, 0.03717579320073128, 0.009832513518631458, -0.037782974541187286, -0.1629893034696579, 0.055867768824100494, -0.018502745777368546, 0.07426828145980835, -0.026072049513459206, 0.0013487667310982943, 0.011323348619043827, -0.02048909105360508, -0.01942131482064724, 0.07386396080255508, 0.01358376257121563, -0.07369060814380646, -0.018057292327284813, -0.007380674127489328, 0.06958118826150894, 0.0369592122733593, -0.12989111244678497, 0.12958399951457977, -0.008593522943556309, 0.12617412209510803, 0.21199463307857513, -0.02271629311144352, 0.10810039192438126, -0.04312960430979729, 0.02791697159409523, -0.0247946847230196, 0.029134321957826614, -0.027454644441604614, -0.07823073863983154, -0.002398278098553419, 0.0314287394285202, -0.042845841497182846, -0.005812854506075382, -0.056092798709869385, -0.05928010866045952, -0.04472282528877258, 0.0517585463821888, 0.11954967677593231, -0.05459865555167198, 0.1595275104045868, 0.39211639761924744, -0.0043248990550637245, 0.12232623994350433, -0.11452768743038177, -0.04975322633981705, -0.014965266920626163, -0.04643712565302849, -0.05426543951034546, 0.11404883861541748, -0.014880559407174587, 0.013194232247769833, 0.09325316548347473, 0.04278843477368355, 0.03497496619820595, -0.12143146991729736, -0.1154615730047226, -0.05073579028248787, -0.07378184050321579, -0.15943863987922668, 0.028020963072776794, -0.09150746464729309, 0.10135632008314133, -0.027677198871970177, -0.04284514859318733, 0.1102331131696701, -0.029287179931998253, -0.04435966908931732, 0.08977755159139633, -0.19186900556087494, -0.031203879043459892, -0.11403657495975494, -0.01297778356820345, -0.0928778126835823, -0.0020748102106153965, 0.09708486497402191, -0.07596238702535629, 0.0038903255481272936, -0.020098386332392693, -0.0051674204878509045, -0.000237872198340483, -0.027500461786985397, 0.0406600758433342, 0.033147793263196945, -0.015960820019245148, -0.1399136781692505, -0.03948957845568657, -0.009645409882068634, 0.003659999230876565, 0.0607026033103466, -0.08421341329813004, 0.09037917852401733, 0.05478217080235481, 0.08323568105697632, 0.023750290274620056, -0.00855997484177351, 0.14977139234542847, -0.03796188533306122, 0.018121585249900818, 0.1652527153491974, 0.055644161999225616, 0.02339169569313526, 0.1204315796494484, 0.0778905600309372, -0.09237343072891235, -0.00027831189800053835, -0.016048645600676537, -0.09149494767189026, -0.3125532567501068, -0.05681520327925682, -0.07477741688489914, 0.14431972801685333, 0.017401233315467834, 0.06382044404745102, 0.0668446496129036, 0.0465361550450325, -0.04479754716157913, 0.09472646564245224, 0.0313291922211647, 0.010644184425473213, 0.09097249805927277, -0.013501185923814774, 0.03013700433075428, -0.1483750194311142, 0.028377147391438484, 0.18601329624652863, 0.16341562569141388, 0.16284367442131042, 0.16944870352745056, 0.14299899339675903, 0.0960676521062851, 0.144578754901886, 0.0030875802040100098, 0.07715248316526413, 0.06697246432304382, 0.03191114217042923, -0.061915185302495956, -0.07354624569416046, 0.03015855699777603, 0.08635545521974564, -0.10556815564632416, -0.15014436841011047, 0.06095590442419052, -0.11849454790353775, 0.07508964836597443, 0.11831527948379517, 0.10993227362632751, -0.08696433901786804, 0.0668042004108429, 0.1559353470802307, 0.10846918076276779, 0.004389485809952021, 0.14816750586032867, -0.03847214952111244, -0.029126843437552452, 0.13036374747753143, 0.014284849166870117, 0.1199425682425499, 0.010553386062383652, 0.015819234773516655, -0.0711202621459961, -0.12466340512037277, 0.0356195829808712, 0.09330129623413086, -0.2440544217824936, 0.22565563023090363, 0.027110910043120384, -0.03904147818684578, -0.054491959512233734, -0.036841996014118195, 0.11271178722381592, 0.12265503406524658, 0.14941495656967163, 0.053420182317495346, -0.19093076884746552, 0.09736019372940063, -0.14440485835075378, 0.054042816162109375, -0.023783259093761444, -0.05712093412876129, -0.10277729481458664, -0.022134335711598396, 0.04512854292988777, 0.016415780410170555, 0.11314169317483902, -0.10544609278440475, -0.09624232351779938, 0.023956527933478355, 0.16972118616104126, -0.0627533569931984, -0.045922957360744476, 0.006814065854996443, 0.027191540226340294, 0.06652193516492844, -0.05019139125943184, -0.055610306560993195, -0.03742845728993416, -0.1281065046787262, 0.08095763623714447, -0.02890252135694027, -0.06589332967996597, 0.0005954614607617259, -0.06430406123399734, -0.08648551255464554, -0.2384289652109146, 0.08270753920078278, -0.08494937419891357, 0.037778306752443314, -0.011105584912002087, 0.1182408481836319, -0.022755054756999016, 0.042627446353435516, -0.027158193290233612, 0.03313971310853958, -0.08773075044155121, -0.10515133291482925, 0.0667642280459404, 0.009222636930644512, 0.010976326651871204, 0.018546592444181442, -0.08227024972438812, 0.027391139417886734, 0.012528805062174797, -0.09127736836671829, 0.15467484295368195, 0.2656487822532654, -0.06554193049669266, 0.17172855138778687, 0.2968668043613434, -0.12533511221408844, -0.27974218130111694, -0.11681248247623444, -0.2391577810049057, -0.08301616460084915, 0.05190887674689293, -0.2126731127500534, 0.10979481041431427, 0.15296535193920135, -0.10730394721031189, 0.21116198599338531, -0.2512314021587372, -0.06437390297651291, 0.17039574682712555, -0.07335291057825089, 0.41731077432632446, -0.1913350373506546, -0.1188426986336708, -0.1474580466747284, -0.15397906303405762, 0.1582244336605072, -0.2883164882659912, 0.05120307579636574, 0.0029752140399068594, 0.061348073184490204, -0.048458345234394073, 0.01634971983730793, 0.22682765126228333, 0.0955217182636261, 0.09594234079122543, -0.08291120827198029, 0.0009271762100979686, 0.1462903469800949, -0.04570082202553749, 0.06967393308877945, -0.22117522358894348, -0.01826545223593712, -0.06851916760206223, 0.056354399770498276, -0.050201352685689926, 0.08055035769939423, -0.02618667483329773, -0.057136792689561844, -0.06173088774085045, -0.01910160481929779, -0.03834781423211098, -0.02607947774231434, 0.2843465209007263, 0.08465935289859772, 0.014468357898294926, 0.017088549211621284, -0.015156974084675312, -0.06041530892252922, 0.02402161993086338, -0.04969518631696701, -0.06964300572872162, 0.07363975793123245, -0.28130003809928894, -0.024473128840327263, 0.07682201266288757, -0.024614466354250908, 0.040575139224529266, 0.021791299805045128, -0.05247065797448158, -0.0033891461789608, 0.10714586079120636, -0.1387365609407425, -0.015086779370903969, -0.013888695277273655, 0.006626434624195099, 0.06085246801376343, 0.016781311482191086, 0.14976799488067627, 0.005886842962354422, 0.00527945114299655, 0.012067265808582306, 0.08214760571718216, -0.1154615730047226, 0.03900657966732979, 0.0716625303030014, -0.042345378547906876, -0.15478450059890747, 0.24121372401714325, 0.026219788938760757, -0.00609540194272995, -0.0008765840902924538, 0.10000712424516678, -0.08219442516565323, -0.11511856317520142, -0.06600653380155563, 0.0747108981013298, -0.10294292122125626, -0.09121546149253845, -0.008336801081895828, -0.08790861070156097, -0.02777901105582714, -0.04287709295749664, 0.07513955980539322, 0.06978197395801544, 0.041519977152347565, -0.08306701481342316, 0.08759152889251709, -0.004277526400983334, -0.06473849713802338, 0.004240376874804497, -0.06080226972699165, -0.15981616079807281, -0.019346898421645164, 0.08143189549446106, -0.007880396209657192, 0.0070515647530555725, -0.05909617617726326, 0.03710172697901726, -0.14849114418029785, 0.008982613682746887, -0.07986462116241455, -0.007189253345131874, -0.004003168549388647, -0.04825645685195923, -0.06749297678470612, 0.0007561391685158014, -0.12763817608356476, -0.02273799292743206, -0.02540029212832451, 0.08747548609972, -0.13129477202892303, -0.0745331421494484, 0.07898474484682083, 0.023510826751589775, 0.08478929102420807, 0.1388077437877655, -0.015150473453104496, 0.1253388226032257, -0.10206163674592972, -0.038147300481796265, 0.043058574199676514, 0.046792857348918915, 0.007068632636219263, 0.019319334998726845, -0.0913940817117691, 0.056841325014829636, 0.007747959811240435, 0.0439276285469532, -0.03383743017911911, -0.10365936160087585, -0.1080157682299614, -0.053411662578582764, -0.0947149321436882, -0.0048847380094230175, -0.1004662737250328, 0.2117718905210495, 0.06304221600294113, 0.14020924270153046, 0.02854885160923004, -0.0025207458529621363, -0.04697191342711449, 0.042464014142751694, -0.015953749418258667, -0.15036754310131073, -0.14445805549621582, -0.040760934352874756, -0.025310898199677467, -0.04537194222211838, 0.25024718046188354, -0.0063335224986076355, -0.164669468998909, 0.03663221374154091, 0.10371780395507812, -0.03696621581912041, -0.019698161631822586, 0.23308829963207245, 0.04707788676023483, 0.003985104616731405, -0.11338436603546143, 0.03526565060019493, 0.022759366780519485, -0.009865612722933292, 0.08123962581157684, 0.0907595232129097, 0.16332878172397614, 0.07267366349697113, 0.06254701316356659, -0.05401550233364105, 0.019408460706472397, -0.04281985014677048, 0.10281055420637131, 0.07965106517076492, 0.04306172579526901, 0.079475998878479, 0.12456323951482773, -0.06696615368127823, 0.014281485229730606, -0.07106661051511765, -0.03233836963772774, -0.1615592986345291, -0.1434602439403534, -0.024902649223804474, -0.09000877290964127, -0.014239112846553326, -0.059751834720373154, 0.03857477754354477, 0.20036128163337708, 0.0404939278960228, -0.04383116960525513, -0.03846457228064537, -0.14162065088748932, -0.08317594975233078, 0.03898181393742561, -0.027608271688222885, -0.05874689295887947, -0.051589738577604294, -0.05606946721673012, 0.023452192544937134, -0.06320998072624207, -0.07286225259304047, 0.030729856342077255, 0.03909866511821747, 0.06110646203160286, -0.13843901455402374, -0.06830646097660065, -0.09392762929201126, 0.030285559594631195, 0.03484993055462837, 0.19222186505794525, 0.06985874474048615, -0.019782861694693565, 0.12049555033445358, 0.18117836117744446, -0.035102479159832, -0.07898157089948654, -0.046167515218257904, -0.022114530205726624, -0.05667056143283844, 0.010672648437321186, -0.03495953604578972, -0.029820460826158524, -0.023591484874486923, 0.201035737991333, 0.3410007655620575, -0.0873657837510109, -0.018703797832131386, -0.04852999746799469, 0.02809414453804493, 0.016871370375156403, 0.08114541321992874, 0.09708809107542038, 0.05505547672510147, -0.05195445194840431, 0.004336134064942598, -0.04901901260018349, 0.017108654603362083, -0.15767140686511993, 0.10929764062166214, -0.01849157176911831, -0.1291435807943344, 0.016414470970630646, 0.12150868028402328, -0.03677409887313843, -0.02022743783891201, 0.0059433612041175365, -0.10226215422153473, -0.013084456324577332, -0.007830663584172726, 0.10002399981021881, 0.07488193362951279, 0.027035614475607872, -0.08688777685165405, -0.03353860601782799, 0.07747404277324677, -0.019953535869717598, -0.28269243240356445, -0.1448231190443039, 0.09929762780666351, -0.05529346317052841, 0.1659270077943802, 0.014554751105606556, 0.06304200738668442, 0.021425940096378326, -0.0005766947870142758, -0.10646778345108032, 0.10286687314510345, 0.04415128007531166, 0.059588681906461716, -0.06058543920516968, -0.12217674404382706, -0.0927838385105133, -0.029912862926721573, 0.11153130978345871, 0.07706554234027863, -0.039380818605422974, 0.2130529135465622, -0.09668995440006256, -0.04562661424279213, -0.0008607484051026404, -0.12722599506378174, 0.08387868106365204, 0.007085700985044241, -0.03993557021021843, -0.04300571605563164, -0.02987714298069477, -0.027123713865876198, 0.06166303902864456, -0.22356048226356506, -0.01884233020246029, 0.11570541560649872, -0.04955219477415085, 0.08832788467407227, 0.06135256960988045, -0.11240588873624802, -0.009783880785107613, -0.10645005851984024, 0.012999430298805237, -0.10752586275339127, 0.026890920475125313, 0.11633571982383728, -0.06462427228689194, 0.0024280031211674213, -0.03510472550988197, 0.05378483980894089, -0.025567401200532913, -0.03430665656924248, -0.1142641231417656 ]
null
null
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
text-classification
CatBarks/bertES_posWeighted2_model
[ "transformers", "safetensors", "bert", "text-classification", "arxiv:1910.09700", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2024-02-13T19:23:21+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #bert #text-classification #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #bert #text-classification #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ 46, 6, 3, 82, 28, 3, 4, 9, 9, 10, 42, 20, 3, 4, 5, 9, 11, 13, 3, 12, 5, 4, 5, 3, 4, 9, 53, 9, 8, 6, 3, 14, 8, 7, 9, 4 ]
[ "passage: TAGS\n#transformers #safetensors #bert #text-classification #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact" ]
[ -0.06817419826984406, 0.1699906885623932, -0.003845146857202053, 0.018365124240517616, 0.11478200554847717, 0.00763329304754734, 0.07986336201429367, 0.10738246887922287, -0.0269484706223011, 0.1267213374376297, 0.03862300142645836, 0.1017010435461998, 0.11044707149267197, 0.18616852164268494, 0.002953584771603346, -0.2117370218038559, 0.062315817922353745, -0.11355884373188019, 0.01421935111284256, 0.12174045294523239, 0.14285145699977875, -0.10472407191991806, 0.07340893894433975, -0.03533155843615532, -0.019184017553925514, -0.029508300125598907, -0.06138347089290619, -0.062117863446474075, 0.06899366527795792, 0.06911981105804443, 0.06776530295610428, 0.02535320073366165, 0.07980640977621078, -0.2927248775959015, 0.019224179908633232, 0.07704847306013107, 0.004596637096256018, 0.06310366839170456, 0.07900875061750412, -0.06604467332363129, 0.12630145251750946, -0.0469624362885952, 0.15577000379562378, 0.07483451068401337, -0.09700790792703629, -0.1833430528640747, -0.07868417352437973, 0.08138132095336914, 0.1542958915233612, 0.0575118213891983, -0.03566069155931473, 0.14360417425632477, -0.0863327905535698, 0.015191552229225636, 0.06608161330223083, -0.07603584229946136, -0.05265629291534424, 0.04255614057183266, 0.07708034664392471, 0.09375373274087906, -0.1291297972202301, -0.010211804881691933, 0.04229271039366722, 0.01873886212706566, 0.10347303748130798, 0.02310175821185112, 0.11163661628961563, 0.026270611211657524, -0.13941870629787445, -0.06378244608640671, 0.1267201453447342, 0.02999917045235634, -0.05697820335626602, -0.23340454697608948, -0.007031846325844526, -0.028088124468922615, -0.024382783100008965, -0.03983099386096001, 0.03844287618994713, -0.0294374767690897, 0.07875318825244904, 0.011917876079678535, -0.07096433639526367, -0.04893866181373596, 0.08819517493247986, 0.06123629957437515, 0.022971229627728462, -0.02526908740401268, 0.02413375861942768, 0.11652170121669769, 0.09283795207738876, -0.11929406225681305, -0.06425759196281433, -0.06432286649942398, -0.08888134360313416, -0.04847237840294838, 0.03574979677796364, 0.0754702165722847, 0.04938753694295883, 0.19765597581863403, 0.006366121117025614, 0.05646394565701485, 0.0260426327586174, 0.015338202007114887, 0.06355882436037064, 0.07606974244117737, -0.0483609177172184, -0.13532373309135437, -0.041331104934215546, 0.11784996092319489, 0.007102925330400467, -0.032494835555553436, -0.03608081117272377, 0.06173410639166832, 0.05820438638329506, 0.1192656010389328, 0.06626396626234055, 0.019241811707615852, -0.06749388575553894, -0.03806937485933304, 0.1874811202287674, -0.1540532261133194, 0.020778683945536613, 0.01720726117491722, -0.05474008247256279, -0.043989501893520355, 0.0171356238424778, 0.008756347931921482, -0.02707439661026001, 0.10765543580055237, -0.0681026354432106, -0.03794260695576668, -0.10775765031576157, -0.057500679045915604, 0.032596319913864136, -0.011795170605182648, -0.030085675418376923, -0.0443500280380249, -0.1081358790397644, -0.07622874528169632, 0.06656987965106964, -0.06241556629538536, -0.07165607810020447, -0.03565853461623192, -0.05456356331706047, 0.012712954543530941, 0.002376573858782649, 0.12743701040744781, -0.02916865609586239, 0.04608776792883873, -0.04567936435341835, 0.06814887374639511, 0.13260088860988617, 0.03273140639066696, -0.07753180712461472, 0.0658058449625969, -0.21566881239414215, 0.10687019675970078, -0.09710393846035004, 0.030530039221048355, -0.1602926403284073, -0.027380328625440598, 0.025517668575048447, 0.035233598202466965, -0.01142354216426611, 0.1405038684606552, -0.18839864432811737, -0.036833859980106354, 0.17594264447689056, -0.13455410301685333, -0.09238629788160324, 0.06278568506240845, -0.057844966650009155, 0.12792403995990753, 0.05209182947874069, -0.027332304045557976, 0.059202857315540314, -0.13285812735557556, -0.024411480873823166, -0.0557100772857666, -0.0024997375439852476, 0.1512058526277542, 0.06197551265358925, -0.05537422001361847, 0.02062765136361122, 0.020016051828861237, -0.024297641590237617, -0.045233841985464096, -0.034582652151584625, -0.0977277010679245, 0.006374812684953213, -0.07783913612365723, 0.015467152930796146, -0.014978265389800072, -0.08572793006896973, -0.037934768944978714, -0.15898989140987396, -0.0011305080261081457, 0.09650373458862305, 0.007345336955040693, -0.029424650594592094, -0.09241348505020142, 0.005526319146156311, 0.014208783395588398, -0.01407501008361578, -0.15675009787082672, -0.05031281337141991, 0.03119790367782116, -0.16866113245487213, 0.033627450466156006, -0.04903757572174072, 0.03549545630812645, 0.04459671676158905, -0.04535774141550064, -0.02160848118364811, 0.0152364457026124, 0.017460787668824196, -0.02394135482609272, -0.24046528339385986, -0.016492176800966263, -0.049182213842868805, 0.17930001020431519, -0.24510087072849274, 0.04199686273932457, 0.062341514974832535, 0.12092601507902145, 0.005246761720627546, -0.047405339777469635, 0.03611646965146065, -0.04782456159591675, -0.04614211246371269, -0.06458985060453415, -0.004041698761284351, -0.03005247749388218, -0.04619463160634041, 0.04105473682284355, -0.19605930149555206, -0.029964644461870193, 0.11028317362070084, 0.07146124541759491, -0.1701718270778656, -0.07740049809217453, -0.03032514825463295, -0.06061795726418495, -0.09144899994134903, -0.04754206910729408, 0.10501570999622345, 0.0424359068274498, 0.054926108568906784, -0.07243066281080246, -0.047703035175800323, 0.012159520760178566, -0.008316845633089542, -0.035265736281871796, 0.0910128578543663, 0.09147894382476807, -0.1183665320277214, 0.1003284826874733, 0.06719938665628433, 0.061502620577812195, 0.10171586275100708, 0.005867301486432552, -0.09559345990419388, -0.012123096734285355, 0.023821083828806877, 0.014739413745701313, 0.13627171516418457, -0.08041682839393616, 0.03041158802807331, 0.043761420994997025, -0.03445654734969139, 0.011279189959168434, -0.10341424494981766, 0.02347799763083458, 0.03186830133199692, -0.007050554268062115, 0.025736309587955475, -0.054652560502290726, 0.013161799870431423, 0.1042112186551094, 0.03211836516857147, 0.0227707140147686, 0.015011876821517944, -0.03876445069909096, -0.12403564900159836, 0.17888623476028442, -0.09523385018110275, -0.25718894600868225, -0.12982366979122162, 0.0025806569028645754, 0.04723223298788071, -0.01322246715426445, 0.01721704937517643, -0.057064954191446304, -0.10620168596506119, -0.10562704503536224, 0.017637979239225388, 0.05363597348332405, -0.08985256403684616, -0.06360358744859695, 0.05353172495961189, 0.038684699684381485, -0.12286891043186188, 0.023170825093984604, 0.04556644707918167, -0.0685787945985794, 0.004107215907424688, 0.05788148567080498, 0.08483386784791946, 0.18220773339271545, 0.013182112947106361, -0.017085859552025795, 0.012520790100097656, 0.22458304464817047, -0.14599265158176422, 0.09336943179368973, 0.13670575618743896, -0.0603153258562088, 0.08385994285345078, 0.20927630364894867, 0.031639765948057175, -0.09247095137834549, 0.04077373072504997, 0.032938770949840546, -0.040111273527145386, -0.23512989282608032, -0.07784179598093033, 0.0005755177116952837, -0.07578593492507935, 0.10564399510622025, 0.09113350510597229, 0.11394096910953522, 0.05373004451394081, -0.10628228634595871, -0.06785868853330612, 0.04576247185468674, 0.11892180144786835, -0.020387137308716774, 0.0034232554025948048, 0.09533460438251495, -0.032669007778167725, 0.016892950981855392, 0.0903218612074852, 0.010076770558953285, 0.18146716058254242, 0.040793538093566895, 0.12895575165748596, 0.08216089755296707, 0.06404399126768112, 0.023877892643213272, 0.01690720207989216, 0.028041476383805275, 0.02853785827755928, -0.021422842517495155, -0.08959300816059113, -0.01811058260500431, 0.14208537340164185, 0.03174193948507309, 0.030387144535779953, 0.009561240673065186, -0.0344390794634819, 0.0656830444931984, 0.16341377794742584, 0.01373966783285141, -0.23032663762569427, -0.06265294551849365, 0.07538370788097382, -0.07251506298780441, -0.11472991853952408, -0.007447437848895788, 0.029569825157523155, -0.17949488759040833, 0.045079123228788376, -0.02245110087096691, 0.1028464064002037, -0.11004801839590073, -0.024476202204823494, 0.04228143393993378, 0.06811302900314331, -0.03619502857327461, 0.07936927676200867, -0.21071307361125946, 0.14414268732070923, 0.0071875168941915035, 0.0627245381474495, -0.10963346809148788, 0.08230046182870865, 0.02151823230087757, 0.009466269053518772, 0.16101586818695068, -0.0074920570477843285, -0.09318114817142487, -0.07651645690202713, -0.07556641101837158, -0.011319656856358051, 0.09559466689825058, -0.10184428840875626, 0.08486217260360718, -0.008358954451978207, -0.03313955292105675, -0.00388424564152956, -0.1140027567744255, -0.13622364401817322, -0.18601436913013458, 0.05523287504911423, -0.11181046068668365, 0.03691478446125984, -0.11166879534721375, -0.06252610683441162, -0.02911795862019062, 0.19807842373847961, -0.1904531568288803, -0.08140338957309723, -0.14539870619773865, -0.07204011082649231, 0.12212951481342316, -0.04274967685341835, 0.07663191109895706, 0.00015701932716183364, 0.2071707546710968, -0.004644640255719423, 0.0014644638868048787, 0.0856679305434227, -0.09557735919952393, -0.206184521317482, -0.09439684450626373, 0.13821037113666534, 0.12497473508119583, 0.04596934840083122, -0.0036321566440165043, 0.024304913356900215, -0.0027867835015058517, -0.10976199060678482, 0.02332260087132454, 0.1432444006204605, 0.08416087180376053, 0.03885705769062042, -0.02675866149365902, -0.14533737301826477, -0.1054752767086029, -0.05289754271507263, 0.019448768347501755, 0.17674845457077026, -0.07222644239664078, 0.1607094258069992, 0.15837931632995605, -0.06414622813463211, -0.20734171569347382, 0.032782182097435, 0.03679283335804939, -0.011663361452519894, 0.03244366869330406, -0.20815548300743103, 0.07330463081598282, 0.016213007271289825, -0.06075131520628929, 0.1363404095172882, -0.1705039143562317, -0.14891991019248962, 0.0919104814529419, 0.07189090549945831, -0.2193969339132309, -0.13394345343112946, -0.09907522052526474, -0.055755600333213806, -0.10410746932029724, 0.08695419132709503, 0.014253350906074047, 0.004559517838060856, 0.040003977715969086, 0.024713784456253052, 0.021094202995300293, -0.05303549766540527, 0.19554594159126282, -0.004308625590056181, 0.041122131049633026, -0.08143328875303268, -0.08729361742734909, 0.030160382390022278, -0.06146852299571037, 0.07429458200931549, -0.02577015943825245, 0.004456855356693268, -0.1102396696805954, -0.06384536623954773, -0.05289682373404503, 0.03639809414744377, -0.08915901929140091, -0.0958789587020874, -0.05767008289694786, 0.10389325767755508, 0.08919540792703629, -0.03324571251869202, -0.058615610003471375, -0.10058292001485825, 0.0726626068353653, 0.22699709236621857, 0.18807223439216614, 0.07284927368164062, -0.07015843689441681, 0.0006279588560573757, -0.022037893533706665, 0.05516184866428375, -0.20622296631336212, 0.04608523100614548, 0.042553652077913284, 0.028887338936328888, 0.13527612388134003, -0.02506665140390396, -0.1602775603532791, -0.04527048021554947, 0.06014934554696083, -0.06545355916023254, -0.1614707112312317, -0.0005388054414652288, 0.09576781094074249, -0.16179001331329346, -0.06273222714662552, 0.024773813784122467, -0.036137934774160385, -0.025756290182471275, 0.0013679420808330178, 0.08270203322172165, 0.027825508266687393, 0.11478793621063232, 0.06896458566188812, 0.11150709539651871, -0.10231363028287888, 0.08406093716621399, 0.09299708157777786, -0.10971303284168243, 0.03247435390949249, 0.07298728823661804, -0.0610542818903923, -0.03390142321586609, 0.023122351616621017, 0.08364028483629227, 0.026266440749168396, -0.0744837298989296, -0.0008558011031709611, -0.1099681630730629, 0.06663114577531815, 0.13796411454677582, 0.032853204756975174, 0.0030810926109552383, 0.04435998201370239, 0.025823330506682396, -0.09881676733493805, 0.11186433583498001, 0.03916766867041588, 0.03720828518271446, -0.04767070338129997, 0.004865953233093023, 0.041960928589105606, -0.01269921287894249, -0.016253290697932243, -0.039693526923656464, -0.06471271812915802, -0.010708925314247608, -0.15688052773475647, 0.031037067994475365, -0.07176970690488815, 0.009115522727370262, 0.018755896016955376, -0.033779606223106384, 0.0002807097043842077, 0.0073861307464540005, -0.07919271290302277, -0.03761441633105278, -0.006646361667662859, 0.10705258697271347, -0.15747743844985962, 0.008323745802044868, 0.08949586004018784, -0.12556882202625275, 0.07766758650541306, -0.007498627994209528, -0.010838181711733341, 0.01879316382110119, -0.14380721747875214, 0.06054820865392685, -0.008177737705409527, 0.006405212916433811, 0.023949483409523964, -0.20071232318878174, 0.005702852737158537, -0.04664513096213341, -0.053938448429107666, -0.00976315326988697, -0.04211960732936859, -0.11404810100793839, 0.10492629557847977, 0.0196357611566782, -0.0860515683889389, -0.018402770161628723, 0.05309472978115082, 0.10592338442802429, -0.057369641959667206, 0.1371336728334427, -0.02283608354628086, 0.05825338885188103, -0.17831756174564362, -0.016339747235178947, -0.017454219982028008, 0.012596609070897102, -0.03102201037108898, -0.008158523589372635, 0.05483707785606384, -0.015072896145284176, 0.22714339196681976, -0.021177595481276512, 0.030790245160460472, 0.06548503786325455, 0.0070373364724218845, -0.013032838702201843, 0.08790382742881775, 0.04639120027422905, 0.021969040855765343, 0.017426103353500366, 0.016819516196846962, -0.047575462609529495, -0.019116412848234177, -0.12834098935127258, 0.08396804332733154, 0.16439755260944366, 0.08264775574207306, -0.005125291179865599, 0.053218428045511246, -0.11920209228992462, -0.08098750561475754, 0.10049403458833694, -0.033211447298526764, -0.001258186181075871, -0.057700008153915405, 0.14298145473003387, 0.15607422590255737, -0.1750815361738205, 0.06616412103176117, -0.07047461718320847, -0.05687202885746956, -0.11070677638053894, -0.17143365740776062, -0.06694129854440689, -0.03149404004216194, -0.005430171266198158, -0.06143372505903244, 0.06926561146974564, 0.10244123637676239, 0.008475886657834053, 0.002354414900764823, 0.08415096998214722, -0.033749498426914215, -0.0007962242234498262, 0.04344722256064415, 0.05283457785844803, 0.021373692899942398, -0.06691429764032364, 0.0076249162666499615, 0.004598149098455906, 0.038937900215387344, 0.05476561188697815, 0.0317605659365654, -0.014559607952833176, 0.011871086433529854, -0.013089693151414394, -0.09815122187137604, 0.03718226030468941, -0.029980625957250595, -0.0468674972653389, 0.14802806079387665, 0.01827765442430973, 0.0034919960889965296, -0.021031659096479416, 0.23128560185432434, -0.06903756409883499, -0.0798255056142807, -0.14009471237659454, 0.15071772038936615, -0.04670744761824608, 0.05065378174185753, 0.04940982535481453, -0.10087474435567856, 0.03407741338014603, 0.14691931009292603, 0.14527682960033417, -0.02467990294098854, 0.007901503704488277, 0.011187983676791191, 0.0055741616524755955, -0.025625228881835938, 0.05354921892285347, 0.04412171617150307, 0.12145667523145676, -0.06669453531503677, 0.09297986328601837, -0.007810541894286871, -0.0844663754105568, -0.02094031497836113, 0.1328510195016861, 0.0014671299140900373, 0.02338746376335621, -0.0805477648973465, 0.11851188540458679, -0.06559251248836517, -0.25864502787590027, 0.061333827674388885, -0.06666524708271027, -0.15384668111801147, -0.018917718902230263, 0.02399173192679882, 0.00401253392919898, 0.024401430040597916, 0.06268756836652756, -0.06360985338687897, 0.14903949201107025, 0.03688151761889458, -0.07834678888320923, -0.07808853685855865, 0.07696148753166199, -0.08397532254457474, 0.3018210828304291, 0.008228152059018612, 0.04951678216457367, 0.09650786966085434, -0.03327273949980736, -0.13361208140850067, 0.04569283500313759, 0.09728528559207916, -0.06408768892288208, 0.06690182536840439, 0.19748380780220032, -0.008177485316991806, 0.12026696652173996, 0.07469146698713303, -0.08128973841667175, 0.057554539293050766, -0.07613562047481537, -0.09007242321968079, -0.09192728251218796, 0.08888110518455505, -0.060599785298109055, 0.15479759871959686, 0.13393908739089966, -0.04440179467201233, -0.001819826546125114, -0.03071022778749466, 0.05197824910283089, -0.002023093169555068, 0.1104598417878151, 0.022785736247897148, -0.19388216733932495, 0.031831543892621994, -0.014316190034151077, 0.0986877828836441, -0.2479904145002365, -0.07837841659784317, 0.0403057225048542, -0.013808837160468102, -0.05274871736764908, 0.12204353511333466, 0.052187733352184296, 0.04937480762600899, -0.05449601635336876, -0.057812657207250595, -0.00025569170247763395, 0.16358551383018494, -0.1094348207116127, -0.00204258831217885 ]
null
null
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
text-classification
CatBarks/bertES_posWeighted0.9_model
[ "transformers", "safetensors", "bert", "text-classification", "arxiv:1910.09700", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2024-02-13T19:23:50+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #bert #text-classification #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #bert #text-classification #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ 46, 6, 3, 82, 28, 3, 4, 9, 9, 10, 42, 20, 3, 4, 5, 9, 11, 13, 3, 12, 5, 4, 5, 3, 4, 9, 53, 9, 8, 6, 3, 14, 8, 7, 9, 4 ]
[ "passage: TAGS\n#transformers #safetensors #bert #text-classification #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact" ]
[ -0.06817419826984406, 0.1699906885623932, -0.003845146857202053, 0.018365124240517616, 0.11478200554847717, 0.00763329304754734, 0.07986336201429367, 0.10738246887922287, -0.0269484706223011, 0.1267213374376297, 0.03862300142645836, 0.1017010435461998, 0.11044707149267197, 0.18616852164268494, 0.002953584771603346, -0.2117370218038559, 0.062315817922353745, -0.11355884373188019, 0.01421935111284256, 0.12174045294523239, 0.14285145699977875, -0.10472407191991806, 0.07340893894433975, -0.03533155843615532, -0.019184017553925514, -0.029508300125598907, -0.06138347089290619, -0.062117863446474075, 0.06899366527795792, 0.06911981105804443, 0.06776530295610428, 0.02535320073366165, 0.07980640977621078, -0.2927248775959015, 0.019224179908633232, 0.07704847306013107, 0.004596637096256018, 0.06310366839170456, 0.07900875061750412, -0.06604467332363129, 0.12630145251750946, -0.0469624362885952, 0.15577000379562378, 0.07483451068401337, -0.09700790792703629, -0.1833430528640747, -0.07868417352437973, 0.08138132095336914, 0.1542958915233612, 0.0575118213891983, -0.03566069155931473, 0.14360417425632477, -0.0863327905535698, 0.015191552229225636, 0.06608161330223083, -0.07603584229946136, -0.05265629291534424, 0.04255614057183266, 0.07708034664392471, 0.09375373274087906, -0.1291297972202301, -0.010211804881691933, 0.04229271039366722, 0.01873886212706566, 0.10347303748130798, 0.02310175821185112, 0.11163661628961563, 0.026270611211657524, -0.13941870629787445, -0.06378244608640671, 0.1267201453447342, 0.02999917045235634, -0.05697820335626602, -0.23340454697608948, -0.007031846325844526, -0.028088124468922615, -0.024382783100008965, -0.03983099386096001, 0.03844287618994713, -0.0294374767690897, 0.07875318825244904, 0.011917876079678535, -0.07096433639526367, -0.04893866181373596, 0.08819517493247986, 0.06123629957437515, 0.022971229627728462, -0.02526908740401268, 0.02413375861942768, 0.11652170121669769, 0.09283795207738876, -0.11929406225681305, -0.06425759196281433, -0.06432286649942398, -0.08888134360313416, -0.04847237840294838, 0.03574979677796364, 0.0754702165722847, 0.04938753694295883, 0.19765597581863403, 0.006366121117025614, 0.05646394565701485, 0.0260426327586174, 0.015338202007114887, 0.06355882436037064, 0.07606974244117737, -0.0483609177172184, -0.13532373309135437, -0.041331104934215546, 0.11784996092319489, 0.007102925330400467, -0.032494835555553436, -0.03608081117272377, 0.06173410639166832, 0.05820438638329506, 0.1192656010389328, 0.06626396626234055, 0.019241811707615852, -0.06749388575553894, -0.03806937485933304, 0.1874811202287674, -0.1540532261133194, 0.020778683945536613, 0.01720726117491722, -0.05474008247256279, -0.043989501893520355, 0.0171356238424778, 0.008756347931921482, -0.02707439661026001, 0.10765543580055237, -0.0681026354432106, -0.03794260695576668, -0.10775765031576157, -0.057500679045915604, 0.032596319913864136, -0.011795170605182648, -0.030085675418376923, -0.0443500280380249, -0.1081358790397644, -0.07622874528169632, 0.06656987965106964, -0.06241556629538536, -0.07165607810020447, -0.03565853461623192, -0.05456356331706047, 0.012712954543530941, 0.002376573858782649, 0.12743701040744781, -0.02916865609586239, 0.04608776792883873, -0.04567936435341835, 0.06814887374639511, 0.13260088860988617, 0.03273140639066696, -0.07753180712461472, 0.0658058449625969, -0.21566881239414215, 0.10687019675970078, -0.09710393846035004, 0.030530039221048355, -0.1602926403284073, -0.027380328625440598, 0.025517668575048447, 0.035233598202466965, -0.01142354216426611, 0.1405038684606552, -0.18839864432811737, -0.036833859980106354, 0.17594264447689056, -0.13455410301685333, -0.09238629788160324, 0.06278568506240845, -0.057844966650009155, 0.12792403995990753, 0.05209182947874069, -0.027332304045557976, 0.059202857315540314, -0.13285812735557556, -0.024411480873823166, -0.0557100772857666, -0.0024997375439852476, 0.1512058526277542, 0.06197551265358925, -0.05537422001361847, 0.02062765136361122, 0.020016051828861237, -0.024297641590237617, -0.045233841985464096, -0.034582652151584625, -0.0977277010679245, 0.006374812684953213, -0.07783913612365723, 0.015467152930796146, -0.014978265389800072, -0.08572793006896973, -0.037934768944978714, -0.15898989140987396, -0.0011305080261081457, 0.09650373458862305, 0.007345336955040693, -0.029424650594592094, -0.09241348505020142, 0.005526319146156311, 0.014208783395588398, -0.01407501008361578, -0.15675009787082672, -0.05031281337141991, 0.03119790367782116, -0.16866113245487213, 0.033627450466156006, -0.04903757572174072, 0.03549545630812645, 0.04459671676158905, -0.04535774141550064, -0.02160848118364811, 0.0152364457026124, 0.017460787668824196, -0.02394135482609272, -0.24046528339385986, -0.016492176800966263, -0.049182213842868805, 0.17930001020431519, -0.24510087072849274, 0.04199686273932457, 0.062341514974832535, 0.12092601507902145, 0.005246761720627546, -0.047405339777469635, 0.03611646965146065, -0.04782456159591675, -0.04614211246371269, -0.06458985060453415, -0.004041698761284351, -0.03005247749388218, -0.04619463160634041, 0.04105473682284355, -0.19605930149555206, -0.029964644461870193, 0.11028317362070084, 0.07146124541759491, -0.1701718270778656, -0.07740049809217453, -0.03032514825463295, -0.06061795726418495, -0.09144899994134903, -0.04754206910729408, 0.10501570999622345, 0.0424359068274498, 0.054926108568906784, -0.07243066281080246, -0.047703035175800323, 0.012159520760178566, -0.008316845633089542, -0.035265736281871796, 0.0910128578543663, 0.09147894382476807, -0.1183665320277214, 0.1003284826874733, 0.06719938665628433, 0.061502620577812195, 0.10171586275100708, 0.005867301486432552, -0.09559345990419388, -0.012123096734285355, 0.023821083828806877, 0.014739413745701313, 0.13627171516418457, -0.08041682839393616, 0.03041158802807331, 0.043761420994997025, -0.03445654734969139, 0.011279189959168434, -0.10341424494981766, 0.02347799763083458, 0.03186830133199692, -0.007050554268062115, 0.025736309587955475, -0.054652560502290726, 0.013161799870431423, 0.1042112186551094, 0.03211836516857147, 0.0227707140147686, 0.015011876821517944, -0.03876445069909096, -0.12403564900159836, 0.17888623476028442, -0.09523385018110275, -0.25718894600868225, -0.12982366979122162, 0.0025806569028645754, 0.04723223298788071, -0.01322246715426445, 0.01721704937517643, -0.057064954191446304, -0.10620168596506119, -0.10562704503536224, 0.017637979239225388, 0.05363597348332405, -0.08985256403684616, -0.06360358744859695, 0.05353172495961189, 0.038684699684381485, -0.12286891043186188, 0.023170825093984604, 0.04556644707918167, -0.0685787945985794, 0.004107215907424688, 0.05788148567080498, 0.08483386784791946, 0.18220773339271545, 0.013182112947106361, -0.017085859552025795, 0.012520790100097656, 0.22458304464817047, -0.14599265158176422, 0.09336943179368973, 0.13670575618743896, -0.0603153258562088, 0.08385994285345078, 0.20927630364894867, 0.031639765948057175, -0.09247095137834549, 0.04077373072504997, 0.032938770949840546, -0.040111273527145386, -0.23512989282608032, -0.07784179598093033, 0.0005755177116952837, -0.07578593492507935, 0.10564399510622025, 0.09113350510597229, 0.11394096910953522, 0.05373004451394081, -0.10628228634595871, -0.06785868853330612, 0.04576247185468674, 0.11892180144786835, -0.020387137308716774, 0.0034232554025948048, 0.09533460438251495, -0.032669007778167725, 0.016892950981855392, 0.0903218612074852, 0.010076770558953285, 0.18146716058254242, 0.040793538093566895, 0.12895575165748596, 0.08216089755296707, 0.06404399126768112, 0.023877892643213272, 0.01690720207989216, 0.028041476383805275, 0.02853785827755928, -0.021422842517495155, -0.08959300816059113, -0.01811058260500431, 0.14208537340164185, 0.03174193948507309, 0.030387144535779953, 0.009561240673065186, -0.0344390794634819, 0.0656830444931984, 0.16341377794742584, 0.01373966783285141, -0.23032663762569427, -0.06265294551849365, 0.07538370788097382, -0.07251506298780441, -0.11472991853952408, -0.007447437848895788, 0.029569825157523155, -0.17949488759040833, 0.045079123228788376, -0.02245110087096691, 0.1028464064002037, -0.11004801839590073, -0.024476202204823494, 0.04228143393993378, 0.06811302900314331, -0.03619502857327461, 0.07936927676200867, -0.21071307361125946, 0.14414268732070923, 0.0071875168941915035, 0.0627245381474495, -0.10963346809148788, 0.08230046182870865, 0.02151823230087757, 0.009466269053518772, 0.16101586818695068, -0.0074920570477843285, -0.09318114817142487, -0.07651645690202713, -0.07556641101837158, -0.011319656856358051, 0.09559466689825058, -0.10184428840875626, 0.08486217260360718, -0.008358954451978207, -0.03313955292105675, -0.00388424564152956, -0.1140027567744255, -0.13622364401817322, -0.18601436913013458, 0.05523287504911423, -0.11181046068668365, 0.03691478446125984, -0.11166879534721375, -0.06252610683441162, -0.02911795862019062, 0.19807842373847961, -0.1904531568288803, -0.08140338957309723, -0.14539870619773865, -0.07204011082649231, 0.12212951481342316, -0.04274967685341835, 0.07663191109895706, 0.00015701932716183364, 0.2071707546710968, -0.004644640255719423, 0.0014644638868048787, 0.0856679305434227, -0.09557735919952393, -0.206184521317482, -0.09439684450626373, 0.13821037113666534, 0.12497473508119583, 0.04596934840083122, -0.0036321566440165043, 0.024304913356900215, -0.0027867835015058517, -0.10976199060678482, 0.02332260087132454, 0.1432444006204605, 0.08416087180376053, 0.03885705769062042, -0.02675866149365902, -0.14533737301826477, -0.1054752767086029, -0.05289754271507263, 0.019448768347501755, 0.17674845457077026, -0.07222644239664078, 0.1607094258069992, 0.15837931632995605, -0.06414622813463211, -0.20734171569347382, 0.032782182097435, 0.03679283335804939, -0.011663361452519894, 0.03244366869330406, -0.20815548300743103, 0.07330463081598282, 0.016213007271289825, -0.06075131520628929, 0.1363404095172882, -0.1705039143562317, -0.14891991019248962, 0.0919104814529419, 0.07189090549945831, -0.2193969339132309, -0.13394345343112946, -0.09907522052526474, -0.055755600333213806, -0.10410746932029724, 0.08695419132709503, 0.014253350906074047, 0.004559517838060856, 0.040003977715969086, 0.024713784456253052, 0.021094202995300293, -0.05303549766540527, 0.19554594159126282, -0.004308625590056181, 0.041122131049633026, -0.08143328875303268, -0.08729361742734909, 0.030160382390022278, -0.06146852299571037, 0.07429458200931549, -0.02577015943825245, 0.004456855356693268, -0.1102396696805954, -0.06384536623954773, -0.05289682373404503, 0.03639809414744377, -0.08915901929140091, -0.0958789587020874, -0.05767008289694786, 0.10389325767755508, 0.08919540792703629, -0.03324571251869202, -0.058615610003471375, -0.10058292001485825, 0.0726626068353653, 0.22699709236621857, 0.18807223439216614, 0.07284927368164062, -0.07015843689441681, 0.0006279588560573757, -0.022037893533706665, 0.05516184866428375, -0.20622296631336212, 0.04608523100614548, 0.042553652077913284, 0.028887338936328888, 0.13527612388134003, -0.02506665140390396, -0.1602775603532791, -0.04527048021554947, 0.06014934554696083, -0.06545355916023254, -0.1614707112312317, -0.0005388054414652288, 0.09576781094074249, -0.16179001331329346, -0.06273222714662552, 0.024773813784122467, -0.036137934774160385, -0.025756290182471275, 0.0013679420808330178, 0.08270203322172165, 0.027825508266687393, 0.11478793621063232, 0.06896458566188812, 0.11150709539651871, -0.10231363028287888, 0.08406093716621399, 0.09299708157777786, -0.10971303284168243, 0.03247435390949249, 0.07298728823661804, -0.0610542818903923, -0.03390142321586609, 0.023122351616621017, 0.08364028483629227, 0.026266440749168396, -0.0744837298989296, -0.0008558011031709611, -0.1099681630730629, 0.06663114577531815, 0.13796411454677582, 0.032853204756975174, 0.0030810926109552383, 0.04435998201370239, 0.025823330506682396, -0.09881676733493805, 0.11186433583498001, 0.03916766867041588, 0.03720828518271446, -0.04767070338129997, 0.004865953233093023, 0.041960928589105606, -0.01269921287894249, -0.016253290697932243, -0.039693526923656464, -0.06471271812915802, -0.010708925314247608, -0.15688052773475647, 0.031037067994475365, -0.07176970690488815, 0.009115522727370262, 0.018755896016955376, -0.033779606223106384, 0.0002807097043842077, 0.0073861307464540005, -0.07919271290302277, -0.03761441633105278, -0.006646361667662859, 0.10705258697271347, -0.15747743844985962, 0.008323745802044868, 0.08949586004018784, -0.12556882202625275, 0.07766758650541306, -0.007498627994209528, -0.010838181711733341, 0.01879316382110119, -0.14380721747875214, 0.06054820865392685, -0.008177737705409527, 0.006405212916433811, 0.023949483409523964, -0.20071232318878174, 0.005702852737158537, -0.04664513096213341, -0.053938448429107666, -0.00976315326988697, -0.04211960732936859, -0.11404810100793839, 0.10492629557847977, 0.0196357611566782, -0.0860515683889389, -0.018402770161628723, 0.05309472978115082, 0.10592338442802429, -0.057369641959667206, 0.1371336728334427, -0.02283608354628086, 0.05825338885188103, -0.17831756174564362, -0.016339747235178947, -0.017454219982028008, 0.012596609070897102, -0.03102201037108898, -0.008158523589372635, 0.05483707785606384, -0.015072896145284176, 0.22714339196681976, -0.021177595481276512, 0.030790245160460472, 0.06548503786325455, 0.0070373364724218845, -0.013032838702201843, 0.08790382742881775, 0.04639120027422905, 0.021969040855765343, 0.017426103353500366, 0.016819516196846962, -0.047575462609529495, -0.019116412848234177, -0.12834098935127258, 0.08396804332733154, 0.16439755260944366, 0.08264775574207306, -0.005125291179865599, 0.053218428045511246, -0.11920209228992462, -0.08098750561475754, 0.10049403458833694, -0.033211447298526764, -0.001258186181075871, -0.057700008153915405, 0.14298145473003387, 0.15607422590255737, -0.1750815361738205, 0.06616412103176117, -0.07047461718320847, -0.05687202885746956, -0.11070677638053894, -0.17143365740776062, -0.06694129854440689, -0.03149404004216194, -0.005430171266198158, -0.06143372505903244, 0.06926561146974564, 0.10244123637676239, 0.008475886657834053, 0.002354414900764823, 0.08415096998214722, -0.033749498426914215, -0.0007962242234498262, 0.04344722256064415, 0.05283457785844803, 0.021373692899942398, -0.06691429764032364, 0.0076249162666499615, 0.004598149098455906, 0.038937900215387344, 0.05476561188697815, 0.0317605659365654, -0.014559607952833176, 0.011871086433529854, -0.013089693151414394, -0.09815122187137604, 0.03718226030468941, -0.029980625957250595, -0.0468674972653389, 0.14802806079387665, 0.01827765442430973, 0.0034919960889965296, -0.021031659096479416, 0.23128560185432434, -0.06903756409883499, -0.0798255056142807, -0.14009471237659454, 0.15071772038936615, -0.04670744761824608, 0.05065378174185753, 0.04940982535481453, -0.10087474435567856, 0.03407741338014603, 0.14691931009292603, 0.14527682960033417, -0.02467990294098854, 0.007901503704488277, 0.011187983676791191, 0.0055741616524755955, -0.025625228881835938, 0.05354921892285347, 0.04412171617150307, 0.12145667523145676, -0.06669453531503677, 0.09297986328601837, -0.007810541894286871, -0.0844663754105568, -0.02094031497836113, 0.1328510195016861, 0.0014671299140900373, 0.02338746376335621, -0.0805477648973465, 0.11851188540458679, -0.06559251248836517, -0.25864502787590027, 0.061333827674388885, -0.06666524708271027, -0.15384668111801147, -0.018917718902230263, 0.02399173192679882, 0.00401253392919898, 0.024401430040597916, 0.06268756836652756, -0.06360985338687897, 0.14903949201107025, 0.03688151761889458, -0.07834678888320923, -0.07808853685855865, 0.07696148753166199, -0.08397532254457474, 0.3018210828304291, 0.008228152059018612, 0.04951678216457367, 0.09650786966085434, -0.03327273949980736, -0.13361208140850067, 0.04569283500313759, 0.09728528559207916, -0.06408768892288208, 0.06690182536840439, 0.19748380780220032, -0.008177485316991806, 0.12026696652173996, 0.07469146698713303, -0.08128973841667175, 0.057554539293050766, -0.07613562047481537, -0.09007242321968079, -0.09192728251218796, 0.08888110518455505, -0.060599785298109055, 0.15479759871959686, 0.13393908739089966, -0.04440179467201233, -0.001819826546125114, -0.03071022778749466, 0.05197824910283089, -0.002023093169555068, 0.1104598417878151, 0.022785736247897148, -0.19388216733932495, 0.031831543892621994, -0.014316190034151077, 0.0986877828836441, -0.2479904145002365, -0.07837841659784317, 0.0403057225048542, -0.013808837160468102, -0.05274871736764908, 0.12204353511333466, 0.052187733352184296, 0.04937480762600899, -0.05449601635336876, -0.057812657207250595, -0.00025569170247763395, 0.16358551383018494, -0.1094348207116127, -0.00204258831217885 ]
null
null
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
null
CatBarks/bertES_posWeighted2_tokenizer
[ "transformers", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
2024-02-13T19:24:30+00:00
[ "1910.09700" ]
[]
TAGS #transformers #arxiv-1910.09700 #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #arxiv-1910.09700 #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ 26, 6, 3, 82, 28, 3, 4, 9, 9, 10, 42, 20, 3, 4, 5, 9, 11, 13, 3, 12, 5, 4, 5, 3, 4, 9, 53, 9, 8, 6, 3, 14, 8, 7, 9, 4 ]
[ "passage: TAGS\n#transformers #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact" ]
[ -0.08389580249786377, 0.19830818474292755, -0.0013316317927092314, 0.02313883788883686, 0.11396584659814835, 0.01961737498641014, 0.053626976907253265, 0.14538456499576569, 0.0060051376931369305, 0.10656800121068954, 0.066679947078228, 0.09131570905447006, 0.09678101539611816, 0.20042605698108673, 0.04371999576687813, -0.17659740149974823, 0.010636410675942898, -0.06930278241634369, -0.010073255747556686, 0.11651819199323654, 0.141214057803154, -0.10151198506355286, 0.07627976685762405, -0.03319970890879631, -0.02870541252195835, -0.0070160143077373505, -0.07769215852022171, -0.05755697935819626, 0.07573003321886063, 0.054863471537828445, 0.04207949340343475, -0.0008347301045432687, 0.08447454124689102, -0.2674994468688965, 0.013753628358244896, 0.07452993094921112, 0.010659529827535152, 0.05990942195057869, 0.07833302766084671, -0.04036625102162361, 0.12881849706172943, -0.06320446729660034, 0.13035163283348083, 0.0906217098236084, -0.0681561604142189, -0.24378153681755066, -0.08239314705133438, 0.06505522131919861, 0.12533815205097198, 0.07694927603006363, -0.02823091857135296, 0.16422191262245178, -0.07247646898031235, 0.019290022552013397, 0.09481704235076904, -0.1151006743311882, -0.060644298791885376, 0.08318385481834412, 0.14101974666118622, 0.10340547561645508, -0.1255619376897812, -0.012289565056562424, 0.04275871813297272, 0.045979104936122894, 0.07389909774065018, 0.011339850723743439, 0.1143413558602333, 0.05629947781562805, -0.13526225090026855, -0.05700986459851265, 0.14547574520111084, 0.023872992023825645, -0.057064127177000046, -0.2138909548521042, -0.002902575535699725, -0.07730814069509506, -0.011685127392411232, -0.06846728920936584, 0.0291305985301733, -0.01194276288151741, 0.060226380825042725, -0.0496203787624836, -0.09797755628824234, -0.046314824372529984, 0.1015089675784111, 0.054820988327264786, 0.011354796588420868, -0.01489334274083376, 0.03576440364122391, 0.13432876765727997, 0.04213530570268631, -0.10012737661600113, -0.07065672427415848, -0.0701170489192009, -0.09620913118124008, -0.03947552293539047, 0.04272124543786049, 0.020167991518974304, 0.042202774435281754, 0.2283228635787964, 0.024096308276057243, 0.05459817871451378, 0.029667891561985016, 0.0026177873369306326, 0.03211980313062668, 0.1073630079627037, -0.041210614144802094, -0.188126802444458, -0.03292805701494217, 0.0931866466999054, -0.009821015410125256, -0.028658604249358177, -0.033444397151470184, 0.035014089196920395, 0.08379437029361725, 0.11821532249450684, 0.08875755965709686, -0.012828069739043713, -0.037612639367580414, -0.03493109717965126, 0.2115669697523117, -0.14141373336315155, 0.045799970626831055, -0.022097334265708923, -0.018195297569036484, -0.06905751675367355, 0.030103791505098343, 0.01831657998263836, -0.003142025787383318, 0.06966056674718857, -0.061253178864717484, -0.05794486775994301, -0.11518853157758713, -0.045523155480623245, 0.04711875319480896, -0.024105608463287354, -0.024469668045639992, -0.07765042781829834, -0.11219723522663116, -0.06417357176542282, 0.06612563133239746, -0.04156653955578804, -0.03974827378988266, 0.005308232270181179, -0.07131324708461761, 0.008387917652726173, 0.008993842639029026, 0.12122467905282974, -0.030063031241297722, 0.05833350867033005, -0.002476902212947607, 0.05916252359747887, 0.10643328726291656, 0.03227818012237549, -0.08492200076580048, 0.057466037571430206, -0.20633617043495178, 0.08371785283088684, -0.11420095711946487, 0.034276340156793594, -0.17048145830631256, -0.024183684960007668, 0.008447963744401932, 0.023597201332449913, 0.023726604878902435, 0.1338067352771759, -0.2097422182559967, -0.016196569427847862, 0.14133213460445404, -0.09649793803691864, -0.12422871589660645, 0.07990546524524689, -0.03459475561976433, 0.1747698187828064, 0.038475677371025085, -0.019652999937534332, 0.09909367561340332, -0.15559963881969452, -0.05852397903800011, -0.026064254343509674, -0.008927824907004833, 0.08823978155851364, 0.07542291283607483, -0.05844951793551445, 0.02285866066813469, 0.02562655322253704, -0.04727208614349365, -0.0268824752420187, -0.05256075784564018, -0.10127434879541397, -0.023140445351600647, -0.09642518311738968, 0.026515161618590355, 0.000058677000197349116, -0.07310442626476288, -0.028560271486639977, -0.17347893118858337, -0.02563360333442688, 0.10103316605091095, 0.004820956848561764, -0.007559072691947222, -0.08540112525224686, 0.022149885073304176, -0.05362366884946823, -0.006164622958749533, -0.16996455192565918, -0.03558015450835228, 0.051895126700401306, -0.14917676150798798, 0.015460150316357613, -0.07327745854854584, 0.07047311216592789, 0.02098717913031578, -0.05859505757689476, -0.03108096309006214, 0.0007694467785768211, 0.004292082041501999, -0.06229274719953537, -0.1903683841228485, -0.058886781334877014, -0.041500482708215714, 0.15720732510089874, -0.24841000139713287, 0.0300158578902483, 0.03247617185115814, 0.13185922801494598, 0.007058668415993452, -0.06344027817249298, 0.02096918225288391, -0.04676475748419762, -0.050621338188648224, -0.06898977607488632, -0.009901339188218117, -0.014539826661348343, -0.031393732875585556, 0.012980648316442966, -0.14970256388187408, -0.060514215379953384, 0.09452559798955917, 0.11224991828203201, -0.14555825293064117, 0.00204002158716321, -0.0460561066865921, -0.07002599537372589, -0.07487804442644119, -0.0761631652712822, 0.07739497721195221, 0.044650159776210785, 0.049250341951847076, -0.06317461282014847, -0.06234706938266754, 0.023210179060697556, 0.005524294450879097, -0.019023682922124863, 0.0948529988527298, 0.074309803545475, -0.09122881293296814, 0.07973480224609375, 0.08461450785398483, 0.04414684325456619, 0.086973637342453, 0.005991141777485609, -0.11396963149309158, -0.03062884695827961, 0.037754856050014496, 0.024159027263522148, 0.15351562201976776, -0.08692087233066559, 0.030462130904197693, 0.052177220582962036, -0.03854219615459442, 0.03157065063714981, -0.0923321321606636, 0.025362705811858177, 0.021495236083865166, -0.006555700208991766, 0.05864228308200836, -0.018769768998026848, -0.01403577346354723, 0.06336429715156555, 0.05677810311317444, 0.044270504266023636, 0.02595379762351513, -0.02093072421848774, -0.1278371512889862, 0.16537296772003174, -0.09028079360723495, -0.2540280222892761, -0.17074446380138397, 0.015454737469553947, 0.03706491366028786, -0.021728800609707832, 0.039588842540979385, -0.06286025792360306, -0.10237989574670792, -0.09417891502380371, 0.0029635571409016848, 0.023925531655550003, -0.058347854763269424, -0.0817074254155159, 0.060779985040426254, 0.04047083482146263, -0.13689260184764862, 0.0349188968539238, 0.06170675903558731, -0.03042641654610634, 0.0018567070364952087, 0.07321398705244064, 0.12743599712848663, 0.14838241040706635, -0.006730219814926386, -0.012446845881640911, 0.035035960376262665, 0.229813352227211, -0.1490442156791687, 0.10630457103252411, 0.14053207635879517, -0.021705523133277893, 0.06635113060474396, 0.1461038440465927, 0.023231739178299904, -0.07546708732843399, 0.04147516191005707, 0.04027445614337921, -0.04228919371962547, -0.2589097023010254, -0.05694316700100899, -0.00946022942662239, -0.07043391466140747, 0.09718906134366989, 0.09238530695438385, 0.11972260475158691, 0.0337289460003376, -0.05568677559494972, -0.025771914049983025, -0.003401360474526882, 0.114128477871418, -0.027640055865049362, -0.004564122296869755, 0.07965842634439468, -0.05878787487745285, 0.011684526689350605, 0.09941446036100388, 0.019347423687577248, 0.17601320147514343, 0.02533329278230667, 0.10681075602769852, 0.06725578010082245, 0.09347675740718842, -0.0015635732561349869, 0.034774236381053925, 0.05337131395936012, 0.022044572979211807, 0.010453542694449425, -0.09408048540353775, -0.012431944720447063, 0.13713060319423676, 0.019816776737570763, 0.009031654335558414, 0.008926562033593655, -0.01010479498654604, 0.03131420537829399, 0.20501568913459778, 0.0009575071162544191, -0.22537250816822052, -0.09500737488269806, 0.059459153562784195, -0.06931101530790329, -0.143676295876503, -0.02094252221286297, 0.030270220711827278, -0.17292405664920807, 0.016790566965937614, -0.0316389761865139, 0.09112390875816345, -0.07145322859287262, -0.028050832450389862, 0.06891903281211853, 0.07569212466478348, -0.012108199298381805, 0.07973295450210571, -0.19069278240203857, 0.12254468351602554, 0.03037673607468605, 0.08605273067951202, -0.11708726733922958, 0.07849059253931046, -0.0019813794642686844, -0.014807495288550854, 0.17999744415283203, -0.014062200672924519, -0.0586031936109066, -0.08878950774669647, -0.08704045414924622, -0.011727320961654186, 0.10361312329769135, -0.09322915226221085, 0.09586969763040543, -0.02775636687874794, -0.03705112263560295, 0.012418309226632118, -0.10469507426023483, -0.1636953055858612, -0.18679304420948029, 0.06244563311338425, -0.07802703976631165, 0.012347841635346413, -0.11227322369813919, -0.06334327906370163, -0.01575082167983055, 0.23160123825073242, -0.16648635268211365, -0.07049825042486191, -0.1498587429523468, -0.03997112438082695, 0.17463743686676025, -0.042160745710134506, 0.06849376112222672, -0.021383514627814293, 0.1873992383480072, -0.008081548847258091, -0.013158116489648819, 0.06569221615791321, -0.09637628495693207, -0.16879262030124664, -0.05748843029141426, 0.14160962402820587, 0.10863390564918518, 0.05731578543782234, -0.0038195757661014795, 0.013171887956559658, -0.03383830562233925, -0.09896382689476013, 0.013824623078107834, 0.13817466795444489, 0.0034514935687184334, 0.00682973163202405, -0.03995988517999649, -0.07027145475149155, -0.05825701728463173, -0.07912654429674149, 0.057147104293107986, 0.187900573015213, -0.09512355923652649, 0.1602867990732193, 0.12431421875953674, -0.06468851119279861, -0.2306901067495346, 0.03996593505144119, 0.04701630026102066, 0.007666614837944508, 0.022401191294193268, -0.19138796627521515, 0.09788824617862701, 0.0009011493530124426, -0.06807263940572739, 0.14616990089416504, -0.16564498841762543, -0.1461436152458191, 0.08002161979675293, 0.025075770914554596, -0.22560662031173706, -0.14821304380893707, -0.1037549376487732, -0.03735695406794548, -0.13707835972309113, 0.048581719398498535, 0.02614329755306244, 0.019834673032164574, 0.025222565978765488, 0.005338077899068594, 0.029657263308763504, -0.07272187620401382, 0.1870686560869217, -0.020297454670071602, 0.0072362530045211315, -0.050640691071748734, -0.04617878794670105, 0.09227550774812698, -0.06150037795305252, 0.11741586774587631, 0.018679620698094368, 0.018796883523464203, -0.1431548148393631, -0.049209367483854294, -0.060803934931755066, 0.04456847906112671, -0.07284719496965408, -0.09393193572759628, -0.04137463867664337, 0.08888561278581619, 0.07211937010288239, -0.032792408019304276, -0.0027768779546022415, -0.07569456845521927, 0.09405932575464249, 0.184477761387825, 0.17357055842876434, 0.009977072477340698, -0.07020942866802216, 0.024555526673793793, -0.042279548943042755, 0.03349342197179794, -0.24652716517448425, 0.03456863760948181, 0.066053606569767, 0.03803660348057747, 0.08509242534637451, -0.016836483031511307, -0.1781480610370636, -0.04086102172732353, 0.08498652279376984, -0.06206206604838371, -0.19876568019390106, -0.02703288197517395, 0.08424776047468185, -0.20383712649345398, -0.032998621463775635, 0.041543323546648026, -0.03834589570760727, -0.02396267279982567, -0.002415500348433852, 0.06396626681089401, -0.008327016606926918, 0.12156640738248825, 0.06747189164161682, 0.10266115516424179, -0.09284433722496033, 0.08920657634735107, 0.10416955500841141, -0.09140542894601822, 0.03545991703867912, 0.10264154523611069, -0.05670900270342827, -0.04460543021559715, 0.033935222774744034, 0.05925208330154419, -0.028357384726405144, -0.06409841030836105, -0.000502707262057811, -0.0359574519097805, 0.04993389546871185, 0.08058220148086548, 0.036113787442445755, -0.01202210783958435, 0.06544706225395203, 0.028145326301455498, -0.11693570017814636, 0.10949387401342392, 0.04405685141682625, 0.04509059712290764, -0.07182393968105316, -0.012280966155230999, 0.015999672934412956, 0.032540347427129745, -0.019734015688300133, -0.014576527290046215, -0.03146412968635559, -0.007561005651950836, -0.1553635597229004, -0.02064543403685093, -0.06516171246767044, 0.006067827809602022, 0.022207623347640038, -0.03830232471227646, -0.012014663778245449, 0.01381110493093729, -0.07979435473680496, -0.07571027427911758, -0.01700955256819725, 0.08539021760225296, -0.1381402313709259, 0.006627439055591822, 0.07182712107896805, -0.10980239510536194, 0.07347989827394485, -0.0048679932951927185, 0.017079560086131096, 0.010923396795988083, -0.11654401570558548, 0.04386281594634056, -0.005810429807752371, 0.01551580335944891, 0.022556742653250694, -0.171111062169075, 0.011553828604519367, -0.038553636521101, -0.03114982508122921, 0.011926400475203991, -0.025060230866074562, -0.11875922232866287, 0.08676479011774063, -0.028097305446863174, -0.037512701004743576, -0.03292486071586609, 0.06296087801456451, 0.08736220002174377, -0.011740099638700485, 0.09667140990495682, -0.025766119360923767, 0.04818311333656311, -0.1756584197282791, -0.01910574547946453, -0.050167568027973175, 0.02537350542843342, -0.01759655587375164, -0.0070639788173139095, 0.055272240191698074, -0.004191063344478607, 0.20991376042366028, -0.03921036794781685, 0.1548677533864975, 0.05199402943253517, -0.009925156831741333, 0.010884369723498821, 0.05032730847597122, 0.06423956155776978, 0.031145188957452774, 0.00853167474269867, 0.04660189896821976, -0.004552975296974182, -0.020357951521873474, -0.13699717819690704, 0.02791593410074711, 0.16117429733276367, 0.061918217688798904, 0.0392887257039547, 0.03704594820737839, -0.1422400325536728, -0.09538721293210983, 0.10306388139724731, -0.0331864058971405, 0.014331420883536339, -0.08317886292934418, 0.17621558904647827, 0.12328410148620605, -0.1574767529964447, 0.0577850341796875, -0.07234696298837662, -0.05066767707467079, -0.1024852767586708, -0.11832084506750107, -0.06293155997991562, -0.06027044355869293, -0.004747506696730852, -0.042489297688007355, 0.05734556168317795, 0.026751231402158737, -0.003270963439717889, -0.006759525276720524, 0.12665949761867523, -0.0249644722789526, -0.004145825747400522, 0.04152364656329155, 0.0326087586581707, 0.019319625571370125, -0.05872373282909393, 0.017997145652770996, 0.018602589145302773, 0.022180357947945595, 0.06835069507360458, 0.0260987039655447, -0.059317342936992645, 0.044286735355854034, 0.00319746439345181, -0.11313364654779434, 0.018146557733416557, -0.00002245741598017048, -0.05020225793123245, 0.13557326793670654, 0.04076748713850975, 0.01548024732619524, -0.029270920902490616, 0.24342355132102966, -0.07199113070964813, -0.08681939542293549, -0.13965600728988647, 0.11511493474245071, -0.023563209921121597, 0.03755274787545204, 0.016542524099349976, -0.12659503519535065, 0.011511262506246567, 0.18531471490859985, 0.12824349105358124, 0.012459068559110165, -0.007656481582671404, 0.05736639350652695, -0.0007639875984750688, -0.05985576659440994, 0.05051197111606598, 0.0664999932050705, 0.16097788512706757, -0.09069112688302994, 0.0652846097946167, -0.008405503816902637, -0.0831485390663147, -0.027498632669448853, 0.11705785244703293, -0.022675158455967903, 0.02148384228348732, -0.03778035193681717, 0.11204422265291214, -0.052532415837049484, -0.2719486355781555, 0.02952493168413639, -0.09503202140331268, -0.13993041217327118, -0.02591860294342041, 0.041448429226875305, -0.03349510580301285, 0.01577647216618061, 0.06254769116640091, -0.045389387756586075, 0.18837277591228485, 0.025987716391682625, -0.08679025620222092, -0.07755549252033234, 0.05874146893620491, -0.08695939928293228, 0.2789687216281891, 0.003863075515255332, 0.04782010242342949, 0.12108923494815826, -0.03053574077785015, -0.18664880096912384, 0.014769754372537136, 0.11989909410476685, -0.09114406257867813, 0.07780203968286514, 0.18139931559562683, -0.005561648402363062, 0.12649618089199066, 0.04705416411161423, -0.03877115994691849, 0.03976387158036232, -0.02721380814909935, -0.03821522742509842, -0.12209630757570267, 0.05661242455244064, -0.0612691193819046, 0.15957388281822205, 0.1158948540687561, -0.05964287370443344, 0.001120698289014399, -0.06126941740512848, 0.06300627440214157, 0.014774397015571594, 0.12115653604269028, 0.018452486023306847, -0.2023056596517563, 0.05087360367178917, -0.03283824771642685, 0.08166342973709106, -0.254973828792572, -0.08186668157577515, 0.07622263580560684, -0.019022729247808456, -0.04275642707943916, 0.12311509251594543, 0.06101066991686821, 0.03676839917898178, -0.03853875398635864, -0.08537755906581879, -0.01412904355674982, 0.15376435220241547, -0.14123432338237762, -0.029574336484074593 ]
null
null
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
null
CatBarks/bertES_posWeighted0.9_tokenizer
[ "transformers", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
2024-02-13T19:24:54+00:00
[ "1910.09700" ]
[]
TAGS #transformers #arxiv-1910.09700 #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #arxiv-1910.09700 #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ 26, 6, 3, 82, 28, 3, 4, 9, 9, 10, 42, 20, 3, 4, 5, 9, 11, 13, 3, 12, 5, 4, 5, 3, 4, 9, 53, 9, 8, 6, 3, 14, 8, 7, 9, 4 ]
[ "passage: TAGS\n#transformers #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact" ]
[ -0.08389580249786377, 0.19830818474292755, -0.0013316317927092314, 0.02313883788883686, 0.11396584659814835, 0.01961737498641014, 0.053626976907253265, 0.14538456499576569, 0.0060051376931369305, 0.10656800121068954, 0.066679947078228, 0.09131570905447006, 0.09678101539611816, 0.20042605698108673, 0.04371999576687813, -0.17659740149974823, 0.010636410675942898, -0.06930278241634369, -0.010073255747556686, 0.11651819199323654, 0.141214057803154, -0.10151198506355286, 0.07627976685762405, -0.03319970890879631, -0.02870541252195835, -0.0070160143077373505, -0.07769215852022171, -0.05755697935819626, 0.07573003321886063, 0.054863471537828445, 0.04207949340343475, -0.0008347301045432687, 0.08447454124689102, -0.2674994468688965, 0.013753628358244896, 0.07452993094921112, 0.010659529827535152, 0.05990942195057869, 0.07833302766084671, -0.04036625102162361, 0.12881849706172943, -0.06320446729660034, 0.13035163283348083, 0.0906217098236084, -0.0681561604142189, -0.24378153681755066, -0.08239314705133438, 0.06505522131919861, 0.12533815205097198, 0.07694927603006363, -0.02823091857135296, 0.16422191262245178, -0.07247646898031235, 0.019290022552013397, 0.09481704235076904, -0.1151006743311882, -0.060644298791885376, 0.08318385481834412, 0.14101974666118622, 0.10340547561645508, -0.1255619376897812, -0.012289565056562424, 0.04275871813297272, 0.045979104936122894, 0.07389909774065018, 0.011339850723743439, 0.1143413558602333, 0.05629947781562805, -0.13526225090026855, -0.05700986459851265, 0.14547574520111084, 0.023872992023825645, -0.057064127177000046, -0.2138909548521042, -0.002902575535699725, -0.07730814069509506, -0.011685127392411232, -0.06846728920936584, 0.0291305985301733, -0.01194276288151741, 0.060226380825042725, -0.0496203787624836, -0.09797755628824234, -0.046314824372529984, 0.1015089675784111, 0.054820988327264786, 0.011354796588420868, -0.01489334274083376, 0.03576440364122391, 0.13432876765727997, 0.04213530570268631, -0.10012737661600113, -0.07065672427415848, -0.0701170489192009, -0.09620913118124008, -0.03947552293539047, 0.04272124543786049, 0.020167991518974304, 0.042202774435281754, 0.2283228635787964, 0.024096308276057243, 0.05459817871451378, 0.029667891561985016, 0.0026177873369306326, 0.03211980313062668, 0.1073630079627037, -0.041210614144802094, -0.188126802444458, -0.03292805701494217, 0.0931866466999054, -0.009821015410125256, -0.028658604249358177, -0.033444397151470184, 0.035014089196920395, 0.08379437029361725, 0.11821532249450684, 0.08875755965709686, -0.012828069739043713, -0.037612639367580414, -0.03493109717965126, 0.2115669697523117, -0.14141373336315155, 0.045799970626831055, -0.022097334265708923, -0.018195297569036484, -0.06905751675367355, 0.030103791505098343, 0.01831657998263836, -0.003142025787383318, 0.06966056674718857, -0.061253178864717484, -0.05794486775994301, -0.11518853157758713, -0.045523155480623245, 0.04711875319480896, -0.024105608463287354, -0.024469668045639992, -0.07765042781829834, -0.11219723522663116, -0.06417357176542282, 0.06612563133239746, -0.04156653955578804, -0.03974827378988266, 0.005308232270181179, -0.07131324708461761, 0.008387917652726173, 0.008993842639029026, 0.12122467905282974, -0.030063031241297722, 0.05833350867033005, -0.002476902212947607, 0.05916252359747887, 0.10643328726291656, 0.03227818012237549, -0.08492200076580048, 0.057466037571430206, -0.20633617043495178, 0.08371785283088684, -0.11420095711946487, 0.034276340156793594, -0.17048145830631256, -0.024183684960007668, 0.008447963744401932, 0.023597201332449913, 0.023726604878902435, 0.1338067352771759, -0.2097422182559967, -0.016196569427847862, 0.14133213460445404, -0.09649793803691864, -0.12422871589660645, 0.07990546524524689, -0.03459475561976433, 0.1747698187828064, 0.038475677371025085, -0.019652999937534332, 0.09909367561340332, -0.15559963881969452, -0.05852397903800011, -0.026064254343509674, -0.008927824907004833, 0.08823978155851364, 0.07542291283607483, -0.05844951793551445, 0.02285866066813469, 0.02562655322253704, -0.04727208614349365, -0.0268824752420187, -0.05256075784564018, -0.10127434879541397, -0.023140445351600647, -0.09642518311738968, 0.026515161618590355, 0.000058677000197349116, -0.07310442626476288, -0.028560271486639977, -0.17347893118858337, -0.02563360333442688, 0.10103316605091095, 0.004820956848561764, -0.007559072691947222, -0.08540112525224686, 0.022149885073304176, -0.05362366884946823, -0.006164622958749533, -0.16996455192565918, -0.03558015450835228, 0.051895126700401306, -0.14917676150798798, 0.015460150316357613, -0.07327745854854584, 0.07047311216592789, 0.02098717913031578, -0.05859505757689476, -0.03108096309006214, 0.0007694467785768211, 0.004292082041501999, -0.06229274719953537, -0.1903683841228485, -0.058886781334877014, -0.041500482708215714, 0.15720732510089874, -0.24841000139713287, 0.0300158578902483, 0.03247617185115814, 0.13185922801494598, 0.007058668415993452, -0.06344027817249298, 0.02096918225288391, -0.04676475748419762, -0.050621338188648224, -0.06898977607488632, -0.009901339188218117, -0.014539826661348343, -0.031393732875585556, 0.012980648316442966, -0.14970256388187408, -0.060514215379953384, 0.09452559798955917, 0.11224991828203201, -0.14555825293064117, 0.00204002158716321, -0.0460561066865921, -0.07002599537372589, -0.07487804442644119, -0.0761631652712822, 0.07739497721195221, 0.044650159776210785, 0.049250341951847076, -0.06317461282014847, -0.06234706938266754, 0.023210179060697556, 0.005524294450879097, -0.019023682922124863, 0.0948529988527298, 0.074309803545475, -0.09122881293296814, 0.07973480224609375, 0.08461450785398483, 0.04414684325456619, 0.086973637342453, 0.005991141777485609, -0.11396963149309158, -0.03062884695827961, 0.037754856050014496, 0.024159027263522148, 0.15351562201976776, -0.08692087233066559, 0.030462130904197693, 0.052177220582962036, -0.03854219615459442, 0.03157065063714981, -0.0923321321606636, 0.025362705811858177, 0.021495236083865166, -0.006555700208991766, 0.05864228308200836, -0.018769768998026848, -0.01403577346354723, 0.06336429715156555, 0.05677810311317444, 0.044270504266023636, 0.02595379762351513, -0.02093072421848774, -0.1278371512889862, 0.16537296772003174, -0.09028079360723495, -0.2540280222892761, -0.17074446380138397, 0.015454737469553947, 0.03706491366028786, -0.021728800609707832, 0.039588842540979385, -0.06286025792360306, -0.10237989574670792, -0.09417891502380371, 0.0029635571409016848, 0.023925531655550003, -0.058347854763269424, -0.0817074254155159, 0.060779985040426254, 0.04047083482146263, -0.13689260184764862, 0.0349188968539238, 0.06170675903558731, -0.03042641654610634, 0.0018567070364952087, 0.07321398705244064, 0.12743599712848663, 0.14838241040706635, -0.006730219814926386, -0.012446845881640911, 0.035035960376262665, 0.229813352227211, -0.1490442156791687, 0.10630457103252411, 0.14053207635879517, -0.021705523133277893, 0.06635113060474396, 0.1461038440465927, 0.023231739178299904, -0.07546708732843399, 0.04147516191005707, 0.04027445614337921, -0.04228919371962547, -0.2589097023010254, -0.05694316700100899, -0.00946022942662239, -0.07043391466140747, 0.09718906134366989, 0.09238530695438385, 0.11972260475158691, 0.0337289460003376, -0.05568677559494972, -0.025771914049983025, -0.003401360474526882, 0.114128477871418, -0.027640055865049362, -0.004564122296869755, 0.07965842634439468, -0.05878787487745285, 0.011684526689350605, 0.09941446036100388, 0.019347423687577248, 0.17601320147514343, 0.02533329278230667, 0.10681075602769852, 0.06725578010082245, 0.09347675740718842, -0.0015635732561349869, 0.034774236381053925, 0.05337131395936012, 0.022044572979211807, 0.010453542694449425, -0.09408048540353775, -0.012431944720447063, 0.13713060319423676, 0.019816776737570763, 0.009031654335558414, 0.008926562033593655, -0.01010479498654604, 0.03131420537829399, 0.20501568913459778, 0.0009575071162544191, -0.22537250816822052, -0.09500737488269806, 0.059459153562784195, -0.06931101530790329, -0.143676295876503, -0.02094252221286297, 0.030270220711827278, -0.17292405664920807, 0.016790566965937614, -0.0316389761865139, 0.09112390875816345, -0.07145322859287262, -0.028050832450389862, 0.06891903281211853, 0.07569212466478348, -0.012108199298381805, 0.07973295450210571, -0.19069278240203857, 0.12254468351602554, 0.03037673607468605, 0.08605273067951202, -0.11708726733922958, 0.07849059253931046, -0.0019813794642686844, -0.014807495288550854, 0.17999744415283203, -0.014062200672924519, -0.0586031936109066, -0.08878950774669647, -0.08704045414924622, -0.011727320961654186, 0.10361312329769135, -0.09322915226221085, 0.09586969763040543, -0.02775636687874794, -0.03705112263560295, 0.012418309226632118, -0.10469507426023483, -0.1636953055858612, -0.18679304420948029, 0.06244563311338425, -0.07802703976631165, 0.012347841635346413, -0.11227322369813919, -0.06334327906370163, -0.01575082167983055, 0.23160123825073242, -0.16648635268211365, -0.07049825042486191, -0.1498587429523468, -0.03997112438082695, 0.17463743686676025, -0.042160745710134506, 0.06849376112222672, -0.021383514627814293, 0.1873992383480072, -0.008081548847258091, -0.013158116489648819, 0.06569221615791321, -0.09637628495693207, -0.16879262030124664, -0.05748843029141426, 0.14160962402820587, 0.10863390564918518, 0.05731578543782234, -0.0038195757661014795, 0.013171887956559658, -0.03383830562233925, -0.09896382689476013, 0.013824623078107834, 0.13817466795444489, 0.0034514935687184334, 0.00682973163202405, -0.03995988517999649, -0.07027145475149155, -0.05825701728463173, -0.07912654429674149, 0.057147104293107986, 0.187900573015213, -0.09512355923652649, 0.1602867990732193, 0.12431421875953674, -0.06468851119279861, -0.2306901067495346, 0.03996593505144119, 0.04701630026102066, 0.007666614837944508, 0.022401191294193268, -0.19138796627521515, 0.09788824617862701, 0.0009011493530124426, -0.06807263940572739, 0.14616990089416504, -0.16564498841762543, -0.1461436152458191, 0.08002161979675293, 0.025075770914554596, -0.22560662031173706, -0.14821304380893707, -0.1037549376487732, -0.03735695406794548, -0.13707835972309113, 0.048581719398498535, 0.02614329755306244, 0.019834673032164574, 0.025222565978765488, 0.005338077899068594, 0.029657263308763504, -0.07272187620401382, 0.1870686560869217, -0.020297454670071602, 0.0072362530045211315, -0.050640691071748734, -0.04617878794670105, 0.09227550774812698, -0.06150037795305252, 0.11741586774587631, 0.018679620698094368, 0.018796883523464203, -0.1431548148393631, -0.049209367483854294, -0.060803934931755066, 0.04456847906112671, -0.07284719496965408, -0.09393193572759628, -0.04137463867664337, 0.08888561278581619, 0.07211937010288239, -0.032792408019304276, -0.0027768779546022415, -0.07569456845521927, 0.09405932575464249, 0.184477761387825, 0.17357055842876434, 0.009977072477340698, -0.07020942866802216, 0.024555526673793793, -0.042279548943042755, 0.03349342197179794, -0.24652716517448425, 0.03456863760948181, 0.066053606569767, 0.03803660348057747, 0.08509242534637451, -0.016836483031511307, -0.1781480610370636, -0.04086102172732353, 0.08498652279376984, -0.06206206604838371, -0.19876568019390106, -0.02703288197517395, 0.08424776047468185, -0.20383712649345398, -0.032998621463775635, 0.041543323546648026, -0.03834589570760727, -0.02396267279982567, -0.002415500348433852, 0.06396626681089401, -0.008327016606926918, 0.12156640738248825, 0.06747189164161682, 0.10266115516424179, -0.09284433722496033, 0.08920657634735107, 0.10416955500841141, -0.09140542894601822, 0.03545991703867912, 0.10264154523611069, -0.05670900270342827, -0.04460543021559715, 0.033935222774744034, 0.05925208330154419, -0.028357384726405144, -0.06409841030836105, -0.000502707262057811, -0.0359574519097805, 0.04993389546871185, 0.08058220148086548, 0.036113787442445755, -0.01202210783958435, 0.06544706225395203, 0.028145326301455498, -0.11693570017814636, 0.10949387401342392, 0.04405685141682625, 0.04509059712290764, -0.07182393968105316, -0.012280966155230999, 0.015999672934412956, 0.032540347427129745, -0.019734015688300133, -0.014576527290046215, -0.03146412968635559, -0.007561005651950836, -0.1553635597229004, -0.02064543403685093, -0.06516171246767044, 0.006067827809602022, 0.022207623347640038, -0.03830232471227646, -0.012014663778245449, 0.01381110493093729, -0.07979435473680496, -0.07571027427911758, -0.01700955256819725, 0.08539021760225296, -0.1381402313709259, 0.006627439055591822, 0.07182712107896805, -0.10980239510536194, 0.07347989827394485, -0.0048679932951927185, 0.017079560086131096, 0.010923396795988083, -0.11654401570558548, 0.04386281594634056, -0.005810429807752371, 0.01551580335944891, 0.022556742653250694, -0.171111062169075, 0.011553828604519367, -0.038553636521101, -0.03114982508122921, 0.011926400475203991, -0.025060230866074562, -0.11875922232866287, 0.08676479011774063, -0.028097305446863174, -0.037512701004743576, -0.03292486071586609, 0.06296087801456451, 0.08736220002174377, -0.011740099638700485, 0.09667140990495682, -0.025766119360923767, 0.04818311333656311, -0.1756584197282791, -0.01910574547946453, -0.050167568027973175, 0.02537350542843342, -0.01759655587375164, -0.0070639788173139095, 0.055272240191698074, -0.004191063344478607, 0.20991376042366028, -0.03921036794781685, 0.1548677533864975, 0.05199402943253517, -0.009925156831741333, 0.010884369723498821, 0.05032730847597122, 0.06423956155776978, 0.031145188957452774, 0.00853167474269867, 0.04660189896821976, -0.004552975296974182, -0.020357951521873474, -0.13699717819690704, 0.02791593410074711, 0.16117429733276367, 0.061918217688798904, 0.0392887257039547, 0.03704594820737839, -0.1422400325536728, -0.09538721293210983, 0.10306388139724731, -0.0331864058971405, 0.014331420883536339, -0.08317886292934418, 0.17621558904647827, 0.12328410148620605, -0.1574767529964447, 0.0577850341796875, -0.07234696298837662, -0.05066767707467079, -0.1024852767586708, -0.11832084506750107, -0.06293155997991562, -0.06027044355869293, -0.004747506696730852, -0.042489297688007355, 0.05734556168317795, 0.026751231402158737, -0.003270963439717889, -0.006759525276720524, 0.12665949761867523, -0.0249644722789526, -0.004145825747400522, 0.04152364656329155, 0.0326087586581707, 0.019319625571370125, -0.05872373282909393, 0.017997145652770996, 0.018602589145302773, 0.022180357947945595, 0.06835069507360458, 0.0260987039655447, -0.059317342936992645, 0.044286735355854034, 0.00319746439345181, -0.11313364654779434, 0.018146557733416557, -0.00002245741598017048, -0.05020225793123245, 0.13557326793670654, 0.04076748713850975, 0.01548024732619524, -0.029270920902490616, 0.24342355132102966, -0.07199113070964813, -0.08681939542293549, -0.13965600728988647, 0.11511493474245071, -0.023563209921121597, 0.03755274787545204, 0.016542524099349976, -0.12659503519535065, 0.011511262506246567, 0.18531471490859985, 0.12824349105358124, 0.012459068559110165, -0.007656481582671404, 0.05736639350652695, -0.0007639875984750688, -0.05985576659440994, 0.05051197111606598, 0.0664999932050705, 0.16097788512706757, -0.09069112688302994, 0.0652846097946167, -0.008405503816902637, -0.0831485390663147, -0.027498632669448853, 0.11705785244703293, -0.022675158455967903, 0.02148384228348732, -0.03778035193681717, 0.11204422265291214, -0.052532415837049484, -0.2719486355781555, 0.02952493168413639, -0.09503202140331268, -0.13993041217327118, -0.02591860294342041, 0.041448429226875305, -0.03349510580301285, 0.01577647216618061, 0.06254769116640091, -0.045389387756586075, 0.18837277591228485, 0.025987716391682625, -0.08679025620222092, -0.07755549252033234, 0.05874146893620491, -0.08695939928293228, 0.2789687216281891, 0.003863075515255332, 0.04782010242342949, 0.12108923494815826, -0.03053574077785015, -0.18664880096912384, 0.014769754372537136, 0.11989909410476685, -0.09114406257867813, 0.07780203968286514, 0.18139931559562683, -0.005561648402363062, 0.12649618089199066, 0.04705416411161423, -0.03877115994691849, 0.03976387158036232, -0.02721380814909935, -0.03821522742509842, -0.12209630757570267, 0.05661242455244064, -0.0612691193819046, 0.15957388281822205, 0.1158948540687561, -0.05964287370443344, 0.001120698289014399, -0.06126941740512848, 0.06300627440214157, 0.014774397015571594, 0.12115653604269028, 0.018452486023306847, -0.2023056596517563, 0.05087360367178917, -0.03283824771642685, 0.08166342973709106, -0.254973828792572, -0.08186668157577515, 0.07622263580560684, -0.019022729247808456, -0.04275642707943916, 0.12311509251594543, 0.06101066991686821, 0.03676839917898178, -0.03853875398635864, -0.08537755906581879, -0.01412904355674982, 0.15376435220241547, -0.14123432338237762, -0.029574336484074593 ]
null
null
peft
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed] ### Framework versions - PEFT 0.8.2
{"library_name": "peft", "base_model": "EleutherAI/gpt-neo-1.3B"}
null
KapitalK/GPT-Neo-1.3B
[ "peft", "arxiv:1910.09700", "base_model:EleutherAI/gpt-neo-1.3B", "region:us" ]
2024-02-13T19:25:24+00:00
[ "1910.09700" ]
[]
TAGS #peft #arxiv-1910.09700 #base_model-EleutherAI/gpt-neo-1.3B #region-us
# Model Card for Model ID ## Model Details ### Model Description - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact ### Framework versions - PEFT 0.8.2
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact", "### Framework versions\n\n- PEFT 0.8.2" ]
[ "TAGS\n#peft #arxiv-1910.09700 #base_model-EleutherAI/gpt-neo-1.3B #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact", "### Framework versions\n\n- PEFT 0.8.2" ]
[ 35, 6, 3, 54, 28, 3, 4, 9, 9, 10, 42, 20, 3, 4, 5, 9, 11, 13, 3, 12, 5, 4, 5, 3, 4, 9, 53, 9, 8, 6, 3, 14, 8, 7, 9, 4, 11 ]
[ "passage: TAGS\n#peft #arxiv-1910.09700 #base_model-EleutherAI/gpt-neo-1.3B #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact### Framework versions\n\n- PEFT 0.8.2" ]
[ -0.10271888226270676, 0.19527022540569305, -0.0033699532505124807, 0.03445544093847275, 0.09149602055549622, 0.021228857338428497, 0.04508786275982857, 0.12934884428977966, -0.0362875871360302, 0.10663561522960663, 0.06918328255414963, 0.10170444846153259, 0.10278696566820145, 0.20173397660255432, 0.014419346116483212, -0.2038455307483673, 0.030306780710816383, -0.09026747941970825, -0.00997967179864645, 0.12101078033447266, 0.14871783554553986, -0.09674528241157532, 0.08067033439874649, -0.013252629898488522, -0.02018401399254799, -0.03479602187871933, -0.08220858126878738, -0.029651518911123276, 0.04297977313399315, 0.04817818105220795, 0.055762454867362976, -0.003919978626072407, 0.0854543000459671, -0.2667374610900879, 0.016378361731767654, 0.04391380771994591, -0.009256024844944477, 0.08216246962547302, 0.09698007255792618, -0.03900330141186714, 0.13612811267375946, -0.02831815741956234, 0.14423765242099762, 0.0780455619096756, -0.09572290629148483, -0.2105964720249176, -0.06972037255764008, 0.08032239973545074, 0.17306309938430786, 0.08101795613765717, -0.044216569513082504, 0.13051597774028778, -0.1039755791425705, 0.015697995200753212, 0.04618983343243599, -0.07645533233880997, -0.07288352400064468, 0.05805329605937004, 0.10494252294301987, 0.057436708360910416, -0.13376228511333466, -0.03275226056575775, 0.020286090672016144, 0.042183659970760345, 0.07066819071769714, 0.018498992547392845, 0.14151762425899506, 0.03474012762308121, -0.1529158651828766, -0.04076622053980827, 0.14054742455482483, 0.03211649879813194, -0.0361289381980896, -0.21481509506702423, 0.011591927148401737, -0.0908573716878891, -0.025917623192071915, -0.0531003400683403, 0.03856999799609184, 0.0023956617806106806, 0.092180535197258, -0.035920217633247375, -0.0902457982301712, -0.01239502802491188, 0.09221173077821732, 0.05003204941749573, 0.027439597994089127, -0.0215892530977726, 0.008754881098866463, 0.12504470348358154, 0.050880007445812225, -0.12132106721401215, -0.06511104106903076, -0.06868802011013031, -0.042587656527757645, -0.04715564101934433, 0.03082866780459881, 0.03808767348527908, 0.052652958780527115, 0.2449570894241333, -0.019221119582653046, 0.055566638708114624, 0.061534516513347626, 0.02239272929728031, 0.04289219155907631, 0.09438673406839371, -0.05471247807145119, -0.15396635234355927, -0.021844854578375816, 0.09371088445186615, -0.013570205308496952, -0.02016391232609749, -0.056890279054641724, 0.03853460028767586, 0.0406564362347126, 0.10453259199857712, 0.09596221148967743, -0.004790294915437698, -0.07481496781110764, -0.053770724684000015, 0.20467320084571838, -0.14680379629135132, 0.04156503453850746, 0.014495219103991985, -0.02624901384115219, -0.057670943439006805, 0.008372100070118904, 0.018296079710125923, -0.03043478913605213, 0.08997470885515213, -0.06557711213827133, -0.03873911872506142, -0.12226494401693344, -0.019989144057035446, 0.03629684075713158, 0.008647252805531025, -0.026938868686556816, -0.028544608503580093, -0.07026344537734985, -0.09179455041885376, 0.10668272525072098, -0.06881968677043915, -0.06464587897062302, -0.03473949059844017, -0.0911305695772171, 0.018590496852993965, 0.029779786244034767, 0.1154361367225647, -0.025189192965626717, 0.040260568261146545, -0.012537366710603237, 0.06329397112131119, 0.07956895977258682, 0.03973676636815071, -0.0689849928021431, 0.057616353034973145, -0.19635947048664093, 0.09345728904008865, -0.07547280937433243, 0.022198766469955444, -0.15951058268547058, -0.014133617281913757, 0.0003997675084974617, 0.02370622754096985, 0.03342324495315552, 0.14784079790115356, -0.1990794837474823, -0.032015763223171234, 0.15099991858005524, -0.0989287793636322, -0.11784408986568451, 0.03849948197603226, -0.053430765867233276, 0.1605759859085083, 0.02140435203909874, -0.006892572622746229, 0.08563012629747391, -0.14643891155719757, -0.03159454092383385, -0.027643615379929543, -0.007386492565274239, 0.09976782649755478, 0.08615487813949585, -0.07854831218719482, 0.03443389758467674, 0.015566358342766762, -0.04777666926383972, -0.0294329971075058, -0.05099432170391083, -0.11137814819812775, 0.0029692379757761955, -0.0859692171216011, 0.026323797181248665, -0.012734618969261646, -0.07471495866775513, -0.01044286135584116, -0.1650170534849167, -0.018342291936278343, 0.08298461884260178, 0.01768721640110016, -0.014847426675260067, -0.0928991436958313, 0.03300855681300163, -0.03144996240735054, -0.026481781154870987, -0.15770770609378815, -0.029187213629484177, 0.020983129739761353, -0.14090430736541748, 0.022598661482334137, -0.11162018030881882, 0.06536617875099182, 0.011664122343063354, -0.07007584720849991, -0.032164912670850754, -0.020195351913571358, 0.005318047013133764, -0.05062396824359894, -0.23767533898353577, -0.02352597378194332, -0.053353264927864075, 0.1458010971546173, -0.22157973051071167, 0.037584658712148666, 0.047911714762449265, 0.12691807746887207, -0.0003185951500199735, -0.06205996870994568, 0.02275559864938259, -0.0693281814455986, -0.022439194843173027, -0.07177534699440002, -0.0040907301008701324, -0.0012408505426719785, -0.04472790285944939, 0.012552541680634022, -0.11035829782485962, -0.05376036465167999, 0.10418743640184402, 0.05667370930314064, -0.16828873753547668, -0.013432081788778305, -0.04589236527681351, -0.06922035664319992, -0.08383683115243912, -0.05967210978269577, 0.11386258155107498, 0.044430773705244064, 0.03550637513399124, -0.07294078916311264, -0.07076667994260788, 0.009802164509892464, -0.0196697935461998, -0.02457464672625065, 0.11297494918107986, 0.08157352358102798, -0.11226522922515869, 0.09922992438077927, 0.06182472035288811, 0.022285914048552513, 0.08092299848794937, -0.019543033093214035, -0.11016147583723068, -0.031004395335912704, 0.04242117702960968, 0.009233261458575726, 0.160649836063385, -0.08420465886592865, 0.04809456318616867, 0.042779017239809036, -0.027389178052544594, 0.056502897292375565, -0.10253318399190903, 0.011449486017227173, 0.005668871570378542, -0.013124875724315643, 0.02312570810317993, -0.024311278015375137, 0.006176369730383158, 0.08370307087898254, 0.05810760706663132, 0.03660840168595314, 0.03507623448967934, -0.02824413776397705, -0.1300036609172821, 0.17933553457260132, -0.09721697866916656, -0.23956209421157837, -0.15353114902973175, 0.04460322484374046, 0.05059796944260597, -0.016652453690767288, 0.027821378782391548, -0.05979283154010773, -0.1016482561826706, -0.0747811496257782, 0.006248244550079107, 0.015249701216816902, -0.06030106171965599, -0.07438782602548599, 0.05214116722345352, 0.03531336784362793, -0.11570831388235092, 0.03739805892109871, 0.05869094282388687, -0.013554093427956104, 0.0028109413105994463, 0.05153822526335716, 0.08202166855335236, 0.18336816132068634, -0.007563526276499033, 0.000933682662434876, 0.054312169551849365, 0.28100934624671936, -0.16327570378780365, 0.11263125389814377, 0.11565077304840088, -0.05856498330831528, 0.07676565647125244, 0.1835424154996872, 0.03218408301472664, -0.10208111256361008, 0.03616746887564659, 0.03428449481725693, -0.026329156011343002, -0.26840370893478394, -0.050570182502269745, -0.012414764612913132, -0.10504362732172012, 0.08211752027273178, 0.0907789096236229, 0.099402517080307, 0.037229277193546295, -0.062066007405519485, -0.08259307593107224, 0.03110586479306221, 0.10465245693922043, -0.02201828733086586, 0.004938175901770592, 0.08363062888383865, -0.030011294409632683, 0.00799849908798933, 0.09572914242744446, -0.010599629022181034, 0.1776392161846161, 0.0544581338763237, 0.11201811581850052, 0.08087325096130371, 0.0881081372499466, -0.0031380350701510906, 0.015359044075012207, 0.01882648468017578, 0.02099435217678547, 0.015194463543593884, -0.08039263635873795, 0.03812464699149132, 0.11125432699918747, 0.051561545580625534, 0.022811172530055046, 0.012991819530725479, -0.05041986331343651, 0.04681578651070595, 0.18459881842136383, 0.009504660032689571, -0.20121802389621735, -0.07564158737659454, 0.0561656728386879, -0.07766810804605484, -0.1403442919254303, -0.02084929123520851, 0.017741326242685318, -0.17015118896961212, 0.012211514636874199, -0.03890514001250267, 0.10080014169216156, -0.07466121017932892, -0.04258851334452629, 0.10233604162931442, 0.07320214807987213, -0.024789933115243912, 0.06110236793756485, -0.2027638554573059, 0.12889042496681213, 0.024234911426901817, 0.07160376757383347, -0.09118596464395523, 0.100211963057518, -0.0009178656036965549, -0.008843843825161457, 0.16634619235992432, 0.004772318992763758, -0.07735513150691986, -0.05102243646979332, -0.08865182846784592, -0.01623058319091797, 0.10478407889604568, -0.13044482469558716, 0.06575878709554672, -0.015665920451283455, -0.027621891349554062, 0.0041419570334255695, -0.07474660873413086, -0.12003333121538162, -0.17588946223258972, 0.0592651292681694, -0.10508809238672256, 0.030553217977285385, -0.08697041869163513, -0.06518734246492386, 0.01360270194709301, 0.18859030306339264, -0.18118038773536682, -0.0941821038722992, -0.14168618619441986, -0.08476671576499939, 0.16449500620365143, -0.038674045354127884, 0.08421395719051361, 0.003376081120222807, 0.16501374542713165, 0.014029408805072308, -0.0015819292748346925, 0.09790860861539841, -0.08551394194364548, -0.18688884377479553, -0.05505610629916191, 0.16293732821941376, 0.14275696873664856, 0.04250163957476616, -0.015132341533899307, 0.025361547246575356, -0.054871972650289536, -0.1111668199300766, 0.030581818893551826, 0.13322541117668152, 0.08406678587198257, -0.010592077858746052, -0.03662798926234245, -0.08230122923851013, -0.06507614254951477, -0.05861164256930351, 0.005036390386521816, 0.18668591976165771, -0.07103097438812256, 0.16253666579723358, 0.11771479994058609, -0.06129937991499901, -0.201959490776062, 0.05579149350523949, 0.058344170451164246, 0.012369496747851372, 0.023383473977446556, -0.20874349772930145, 0.08683819323778152, 0.004007289186120033, -0.0715911015868187, 0.16410939395427704, -0.1716199368238449, -0.14237919449806213, 0.09782696515321732, 0.032474637031555176, -0.22474411129951477, -0.13851091265678406, -0.09858177602291107, -0.018526121973991394, -0.11399463564157486, 0.06551209837198257, 0.01875706948339939, 0.017452297732234, 0.02743147872388363, 0.024566752836108208, 0.02483227290213108, -0.04607364907860756, 0.2067040503025055, -0.022974591702222824, 0.005967532750219107, -0.051975999027490616, -0.09832221269607544, 0.03426579758524895, -0.05074713006615639, 0.10089464485645294, 0.008808075450360775, 0.020914187654852867, -0.14865918457508087, -0.044754933565855026, -0.06180845946073532, 0.028983378782868385, -0.09646521508693695, -0.0928386002779007, -0.04957341030240059, 0.0998292863368988, 0.09793656319379807, -0.029254015535116196, 0.0022424738854169846, -0.08724266290664673, 0.06500335782766342, 0.1931239366531372, 0.18501150608062744, 0.0719185471534729, -0.07413624227046967, 0.020794617012143135, -0.03214142471551895, 0.04187725856900215, -0.22738376259803772, 0.03814420849084854, 0.058711498975753784, 0.021806564182043076, 0.09063896536827087, -0.007026664912700653, -0.15516552329063416, -0.0753142312169075, 0.0803043395280838, -0.039028678089380264, -0.16064958274364471, -0.029492413625121117, 0.041178084909915924, -0.20852777361869812, -0.04924628511071205, 0.011803652159869671, -0.021445633843541145, -0.04273570328950882, 0.025823894888162613, 0.07595545053482056, -0.0244743712246418, 0.10493255406618118, 0.08933497965335846, 0.09284022450447083, -0.09939762204885483, 0.08493978530168533, 0.07622935622930527, -0.048287224024534225, 0.025069329887628555, 0.11160042881965637, -0.04596396163105965, -0.03537760302424431, 0.09026583284139633, 0.08943408727645874, 0.016229605302214622, -0.049906887114048004, 0.013988400809466839, -0.051716793328523636, 0.07043661922216415, 0.11282343417406082, 0.03159111365675926, -0.006552307400852442, 0.054932475090026855, 0.04239531233906746, -0.10080348700284958, 0.10230907797813416, 0.05315650627017021, 0.0218729879707098, -0.03988701477646828, -0.028429923579096794, -0.0059728361666202545, -0.009460970759391785, -0.017285294830799103, -0.01066250167787075, -0.08763545751571655, -0.006626291200518608, -0.09483971446752548, 0.026754816994071007, -0.07051923125982285, 0.008824178948998451, 0.026702674105763435, -0.050643302500247955, 0.003927215933799744, 0.006106188520789146, -0.08002579212188721, -0.052181437611579895, -0.014279724098742008, 0.0810820460319519, -0.1215527206659317, 0.036080993711948395, 0.07726311683654785, -0.10459728538990021, 0.06672767549753189, 0.003655904671177268, 0.012402929365634918, 0.01668664813041687, -0.14719273149967194, 0.05834683030843735, -0.024367541074752808, -0.011198407039046288, 0.018783845007419586, -0.20767025649547577, -0.011839919723570347, -0.04848124086856842, -0.04823074862360954, 0.012074884958565235, -0.032894641160964966, -0.1225145161151886, 0.10078942775726318, -0.008761618286371231, -0.0742865726351738, -0.021009810268878937, 0.04462618753314018, 0.09985482692718506, -0.01762528531253338, 0.12390194833278656, -0.02160356380045414, 0.07123368978500366, -0.17080232501029968, -0.0004889495321549475, -0.01269409991800785, 0.04200352355837822, -0.012040166184306145, -0.026236990466713905, 0.058961182832717896, -0.02053704671561718, 0.1841084510087967, -0.024381687864661217, 0.0681288093328476, 0.05437815189361572, 0.013915072195231915, 0.004542439244687557, 0.08372728526592255, 0.0618152990937233, -0.003183289198204875, -0.0038743203040212393, 0.04549738019704819, -0.0023415114264935255, -0.04342805966734886, -0.15102863311767578, 0.07449230551719666, 0.15618577599525452, 0.059410423040390015, 0.024206973612308502, 0.028225917369127274, -0.12202136963605881, -0.0782540887594223, 0.1393209993839264, -0.005024428945034742, -0.034643106162548065, -0.07235150039196014, 0.1890919804573059, 0.13136602938175201, -0.20154888927936554, 0.08373883366584778, -0.05570732429623604, -0.05620025098323822, -0.12935107946395874, -0.16308347880840302, -0.06502500176429749, -0.04713558778166771, -0.0209365114569664, -0.06569118797779083, 0.05596251040697098, 0.05145636573433876, 0.006585097871720791, -0.018333392217755318, 0.10108150541782379, 0.0050308238714933395, -0.025837458670139313, 0.0428447499871254, 0.05548344552516937, 0.03092857636511326, -0.10175253450870514, 0.01078856736421585, -0.002238289453089237, 0.01712547242641449, 0.06621119379997253, 0.01408971194177866, -0.05672900006175041, 0.01267198659479618, -0.014224288053810596, -0.11077350378036499, 0.04143771156668663, -0.021275296807289124, -0.03517235070466995, 0.1428452879190445, 0.028691884130239487, 0.006719445809721947, -0.02047916315495968, 0.24043984711170197, -0.07426124811172485, -0.07386737316846848, -0.15206339955329895, 0.05632876232266426, -0.06691859662532806, 0.02882184274494648, 0.034763600677251816, -0.11507245898246765, 0.01401499193161726, 0.1639842391014099, 0.13047149777412415, -0.00930862408131361, 0.0072251660749316216, 0.0565757192671299, 0.002024472691118717, -0.030220327898859978, 0.011807632632553577, 0.05370987951755524, 0.13945932686328888, -0.07895497232675552, 0.06118346378207207, -0.014082411304116249, -0.07395875453948975, -0.013036437332630157, 0.10983027517795563, 0.004096006974577904, 0.0012831802014261484, -0.07218842953443527, 0.13870152831077576, -0.09241402894258499, -0.22778034210205078, 0.056384675204753876, -0.07214677333831787, -0.15306729078292847, -0.04542899876832962, 0.012344133108854294, -0.01827598363161087, 0.018087347969412804, 0.07632926106452942, -0.05073400214314461, 0.16583430767059326, 0.04702341556549072, -0.06302731484174728, -0.09262634068727493, 0.06539707630872726, -0.12157639116048813, 0.280399888753891, 0.019037896767258644, 0.05011695623397827, 0.10303166508674622, -0.013957429677248001, -0.13890235126018524, 0.012244418263435364, 0.10373015701770782, -0.06866172701120377, 0.06053752824664116, 0.17952458560466766, -0.002518785884603858, 0.12831518054008484, 0.05488770827651024, -0.05702192336320877, 0.039481718093156815, -0.08259769529104233, -0.049059707671403885, -0.11048151552677155, 0.07668175548315048, -0.08611443638801575, 0.16048398613929749, 0.13442015647888184, -0.06284430623054504, -0.00866981316357851, -0.02235570177435875, 0.08353181183338165, 0.0065396311692893505, 0.11808989197015762, 0.004705366212874651, -0.1841500997543335, 0.03629741445183754, 0.011462558060884476, 0.1012720912694931, -0.20578451454639435, -0.06913269311189651, 0.0550631582736969, -0.02396181784570217, -0.07605168223381042, 0.11703146249055862, 0.04508436843752861, 0.03358806297183037, -0.04193226993083954, -0.05098617821931839, 0.004651195835322142, 0.14566008746623993, -0.11616147309541702, -0.011310935951769352 ]
null
null
transformers
# Model Card for Model ID A 4bit instruct tuned version of the mistral 7b instruct, finetuned using Turkish examples from CohereForAI/aya_dataset, not benchmarked or tested extensively. I uploaded LoRa weights only.
{"language": ["tr"], "license": "apache-2.0", "library_name": "transformers", "tags": ["turkish", "mistral", "instruct", "sft", "chat"], "datasets": ["CohereForAI/aya_dataset"], "pipeline_tag": "text-generation"}
text-generation
eren23/AYA-Mistral7B-instruct-TR-4b
[ "transformers", "safetensors", "turkish", "mistral", "instruct", "sft", "chat", "text-generation", "tr", "dataset:CohereForAI/aya_dataset", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2024-02-13T19:30:22+00:00
[]
[ "tr" ]
TAGS #transformers #safetensors #turkish #mistral #instruct #sft #chat #text-generation #tr #dataset-CohereForAI/aya_dataset #license-apache-2.0 #endpoints_compatible #region-us
# Model Card for Model ID A 4bit instruct tuned version of the mistral 7b instruct, finetuned using Turkish examples from CohereForAI/aya_dataset, not benchmarked or tested extensively. I uploaded LoRa weights only.
[ "# Model Card for Model ID\n\nA 4bit instruct tuned version of the mistral 7b instruct, finetuned using Turkish examples from CohereForAI/aya_dataset, not benchmarked or tested extensively. I uploaded LoRa weights only." ]
[ "TAGS\n#transformers #safetensors #turkish #mistral #instruct #sft #chat #text-generation #tr #dataset-CohereForAI/aya_dataset #license-apache-2.0 #endpoints_compatible #region-us \n", "# Model Card for Model ID\n\nA 4bit instruct tuned version of the mistral 7b instruct, finetuned using Turkish examples from CohereForAI/aya_dataset, not benchmarked or tested extensively. I uploaded LoRa weights only." ]
[ 64, 62 ]
[ "passage: TAGS\n#transformers #safetensors #turkish #mistral #instruct #sft #chat #text-generation #tr #dataset-CohereForAI/aya_dataset #license-apache-2.0 #endpoints_compatible #region-us \n# Model Card for Model ID\n\nA 4bit instruct tuned version of the mistral 7b instruct, finetuned using Turkish examples from CohereForAI/aya_dataset, not benchmarked or tested extensively. I uploaded LoRa weights only." ]
[ -0.08347539603710175, -0.02689877524971962, -0.0004227626195643097, 0.05518361181020737, 0.1056816428899765, 0.03080461546778679, 0.24239350855350494, 0.06801809370517731, 0.09111659228801727, -0.03564261645078659, 0.10943329334259033, 0.06521859765052795, 0.024818643927574158, 0.12200659513473511, -0.0639314278960228, -0.0880638062953949, 0.08872582763433456, -0.04163366183638573, 0.013168196193873882, 0.08789137750864029, 0.0853639468550682, -0.03016277588903904, 0.12952719628810883, -0.04995374009013176, -0.10780342668294907, 0.049445293843746185, -0.0073092347010970116, -0.08123411238193512, 0.04480372369289398, 0.03211895003914833, 0.11071465909481049, 0.017871441319584846, 0.017643136903643608, -0.11953691393136978, 0.04849604517221451, 0.0015712446765974164, -0.05507763847708702, 0.029010623693466187, 0.015324150212109089, -0.10027947276830673, 0.16297267377376556, -0.03588509559631348, -0.02032429352402687, 0.005441023036837578, -0.071250781416893, -0.17761017382144928, -0.11843926459550858, 0.08200082927942276, 0.11966974288225174, 0.08850482851266861, 0.006477985996752977, 0.08018374443054199, 0.007622454781085253, 0.07127939164638519, 0.07780148833990097, -0.20962433516979218, -0.055300433188676834, 0.09433542191982269, -0.03785146772861481, 0.12413878738880157, 0.039250846952199936, 0.03855229914188385, 0.09810961782932281, -0.04453949257731438, -0.010927626863121986, -0.07498324662446976, -0.026186175644397736, -0.022946765646338463, -0.03406883031129837, -0.010273984633386135, 0.2701472043991089, 0.017233658581972122, -0.04060688614845276, -0.07519941031932831, -0.03573662042617798, 0.005506842862814665, -0.04756579548120499, 0.05034462362527847, -0.034035325050354004, 0.030824843794107437, 0.08548930287361145, 0.009508630260825157, -0.09514274448156357, -0.08969948440790176, -0.10885940492153168, 0.03819352015852928, 0.03791194036602974, 0.09091581404209137, -0.10537859797477722, 0.05925419181585312, -0.04439670965075493, -0.10306790471076965, -0.03494607284665108, -0.03737466037273407, 0.02229798212647438, 0.011853357776999474, -0.03581315279006958, -0.008671039715409279, 0.13262930512428284, 0.12075870484113693, 0.08329708874225616, 0.08520426601171494, 0.0025904658250510693, 0.05367860198020935, -0.01983886957168579, -0.009747612290084362, 0.035450153052806854, -0.07082614302635193, 0.06629077345132828, -0.024640105664730072, 0.08337320387363434, -0.03137192875146866, -0.1281127780675888, -0.021665601059794426, 0.023005545139312744, 0.08568117022514343, -0.04728462174534798, 0.03470617160201073, 0.007601712830364704, 0.013833485543727875, 0.20916016399860382, -0.0827397033572197, -0.03587785363197327, 0.004535572603344917, -0.03921104967594147, 0.009756886400282383, 0.06623558700084686, 0.013381149619817734, 0.04031304270029068, -0.007734853308647871, 0.008440894074738026, 0.016765570268034935, -0.02523394674062729, -0.03510222211480141, 0.02814646251499653, -0.09248987585306168, 0.058232381939888, -0.19277328252792358, -0.3092528283596039, 0.028171472251415253, 0.07244037836790085, -0.03813134506344795, 0.10964303463697433, -0.030911987647414207, 0.008036617189645767, 0.01297811884433031, -0.021083099767565727, 0.016284003853797913, -0.08682753145694733, 0.038756612688302994, 0.004992678761482239, 0.05828383192420006, -0.21990422904491425, 0.011585749685764313, -0.14461293816566467, 0.020474551245570183, -0.08584966510534286, 0.04988335818052292, -0.04138704016804695, 0.09639131277799606, -0.13599708676338196, 0.001595356035977602, -0.02464681677520275, -0.0036622497718781233, 0.028993207961320877, 0.0730133131146431, -0.1816389560699463, -0.00967059750109911, 0.058675192296504974, -0.18728920817375183, -0.2017376571893692, 0.11334513872861862, -0.03490512818098068, 0.0930120199918747, 0.11776027083396912, 0.1784902811050415, 0.10762079060077667, -0.061769045889377594, -0.03856220096349716, 0.015530562028288841, -0.02048194594681263, -0.10045110434293747, 0.07081804424524307, 0.0923163890838623, -0.09493055194616318, 0.06206212565302849, -0.037098754197359085, 0.0539902038872242, 0.026859916746616364, -0.08299543708562851, -0.06993558257818222, -0.1040630117058754, -0.006723667494952679, -0.024892127141356468, 0.04756489768624306, -0.06428633630275726, -0.006053057499229908, -0.001377838896587491, 0.15073128044605255, -0.04218997433781624, -0.026412392035126686, -0.07936599850654602, 0.16221699118614197, -0.14401404559612274, -0.0016529522836208344, -0.03409884497523308, -0.019955908879637718, -0.0770559161901474, 0.09973105043172836, 0.016810590401291847, 0.06010761857032776, 0.0996948704123497, -0.0015971428947523236, -0.0670103207230568, 0.004860519431531429, 0.1490895003080368, 0.03280757740139961, -0.030769340693950653, -0.11817604303359985, 0.01788768731057644, -0.056431081146001816, 0.17218640446662903, -0.12004003673791885, 0.027027133852243423, 0.060114894062280655, 0.08584272861480713, -0.009312640875577927, 0.041642382740974426, 0.007391174323856831, -0.027732839807868004, -0.009314563125371933, -0.053363051265478134, 0.05500708892941475, -0.006027408875524998, -0.14872734248638153, 0.08714454621076584, -0.15427233278751373, 0.13577032089233398, 0.185139998793602, 0.06879359483718872, -0.003241865197196603, -0.05800086259841919, 0.03479290008544922, -0.030331926420331, 0.03664480894804001, -0.0018118353327736259, 0.10739459842443466, -0.02721354365348816, 0.12769559025764465, -0.0845022052526474, 0.0008385257096961141, 0.03485386073589325, -0.050094544887542725, -0.04558824375271797, 0.07510297745466232, 0.010039578191936016, -0.1881582885980606, 0.1068820059299469, 0.21024960279464722, -0.012998569756746292, 0.1934371292591095, -0.0029139486141502857, -0.046997904777526855, -0.0002437651710351929, 0.04206952452659607, 0.0016994330799207091, 0.0980781763792038, -0.04690111428499222, 0.0068269758485257626, 0.06855219602584839, 0.008524671196937561, 0.019182806834578514, -0.08472225069999695, -0.03537536785006523, -0.012143757194280624, -0.05305150896310806, -0.06376811861991882, 0.07357867807149887, -0.03969191387295723, 0.08527100831270218, -0.06864193081855774, -0.07020014524459839, 0.08435739576816559, 0.01278019417077303, -0.09145694971084595, 0.12715162336826324, -0.0962030440568924, -0.12593336403369904, -0.09298798441886902, -0.05306586995720863, -0.049270130693912506, -0.02011323906481266, 0.09433154761791229, -0.10208168625831604, -0.03158897906541824, -0.07820163667201996, -0.02885384112596512, 0.014243829995393753, -0.018878638744354248, -0.0042090690694749355, -0.018813390284776688, -0.011085979640483856, -0.1250331699848175, -0.025198791176080704, -0.010011084377765656, -0.08505886048078537, 0.08483558148145676, -0.1506219208240509, 0.046734683215618134, 0.07280369102954865, -0.02190723456442356, 0.040693897753953934, -0.03419659659266472, 0.1666140854358673, -0.06699631363153458, 0.05816285312175751, 0.16878679394721985, 0.007840605452656746, 0.031317826360464096, 0.13234344124794006, 0.007300036493688822, -0.07728023082017899, 0.026577554643154144, -0.02554379403591156, -0.1262504756450653, -0.16985826194286346, -0.0868251621723175, -0.0281803198158741, -0.008740397170186043, 0.021976470947265625, 0.07220914214849472, 0.06941545009613037, 0.17073917388916016, -0.03719459846615791, -0.013549006544053555, 0.0476345531642437, 0.044268637895584106, 0.019503561779856682, 0.04444355517625809, 0.0848662406206131, -0.1887635886669159, 0.029622342437505722, 0.13734249770641327, 0.0840696319937706, 0.24115881323814392, -0.0022390796802937984, 0.0466335229575634, 0.05003674328327179, 0.1070246547460556, 0.07715444266796112, 0.08303498476743698, -0.02616794779896736, -0.03675971180200577, -0.054234348237514496, -0.07518714666366577, -0.04104338213801384, 0.040458641946315765, -0.1703096479177475, 0.02148057147860527, 0.004197696223855019, 0.058057744055986404, 0.10686378926038742, 0.04483926296234131, 0.008815336972475052, -0.29794251918792725, -0.03711927309632301, 0.04123924300074577, 0.021369285881519318, -0.035148341208696365, 0.02714252471923828, 0.053642988204956055, -0.02647102251648903, 0.1287211924791336, -0.05686578527092934, 0.09003503620624542, 0.01567201130092144, -0.019842002540826797, -0.08453565835952759, -0.003792336443439126, -0.011591496877372265, 0.09131591022014618, -0.2215447723865509, 0.14549151062965393, 0.023671483621001244, 0.09287715703248978, -0.1047658920288086, -0.030145633965730667, 0.09961600601673126, 0.13369138538837433, 0.11093884706497192, 0.010369393043220043, -0.16491760313510895, -0.06595581024885178, -0.11306358873844147, 0.07575958222150803, -0.03939732536673546, 0.04907282441854477, -0.03500261530280113, -0.04145783558487892, 0.001214473508298397, -0.0007533683092333376, -0.06146959960460663, -0.09004703164100647, -0.13181369006633759, 0.004959688056260347, 0.16963058710098267, -0.08118894696235657, -0.021380798891186714, -0.007262293249368668, -0.10598697513341904, 0.0507437102496624, 0.03372093290090561, -0.09955547749996185, -0.07733888179063797, -0.10421058535575867, 0.08765701204538345, -0.049684539437294006, -0.0008074477664195001, -0.009198516607284546, -0.05591920390725136, -0.048449184745550156, -0.1713883876800537, 0.031973812729120255, -0.10054203122854233, -0.036268677562475204, 0.002616372425109148, 0.05079888179898262, 0.011586464010179043, -0.005694964434951544, 0.06408423185348511, 0.023794494569301605, -0.06542292982339859, -0.09147782623767853, -0.06554611772298813, 0.06995454430580139, 0.038529615849256516, 0.06464342772960663, 0.0354270413517952, -0.11631672829389572, -0.004341464955359697, -0.0604698620736599, 0.019505996257066727, 0.11740162968635559, -0.047206610441207886, 0.08264704048633575, 0.14003333449363708, -0.05749090015888214, -0.17617811262607574, -0.04389018565416336, -0.09410712122917175, -0.022538650780916214, -0.055309493094682693, -0.024817613884806633, 0.16183598339557648, 0.14097736775875092, -0.04543536156415939, 0.13613376021385193, -0.18459473550319672, -0.1152568981051445, 0.13178634643554688, 0.10786832123994827, 0.35759320855140686, -0.1471533179283142, -0.04178837686777115, -0.11211348325014114, -0.12276045233011246, 0.011768670752644539, -0.22579161822795868, 0.03897662088274956, -0.03741798549890518, 0.040951795876026154, -0.033121559768915176, -0.044822536408901215, 0.1825728416442871, 0.006094439886510372, 0.12332932651042938, -0.1164853423833847, -0.0005178028368391097, 0.10451196134090424, -0.04776114597916603, 0.1275196075439453, -0.2529536783695221, 0.042906541377305984, -0.08165464550256729, -0.018306752666831017, 0.019811371341347694, 0.03699735179543495, 0.02114749699831009, -0.04463685676455498, -0.0685267448425293, -0.037124428898096085, -0.0019104459788650274, 0.006251972634345293, 0.0930011197924614, -0.039995722472667694, -0.012813527137041092, 0.05259042605757713, 0.06882356852293015, -0.09039884060621262, 0.0465075708925724, -0.04399110749363899, -0.049932513386011124, 0.06989839673042297, -0.23196542263031006, 0.027812859043478966, 0.07782912254333496, -0.06037672981619835, 0.06197267025709152, -0.007062761578708887, 0.011649712920188904, 0.0755569189786911, 0.11009105294942856, -0.04044615849852562, -0.08577023446559906, 0.013435966335237026, 0.02785664051771164, 0.04146910086274147, 0.07878562062978745, 0.1289043128490448, -0.08176922798156738, 0.033497970551252365, -0.023784037679433823, -0.006290751043707132, -0.09274221211671829, 0.09671249985694885, 0.022611266002058983, 0.003213248448446393, -0.13588601350784302, 0.1007661372423172, -0.009399552829563618, -0.026110948994755745, 0.08696352690458298, 0.005986195523291826, -0.14472512900829315, -0.12332115322351456, -0.031689342111349106, 0.26108136773109436, -0.0825568288564682, -0.1583249270915985, -0.051884282380342484, -0.08560709655284882, -0.024545200169086456, 0.1369377225637436, 0.12117442488670349, 0.027263693511486053, 0.010391837917268276, -0.07551512122154236, -0.06360948085784912, 0.055101651698350906, 0.02264401875436306, 0.061417024582624435, -0.16003726422786713, -0.07224751263856888, 0.002263957168906927, -0.004889920819550753, -0.052925024181604385, -0.03313112631440163, -0.029568281024694443, 0.03389539197087288, -0.2699890732765198, 0.11271531134843826, -0.09885136038064957, 0.034977566450834274, 0.024068333208560944, -0.03564776852726936, 0.0031122886575758457, 0.014550486579537392, -0.059370171278715134, 0.07784460484981537, 0.011543305590748787, 0.047020044177770615, -0.1062401533126831, -0.07704699039459229, -0.00010635307990014553, -0.004803933203220367, 0.08383401483297348, 0.12493497133255005, -0.0711122453212738, 0.09851931035518646, -0.19704768061637878, 0.00892528798431158, 0.07521585375070572, 0.02726447768509388, -0.004593767691403627, -0.04421322047710419, 0.009637841023504734, 0.10615650564432144, 0.0145456213504076, 0.07518762350082397, 0.1293921023607254, -0.07015323638916016, -0.020767485722899437, -0.10064239054918289, -0.03496832773089409, -0.04628549888730049, 0.04997534304857254, 0.16597376763820648, 0.057030271738767624, 0.12707823514938354, -0.06601100414991379, -0.023524902760982513, -0.060292936861515045, 0.016320932656526566, -0.02309293858706951, -0.14707313477993011, -0.12951122224330902, -0.04323592409491539, -0.003390721743926406, -0.014707286842167377, 0.19942808151245117, -0.059346362948417664, -0.07469595223665237, -0.021344797685742378, 0.04068680480122566, 0.004403966944664717, -0.02836189977824688, 0.31275391578674316, 0.0039007209707051516, 0.02651374042034149, -0.10076642036437988, 0.030594032257795334, 0.05917036160826683, 0.00022172422904986888, 0.009919391945004463, 0.17020107805728912, 0.011183769442141056, 0.10679739713668823, -0.015328112058341503, 0.003118154127150774, 0.0556623712182045, -0.02796870470046997, -0.0723361074924469, 0.006867732387036085, 0.015306704677641392, 0.04915624484419823, 0.24358361959457397, -0.06862006336450577, -0.04879286140203476, -0.009237962774932384, -0.036000654101371765, -0.07750573009252548, -0.18108996748924255, -0.10098810493946075, -0.1449400782585144, -0.005606611724942923, -0.0406896211206913, -0.03522203862667084, 0.1328454166650772, 0.0533314049243927, -0.01309204287827015, 0.14571458101272583, -0.1060142070055008, 0.05986945703625679, 0.01381534244865179, -0.02775854617357254, -0.06674536317586899, 0.04916480556130409, -0.05094102397561073, 0.04536247253417969, -0.0685320496559143, -0.007802724838256836, 0.04221796989440918, 0.08652849495410919, 0.07583753764629364, -0.07414791733026505, -0.07730591297149658, -0.07110346108675003, 0.0847155973315239, 0.03223813325166702, 0.14278258383274078, 0.051497407257556915, -0.003509908216074109, 0.004926153924316168, 0.16050565242767334, -0.01794334128499031, -0.12366651743650436, -0.09358341246843338, 0.0871054008603096, -0.06637974083423615, -0.0001592326007084921, 0.01635821908712387, -0.04023108258843422, 0.040765803307294846, 0.1687481850385666, 0.2800508439540863, 0.021166786551475525, 0.02512138895690441, -0.08126036077737808, 0.02173703908920288, -0.07284267991781235, 0.07077435404062271, 0.04168849438428879, 0.0617755651473999, -0.028358792886137962, -0.02800344116985798, -0.041397448629140854, -0.021563051268458366, -0.07334545999765396, 0.012983741238713264, -0.07792282104492188, -0.08725215494632721, 0.005819182377308607, 0.09591666609048843, -0.02951177954673767, -0.0701909288764, 0.05557260289788246, -0.024691326543688774, -0.05799560621380806, -0.040375493466854095, 0.08627752959728241, 0.07838018238544464, -0.019982008263468742, -0.05920880660414696, 0.01712971366941929, 0.03137141466140747, 0.0179628636687994, -0.1761549711227417, -0.13630348443984985, 0.04534437879920006, -0.005885137245059013, 0.1395544707775116, 0.004985294304788113, 0.08332213759422302, 0.08107785135507584, -0.0008698648889549077, -0.09969401359558105, 0.15671306848526, -0.0013817560393363237, 0.06203983724117279, 0.06077464669942856, -0.01916196383535862, -0.07954142987728119, 0.044059693813323975, 0.04674844071269035, -0.05136650428175926, -0.012251373380422592, 0.0615353062748909, -0.039126764982938766, -0.047312818467617035, 0.1072724461555481, -0.07745176553726196, 0.1177530288696289, 0.08019609749317169, -0.004017546307295561, 0.021973740309476852, -0.017058631405234337, 0.07484055310487747, 0.04179338738322258, -0.14922569692134857, 0.01643679104745388, -0.09140356630086899, -0.04840269312262535, 0.05173501372337341, 0.011912994086742401, -0.18837429583072662, -0.018562553450465202, -0.17490853369235992, -0.04772689566016197, -0.046871256083250046, 0.040574412792921066, 0.15235964953899384, 0.025475453585386276, -0.030194329097867012, -0.023470096290111542, -0.005223582498729229, 0.05767076835036278, -0.06254883855581284, -0.07192040979862213 ]
null
null
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
feature-extraction
furrutiav/bert_qa_extractor_cockatiel_2022_nllf_mixtral_v2_over_subsample_it_141
[ "transformers", "safetensors", "bert", "feature-extraction", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
2024-02-13T19:31:52+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #bert #feature-extraction #arxiv-1910.09700 #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #bert #feature-extraction #arxiv-1910.09700 #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ 39, 6, 3, 82, 28, 3, 4, 9, 9, 10, 42, 20, 3, 4, 5, 9, 11, 13, 3, 12, 5, 4, 5, 3, 4, 9, 53, 9, 8, 6, 3, 14, 8, 7, 9, 4 ]
[ "passage: TAGS\n#transformers #safetensors #bert #feature-extraction #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact" ]
[ -0.052746038883924484, 0.20255789160728455, -0.0045078229159116745, 0.0248473659157753, 0.10497838258743286, 0.00675728265196085, 0.06521498411893845, 0.11486967653036118, -0.0023755673319101334, 0.12028469145298004, 0.027631845325231552, 0.08119397610425949, 0.12110675126314163, 0.15393014252185822, 0.005160121712833643, -0.24253977835178375, 0.05344875901937485, -0.09366832673549652, 0.004077504388988018, 0.11452110856771469, 0.1343945860862732, -0.10780399292707443, 0.08976872265338898, -0.00683097867295146, -0.01712046191096306, -0.015751034021377563, -0.07134060561656952, -0.06668227165937424, 0.05541034787893295, 0.07649129629135132, 0.0725555345416069, 0.010986946523189545, 0.07830587029457092, -0.2806258797645569, 0.014425364322960377, 0.08005264401435852, 0.0010765197221189737, 0.06795802712440491, 0.08151742070913315, -0.06789936870336533, 0.1251654475927353, -0.0605485662817955, 0.14059753715991974, 0.07639917731285095, -0.08928128331899643, -0.19590547680854797, -0.06669555604457855, 0.07481247186660767, 0.129872128367424, 0.05026249960064888, -0.02990107797086239, 0.1371748298406601, -0.09688840061426163, 0.00786701962351799, 0.12302009761333466, -0.07360870391130447, -0.05524582043290138, 0.031063849106431007, 0.10805318504571915, 0.09297362715005875, -0.11762315034866333, -0.008467874489724636, 0.029582185670733452, 0.022175652906298637, 0.08627551048994064, 0.015828849747776985, 0.1525639444589615, 0.041341137140989304, -0.14141254127025604, -0.0526716373860836, 0.09056255221366882, 0.03701045364141464, -0.050960201770067215, -0.23367193341255188, -0.026245610788464546, -0.012442239560186863, -0.03079850971698761, -0.04234880208969116, 0.053594592958688736, -0.03630254790186882, 0.07596245408058167, -0.007196845952421427, -0.07732249796390533, -0.031211229041218758, 0.05230424553155899, 0.06785056740045547, 0.018615471199154854, -0.006994647905230522, 0.019442738965153694, 0.11387838423252106, 0.07708574831485748, -0.13029205799102783, -0.07214002311229706, -0.0739525631070137, -0.09558356553316116, -0.04332297295331955, 0.03707554563879967, 0.07106684148311615, 0.04390906170010567, 0.20283061265945435, -0.017690327018499374, 0.046562306582927704, 0.0476159006357193, 0.005842953454703093, 0.07147589325904846, 0.10925443470478058, -0.06689215451478958, -0.14432233572006226, -0.06022803485393524, 0.08875485509634018, -0.009834992699325085, -0.03670760244131088, -0.049119677394628525, 0.04676154628396034, 0.03209913894534111, 0.11318106204271317, 0.08643888682126999, -0.003593706525862217, -0.0628826767206192, -0.042073074728250504, 0.22331053018569946, -0.14625342190265656, 0.043256524950265884, 0.007445589639246464, -0.0429743155837059, -0.0076383077539503574, 0.005870272871106863, 0.014089803211390972, -0.03238216042518616, 0.10351061820983887, -0.0778173878788948, -0.035906463861465454, -0.1116463914513588, -0.06868703663349152, 0.024910317733883858, 0.0025890374090522528, -0.018393149599432945, -0.04424213990569115, -0.11253650486469269, -0.051282741129398346, 0.0724339634180069, -0.07579848170280457, -0.05524555593729019, 0.009976830333471298, -0.04834962263703346, 0.0031978494953364134, 0.00010397454752819613, 0.11258035898208618, -0.03314845636487007, 0.025259260088205338, -0.04850656911730766, 0.06803499162197113, 0.10959596186876297, 0.038730688393116, -0.0804535374045372, 0.07286878675222397, -0.22788093984127045, 0.10223092138767242, -0.09346398711204529, 0.025767935439944267, -0.14578653872013092, -0.04199126362800598, 0.02854149229824543, 0.02887420728802681, -0.010361229069530964, 0.1268649846315384, -0.1982942521572113, -0.035082314163446426, 0.15190726518630981, -0.11336656659841537, -0.09347330778837204, 0.065653957426548, -0.05610617995262146, 0.11296144872903824, 0.04835578054189682, -0.019556574523448944, 0.06953749805688858, -0.1281629204750061, -0.04506009817123413, -0.021473335102200508, -0.008493004366755486, 0.14857245981693268, 0.06750676780939102, -0.05737153813242912, 0.07104712724685669, 0.02051553688943386, -0.037109848111867905, -0.03301886469125748, -0.03470754995942116, -0.09331934154033661, 0.009520708583295345, -0.07244295626878738, 0.03737799823284149, -0.02224314957857132, -0.08870045095682144, -0.030656753107905388, -0.17619828879833221, 0.043274905532598495, 0.08050142228603363, 0.008233942091464996, -0.021131468936800957, -0.09287237375974655, 0.02556683123111725, -0.009385489858686924, -0.021018607541918755, -0.1641797423362732, -0.044834475964307785, 0.04416196420788765, -0.1971662938594818, 0.023802341893315315, -0.03283040598034859, 0.05093098804354668, 0.03247829154133797, -0.04019762575626373, -0.005096070934087038, 0.0028117431793361902, 0.01809627003967762, -0.026984719559550285, -0.200385183095932, -0.031109308823943138, -0.029154371470212936, 0.1362139731645584, -0.22226740419864655, 0.028292208909988403, 0.07483648508787155, 0.13521188497543335, 0.0009690870065242052, -0.04426588490605354, 0.010693409480154514, -0.05366935580968857, -0.053671274334192276, -0.06512755900621414, -0.007102466654032469, -0.03287021815776825, -0.04422381520271301, 0.06460095942020416, -0.19425635039806366, -0.03641216829419136, 0.10608077049255371, 0.10164625942707062, -0.14719000458717346, -0.028969714418053627, -0.04096706584095955, -0.06081128865480423, -0.09094393998384476, -0.0630471333861351, 0.14371246099472046, 0.04861542955040932, 0.048413511365652084, -0.08624191582202911, -0.0630124881863594, 0.00895135197788477, 0.0006565740332007408, -0.03649118170142174, 0.08907787501811981, 0.08782777935266495, -0.10737399011850357, 0.08881597965955734, 0.08605224639177322, 0.06605713814496994, 0.10539878904819489, 0.001256609451957047, -0.10750970244407654, -0.029154706746339798, 0.005644100718200207, 0.01547710970044136, 0.14092515408992767, -0.044270921498537064, 0.04743899777531624, 0.05656488984823227, -0.027443327009677887, 0.01715722121298313, -0.10313762724399567, 0.02984124980866909, 0.046840768307447433, -0.010507673025131226, 0.012429861351847649, -0.03895113617181778, 0.025837475433945656, 0.08796556293964386, 0.03584056720137596, 0.027896199375391006, 0.0029043578542768955, -0.03437814116477966, -0.10392027348279953, 0.17429527640342712, -0.0878753736615181, -0.28357240557670593, -0.1356295943260193, -0.00747122336179018, 0.05167245492339134, -0.022715993225574493, 0.013256389647722244, -0.04903135821223259, -0.11467588692903519, -0.10348290205001831, 0.008818334899842739, 0.0437844917178154, -0.07700283080339432, -0.07256268709897995, 0.046553414314985275, 0.033613573759794235, -0.14174877107143402, 0.022300107404589653, 0.048012908548116684, -0.03855963796377182, -0.015413837507367134, 0.07170835882425308, 0.10258439928293228, 0.17387451231479645, -0.004228805657476187, -0.01945391111075878, 0.023280048742890358, 0.24459126591682434, -0.14296141266822815, 0.10647262632846832, 0.15432609617710114, -0.06630013138055801, 0.1025824174284935, 0.19176462292671204, 0.02610800787806511, -0.07571171224117279, 0.03370760753750801, 0.03715203329920769, -0.053104497492313385, -0.23274335265159607, -0.060641512274742126, 0.0011178229469805956, -0.06850682199001312, 0.09104112535715103, 0.08915619552135468, 0.11183936148881912, 0.0454646460711956, -0.08415863662958145, -0.06847929954528809, 0.019614145159721375, 0.10642454773187637, -0.03275766968727112, 0.007264797575771809, 0.09054313600063324, -0.04184457287192345, -0.005177726969122887, 0.10835286974906921, 0.007426192983984947, 0.1962665617465973, 0.031048519536852837, 0.15333782136440277, 0.07211130857467651, 0.0342402458190918, 0.026680786162614822, 0.025636766105890274, 0.023090654984116554, 0.009547512046992779, -0.01598707027733326, -0.08795502036809921, 0.027014199644327164, 0.13500221073627472, 0.07871367782354355, 0.029795078560709953, 0.020392734557390213, -0.0429922379553318, 0.062152985483407974, 0.15964233875274658, 0.006258485373109579, -0.2136749029159546, -0.03950631618499756, 0.08867984265089035, -0.0793125256896019, -0.1237078458070755, -0.02518491819500923, 0.03823186457157135, -0.1809074580669403, 0.04127289727330208, -0.01795332506299019, 0.11453432589769363, -0.11700457334518433, -0.028958700597286224, 0.039744846522808075, 0.08327627927064896, -0.03253408893942833, 0.07922478020191193, -0.1647184044122696, 0.1165376752614975, 0.012328862212598324, 0.05802180990576744, -0.11617794632911682, 0.09878876805305481, 0.012594180181622505, -0.009003117680549622, 0.16720694303512573, -0.0008162438753060997, -0.07339610159397125, -0.06517832726240158, -0.07867198437452316, -0.022016214206814766, 0.09116258472204208, -0.11647430807352066, 0.08271238952875137, -0.012302344664931297, -0.03819865360856056, 0.002976413816213608, -0.1073245257139206, -0.12343364208936691, -0.191313698887825, 0.05862122401595116, -0.11746024340391159, 0.00024363139527849853, -0.10003595799207687, -0.05551697313785553, -0.04721582680940628, 0.19990667700767517, -0.14306047558784485, -0.09675363451242447, -0.1526252180337906, -0.09468596428632736, 0.1679719239473343, -0.04768168181180954, 0.08716544508934021, -0.00014324963558465242, 0.22273695468902588, 0.00589721417054534, -0.010143720544874668, 0.07824880629777908, -0.08608578145503998, -0.17828822135925293, -0.07740302383899689, 0.12055730819702148, 0.12802201509475708, 0.05279289186000824, -0.012038013897836208, 0.020934196189045906, -0.036648161709308624, -0.11678951978683472, 0.003050430677831173, 0.1217387318611145, 0.05949230119585991, 0.039503831416368484, -0.002558275358751416, -0.10200468450784683, -0.07551230490207672, -0.0352395698428154, 0.02261841483414173, 0.18903005123138428, -0.08441178500652313, 0.15781226754188538, 0.13112787902355194, -0.05333179607987404, -0.21253353357315063, 0.030583804473280907, 0.043237145990133286, 0.004318034742027521, 0.0612679123878479, -0.17720702290534973, 0.08167627453804016, 0.025727098807692528, -0.05116020143032074, 0.15224720537662506, -0.16569727659225464, -0.15514664351940155, 0.0824643224477768, 0.05010354146361351, -0.22108957171440125, -0.12386278063058853, -0.0879128947854042, -0.06589758396148682, -0.1396872103214264, 0.08584427833557129, 0.014041651971638203, -0.0018043812597170472, 0.05013851076364517, 0.033740755170583725, 0.018914686515927315, -0.048698488622903824, 0.21615906059741974, -0.0022440196480602026, 0.03326340764760971, -0.07553089410066605, -0.10180798172950745, 0.06950566172599792, -0.05141735449433327, 0.08518881350755692, -0.03099823370575905, 0.005753061734139919, -0.08320630341768265, -0.057475052773952484, -0.05255331099033356, 0.03318103775382042, -0.08139406144618988, -0.10520965605974197, -0.06759276986122131, 0.09429939836263657, 0.09139011800289154, -0.03298058733344078, -0.04032526910305023, -0.08896728605031967, 0.039150089025497437, 0.20617929100990295, 0.17360219359397888, 0.05333937704563141, -0.10111589729785919, 0.002542630536481738, -0.01915728859603405, 0.040264517068862915, -0.21200114488601685, 0.04798245429992676, 0.04617756977677345, 0.024147402495145798, 0.12109645456075668, -0.0176423080265522, -0.1646004468202591, -0.047221194952726364, 0.0562983863055706, -0.03494611009955406, -0.20504815876483917, -0.01314060389995575, 0.04864202439785004, -0.18736153841018677, -0.06957933306694031, 0.016700902953743935, -0.014444489032030106, -0.027432914823293686, 0.013032985851168633, 0.06286440044641495, 0.025481918826699257, 0.10238313674926758, 0.05989401787519455, 0.1000840812921524, -0.112981878221035, 0.0795830711722374, 0.09043775498867035, -0.08344172686338425, 0.009394102729856968, 0.06964189559221268, -0.05280066654086113, -0.02294989861547947, 0.022772129625082016, 0.06757686287164688, -0.003049787599593401, -0.057536181062459946, -0.02079189568758011, -0.10809285193681717, 0.06586270034313202, 0.1269281655550003, 0.0400845967233181, -0.006831571459770203, 0.04905473813414574, 0.02419281378388405, -0.07880669087171555, 0.11321208626031876, 0.03362756222486496, 0.03722309693694115, -0.05989459529519081, -0.01674187369644642, 0.04316421225667, 0.005734616424888372, -0.02047782577574253, -0.025104478001594543, -0.05658029392361641, -0.013948953710496426, -0.18932224810123444, 0.014544147998094559, -0.07588981091976166, 0.005138450767844915, 0.014814606867730618, -0.040141742676496506, -0.018671197816729546, 0.012856033630669117, -0.08163223415613174, -0.05027473345398903, -0.0038707295898348093, 0.09766460955142975, -0.1400173306465149, 0.008230311796069145, 0.09175591170787811, -0.11852382868528366, 0.06848865002393723, -0.019968708977103233, -0.014717686921358109, 0.0038272906094789505, -0.1270400881767273, 0.04572216048836708, -0.004586559720337391, 0.02062096633017063, 0.04444560408592224, -0.17065683007240295, 0.004877567756921053, -0.0423397533595562, -0.0478336401283741, -0.015323328785598278, -0.08405033499002457, -0.11406292766332626, 0.10921793431043625, 0.002206311793997884, -0.08430022746324539, -0.010287429206073284, 0.04696008190512657, 0.10919637978076935, -0.03898061811923981, 0.124757781624794, 0.0047785635106265545, 0.06639395654201508, -0.18268363177776337, -0.024298490956425667, -0.014514438807964325, 0.007352736312896013, 0.027192458510398865, -0.016180848702788353, 0.04238643869757652, -0.01372526679188013, 0.2601816952228546, -0.021822240203619003, 0.07231466472148895, 0.0637383759021759, 0.042024899274110794, 0.016651110723614693, 0.08318763226270676, 0.06755662709474564, 0.016758481040596962, 0.004258559085428715, 0.02265608124434948, -0.03241465613245964, -0.016654497012495995, -0.15768693387508392, 0.07677853107452393, 0.14623822271823883, 0.08591317385435104, 0.007676990237087011, 0.06586159020662308, -0.10330242663621902, -0.10554943233728409, 0.08015866577625275, -0.03888537734746933, -0.0009790018666535616, -0.058588381856679916, 0.15355949103832245, 0.14971502125263214, -0.17422176897525787, 0.08231138437986374, -0.03791337087750435, -0.04883022606372833, -0.11436772346496582, -0.15839459002017975, -0.06608819216489792, -0.029153592884540558, -0.0041826991364359856, -0.05528274551033974, 0.06748054921627045, 0.10802645981311798, -0.0021057529374957085, -0.00038325722562149167, 0.09545762091875076, -0.026331622153520584, -0.01757199876010418, 0.03465426340699196, 0.04817976430058479, 0.033562518656253815, -0.04831063002347946, 0.020485511049628258, 0.004976877011358738, 0.03976510092616081, 0.05864322930574417, 0.023703020066022873, -0.03892989084124565, 0.014479226432740688, -0.01092575490474701, -0.1049860492348671, 0.022427968680858612, -0.029776830226182938, -0.07360642403364182, 0.13104131817817688, 0.029177764430642128, 0.019099419936537743, -0.03228067234158516, 0.20109383761882782, -0.07107947021722794, -0.06925153732299805, -0.14109766483306885, 0.10889512300491333, -0.03372858464717865, 0.06323269009590149, 0.058447178453207016, -0.1133023053407669, -0.002398417331278324, 0.1314154714345932, 0.133079394698143, -0.033533163368701935, 0.005780258681625128, 0.03008044883608818, 0.00756559893488884, -0.0482633113861084, 0.045497048646211624, 0.031092669814825058, 0.15440985560417175, -0.06949599832296371, 0.07780899107456207, 0.00008295764564536512, -0.08774317800998688, -0.036128852516412735, 0.1405542492866516, 0.006535779219120741, 0.03079606406390667, -0.06559351831674576, 0.10371401906013489, -0.07252706587314606, -0.23936228454113007, 0.045033879578113556, -0.07753164321184158, -0.15683837234973907, -0.013978141359984875, 0.02726292423903942, -0.009009851142764091, 0.02702206000685692, 0.0654432401061058, -0.06469112634658813, 0.161378413438797, 0.03472336754202843, -0.08781957626342773, -0.05673113837838173, 0.07957270741462708, -0.09192227572202682, 0.2958409786224365, 0.013188840821385384, 0.029593972489237785, 0.10327941924333572, -0.019989576190710068, -0.13285429775714874, 0.030561091378331184, 0.10066051781177521, -0.09982595592737198, 0.06684590131044388, 0.18159176409244537, -0.009470577351748943, 0.10021016746759415, 0.07437440752983093, -0.061603669077157974, 0.05807222053408623, -0.0826035663485527, -0.06770919263362885, -0.09389114379882812, 0.05970105528831482, -0.06468918174505234, 0.14543601870536804, 0.1228262409567833, -0.04243761673569679, -0.004415105562657118, -0.02816380001604557, 0.043726447969675064, 0.012194468639791012, 0.12871193885803223, 0.008576037362217903, -0.1618158370256424, 0.026840461418032646, 0.0030557403806596994, 0.10387714207172394, -0.21997274458408356, -0.08367477357387543, 0.04838619381189346, -0.029553698375821114, -0.05334814265370369, 0.10579082369804382, 0.06295353919267654, 0.0504634715616703, -0.04548325017094612, -0.05543007701635361, -0.008723298087716103, 0.14979462325572968, -0.1187625601887703, -0.006005466915667057 ]
null
null
peft
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # whisper-med-JP This model is a fine-tuned version of [openai/whisper-medium](https://huggingface.co/openai/whisper-medium) on the common_voice_16_1 dataset. It achieves the following results on the evaluation set: - Loss: 0.5223 ## Model description Medium whisper fine tuned on three datasets for japanese. Common voice gooogle fleurs and speech from anime. Use this adapter with adapter_model_merger and whisper-medium. ## Intended uses & limitations For transcribing/translating Japanese language pop culture tv shows anime and movies. (Work in progress) ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.001 - train_batch_size: 4 - eval_batch_size: 1 - seed: 42 - gradient_accumulation_steps: 8 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - training_steps: 3000 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | 0.1878 | 0.52 | 1569 | 0.5298 | | 0.4394 | 1.48 | 3000 | 0.5223 | ### Framework versions - PEFT 0.8.2 - Transformers 4.37.2 - Pytorch 2.2.0+cu118 - Datasets 2.17.0 - Tokenizers 0.15.2
{"language": ["ja"], "license": "apache-2.0", "library_name": "peft", "tags": ["generated_from_trainer"], "datasets": ["google/fleurs", "joujiboi/japanese-anime-speech", "mozilla-foundation/common_voice_16_1"], "metrics": ["wer"], "base_model": "openai/whisper-medium", "pipeline_tag": "automatic-speech-recognition", "model-index": [{"name": "whisper-med-JP", "results": []}]}
automatic-speech-recognition
sin2piusc/whisper-med-JP
[ "peft", "tensorboard", "safetensors", "generated_from_trainer", "automatic-speech-recognition", "ja", "dataset:google/fleurs", "dataset:joujiboi/japanese-anime-speech", "dataset:mozilla-foundation/common_voice_16_1", "base_model:openai/whisper-medium", "license:apache-2.0", "region:us" ]
2024-02-13T19:32:43+00:00
[]
[ "ja" ]
TAGS #peft #tensorboard #safetensors #generated_from_trainer #automatic-speech-recognition #ja #dataset-google/fleurs #dataset-joujiboi/japanese-anime-speech #dataset-mozilla-foundation/common_voice_16_1 #base_model-openai/whisper-medium #license-apache-2.0 #region-us
whisper-med-JP ============== This model is a fine-tuned version of openai/whisper-medium on the common\_voice\_16\_1 dataset. It achieves the following results on the evaluation set: * Loss: 0.5223 Model description ----------------- Medium whisper fine tuned on three datasets for japanese. Common voice gooogle fleurs and speech from anime. Use this adapter with adapter\_model\_merger and whisper-medium. Intended uses & limitations --------------------------- For transcribing/translating Japanese language pop culture tv shows anime and movies. (Work in progress) Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 0.001 * train\_batch\_size: 4 * eval\_batch\_size: 1 * seed: 42 * gradient\_accumulation\_steps: 8 * total\_train\_batch\_size: 32 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * training\_steps: 3000 * mixed\_precision\_training: Native AMP ### Training results ### Framework versions * PEFT 0.8.2 * Transformers 4.37.2 * Pytorch 2.2.0+cu118 * Datasets 2.17.0 * Tokenizers 0.15.2
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.001\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 1\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* training\\_steps: 3000\n* mixed\\_precision\\_training: Native AMP", "### Training results", "### Framework versions\n\n\n* PEFT 0.8.2\n* Transformers 4.37.2\n* Pytorch 2.2.0+cu118\n* Datasets 2.17.0\n* Tokenizers 0.15.2" ]
[ "TAGS\n#peft #tensorboard #safetensors #generated_from_trainer #automatic-speech-recognition #ja #dataset-google/fleurs #dataset-joujiboi/japanese-anime-speech #dataset-mozilla-foundation/common_voice_16_1 #base_model-openai/whisper-medium #license-apache-2.0 #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.001\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 1\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* training\\_steps: 3000\n* mixed\\_precision\\_training: Native AMP", "### Training results", "### Framework versions\n\n\n* PEFT 0.8.2\n* Transformers 4.37.2\n* Pytorch 2.2.0+cu118\n* Datasets 2.17.0\n* Tokenizers 0.15.2" ]
[ 103, 139, 4, 39 ]
[ "passage: TAGS\n#peft #tensorboard #safetensors #generated_from_trainer #automatic-speech-recognition #ja #dataset-google/fleurs #dataset-joujiboi/japanese-anime-speech #dataset-mozilla-foundation/common_voice_16_1 #base_model-openai/whisper-medium #license-apache-2.0 #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.001\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 1\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* training\\_steps: 3000\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* PEFT 0.8.2\n* Transformers 4.37.2\n* Pytorch 2.2.0+cu118\n* Datasets 2.17.0\n* Tokenizers 0.15.2" ]
[ -0.11287938058376312, 0.16398684680461884, -0.004255855455994606, 0.07250946015119553, 0.11487773060798645, 0.0007199051906354725, 0.11997254937887192, 0.164226695895195, -0.05555647984147072, 0.1122870072722435, 0.07726816833019257, 0.04886918142437935, 0.08716985583305359, 0.20147019624710083, -0.018106652423739433, -0.3205914795398712, 0.007979720830917358, -0.022434020414948463, -0.0702187642455101, 0.11081718653440475, 0.09876611828804016, -0.10588483512401581, 0.014531462453305721, -0.01321339514106512, -0.09804186224937439, -0.012510638684034348, -0.02066829986870289, -0.0636921375989914, 0.08502419292926788, 0.02034047059714794, 0.060083016753196716, 0.017873192206025124, 0.09924454241991043, -0.25569382309913635, 0.010114414617419243, 0.05258368328213692, 0.04872901737689972, 0.05378051847219467, 0.11259767413139343, -0.014722480438649654, 0.0463869571685791, -0.08702940493822098, 0.05976775288581848, 0.05331746116280556, -0.1102481484413147, -0.2871118485927582, -0.07099540531635284, 0.023495711386203766, 0.1583774983882904, 0.08223580569028854, -0.05482262372970581, 0.023979809135198593, -0.07154209911823273, 0.07588719576597214, 0.19824323058128357, -0.22063151001930237, -0.08602561801671982, -0.01840837672352791, 0.0335860550403595, 0.07934100925922394, -0.13209909200668335, -0.024313081055879593, 0.024009088054299355, 0.012418737635016441, 0.08810997009277344, -0.00682326965034008, 0.004850743804126978, 0.002619144506752491, -0.1188344955444336, -0.008873429149389267, 0.1663670539855957, 0.07995922118425369, -0.024150501936674118, -0.14394502341747284, -0.010025781579315662, -0.1325918436050415, -0.06681269407272339, 0.031405407935380936, 0.030527908354997635, -0.032947130501270294, -0.05054362118244171, 0.009116539731621742, -0.05525336414575577, -0.07437080144882202, 0.057088494300842285, 0.08955568075180054, 0.046217791736125946, -0.04489430412650108, 0.02774825692176819, 0.0848645567893982, 0.039818085730075836, -0.15964238345623016, -0.0027920277789235115, 0.0336940661072731, -0.08713547885417938, -0.011625515297055244, 0.009505657479166985, 0.003790496615692973, 0.04163069650530815, 0.13570135831832886, -0.007130264770239592, 0.09545332193374634, -0.0007395778666250408, 0.011997684836387634, -0.0683681070804596, 0.15848660469055176, -0.05535648763179779, -0.09693952649831772, -0.03126867115497589, 0.14198051393032074, 0.0026394061278551817, -0.010770458728075027, -0.06442537903785706, 0.03504408895969391, 0.06478137522935867, 0.0698263943195343, -0.009032627567648888, 0.01433495432138443, -0.0831134244799614, -0.02400892786681652, 0.00800707284361124, -0.1283026933670044, 0.06189500540494919, 0.05413118749856949, -0.06894047558307648, -0.021952787414193153, -0.02727438136935234, 0.030584903433918953, -0.01810089498758316, 0.06849101930856705, -0.04124689847230911, 0.018982309848070145, -0.07492148131132126, -0.07909133285284042, 0.032849691808223724, -0.05351705104112625, -0.005122017115354538, -0.056730132550001144, -0.1059032678604126, -0.079795241355896, 0.04056647792458534, -0.08291162550449371, -0.06386427581310272, -0.09679500758647919, -0.06863454729318619, 0.03692781925201416, -0.00930111575871706, 0.1322001963853836, -0.054344289004802704, 0.10501992702484131, 0.01791243813931942, 0.05448922887444496, 0.1008821651339531, 0.06725948303937912, -0.0311531201004982, 0.04864579811692238, -0.15856023132801056, 0.09634823352098465, -0.11111114919185638, 0.05680405721068382, -0.14890369772911072, -0.07926980406045914, 0.012557989917695522, 0.002436550334095955, 0.08273302018642426, 0.1505376249551773, -0.19862979650497437, -0.07458207756280899, 0.1872597336769104, -0.051206544041633606, -0.07952418923377991, 0.14026987552642822, -0.03019106574356556, 0.019038449972867966, 0.03937901183962822, 0.23245713114738464, 0.08197301626205444, -0.0933588370680809, 0.03335957229137421, -0.07705561071634293, 0.14532969892024994, 0.06122260168194771, 0.09821468591690063, -0.06569118797779083, 0.03826550394296646, 0.009709569625556469, -0.041475068777799606, 0.06257130205631256, -0.06480760127305984, -0.07942222058773041, 0.007753279991447926, -0.0820126011967659, 0.031361039727926254, 0.0669182762503624, 0.016865603625774384, -0.0815361738204956, -0.10498637706041336, -0.01070924662053585, 0.09560643136501312, -0.10193829238414764, 0.011254465207457542, -0.05353359878063202, 0.05522099509835243, -0.009560263715684414, -0.012796743772923946, -0.1206006407737732, 0.002552608260884881, 0.0584065243601799, -0.0341462567448616, 0.013067075982689857, -0.06698330491781235, 0.06687059253454208, 0.03583672642707825, -0.06396014988422394, -0.06804323941469193, -0.01872243732213974, -0.0006710428278893232, -0.06216776371002197, -0.2244759202003479, -0.026736551895737648, -0.037670545279979706, 0.18801283836364746, -0.22049790620803833, 0.012187626212835312, 0.008834147825837135, 0.13024689257144928, 0.05053890496492386, -0.06351116299629211, 0.04778866842389107, 0.05931142345070839, -0.005869263783097267, -0.09214349091053009, 0.0317121297121048, -0.001434836070984602, -0.0944763645529747, 0.027064040303230286, -0.13465383648872375, 0.08117920905351639, 0.08452615141868591, 0.030676651746034622, -0.09247126430273056, -0.0672040656208992, -0.04207081347703934, -0.04513327404856682, -0.0526859350502491, 0.008350561372935772, 0.14489790797233582, 0.0037006859201937914, 0.11110547930002213, -0.09049326181411743, -0.03573448210954666, 0.02669956535100937, -0.014185470528900623, -0.00870441272854805, 0.15770293772220612, 0.06496968865394592, -0.05660645291209221, 0.11736272275447845, 0.04575157165527344, -0.04546431452035904, 0.1728852093219757, -0.07951772212982178, -0.08518984913825989, -0.03872515633702278, 0.05383825674653053, 0.027862459421157837, 0.13574457168579102, -0.13954024016857147, -0.02193107269704342, 0.023071296513080597, 0.010036168619990349, 0.0059189507737755775, -0.17355018854141235, -0.031119761988520622, 0.02472875267267227, -0.0835043340921402, 0.01884915679693222, -0.01629507914185524, -0.007228184957057238, 0.09838058054447174, -0.00131410441827029, -0.03993130475282669, -0.023510156199336052, -0.030569899827241898, -0.09512799233198166, 0.16141663491725922, -0.11266333609819412, -0.1569865643978119, -0.08591371774673462, -0.029506344348192215, 0.0047330171801149845, -0.005134962499141693, 0.027714351192116737, -0.12059888988733292, -0.033095408231019974, -0.09066534042358398, -0.0016347839264199138, -0.019661016762256622, 0.0011726667871698737, -0.0198996439576149, 0.022771071642637253, 0.06398968398571014, -0.06984450668096542, 0.02437729574739933, -0.00211392086930573, -0.012051118537783623, 0.015571697615087032, -0.0061691212467849255, 0.10472288727760315, 0.15639230608940125, 0.061934638768434525, 0.02716139145195484, -0.04633968695998192, 0.1653088927268982, -0.15279057621955872, 0.041464149951934814, 0.10351277887821198, 0.01291996892541647, 0.03184331953525543, 0.19214962422847748, 0.04125707224011421, -0.0911111980676651, 0.024859320372343063, 0.03446453437209129, -0.01027456670999527, -0.22168302536010742, -0.021380478516221046, -0.07759787887334824, 0.00929495133459568, 0.11292953789234161, 0.03680030629038811, -0.003967340104281902, 0.022209133952856064, -0.02374083735048771, -0.009519193321466446, 0.0675155520439148, 0.05526111274957657, 0.033893439918756485, 0.0218027513474226, 0.10131493210792542, -0.009296098724007607, -0.041439238935709, 0.03214431554079056, 0.008119874633848667, 0.24015749990940094, -0.0012991683324798942, 0.1863209754228592, 0.03990538790822029, 0.15507060289382935, 0.00019272399367764592, 0.009032384492456913, 0.029029859229922295, -0.005965206306427717, 0.01574384979903698, -0.05498292297124863, 0.00014836130139883608, 0.04691559448838234, 0.09905396401882172, 0.015443379990756512, -0.08204007893800735, -0.019251493737101555, 0.041191115975379944, 0.28786197304725647, 0.08576449751853943, -0.2606228291988373, -0.037781957536935806, 0.03486275672912598, -0.07174942642450333, -0.047256723046302795, 0.030506417155265808, 0.14000996947288513, -0.0909011960029602, 0.07672948390245438, -0.06938771903514862, 0.07850023359060287, -0.06377087533473969, -0.016299812123179436, 0.0900261327624321, 0.05015485733747482, -0.01383433397859335, 0.05232013761997223, -0.2692897319793701, 0.27502942085266113, -0.0030238197650760412, 0.07939957082271576, -0.035535961389541626, 0.0278292465955019, 0.034645386040210724, -0.03702372685074806, 0.11019245535135269, -0.01038180198520422, -0.07882991433143616, -0.17872552573680878, -0.10345923900604248, 0.006614672485738993, 0.12146454304456711, -0.06913375854492188, 0.10359245538711548, -0.022642867639660835, -0.04238557443022728, 0.022506849840283394, -0.10212317109107971, -0.12618638575077057, -0.11603027582168579, 0.009142756462097168, 0.021073943004012108, 0.04799190163612366, -0.08603103458881378, -0.07935217767953873, -0.05550047382712364, 0.11314966529607773, -0.09629414230585098, -0.03909752517938614, -0.13913856446743011, 0.07369960844516754, 0.14782826602458954, -0.06135593727231026, 0.049154315143823624, 0.013484496623277664, 0.11698409914970398, 0.023009950295090675, 0.017861276865005493, 0.10268064588308334, -0.060426387935876846, -0.20244956016540527, -0.07682720571756363, 0.1703740656375885, 0.032120730727910995, 0.07215061038732529, -0.02298041805624962, 0.035721637308597565, 0.011167293414473534, -0.06178971007466316, 0.06464850902557373, 0.05335453525185585, 0.004927532281726599, 0.04838358238339424, -0.03643134981393814, -0.031226743012666702, -0.10704541206359863, -0.08106864988803864, 0.10127415508031845, 0.2535496652126312, -0.07926411926746368, 0.06733012199401855, 0.03012341447174549, -0.06042710319161415, -0.14859585464000702, 0.013251620344817638, 0.11754552274942398, 0.046532295644283295, -0.0014311419799923897, -0.20197831094264984, 0.012205507606267929, 0.05747319385409355, -0.034565720707178116, 0.07892295718193054, -0.3175182640552521, -0.13556280732154846, 0.05178255960345268, 0.08175228536128998, -0.06698478758335114, -0.16506804525852203, -0.06746172159910202, 0.012827660888433456, -0.03151923418045044, 0.01141173206269741, -0.008110055699944496, 0.12168455123901367, 0.009061402641236782, 0.01073843240737915, 0.024641860276460648, -0.05380385369062424, 0.14685334265232086, -0.0136475944891572, 0.04698633402585983, -0.019931284710764885, -0.020867878571152687, -0.031457651406526566, -0.057154279202222824, 0.02261536382138729, -0.07070153951644897, 0.007267991080880165, -0.15121491253376007, -0.037890393286943436, -0.08313055336475372, 0.002320398110896349, -0.052812814712524414, -0.032587680965662, -0.030455509200692177, 0.07737833261489868, 0.06638219952583313, 0.022823844105005264, 0.08497469127178192, -0.06377542018890381, 0.12450682371854782, 0.09168675541877747, 0.09484926611185074, 0.033356908708810806, -0.07624439895153046, -0.021206295117735863, -0.019982464611530304, 0.02389874868094921, -0.16394451260566711, 0.030515601858496666, 0.1216142401099205, 0.03744404390454292, 0.17234940826892853, 0.02971329540014267, -0.09096326678991318, 0.02818218246102333, 0.05921970680356026, -0.06462042778730392, -0.14511246979236603, -0.017084913328289986, 0.008116588927805424, -0.12573619186878204, -0.043288856744766235, 0.09506209939718246, -0.03799699991941452, -0.016440194100141525, 0.008680665865540504, 0.05385817587375641, -0.0088098319247365, 0.20681829750537872, 0.04756028950214386, 0.06742749363183975, -0.08988543599843979, 0.07674546539783478, 0.07433119416236877, -0.09199412912130356, 0.0465063750743866, 0.12357769161462784, -0.05453450232744217, -0.019451025873422623, 0.059998542070388794, 0.11933261901140213, 0.0995812714099884, -0.03683597221970558, -0.12020202726125717, -0.14644739031791687, 0.0718354880809784, 0.053833525627851486, 0.022558966651558876, -0.0030075518880039454, -0.0011729573598131537, 0.02431778982281685, -0.0982450321316719, 0.13066734373569489, 0.11087426543235779, 0.044070180505514145, -0.11273039132356644, 0.0968279242515564, 0.022767169401049614, -0.016520872712135315, 0.0015125295612961054, 0.007908402010798454, -0.12010816484689713, 0.025399571284651756, -0.06664375215768814, 0.005001123994588852, -0.05169457197189331, -0.005359480157494545, -0.00808853842318058, -0.06585969775915146, -0.0444430336356163, 0.010700642131268978, -0.09664832800626755, -0.04926680028438568, -0.025997349992394447, 0.06989073008298874, -0.09027635306119919, -0.0377604104578495, 0.040806032717227936, -0.11016446352005005, 0.08518770337104797, 0.015325179323554039, 0.00499099213629961, 0.001978127285838127, -0.09672640264034271, 0.013710500672459602, 0.028562964871525764, -0.007485026493668556, 0.021547386422753334, -0.16492652893066406, -0.011483478359878063, -0.031173989176750183, 0.02301432378590107, 0.0014277775771915913, 0.019831620156764984, -0.11970436573028564, 0.009468941017985344, -0.06110316142439842, -0.07475809752941132, -0.05235777795314789, 0.07782702893018723, 0.06397175788879395, 0.007261552847921848, 0.1222926676273346, -0.09424645453691483, 0.0855572521686554, -0.22867688536643982, -0.004247758537530899, -0.02498743310570717, -0.06610787659883499, -0.07853502035140991, -0.012096197344362736, 0.10109671950340271, -0.05300658941268921, 0.07312396913766861, 0.00462234066799283, 0.03213313966989517, 0.022297397255897522, -0.1172948032617569, 0.05262167006731033, 0.06910032033920288, 0.1516578048467636, 0.027862342074513435, -0.03808126226067543, 0.07498081773519516, -0.005059418734163046, 0.06124275177717209, 0.13546963036060333, 0.09805474430322647, 0.1792062520980835, 0.07045198976993561, 0.057115815579891205, 0.04641522094607353, -0.12663404643535614, -0.11532284319400787, 0.1652570217847824, -0.05482428893446922, 0.11871232837438583, -0.02201986126601696, 0.18925438821315765, 0.10618727654218674, -0.21536140143871307, 0.053492531180381775, -0.038117025047540665, -0.09857475757598877, -0.08962168544530869, -0.0883437767624855, -0.09684158861637115, -0.160241961479187, 0.015531052835285664, -0.10263971984386444, 0.05139244347810745, 0.042450882494449615, 0.03432838246226311, 0.024465644732117653, 0.11018475890159607, 0.08863402903079987, -0.0011796224862337112, 0.14556580781936646, 0.023121057078242302, -0.006198683753609657, -0.05435100570321083, -0.08752520382404327, 0.031771380454301834, -0.039916232228279114, 0.050305068492889404, -0.018648158758878708, -0.09701737016439438, 0.052264295518398285, -0.007186351343989372, -0.10701469331979752, 0.03588356450200081, -0.019742518663406372, 0.06321307271718979, 0.10474111139774323, 0.05120394751429558, -0.011646613478660583, -0.011064433492720127, 0.23581580817699432, -0.06442049145698547, -0.035985883325338364, -0.14196352660655975, 0.1577209085226059, -0.015553588978946209, -0.014977927319705486, 0.01626688987016678, -0.08267835527658463, 0.009028679691255093, 0.19139280915260315, 0.1416812241077423, -0.0415031835436821, -0.02185659669339657, 0.012960361316800117, -0.01273901853710413, -0.02637709490954876, 0.09009717404842377, 0.09797258675098419, 0.0449855737388134, -0.04912872985005379, -0.019285444170236588, -0.018220365047454834, -0.07734429091215134, -0.05900426208972931, 0.08838213235139847, 0.04577038437128067, -0.0002857635263353586, -0.047808937728405, 0.13314929604530334, -0.06268612295389175, -0.09671927988529205, 0.04065532237291336, -0.2024110108613968, -0.19905459880828857, -0.03275030851364136, 0.053263768553733826, 0.04645831882953644, 0.031807731837034225, -0.0019568956922739744, -0.00020257926371414214, 0.10317879170179367, 0.003552878275513649, -0.016995426267385483, -0.08605019003152847, 0.05665791034698486, -0.15543197095394135, 0.20855751633644104, -0.03599833697080612, 0.006800797302275896, 0.11311902850866318, 0.029613425955176353, -0.08486474305391312, 0.03906211629509926, 0.08365882933139801, -0.06439279764890671, 0.04566891863942146, 0.19498585164546967, -0.03971963748335838, 0.13740091025829315, 0.06732352077960968, -0.052331991493701935, 0.035698480904102325, -0.08079230040311813, -0.06509282439947128, -0.09154362231492996, 0.03169999644160271, -0.061794839799404144, 0.13285501301288605, 0.19325214624404907, -0.08434537053108215, -0.008137460798025131, -0.028834054246544838, 0.015953047201037407, 0.033494699746370316, 0.1284283697605133, -0.028068220242857933, -0.2456074357032776, -0.0008149747154675424, -0.00025892799021676183, 0.030348841100931168, -0.22617459297180176, -0.074334055185318, -0.009891184978187084, -0.057038530707359314, -0.07651038467884064, 0.13226135075092316, 0.056584250181913376, 0.03521724045276642, -0.05415448173880577, -0.15945912897586823, -0.007662971504032612, 0.18563398718833923, -0.14799533784389496, -0.0384419746696949 ]
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # Whisper Small tr Beta - tgrhn This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on the Common Voice 11.0 dataset. It achieves the following results on the evaluation set: - Loss: 0.5358 - Wer: 20.7834 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 64 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 500 - training_steps: 1000 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:-----:|:----:|:---------------:|:-------:| | 0.0005 | 28.03 | 1000 | 0.5358 | 20.7834 | ### Framework versions - Transformers 4.37.2 - Pytorch 2.1.1 - Datasets 2.16.1 - Tokenizers 0.15.2
{"language": ["tr"], "license": "apache-2.0", "tags": ["whisper-event", "generated_from_trainer"], "datasets": ["mozilla-foundation/common_voice_11_0"], "metrics": ["wer"], "base_model": "openai/whisper-small", "model-index": [{"name": "Whisper Small tr Beta - tgrhn", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Common Voice 11.0", "type": "mozilla-foundation/common_voice_11_0", "config": "hi", "split": "test", "args": "hi"}, "metrics": [{"type": "wer", "value": 20.783353172786377, "name": "Wer"}]}]}]}
automatic-speech-recognition
tgrhn/whisper-tr-small
[ "transformers", "tensorboard", "safetensors", "whisper", "automatic-speech-recognition", "whisper-event", "generated_from_trainer", "tr", "dataset:mozilla-foundation/common_voice_11_0", "base_model:openai/whisper-small", "license:apache-2.0", "model-index", "endpoints_compatible", "region:us" ]
2024-02-13T19:32:45+00:00
[]
[ "tr" ]
TAGS #transformers #tensorboard #safetensors #whisper #automatic-speech-recognition #whisper-event #generated_from_trainer #tr #dataset-mozilla-foundation/common_voice_11_0 #base_model-openai/whisper-small #license-apache-2.0 #model-index #endpoints_compatible #region-us
Whisper Small tr Beta - tgrhn ============================= This model is a fine-tuned version of openai/whisper-small on the Common Voice 11.0 dataset. It achieves the following results on the evaluation set: * Loss: 0.5358 * Wer: 20.7834 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 1e-05 * train\_batch\_size: 64 * eval\_batch\_size: 32 * seed: 42 * gradient\_accumulation\_steps: 2 * total\_train\_batch\_size: 128 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * lr\_scheduler\_warmup\_steps: 500 * training\_steps: 1000 * mixed\_precision\_training: Native AMP ### Training results ### Framework versions * Transformers 4.37.2 * Pytorch 2.1.1 * Datasets 2.16.1 * Tokenizers 0.15.2
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 32\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* training\\_steps: 1000\n* mixed\\_precision\\_training: Native AMP", "### Training results", "### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.1.1\n* Datasets 2.16.1\n* Tokenizers 0.15.2" ]
[ "TAGS\n#transformers #tensorboard #safetensors #whisper #automatic-speech-recognition #whisper-event #generated_from_trainer #tr #dataset-mozilla-foundation/common_voice_11_0 #base_model-openai/whisper-small #license-apache-2.0 #model-index #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 32\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* training\\_steps: 1000\n* mixed\\_precision\\_training: Native AMP", "### Training results", "### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.1.1\n* Datasets 2.16.1\n* Tokenizers 0.15.2" ]
[ 100, 158, 4, 30 ]
[ "passage: TAGS\n#transformers #tensorboard #safetensors #whisper #automatic-speech-recognition #whisper-event #generated_from_trainer #tr #dataset-mozilla-foundation/common_voice_11_0 #base_model-openai/whisper-small #license-apache-2.0 #model-index #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 32\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* training\\_steps: 1000\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.1.1\n* Datasets 2.16.1\n* Tokenizers 0.15.2" ]
[ -0.11554069072008133, 0.12935423851013184, -0.005424911621958017, 0.077652208507061, 0.08963777869939804, 0.011674660257995129, 0.11356882005929947, 0.15164794027805328, -0.041874077171087265, 0.09740704298019409, 0.08346117287874222, 0.0843823179602623, 0.06948284059762955, 0.13939327001571655, -0.020064305514097214, -0.29499876499176025, -0.0015757689252495766, -0.0358487144112587, -0.10954809933900833, 0.1086074486374855, 0.08446581661701202, -0.10089071094989777, 0.02923865243792534, -0.009170261211693287, -0.04260419309139252, -0.012220146134495735, -0.03431643545627594, -0.04599585384130478, 0.10279358923435211, 0.0184266846626997, 0.03503434732556343, 0.03760066255927086, 0.106696218252182, -0.24703338742256165, 0.004771883599460125, 0.04727974906563759, 0.03770389035344124, 0.055349063128232956, 0.09292127192020416, -0.022848492488265038, 0.07441022992134094, -0.09155946224927902, 0.05332541465759277, 0.046926990151405334, -0.10105396807193756, -0.2871215343475342, -0.05775625631213188, 0.03716946765780449, 0.14014017581939697, 0.06549747288227081, -0.028679409995675087, 0.04432215169072151, -0.06461193412542343, 0.09425672888755798, 0.24767635762691498, -0.24751628935337067, -0.06412723660469055, -0.024708591401576996, 0.03072553128004074, 0.04900197312235832, -0.11026114225387573, -0.011240296997129917, 0.010898880660533905, 0.014058539643883705, 0.10794615000486374, 0.002186503494158387, 0.03284089267253876, -0.008468474261462688, -0.13390888273715973, -0.046545322984457016, 0.12322062253952026, 0.06451760977506638, -0.022167546674609184, -0.11858780682086945, -0.03844687342643738, -0.1539815068244934, -0.054856982082128525, 0.021456902846693993, 0.03471418470144272, -0.038551971316337585, -0.0581313818693161, 0.014195691794157028, -0.05280875787138939, -0.08075103908777237, 0.05930934473872185, 0.1319846510887146, 0.04641620069742203, -0.030882228165864944, 0.02387416362762451, 0.09509170055389404, 0.05319979414343834, -0.16639916598796844, -0.009875746443867683, 0.03522033616900444, -0.0976334810256958, 0.0002515471132937819, -0.004605663940310478, 0.03011416830122471, 0.04945380613207817, 0.15482237935066223, 0.0007327431230805814, 0.08077648282051086, 0.04483942314982414, 0.00947188213467598, -0.085891492664814, 0.1450856626033783, -0.056324947625398636, -0.10061128437519073, -0.030196910724043846, 0.13686145842075348, 0.0249947439879179, -0.007059169467538595, -0.06134335696697235, 0.028187371790409088, 0.08769086748361588, 0.07174655050039291, -0.003399942070245743, 0.0364522747695446, -0.06580890715122223, -0.01764398068189621, -0.012865481898188591, -0.12757229804992676, 0.03198970854282379, 0.0526215024292469, -0.07456637918949127, -0.04666532203555107, -0.006267349701374769, 0.006040928419679403, -0.02954382821917534, 0.07694783806800842, -0.042452942579984665, -0.012459168210625648, -0.05549595132470131, -0.08138809353113174, 0.01875460147857666, -0.041968345642089844, 0.001686872448772192, -0.04590519890189171, -0.12396904826164246, -0.0568472295999527, 0.0627378597855568, -0.07007691264152527, -0.06815867125988007, -0.08077802509069443, -0.08514747768640518, 0.038423288613557816, -0.008691610768437386, 0.13535436987876892, -0.05346029996871948, 0.09600231051445007, 0.008423631079494953, 0.06663263589143753, 0.11609448492527008, 0.05767909809947014, -0.05327778682112694, 0.06538058072328568, -0.14898699522018433, 0.09415698051452637, -0.1046307384967804, 0.06673680245876312, -0.13983942568302155, -0.0968436449766159, 0.014740840531885624, -0.0052810898050665855, 0.09017907083034515, 0.14903873205184937, -0.17946666479110718, -0.0740375816822052, 0.17485080659389496, -0.0743340328335762, -0.0842069610953331, 0.13897305727005005, -0.010663517750799656, -0.025264166295528412, 0.028631437569856644, 0.2095661610364914, 0.12231598049402237, -0.08387688547372818, 0.024400601163506508, -0.02681647427380085, 0.10705497860908508, 0.043833501636981964, 0.09435416013002396, -0.039588626474142075, 0.02941003069281578, 0.0072551025077700615, -0.04752296954393387, 0.04304748401045799, -0.0832628384232521, -0.0887681096792221, -0.011440382339060307, -0.08269213885068893, 0.03376408666372299, 0.053389810025691986, 0.008261057548224926, -0.086685411632061, -0.12475654482841492, -0.022217199206352234, 0.10645595192909241, -0.09773843735456467, 0.008560353890061378, -0.07606794685125351, 0.06886347383260727, 0.013090487569570541, -0.007268285844475031, -0.12937496602535248, -0.007929378189146519, 0.03990400955080986, -0.06883201748132706, 0.010805127210915089, -0.040510304272174835, 0.08776339143514633, 0.05918963998556137, -0.046524278819561005, -0.07428613305091858, -0.014424093998968601, -0.005470167845487595, -0.06521770358085632, -0.23586364090442657, -0.07515595108270645, -0.023198461160063744, 0.15991979837417603, -0.20855723321437836, 0.01718495972454548, 0.008004930801689625, 0.11492961645126343, 0.027317427098751068, -0.045428771525621414, 0.0322456955909729, 0.03135247901082039, -0.006610166281461716, -0.08219049125909805, 0.0364246666431427, 0.0026276353746652603, -0.11247053742408752, 0.00618031295016408, -0.1453731805086136, 0.08649317175149918, 0.06104416027665138, 0.03200291842222214, -0.07847733795642853, -0.052645131945610046, -0.060766831040382385, -0.048949167132377625, -0.014927343465387821, -0.0022689031902700663, 0.16072149574756622, 0.011092395521700382, 0.09703429043292999, -0.08028911054134369, -0.06108102947473526, 0.02463417872786522, -0.018978066742420197, -0.0005556299583986402, 0.1716691106557846, 0.025339271873235703, -0.0757034569978714, 0.0879097729921341, 0.0653088390827179, -0.051265668123960495, 0.13757535815238953, -0.07865319401025772, -0.07456671446561813, -0.042611051350831985, 0.046825237572193146, 0.03140087053179741, 0.10835807770490646, -0.14521130919456482, -0.0067826793529093266, 0.02663048915565014, 0.0011382469674572349, 0.0024858657270669937, -0.1716882437467575, -0.015278918668627739, 0.04089733958244324, -0.08892299979925156, -0.0017376192845404148, -0.008589089848101139, -0.00421673571690917, 0.09720192849636078, -0.009311902336776257, -0.07521457225084305, -0.027913618832826614, -0.040216077119112015, -0.07663285732269287, 0.1810268610715866, -0.08896387368440628, -0.1342693567276001, -0.09896492958068848, -0.00006353774369927123, -0.004316316451877356, -0.011720878072082996, 0.03938053548336029, -0.09115397185087204, -0.03526811674237251, -0.08583401143550873, 0.01905839331448078, -0.005591405089944601, 0.030295008793473244, 0.02360423095524311, 0.01048338320106268, 0.07687801867723465, -0.09098696708679199, 0.011091243475675583, -0.021691743284463882, -0.03609411045908928, 0.01607625000178814, 0.024164721369743347, 0.07932332158088684, 0.15401455760002136, 0.04944392293691635, 0.030801698565483093, -0.022070152685046196, 0.17722001671791077, -0.11303967982530594, 0.024232584983110428, 0.12212666124105453, 0.0036949252244085073, 0.05504994094371796, 0.17517195641994476, 0.040402598679065704, -0.08834459632635117, 0.00755753880366683, 0.041498370468616486, -0.02618265710771084, -0.2169361561536789, -0.0330115370452404, -0.05357532948255539, 0.008514084853231907, 0.10553114861249924, 0.036010850220918655, -0.02440596930682659, 0.026048189029097557, -0.023784564808011055, -0.04948300123214722, 0.05360567569732666, 0.048752639442682266, 0.04863500967621803, 0.027224093675613403, 0.10551613569259644, -0.008782370947301388, -0.04177650436758995, 0.02267787978053093, -0.01224992610514164, 0.2326139360666275, -0.031623512506484985, 0.17769066989421844, 0.04352382570505142, 0.13253110647201538, -0.012502681463956833, 0.04956040903925896, 0.003331357380375266, -0.0039869449101388454, 0.01478132139891386, -0.056128114461898804, -0.01680339314043522, 0.026812292635440826, 0.06292420625686646, 0.036483388394117355, -0.10911399126052856, 0.03441368415951729, 0.039215199649333954, 0.32042524218559265, 0.08855534344911575, -0.27014029026031494, -0.07801053673028946, 0.024688921868801117, -0.0689259022474289, -0.026316773146390915, 0.03388195484876633, 0.13656282424926758, -0.07018568366765976, 0.07239246368408203, -0.05910508334636688, 0.07006801664829254, -0.0724928230047226, 0.010329141281545162, 0.07523468136787415, 0.09359928220510483, 0.00016539558419026434, 0.0613122284412384, -0.2262210249900818, 0.2945067882537842, -0.010483667254447937, 0.061968814581632614, -0.05279197171330452, 0.03166744112968445, 0.02272246964275837, -0.027479955926537514, 0.12083947658538818, -0.0010165289277210832, -0.09435626119375229, -0.16555142402648926, -0.10914687067270279, 0.009707334451377392, 0.12860877811908722, -0.07277608662843704, 0.1136622503399849, -0.03890962898731232, -0.04799080640077591, 0.03129251301288605, -0.09019594639539719, -0.0830584317445755, -0.08710618317127228, 0.026304705068469048, -0.0008116972749121487, 0.02398110181093216, -0.09228499233722687, -0.09108079224824905, -0.043676335364580154, 0.14132153987884521, -0.08721497654914856, -0.04739255830645561, -0.12935848534107208, 0.05159186199307442, 0.1705978363752365, -0.07787815481424332, 0.03686941787600517, 0.019536716863512993, 0.10539738088846207, 0.030176931992173195, -0.019742310047149658, 0.10347966849803925, -0.08573122322559357, -0.2177792489528656, -0.046059608459472656, 0.19867613911628723, 0.0337567999958992, 0.07134290784597397, -0.015116161666810513, 0.018663369119167328, 0.0059965080581605434, -0.07479272782802582, 0.0723358765244484, 0.028367120772600174, -0.008198829367756844, 0.03703843429684639, -0.02968261018395424, 0.006770376581698656, -0.0748160183429718, -0.03993461653590202, 0.11018162965774536, 0.28777003288269043, -0.08878400176763535, 0.0683186873793602, 0.04834632948040962, -0.06121193617582321, -0.16985704004764557, -0.039528485387563705, 0.1117040291428566, 0.030542448163032532, 0.01271094474941492, -0.18295511603355408, 0.030837617814540863, 0.05945560708642006, -0.028645865619182587, 0.0590282566845417, -0.3607606589794159, -0.13345867395401, 0.0952046811580658, 0.0831415206193924, -0.003133708145469427, -0.16444718837738037, -0.06682521104812622, 0.004708500113338232, -0.02557937055826187, 0.021768715232610703, -0.015411436557769775, 0.12840263545513153, -0.0016401201719418168, 0.008412807248532772, 0.03218657895922661, -0.05051505193114281, 0.13846242427825928, 0.0005374333122745156, 0.053770579397678375, -0.034319061785936356, 0.011690549552440643, 0.007468188647180796, -0.06558076292276382, 0.018521204590797424, -0.1006496325135231, 0.033931292593479156, -0.12317946553230286, -0.027442211285233498, -0.08298467099666595, 0.023327220231294632, -0.04749806597828865, -0.03751781955361366, -0.007330968976020813, 0.059390608221292496, 0.09745845198631287, 0.010655966587364674, 0.08147542178630829, -0.05021978169679642, 0.11855167895555496, 0.1296095848083496, 0.11385061591863632, 0.021746499463915825, -0.09401324391365051, -0.00652722455561161, 0.005608625244349241, 0.02294653095304966, -0.1131497323513031, 0.04432493820786476, 0.1414983570575714, 0.05318267643451691, 0.1280653327703476, 0.044213902205228806, -0.06286410987377167, -0.0053365700878202915, 0.05916042625904083, -0.08817847818136215, -0.17399291694164276, -0.01720570959150791, 0.000011479063687147573, -0.1410866379737854, 0.012231571599841118, 0.10758958011865616, -0.03529783710837364, -0.00416028406471014, 0.0071727922186255455, 0.06225550174713135, -0.01707916334271431, 0.23569592833518982, 0.024983210489153862, 0.09530024975538254, -0.09825678169727325, 0.08153107762336731, 0.03852323815226555, -0.07897354662418365, 0.03968251124024391, 0.1185658648610115, -0.05460425093770027, -0.027036497369408607, 0.05449296906590462, 0.07146356254816055, 0.06409864127635956, -0.022837452590465546, -0.12831692397594452, -0.1453457921743393, 0.08215449750423431, 0.07903856039047241, 0.02867143042385578, 0.012682982720434666, -0.017936946824193, 0.026095883920788765, -0.0809740275144577, 0.13445539772510529, 0.11797810345888138, 0.04630303010344505, -0.1281803846359253, 0.1250956952571869, -0.004455286078155041, 0.006187478546053171, -0.0063119265250861645, 0.00994451716542244, -0.11752817779779434, 0.01770854741334915, -0.09138906747102737, 0.0016910586273297668, -0.05042547732591629, 0.004510378930717707, -0.0017322502098977566, -0.059652820229530334, -0.04564107954502106, 0.02298608236014843, -0.10380478948354721, -0.04374271631240845, -0.021518616005778313, 0.06107313930988312, -0.09690757095813751, -0.04625282436609268, 0.03282741457223892, -0.12045595794916153, 0.10356694459915161, 0.03674545884132385, 0.010207138955593109, 0.008690855465829372, -0.10551594197750092, 0.0073775360360741615, 0.023977812379598618, 0.00494637293741107, 0.017923910170793533, -0.16772234439849854, -0.019721195101737976, -0.041609667241573334, -0.0035536736249923706, -0.011949136853218079, 0.017321621999144554, -0.12159664183855057, 0.020681703463196754, -0.0362163670361042, -0.05286988243460655, -0.04885483905673027, 0.04362516850233078, 0.05684328451752663, 0.007002446800470352, 0.13905642926692963, -0.08666391670703888, 0.07578316330909729, -0.2372884899377823, -0.0007264314335770905, -0.005931723862886429, -0.06951893866062164, -0.06533464789390564, -0.019325319677591324, 0.10229893028736115, -0.0549834705889225, 0.07385766506195068, -0.022287484258413315, 0.03598771616816521, 0.016383176669478416, -0.07577582448720932, 0.06300893425941467, 0.05975585803389549, 0.15599504113197327, 0.02319972962141037, -0.035087261348962784, 0.07238668203353882, -0.016840830445289612, 0.053614165633916855, 0.10602621734142303, 0.15024840831756592, 0.1535993069410324, 0.06269092112779617, 0.04619117081165314, 0.08093693852424622, -0.12744928896427155, -0.1605430245399475, 0.12484398484230042, -0.036375775933265686, 0.1330818086862564, -0.032548971474170685, 0.19638846814632416, 0.10747874528169632, -0.18611444532871246, 0.06904329359531403, -0.03290591016411781, -0.09225168079137802, -0.10990917682647705, -0.12553879618644714, -0.08671499043703079, -0.14621493220329285, 0.007243334781378508, -0.10390505194664001, 0.05804207921028137, 0.04784851148724556, 0.03710804134607315, 0.03885564208030701, 0.1105925664305687, 0.052164871245622635, 0.01899798773229122, 0.10158520936965942, 0.020517554134130478, -0.0237998366355896, -0.022434400394558907, -0.0947413444519043, 0.0398145467042923, -0.02045651525259018, 0.03357242792844772, -0.03976014256477356, -0.08944466710090637, 0.04857168346643448, -0.00021045669564045966, -0.1034153401851654, 0.020073099061846733, -0.02255593053996563, 0.05955369770526886, 0.0657607913017273, 0.042814262211322784, -0.020830122753977776, -0.016794098541140556, 0.24864371120929718, -0.09469227492809296, -0.07948029786348343, -0.11177939176559448, 0.21442769467830658, -0.0167374424636364, -0.00602442491799593, 0.03015735186636448, -0.06277327984571457, -0.0001393632555846125, 0.16429787874221802, 0.17289623618125916, -0.023076102137565613, -0.005265654530376196, 0.0009252637973986566, -0.006213480141013861, -0.023199953138828278, 0.07478633522987366, 0.11485479772090912, 0.060095008462667465, -0.06127755343914032, -0.012635665014386177, -0.02170543000102043, -0.05659961700439453, -0.06431075185537338, 0.10180879384279251, 0.027301505208015442, 0.010201024822890759, -0.022242682054638863, 0.11028283089399338, -0.07306403666734695, -0.13702638447284698, 0.019946668297052383, -0.18910129368305206, -0.1838311403989792, -0.04973171651363373, 0.05116628482937813, 0.06127416342496872, 0.056077517569065094, 0.004127140156924725, -0.02361372858285904, 0.08438805490732193, 0.006112617906183004, -0.05261891707777977, -0.08767276257276535, 0.06671212613582611, -0.1303965002298355, 0.20569920539855957, -0.030256200581789017, 0.011394691653549671, 0.11709188669919968, 0.02501445636153221, -0.10134643316268921, 0.03991612046957016, 0.09057378023862839, -0.11965879052877426, 0.05400868505239487, 0.18767082691192627, -0.03724497929215431, 0.1279297024011612, 0.04041724652051926, -0.0899691954255104, 0.00455518951639533, -0.03440837189555168, -0.047827206552028656, -0.057395532727241516, 0.005118008702993393, -0.04236309602856636, 0.1368597149848938, 0.21031658351421356, -0.07710758596658707, -0.012019813992083073, -0.04434937611222267, 0.0023490828461945057, 0.03910909965634346, 0.09771288186311722, -0.02033940702676773, -0.2550520598888397, 0.012488925829529762, -0.01028492208570242, 0.03223114833235741, -0.1884065419435501, -0.09069116413593292, 0.011740917339920998, -0.04609151929616928, -0.07981324940919876, 0.11324784904718399, 0.08385737240314484, 0.04179835319519043, -0.04424101114273071, -0.08888058364391327, -0.023173773661255836, 0.18192166090011597, -0.17511847615242004, -0.04747442528605461 ]
null
null
null
Mirror for https://github.com/dcharatan/pixelsplat/
{"license": "mit"}
null
mileleap/pixelsplat
[ "license:mit", "region:us" ]
2024-02-13T19:33:23+00:00
[]
[]
TAGS #license-mit #region-us
Mirror for URL
[]
[ "TAGS\n#license-mit #region-us \n" ]
[ 11 ]
[ "passage: TAGS\n#license-mit #region-us \n" ]
[ 0.026221778243780136, -0.033018264919519424, -0.008281232789158821, -0.05295303836464882, 0.052470896393060684, 0.06768012046813965, 0.1598525494337082, 0.04655371606349945, 0.23683255910873413, -0.05407243221998215, 0.11752297729253769, 0.08923697471618652, 0.004284696187824011, -0.0009730930323712528, 0.014216204173862934, -0.17134642601013184, 0.04864625632762909, -0.02878100797533989, 0.08764812350273132, 0.032233644276857376, -0.006205103360116482, -0.03845774009823799, -0.0022142508532851934, -0.03178790956735611, -0.057939812541007996, 0.03869890421628952, 0.045729056000709534, -0.02754949778318405, 0.14189864695072174, -0.021783310920000076, 0.13335508108139038, 0.046146418899297714, -0.011738095432519913, -0.2486042082309723, 0.008575023151934147, -0.07252951711416245, -0.11333522200584412, 0.016201216727495193, 0.035761721432209015, -0.010069100186228752, 0.032174937427043915, 0.11049123108386993, -0.011680051684379578, 0.06288356333971024, -0.2015703022480011, -0.20486389100551605, -0.07508610188961029, -0.07555478066205978, 0.0589042492210865, 0.030872387811541557, 0.05628744140267372, 0.1426718831062317, -0.18022038042545319, -0.0018841808196157217, 0.04129622131586075, -0.3510737717151642, 0.09011197835206985, 0.19666501879692078, 0.06407395005226135, 0.07872317731380463, -0.04774639382958412, 0.06726468354463577, 0.07745297998189926, -0.02402484230697155, -0.10679105669260025, -0.06142130121588707, 0.040939174592494965, 0.15604156255722046, -0.03852643445134163, -0.10356393456459045, 0.2591084837913513, -0.023262828588485718, -0.04234466329216957, 0.08201269060373306, -0.02980397455394268, -0.040379155427217484, 0.04404358193278313, 0.044016025960445404, 0.036236923187971115, 0.182089164853096, 0.1260262131690979, -0.03375067934393883, -0.16269677877426147, -0.030629513785243034, -0.2528207004070282, 0.07418664544820786, -0.003647059667855501, 0.10666298121213913, -0.20037521421909332, 0.03286786004900932, -0.15483668446540833, -0.009493621066212654, -0.02952384203672409, -0.059835705906152725, 0.05229754373431206, -0.0237403754144907, -0.04600388556718826, 0.07238677144050598, 0.08390641957521439, 0.2046167105436325, 0.023024363443255424, 0.016697337850928307, -0.10405295342206955, 0.15052515268325806, 0.019140364602208138, 0.024860305711627007, 0.179348424077034, 0.07677878439426422, -0.04891882464289665, -0.2251969277858734, 0.027894439175724983, -0.03671982139348984, -0.1441805064678192, 0.015881337225437164, -0.1542915552854538, 0.1736440360546112, -0.04078168794512749, -0.06919530034065247, -0.08578147739171982, 0.09790384024381638, 0.07768166810274124, -0.021921472623944283, -0.023105677217245102, -0.01381723117083311, 0.03522264584898949, -0.048196230083703995, -0.11687057465314865, 0.018241960555315018, 0.11869648098945618, 0.12573401629924774, -0.1483907401561737, -0.008189842104911804, -0.017200417816638947, 0.019065292552113533, 0.09696817398071289, -0.112403005361557, 0.028845038264989853, -0.09672309458255768, -0.13033071160316467, 0.036653537303209305, 0.017736904323101044, -0.019008556380867958, 0.1340927630662918, 0.061849117279052734, 0.056560322642326355, -0.011025321669876575, -0.07250872999429703, -0.14035539329051971, -0.08679798245429993, 0.1058693379163742, -0.046787332743406296, 0.010320915840566158, -0.24556252360343933, -0.014234079979360104, -0.14995723962783813, 0.059662189334630966, -0.0037668521981686354, -0.08819212019443512, -0.07740068435668945, 0.21408265829086304, 0.0018596589798107743, 0.04301392287015915, -0.1078512966632843, 0.054903753101825714, -0.06764797121286392, 0.10065380483865738, -0.12895582616329193, -0.06441528350114822, 0.1613781899213791, -0.13135331869125366, -0.14002031087875366, 0.0033312994055449963, -0.009472889825701714, 0.12053907662630081, 0.0802001804113388, 0.44566696882247925, -0.058881040662527084, -0.16201181709766388, 0.1270403116941452, 0.17969723045825958, -0.13685379922389984, -0.25928929448127747, 0.12393020838499069, -0.1636963188648224, -0.16647985577583313, 0.0040023741312325, -0.006962866988033056, 0.08049977570772171, -0.03446655720472336, -0.056274134665727615, 0.042339932173490524, 0.024350708350539207, 0.029094615951180458, 0.01740112341940403, 0.07037191838026047, -0.1023021712899208, 0.08444856107234955, 0.058610700070858, -0.014111426658928394, 0.15077349543571472, 0.011494536884129047, -0.05393160134553909, 0.014761670492589474, 0.044013332575559616, -0.015627963468432426, -0.05899091437458992, -0.09661509096622467, 0.019826244562864304, -0.031149597838521004, 0.08229395002126694, 0.1699674129486084, 0.023824702948331833, -0.02797185815870762, 0.028922779485583305, 0.028606392443180084, 0.1009954959154129, 0.06960704177618027, 0.03099375218153, -0.04839283227920532, 0.04952205345034599, -0.0417071171104908, -0.11430390179157257, -0.004862460307776928, -0.011735930107533932, 0.11975742131471634, -0.08906009048223495, -0.01223952230066061, 0.05951591953635216, -0.04513183981180191, 0.0019881438929587603, 0.0428374819457531, 0.0035966038703918457, 0.1388600617647171, 0.004440935328602791, -0.04352007433772087, 0.17440910637378693, -0.05288633331656456, 0.15533447265625, 0.1715822070837021, -0.07049662619829178, 0.015605369582772255, -0.1273636519908905, 0.003230511210858822, -0.014480113983154297, 0.05292887985706329, -0.05400136485695839, -0.05201306566596031, -0.01274962443858385, 0.014292534440755844, -0.03134604170918465, 0.01711403578519821, -0.06057267636060715, -0.08167021721601486, -0.10849859565496445, 0.018649224191904068, 0.20683221518993378, -0.22544461488723755, 0.1609548032283783, 0.40251004695892334, 0.15190774202346802, 0.21155193448066711, -0.12478897720575333, -0.002471078187227249, -0.06630261242389679, 0.026115071028470993, -0.024814706295728683, 0.13782677054405212, -0.13174867630004883, -0.01413064356893301, 0.03880728408694267, 0.0454997681081295, 0.0661163181066513, -0.17195898294448853, -0.15260353684425354, -0.0034879595041275024, -0.020591814070940018, -0.1749730259180069, 0.04874620959162712, -0.07595308125019073, 0.02181261032819748, 0.018216799944639206, -0.10832522064447403, 0.16837291419506073, -0.033566512167453766, -0.06695768237113953, 0.052613962441682816, -0.20581911504268646, -0.07900715619325638, -0.17772749066352844, -0.18375012278556824, 0.06050071492791176, 0.05760138854384422, 0.07903145253658295, -0.05951719731092453, -0.01922747679054737, 0.061719246208667755, -0.009363299235701561, -0.13802112638950348, -0.04235544428229332, -0.06993678212165833, 0.08744155615568161, -0.09474305808544159, -0.07518411427736282, -0.07833878695964813, -0.046996138989925385, -0.020961694419384003, 0.08125963062047958, -0.1039251759648323, 0.08903530240058899, 0.1493726521730423, 0.03651920333504677, 0.05440247058868408, -0.08271230012178421, 0.12693379819393158, -0.037743739783763885, -0.09459595382213593, 0.07307634502649307, 0.004350725095719099, 0.04920351505279541, 0.24039287865161896, 0.08962162584066391, -0.10578162968158722, -0.01780811697244644, -0.0968487411737442, -0.16405464708805084, -0.2553846538066864, -0.06823288649320602, -0.08744750916957855, 0.14417944848537445, 0.014636521227657795, 0.10712126642465591, 0.14313316345214844, 0.01343101728707552, 0.10255914181470871, -0.08983208239078522, -0.018939344212412834, 0.031209396198391914, 0.2135104089975357, -0.05208220332860947, 0.00838248711079359, -0.13684824109077454, -0.0256142970174551, 0.14601100981235504, 0.13798639178276062, 0.14503207802772522, 0.31421369314193726, 0.15292863547801971, 0.13410434126853943, 0.13474710285663605, 0.12333164364099503, 0.07403261214494705, 0.03444362059235573, -0.015304201282560825, -0.06035377085208893, -0.003846159903332591, 0.02816268615424633, 0.05421729013323784, 0.06724072247743607, -0.22906480729579926, 0.041139665991067886, -0.2661744952201843, 0.03544611483812332, -0.0854712724685669, 0.1161833181977272, -0.028890252113342285, 0.11051984131336212, 0.11386284977197647, 0.05553818494081497, -0.023278791457414627, 0.16036942601203918, 0.032686375081539154, -0.07703183591365814, 0.020292721688747406, 0.024695809930562973, 0.06633034348487854, 0.08606193959712982, 0.09550496190786362, -0.020778406411409378, -0.1831783503293991, 0.025963841006159782, 0.12212833017110825, -0.20747940242290497, 0.289523184299469, 0.013651901856064796, -0.0743619054555893, -0.01690039224922657, -0.06958060711622238, 0.008433517068624496, 0.12829731404781342, 0.10406835377216339, 0.05508929491043091, -0.2613787055015564, -0.13299626111984253, 0.046764206141233444, -0.00873907096683979, 0.11356569826602936, -0.0052223424427211285, -0.14201195538043976, -0.06640999764204025, 0.05814211815595627, -0.006591420155018568, 0.13023322820663452, -0.018290361389517784, -0.08173255622386932, -0.010230090469121933, 0.055564697831869125, -0.001312803477048874, -0.04580084979534149, 0.07523149996995926, 0.009008137509226799, 0.02259289287030697, -0.08178020268678665, 0.03887253627181053, -0.08071476966142654, -0.25375792384147644, 0.019298138096928596, -0.04987313598394394, 0.004092312417924404, -0.04684043675661087, -0.15448936820030212, -0.1129264086484909, -0.15445278584957123, 0.13100723922252655, -0.03675999864935875, 0.091565802693367, -0.0817658007144928, 0.13736046850681305, -0.08521489799022675, 0.05375019088387489, 0.00614814180880785, 0.03918716683983803, -0.017955513671040535, -0.1031481996178627, 0.09334362298250198, -0.1874227225780487, 0.023863423615694046, 0.010427716188132763, -0.056847453117370605, -0.01354232057929039, 0.03918023407459259, -0.08763083070516586, 0.21879427134990692, 0.3331502079963684, -0.011948764324188232, 0.22546616196632385, 0.35863226652145386, -0.13763751089572906, -0.23258967697620392, -0.1205512136220932, -0.3263251483440399, -0.09005610644817352, 0.17321562767028809, -0.18057219684123993, 0.04850830137729645, 0.16150830686092377, -0.10868281871080399, 0.22499866783618927, -0.22723928093910217, -0.04793389141559601, 0.1823979914188385, -0.038322996348142624, 0.4527989625930786, -0.1144307404756546, -0.1784561723470688, -0.03637253865599632, -0.16285361349582672, 0.12426037341356277, -0.026553882285952568, 0.06700495630502701, 0.02416347898542881, -0.011372359469532967, -0.009014161303639412, -0.04529716446995735, 0.2216065675020218, 0.0522729866206646, 0.10468899458646774, -0.09159468114376068, -0.17199653387069702, 0.1907423883676529, -0.0004908236442133784, -0.003372655250132084, -0.05411549657583237, -0.04850282520055771, -0.06871756166219711, 0.033092137426137924, -0.0334564633667469, 0.06195882335305214, 0.03364093229174614, -0.11903523653745651, -0.10248823463916779, 0.034111104905605316, -0.13155671954154968, -0.054850947111845016, 0.26421889662742615, -0.02080743946135044, 0.09609334170818329, 0.04959092289209366, -0.05474294349551201, -0.13538943231105804, 0.005736751481890678, -0.07534020394086838, -0.05711410939693451, 0.06573604047298431, -0.11453206837177277, -0.024341827258467674, 0.1293732225894928, -0.029497180134058, 0.09674722701311111, 0.08061115443706512, -0.07585363835096359, 0.02032829262316227, 0.15617427229881287, -0.07247176766395569, -0.10849180817604065, 0.04999847710132599, 0.04640531167387962, 0.17256882786750793, 0.004101871978491545, 0.02018604800105095, 0.08726977556943893, 0.045959215611219406, -0.007486662827432156, 0.007311292923986912, -0.11321697384119034, -0.04241771996021271, 0.0387241393327713, -0.005273692775517702, -0.10946331918239594, 0.16008898615837097, 0.056837860494852066, 0.004653505515307188, -0.06027700752019882, 0.09720424562692642, -0.06709636747837067, -0.07046061009168625, -0.1753035932779312, 0.018511172384023666, -0.12734080851078033, -0.09874535351991653, 0.06846235692501068, -0.09371624886989594, -0.04084605351090431, 0.08152704685926437, 0.046927981078624725, 0.14401860535144806, -0.006597559433430433, -0.023080874234437943, 0.149825319647789, -0.0884878933429718, -0.2241756170988083, 0.01969664730131626, -0.04083063453435898, -0.07065816223621368, -0.0007070365245454013, 0.06069544702768326, -0.0663156732916832, -0.11958606541156769, -0.20477768778800964, 0.10412076860666275, -0.12043121457099915, -0.03954985365271568, -0.1041841059923172, -0.053260523825883865, 0.07891252636909485, -0.02613759972155094, -0.04122013971209526, -0.047595683485269547, -0.16630595922470093, 0.054254453629255295, 0.07140932232141495, 0.11125344783067703, -0.0759999230504036, -0.018354382365942, 0.1398727148771286, 0.048581548035144806, 0.08479110151529312, 0.07578440010547638, 0.026255371049046516, 0.16728560626506805, -0.1708206981420517, -0.0542997270822525, 0.1068294569849968, -0.026716172695159912, 0.01994573324918747, 0.10631280392408371, -0.04839588701725006, 0.07042654603719711, -0.05095988139510155, 0.05859163776040077, -0.15704534947872162, -0.13073866069316864, -0.04184387996792793, 0.023728877305984497, -0.2260182797908783, 0.015071595087647438, -0.1769561767578125, 0.19692228734493256, -0.024228032678365707, 0.11490963399410248, 0.08052190393209457, 0.02052290178835392, 0.03539382666349411, -0.006019921973347664, 0.00946811307221651, -0.10524865239858627, -0.05784677714109421, -0.07560300827026367, -0.1168874129652977, -0.009665017947554588, 0.36614301800727844, 0.02430291846394539, -0.19682736694812775, 0.051222387701272964, 0.18285293877124786, 0.023639049381017685, -0.0073763905093073845, 0.26180747151374817, 0.08150359988212585, -0.023175053298473358, -0.1782374382019043, 0.0396091528236866, -0.08699734508991241, -0.15269799530506134, 0.11385007947683334, 0.09347525984048843, 0.05813581123948097, 0.022930078208446503, 0.10404518246650696, -0.035940010100603104, -0.05509711429476738, -0.13301853835582733, 0.13368983566761017, -0.001790675800293684, 0.0193882267922163, 0.0897885113954544, 0.19249756634235382, -0.045275162905454636, 0.05437124893069267, -0.07336640357971191, -0.001598604372702539, -0.15740543603897095, -0.13358698785305023, 0.06194563955068588, -0.08269550651311874, 0.06342913210391998, 0.050261519849300385, 0.04341990500688553, 0.31786394119262695, 0.039095040410757065, -0.046439893543720245, 0.003166865324601531, -0.14845187962055206, -0.08075450360774994, -0.06024569645524025, -0.03110554814338684, 0.028620192781090736, -0.13928957283496857, -0.09898591786623001, -0.06917677819728851, -0.130235955119133, -0.06539803743362427, 0.025270747020840645, 0.014251931570470333, -0.053083837032318115, -0.17625881731510162, -0.04808593541383743, -0.06644169986248016, 0.10105955600738525, -0.08462738990783691, 0.1516820639371872, 0.0022449472453445196, 0.030281953513622284, 0.07627002149820328, 0.09585131704807281, 0.018900424242019653, -0.06975197046995163, 0.05599058046936989, 0.12436293810606003, 0.01323844213038683, 0.1259988248348236, -0.06034265458583832, -0.019420607015490532, -0.014145253226161003, 0.14038437604904175, 0.304447740316391, -0.01856905221939087, -0.013814439997076988, -0.022110093384981155, 0.021388787776231766, 0.10893569141626358, 0.19800719618797302, -0.03437356278300285, 0.2551359534263611, -0.058974795043468475, 0.0756678432226181, -0.013180435635149479, -0.005362013820558786, -0.053146667778491974, 0.06074550002813339, 0.06268858164548874, -0.06877048313617706, -0.10191375762224197, 0.15178529918193817, -0.14985080063343048, 0.13306055963039398, 0.14678068459033966, -0.06057753041386604, 0.03797250986099243, 0.0007459368789568543, 0.19896264374256134, -0.03570213168859482, 0.0984780564904213, -0.10653308779001236, -0.10261140763759613, -0.14764924347400665, 0.037690844386816025, -0.36797797679901123, -0.1756322830915451, 0.11731542646884918, 0.14115898311138153, 0.1759258657693863, -0.012341637164354324, 0.056479312479496, 0.0033020609989762306, 0.08296097069978714, -0.04232487455010414, 0.1519634872674942, 0.0612073615193367, -0.017103128135204315, -0.15296664834022522, -0.20328094065189362, -0.0012039330322295427, -0.058561209589242935, 0.055583830922842026, -0.02269243635237217, 0.025347469374537468, 0.07746459543704987, -0.06768939644098282, -0.029180381447076797, -0.02352982573211193, -0.13262848556041718, 0.052229251712560654, -0.04354005306959152, 0.0320255309343338, -0.03958037868142128, -0.022394726052880287, -0.039987675845623016, 0.10721533745527267, -0.22402705252170563, -0.08517231047153473, 0.1422796994447708, -0.03421911224722862, 0.1542559564113617, -0.02848726324737072, -0.12159585952758789, -0.024955326691269875, -0.06977712363004684, 0.10887379199266434, -0.1419300138950348, 0.038592495024204254, 0.13747453689575195, 0.008710617199540138, 0.031119761988520622, -0.2533661723136902, 0.050644006580114365, -0.03556957095861435, -0.016733208671212196, -0.057031940668821335 ]
null
null
null
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # deberta-v3-base-QuestionAns This model is a fine-tuned version of [microsoft/deberta-v3-base](https://huggingface.co/microsoft/deberta-v3-base) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 5.0560 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | No log | 1.0 | 30 | 5.2473 | | No log | 2.0 | 60 | 5.1047 | | No log | 3.0 | 90 | 5.0560 | ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.1+cu118 - Datasets 2.15.0 - Tokenizers 0.15.0
{"license": "mit", "tags": ["generated_from_trainer"], "base_model": "microsoft/deberta-v3-base", "model-index": [{"name": "deberta-v3-base-QuestionAns", "results": []}]}
null
alitolga/deberta-v3-base-QuestionAns
[ "safetensors", "generated_from_trainer", "base_model:microsoft/deberta-v3-base", "license:mit", "region:us" ]
2024-02-13T19:34:30+00:00
[]
[]
TAGS #safetensors #generated_from_trainer #base_model-microsoft/deberta-v3-base #license-mit #region-us
deberta-v3-base-QuestionAns =========================== This model is a fine-tuned version of microsoft/deberta-v3-base on an unknown dataset. It achieves the following results on the evaluation set: * Loss: 5.0560 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 2e-05 * train\_batch\_size: 8 * eval\_batch\_size: 8 * seed: 42 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * num\_epochs: 3.0 ### Training results ### Framework versions * Transformers 4.35.2 * Pytorch 2.1.1+cu118 * Datasets 2.15.0 * Tokenizers 0.15.0
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0", "### Training results", "### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.1+cu118\n* Datasets 2.15.0\n* Tokenizers 0.15.0" ]
[ "TAGS\n#safetensors #generated_from_trainer #base_model-microsoft/deberta-v3-base #license-mit #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0", "### Training results", "### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.1+cu118\n* Datasets 2.15.0\n* Tokenizers 0.15.0" ]
[ 37, 98, 4, 33 ]
[ "passage: TAGS\n#safetensors #generated_from_trainer #base_model-microsoft/deberta-v3-base #license-mit #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0### Training results### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.1+cu118\n* Datasets 2.15.0\n* Tokenizers 0.15.0" ]
[ -0.10977909713983536, -0.02383926510810852, -0.000272897508693859, 0.0962144061923027, 0.18739436566829681, 0.03188342973589897, 0.1521315574645996, 0.04621896147727966, -0.11039812862873077, 0.026030896231532097, 0.10344589501619339, 0.12346210330724716, -0.01214214600622654, 0.12621824443340302, -0.05480639636516571, -0.22153477370738983, 0.015339897945523262, 0.017923252657055855, -0.055215757340192795, 0.11165095120668411, 0.10440938174724579, -0.16147270798683167, 0.0682869404554367, -0.013526142574846745, -0.23760250210762024, 0.03791137784719467, 0.04424617439508438, -0.06407684087753296, 0.14295685291290283, -0.009633136913180351, 0.18386055529117584, 0.012823836877942085, 0.13880614936351776, -0.16603654623031616, 0.01076443400233984, 0.06743723154067993, 0.0072526587173342705, 0.05671124532818794, 0.06414835900068283, 0.008718214929103851, 0.08650407195091248, -0.10005319863557816, 0.0757589116692543, 0.031208310276269913, -0.13932667672634125, -0.27417436242103577, -0.08789606392383575, 0.0025896879378706217, 0.07555236667394638, 0.08566202968358994, -0.013975093141198158, 0.17519612610340118, -0.10044020414352417, 0.08535193651914597, 0.25552237033843994, -0.25281932950019836, -0.07021211832761765, 0.071146659553051, 0.023459527641534805, 0.08894380927085876, -0.10143274813890457, -0.02900632470846176, 0.0903119370341301, 0.04442868381738663, 0.12188786268234253, -0.012826957739889622, -0.11815144866704941, 0.009122032672166824, -0.14692682027816772, -0.004426182713359594, 0.0912783071398735, 0.03500741720199585, -0.054377976804971695, -0.004236732143908739, -0.07277829945087433, -0.1371414214372635, -0.051367513835430145, -0.04145136475563049, 0.06730754673480988, -0.06141535937786102, -0.08764902502298355, 0.01499799732118845, -0.11625413596630096, -0.11828317493200302, -0.024499638006091118, 0.24395588040351868, 0.052924994379282, 0.028124088421463966, -0.04764493182301521, 0.1291472613811493, -0.05694502219557762, -0.1366644948720932, 0.024524841457605362, 0.019550397992134094, 0.024960318580269814, -0.05345170944929123, -0.07289297133684158, -0.0356169231235981, 0.027074923738837242, 0.1499943733215332, -0.1311299353837967, 0.0237225741147995, 0.04559052735567093, 0.028581978753209114, -0.11121973395347595, 0.16195543110370636, -0.035876959562301636, 0.010333948768675327, 0.028275663033127785, 0.07690852880477905, 0.03842984512448311, 0.0128098726272583, -0.07406949251890182, 0.005984250921756029, 0.1138528510928154, 0.03108055889606476, -0.10245327651500702, 0.03943431377410889, -0.04984300583600998, 0.02609124220907688, 0.022958341985940933, -0.09585303068161011, 0.02446972019970417, 0.01637401059269905, -0.0685202106833458, -0.0464635044336319, 0.017798306420445442, 0.028543973341584206, 0.041776999831199646, 0.1314091831445694, -0.09103567898273468, 0.033552948385477066, -0.1253967434167862, -0.11012474447488785, -0.012358481995761395, -0.013520389795303345, 0.038584187626838684, -0.11949725449085236, -0.1693306863307953, -0.004231118597090244, 0.03667663410305977, -0.015293833799660206, 0.01881149411201477, -0.03158961609005928, -0.0996800884604454, -0.012826534919440746, -0.026565516367554665, 0.11635850369930267, -0.0694487988948822, 0.11646972596645355, 0.09728671610355377, 0.05548146739602089, -0.09908585995435715, 0.02373643033206463, -0.10591994971036911, 0.0024640432093292475, -0.2569774389266968, -0.005124390125274658, -0.06571763008832932, 0.06924369931221008, -0.03966303914785385, -0.1075524166226387, 0.022764498367905617, 0.005322130396962166, 0.10874362289905548, 0.1000693067908287, -0.17926017940044403, -0.05148404836654663, 0.12636883556842804, -0.1271693855524063, -0.1113729402422905, 0.08568505942821503, -0.04757939279079437, 0.06583791971206665, 0.07703462988138199, 0.16356876492500305, -0.00956286396831274, -0.15184687077999115, 0.010717145167291164, -0.0590687021613121, 0.04493590071797371, -0.05465210974216461, 0.0488610714673996, 0.0008339445339515805, -0.021729113534092903, 0.032641030848026276, -0.06953171640634537, 0.04686718434095383, -0.1324886828660965, -0.07485809177160263, -0.06488070636987686, -0.1212497353553772, 0.041503455489873886, 0.06225567311048508, 0.06155192852020264, -0.1368376612663269, -0.061608269810676575, 0.1359492540359497, 0.0928584486246109, -0.05217166990041733, 0.004119411576539278, -0.05063227564096451, 0.05520080775022507, -0.041433896869421005, -0.04931100085377693, -0.1645110845565796, -0.08925328403711319, 0.015393294394016266, -0.0047068120911717415, 0.00327594974078238, -0.044142138212919235, 0.07455213367938995, 0.11798553168773651, -0.07449058443307877, -0.04214201495051384, -0.10275856405496597, 0.019334375858306885, -0.1075754463672638, -0.19162553548812866, -0.03287539631128311, -0.00888550654053688, 0.07661817222833633, -0.20160453021526337, 0.04192623123526573, -0.029132604598999023, 0.09545008093118668, 0.02637353353202343, -0.005475367419421673, -0.06444989144802094, 0.09747123718261719, 0.0045448471792042255, -0.0580558106303215, 0.02937583066523075, -0.021635031327605247, -0.04795526713132858, -0.09300464391708374, -0.09899953752756119, 0.19842319190502167, 0.14743494987487793, -0.10093656182289124, -0.09002619981765747, 0.0297528188675642, -0.06982388347387314, -0.016209444031119347, -0.06776061654090881, 0.020620150491595268, 0.11483170837163925, -0.02282099425792694, 0.11661163717508316, -0.09248635172843933, -0.022356461733579636, 0.0014529010513797402, -0.041367556899785995, 0.057399142533540726, 0.06753043085336685, 0.11225567758083344, -0.053833018988370895, 0.11423299461603165, 0.15965376794338226, -0.11110177636146545, 0.10334773361682892, -0.06334836781024933, -0.07238177955150604, -0.015347606502473354, -0.01436277199536562, -0.007964075542986393, 0.1776101291179657, -0.04427338019013405, 0.03807692229747772, -0.009708701632916927, 0.017880810424685478, 0.027409786358475685, -0.2506949007511139, -0.052766069769859314, -0.006493645720183849, -0.04180464893579483, -0.046832308173179626, -0.01916314661502838, 0.020225372165441513, 0.10081995278596878, -0.04058639332652092, -0.057044293731451035, 0.0223590936511755, 0.00557175325229764, -0.07168775796890259, 0.22610197961330414, -0.07881317287683487, -0.04608485847711563, -0.10444154590368271, 0.012919694185256958, -0.04146885499358177, 0.009725523181259632, 0.0729912519454956, -0.11279396712779999, -0.044001005589962006, -0.06022777780890465, 0.05084332823753357, 0.07876109331846237, 0.039546042680740356, 0.007717863656580448, 0.01782396249473095, 0.09497642517089844, -0.14543552696704865, 0.0006467378116212785, -0.07175392657518387, -0.08312161266803741, 0.056910187005996704, 0.08347267657518387, 0.11983032524585724, 0.13179658353328705, -0.040337517857551575, -0.02079136297106743, -0.02325625531375408, 0.27377429604530334, -0.06932184845209122, -0.062200676649808884, 0.14498460292816162, -0.0005378815694712102, 0.03789127990603447, 0.11220243573188782, 0.08702115714550018, -0.13622024655342102, 0.029426908120512962, 0.025967992842197418, -0.021020567044615746, -0.21206539869308472, -0.028446493670344353, -0.0029190380591899157, -0.07482894510030746, 0.0390218086540699, 0.021479828283190727, -0.0022142441011965275, 0.06011484935879707, 0.02166934125125408, 0.048596709966659546, -0.0249907448887825, 0.06053663790225983, 0.0602521076798439, 0.046088092029094696, 0.10996872186660767, -0.04331322759389877, -0.06801527738571167, 0.02170231193304062, -0.04719885438680649, 0.22817444801330566, 0.00001491811781306751, 0.041673850268125534, 0.07939895242452621, 0.17735464870929718, -0.002462110947817564, 0.10430179536342621, 0.019770093262195587, -0.06619296967983246, 0.011421018280088902, -0.07086790353059769, -0.003634663764387369, 0.02175724133849144, -0.14258019626140594, 0.08539474755525589, -0.12784256041049957, -0.004376190714538097, 0.0801885724067688, 0.22792589664459229, 0.02038872055709362, -0.33866608142852783, -0.07218042016029358, 0.0015513667603954673, -0.007956861518323421, -0.023518022149801254, 0.011211591772735119, 0.1214594691991806, -0.04184851422905922, 0.027020303532481194, -0.03522983193397522, 0.06242155283689499, 0.031152624636888504, 0.04701671749353409, 0.05710339918732643, 0.13777777552604675, -0.01353135984390974, 0.03180978074669838, -0.2967328429222107, 0.2733897864818573, 0.019609946757555008, 0.13773353397846222, -0.023160841315984726, -0.02387249656021595, 0.020564470440149307, 0.06624981760978699, 0.01824108138680458, -0.030782584100961685, -0.05063082277774811, -0.23543617129325867, -0.039089519530534744, 0.06226648762822151, 0.12959490716457367, 0.01751827634871006, 0.09988957643508911, -0.00317860534414649, 0.012835978530347347, 0.10463982820510864, -0.06836505234241486, -0.19078455865383148, -0.010738518089056015, -0.0613342821598053, 0.02804485149681568, 0.016211025416851044, -0.11761446297168732, -0.1024993360042572, -0.0700412392616272, 0.06540956348180771, 0.0037123053334653378, -0.03701884299516678, -0.11041536182165146, 0.07908197492361069, 0.09566164761781693, -0.048732247203588486, 0.07471208274364471, 0.041078776121139526, 0.056677158921957016, 0.024378404021263123, -0.03807076811790466, 0.10815038532018661, -0.06901469081640244, -0.2009645253419876, -0.05414474010467529, 0.08314872533082962, 0.05137045681476593, 0.03196307271718979, -0.01318739727139473, 0.015162421390414238, 0.008538338355720043, -0.09961853921413422, 0.0038757321890443563, -0.02996899001300335, 0.05577516183257103, 0.028683772310614586, -0.051311735063791275, -0.042351674288511276, -0.050810445100069046, -0.0363025926053524, 0.10146798938512802, 0.32721322774887085, -0.06988053023815155, -0.04230755195021629, 0.08364763855934143, -0.03463691845536232, -0.17070871591567993, 0.09220478683710098, 0.06034506857395172, -0.0038430248387157917, 0.07597258687019348, -0.11784222722053528, 0.12991152703762054, 0.14157889783382416, -0.028757434338331223, 0.1240784227848053, -0.28924304246902466, -0.1585167944431305, 0.11064843088388443, 0.21439293026924133, 0.1088043749332428, -0.1557728499174118, -0.01764458417892456, -0.04504897817969322, -0.13108333945274353, 0.09324704110622406, -0.15120381116867065, 0.08701140433549881, -0.01612214185297489, 0.06821642071008682, -0.0008300155750475824, -0.06112472340464592, 0.14311541616916656, 0.0007622124394401908, 0.1586318016052246, -0.03935972973704338, -0.012870344333350658, 0.09020872414112091, -0.021951651200652122, 0.0271515604108572, -0.01850840076804161, 0.04274798557162285, 0.009107017889618874, -0.023169470950961113, -0.08148299902677536, 0.049713414162397385, -0.0533994697034359, -0.08378700911998749, -0.024846956133842468, 0.018639277666807175, -0.024928512051701546, -0.04630934074521065, 0.09266750514507294, 0.03939923644065857, 0.16225983202457428, 0.07548313587903976, 0.029977168887853622, -0.0779544860124588, -0.00511569669470191, 0.03007619082927704, -0.0344298779964447, 0.07267012447118759, -0.13134963810443878, 0.012572791427373886, 0.1029602587223053, 0.020421480759978294, 0.10135439038276672, 0.08415261656045914, -0.04837226867675781, 0.028388533741235733, 0.09198541939258575, -0.16636347770690918, -0.0741669088602066, 0.008622251451015472, -0.039275527000427246, -0.07221722602844238, 0.10561412572860718, 0.0799483135342598, -0.09373471885919571, -0.007967788726091385, -0.03593408688902855, -0.012631812132894993, -0.07303313910961151, 0.2227294147014618, 0.10064034163951874, 0.05390103533864021, -0.09364035725593567, 0.09540357440710068, 0.037226930260658264, -0.019714446738362312, -0.02629864402115345, 0.050324659794569016, -0.0708092525601387, -0.0081774378195405, 0.11950493603944778, 0.19487260282039642, -0.07515739649534225, -0.06573905795812607, -0.18015339970588684, -0.11609358340501785, 0.016829947009682655, 0.19300605356693268, 0.10496100038290024, 0.005628860089927912, 0.018334483727812767, 0.04501958191394806, -0.13519175350666046, 0.08371682465076447, 0.013451937586069107, 0.09526120126247406, -0.15351462364196777, 0.1688528209924698, 0.02303115464746952, -0.0004302372981328517, -0.03028959222137928, 0.06828100979328156, -0.1316944658756256, 0.024528106674551964, -0.1442081779241562, -0.04870409145951271, 0.009894388727843761, -0.0011268117232248187, 0.009574041701853275, -0.07749912887811661, -0.07624939829111099, 0.047325003892183304, -0.10701669752597809, -0.00812611822038889, 0.05367180332541466, 0.03715867921710014, -0.15122228860855103, -0.03743722289800644, 0.019229386001825333, -0.06613703072071075, 0.03823935240507126, 0.0487651601433754, 0.03849530592560768, 0.09630363434553146, -0.20997434854507446, -0.0029430179856717587, 0.08880948275327682, -0.016367431730031967, 0.0696609690785408, -0.05353253334760666, -0.02602485939860344, -0.007858239114284515, 0.0978042483329773, 0.008416155353188515, 0.0867205411195755, -0.13997472822666168, -0.006191175431013107, -0.02968805655837059, -0.06472725421190262, -0.04450218006968498, -0.0349254235625267, 0.09429186582565308, -0.013142097741365433, 0.18351885676383972, -0.09520009905099869, 0.011849514208734035, -0.21236686408519745, -0.014442931860685349, -0.028647301718592644, -0.09178724139928818, -0.15416806936264038, -0.026780616492033005, 0.06469432264566422, -0.04646223038434982, 0.14698436856269836, -0.0015701946103945374, 0.06641637533903122, 0.0374632254242897, -0.03034866787493229, 0.026866896077990532, 0.0457080602645874, 0.24175049364566803, 0.031129976734519005, -0.00613299710676074, 0.03124288097023964, 0.0734565481543541, 0.11419452726840973, 0.01158747635781765, 0.22155217826366425, 0.18127477169036865, -0.07043489068746567, 0.1019318550825119, 0.0614655576646328, -0.06218587979674339, -0.11618132889270782, 0.0043325223959982395, -0.024654695764183998, 0.025565601885318756, -0.037003397941589355, 0.18632014095783234, 0.10361658036708832, -0.16563965380191803, 0.013917487114667892, -0.0577508844435215, -0.07344912737607956, -0.09634224325418472, 0.07478474825620651, -0.08116906881332397, -0.19115056097507477, 0.026669880375266075, -0.11170000582933426, -0.01101001352071762, 0.1307188719511032, -0.015901604667305946, -0.008357066661119461, 0.235883891582489, 0.07531672716140747, 0.062061015516519547, 0.01887267641723156, 0.002806885400786996, -0.03576918691396713, -0.07197372615337372, -0.1012999638915062, 0.005275350995361805, -0.0329875648021698, 0.015152910724282265, -0.05596904084086418, -0.12911638617515564, 0.0355282686650753, 0.015552476979792118, -0.09444280713796616, 0.024008115753531456, 0.033259183168411255, 0.050202906131744385, 0.02025768719613552, 0.011496256105601788, 0.022110527381300926, -0.0040362440049648285, 0.2072332501411438, -0.057010989636182785, -0.13155171275138855, -0.06247086822986603, 0.26245516538619995, 0.04406530410051346, 0.021840333938598633, 0.02593136578798294, -0.12180887162685394, 0.010624225251376629, 0.15366508066654205, 0.14873500168323517, -0.08162666112184525, -0.0008891946054063737, -0.04128885641694069, -0.02712344564497471, -0.09549082070589066, 0.12738096714019775, 0.1272391825914383, 0.04202356934547424, -0.09513653069734573, -0.020700950175523758, -0.05248567834496498, -0.008205144666135311, -0.04144055396318436, 0.02868380770087242, 0.03159588947892189, 0.009482614696025848, -0.059001702815294266, 0.07555936276912689, -0.01746372878551483, -0.11142945289611816, 0.0962703675031662, -0.15968038141727448, -0.14188013970851898, -0.010838326066732407, 0.1389695703983307, -0.023466376587748528, 0.06749363988637924, -0.052638206630945206, -0.005400174763053656, 0.04293496161699295, -0.02973678521811962, -0.051605794578790665, -0.127696231007576, 0.0767015814781189, -0.0755990594625473, 0.236514151096344, -0.0324367955327034, 0.12624210119247437, 0.10700509697198868, 0.03140915557742119, -0.09085901826620102, 0.12035630643367767, 0.03917442262172699, -0.1436665952205658, -0.0003976380976382643, 0.0872146487236023, -0.052429888397455215, 0.0752221941947937, 0.01663416437804699, -0.1661776900291443, 0.028236106038093567, 0.006344062741845846, -0.0737876445055008, -0.057386551052331924, -0.06284373253583908, -0.06381191313266754, 0.09128324687480927, 0.1598876714706421, -0.03899944946169853, 0.04929110035300255, -0.0702781155705452, 0.07627421617507935, 0.0825834795832634, 0.04035616293549538, -0.028585845604538918, -0.2825299799442291, 0.058108292520046234, 0.14923399686813354, -0.056315988302230835, -0.2234940379858017, -0.07451563328504562, 0.01051599532365799, -0.0654032900929451, -0.09339899569749832, 0.06626714020967484, 0.11650731414556503, 0.05525080859661102, -0.045404739677906036, -0.1677107810974121, -0.10386636853218079, 0.1699165403842926, -0.14799459278583527, -0.12518051266670227 ]
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # bert-finetuned-ner This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0568 - Precision: 0.9372 - Recall: 0.9525 - F1: 0.9448 - Accuracy: 0.9867 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:| | 0.0781 | 1.0 | 1756 | 0.0792 | 0.9066 | 0.9327 | 0.9195 | 0.9800 | | 0.0403 | 2.0 | 3512 | 0.0600 | 0.9265 | 0.9461 | 0.9362 | 0.9849 | | 0.0264 | 3.0 | 5268 | 0.0568 | 0.9372 | 0.9525 | 0.9448 | 0.9867 | ### Framework versions - Transformers 4.37.2 - Pytorch 2.2.0+cu118 - Datasets 2.16.1 - Tokenizers 0.15.1
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "metrics": ["precision", "recall", "f1", "accuracy"], "base_model": "bert-base-cased", "model-index": [{"name": "bert-finetuned-ner", "results": []}]}
token-classification
Fm505/bert-finetuned-ner
[ "transformers", "safetensors", "bert", "token-classification", "generated_from_trainer", "base_model:bert-base-cased", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2024-02-13T19:36:16+00:00
[]
[]
TAGS #transformers #safetensors #bert #token-classification #generated_from_trainer #base_model-bert-base-cased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
bert-finetuned-ner ================== This model is a fine-tuned version of bert-base-cased on an unknown dataset. It achieves the following results on the evaluation set: * Loss: 0.0568 * Precision: 0.9372 * Recall: 0.9525 * F1: 0.9448 * Accuracy: 0.9867 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 2e-05 * train\_batch\_size: 8 * eval\_batch\_size: 8 * seed: 42 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * num\_epochs: 3 ### Training results ### Framework versions * Transformers 4.37.2 * Pytorch 2.2.0+cu118 * Datasets 2.16.1 * Tokenizers 0.15.1
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3", "### Training results", "### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.2.0+cu118\n* Datasets 2.16.1\n* Tokenizers 0.15.1" ]
[ "TAGS\n#transformers #safetensors #bert #token-classification #generated_from_trainer #base_model-bert-base-cased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3", "### Training results", "### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.2.0+cu118\n* Datasets 2.16.1\n* Tokenizers 0.15.1" ]
[ 64, 98, 4, 33 ]
[ "passage: TAGS\n#transformers #safetensors #bert #token-classification #generated_from_trainer #base_model-bert-base-cased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3### Training results### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.2.0+cu118\n* Datasets 2.16.1\n* Tokenizers 0.15.1" ]
[ -0.10009071975946426, 0.08769192546606064, -0.0016157154459506273, 0.11102181673049927, 0.16385821998119354, 0.02851385623216629, 0.13268503546714783, 0.0976686030626297, -0.08472595363855362, 0.03126371651887894, 0.1266617327928543, 0.14024589955806732, 0.0001820201287046075, 0.14190176129341125, -0.061045315116643906, -0.2218911051750183, 0.023931141942739487, 0.02623043768107891, -0.06859871000051498, 0.11714708805084229, 0.10239746421575546, -0.13512645661830902, 0.09655507653951645, -0.009065021760761738, -0.18821148574352264, 0.023587042465806007, 0.03330840915441513, -0.05428093671798706, 0.14034920930862427, 0.03185415640473366, 0.1327906847000122, 0.01094221044331789, 0.10545779019594193, -0.20471593737602234, 0.0058305454440414906, 0.05764922872185707, 0.0021784892305731773, 0.07582955062389374, 0.03182327002286911, 0.0016785975312814116, 0.07444203644990921, -0.06757175177335739, 0.0664314329624176, 0.020871944725513458, -0.12391071766614914, -0.23080581426620483, -0.07891109585762024, 0.04989631474018097, 0.10320577770471573, 0.08168504387140274, -0.0009011725778691471, 0.13907502591609955, -0.09130612760782242, 0.07651053369045258, 0.21842096745967865, -0.31165891885757446, -0.05733935907483101, 0.04584121331572533, 0.012695706449449062, 0.0715077593922615, -0.10827755928039551, -0.031670503318309784, 0.0721534788608551, 0.02535345032811165, 0.12718035280704498, -0.029627250507473946, -0.08211605250835419, 0.02027849107980728, -0.14524821937084198, -0.014456487260758877, 0.1484009474515915, 0.0581509992480278, -0.05041354522109032, -0.028290877118706703, -0.05905558168888092, -0.13752588629722595, -0.040470268577337265, -0.02832355722784996, 0.050093308091163635, -0.02654358185827732, -0.06340846419334412, 0.024808606132864952, -0.09685409069061279, -0.09426962584257126, -0.05567176267504692, 0.16793566942214966, 0.042018961161375046, 0.009362861514091492, -0.003618484130129218, 0.10780469328165054, -0.038626838475465775, -0.12220712751150131, 0.010534338653087616, 0.019757216796278954, 0.0028811106458306313, -0.06394218653440475, -0.05517249181866646, 0.002215888351202011, 0.029717281460762024, 0.1713271141052246, -0.05413440614938736, 0.032279688864946365, 0.032602835446596146, 0.029864629730582237, -0.10230092704296112, 0.1650603711605072, -0.04709057882428169, -0.03738033398985863, 0.017480744048953056, 0.07139770686626434, 0.030933741480112076, 0.008821179158985615, -0.11194193363189697, 0.01878562569618225, 0.1172320619225502, 0.016521140933036804, -0.08968336135149002, 0.07265915721654892, -0.058443572372198105, -0.0002092416543746367, 0.04251764342188835, -0.08743640035390854, 0.028225257992744446, 0.0035417620092630386, -0.061111968010663986, -0.07638445496559143, 0.020312104374170303, 0.026831749826669693, 0.01756954751908779, 0.10688328742980957, -0.09394804388284683, 0.007307254243642092, -0.09134241193532944, -0.11377080529928207, 0.004709092900156975, -0.04663493484258652, 0.042548928409814835, -0.11183429509401321, -0.16422326862812042, -0.0016196906799450517, 0.049741704016923904, -0.017269998788833618, -0.04488066956400871, -0.047545552253723145, -0.07293658703565598, 0.0013510157587006688, -0.02456735074520111, 0.08685116469860077, -0.0732116624712944, 0.08966372162103653, 0.055948998779058456, 0.06083142012357712, -0.053438879549503326, 0.03871242329478264, -0.11316631734371185, 0.022559355944395065, -0.1889367550611496, 0.007965253666043282, -0.07174227386713028, 0.06567912548780441, -0.08216999471187592, -0.08073493838310242, 0.02403496764600277, 0.010368559509515762, 0.06793373078107834, 0.10168828070163727, -0.13751697540283203, -0.06850440800189972, 0.14927345514297485, -0.10054302960634232, -0.14897389709949493, 0.11014774441719055, -0.057700905948877335, 0.04283025860786438, 0.07228647172451019, 0.15602190792560577, 0.06626985967159271, -0.08949083089828491, 0.006677582859992981, -0.0002178242866648361, 0.05732782930135727, -0.05449214205145836, 0.08035517483949661, 0.00683344341814518, -0.04040957987308502, 0.020787283778190613, -0.06815241277217865, 0.05736026167869568, -0.09913409501314163, -0.08697603642940521, -0.03659661486744881, -0.11103831976652145, 0.04614110663533211, 0.053753890097141266, 0.060911402106285095, -0.11669214069843292, -0.08431807905435562, 0.08941271156072617, 0.09092549234628677, -0.056568264961242676, 0.01207494642585516, -0.07490012049674988, 0.0680510476231575, -0.05614712834358215, -0.02880200184881687, -0.1473466157913208, -0.054655566811561584, 0.015371440909802914, 0.009864901192486286, 0.0012891223886981606, -0.007786777336150408, 0.0675685927271843, 0.08678890764713287, -0.07084585726261139, -0.04272060841321945, -0.02201768569648266, 0.024521097540855408, -0.1282244622707367, -0.20853619277477264, -0.029129009693861008, -0.02175002358853817, 0.13838353753089905, -0.2368655651807785, 0.04027428478002548, -0.02419598028063774, 0.08896061033010483, 0.027782421559095383, 0.0010560652008280158, -0.06128312274813652, 0.07784757018089294, -0.03955258056521416, -0.05305897071957588, 0.050926465541124344, -0.0011073742061853409, -0.07947378605604172, -0.04851547256112099, -0.11495152115821838, 0.22095268964767456, 0.13426616787910461, -0.09394356608390808, -0.08163954317569733, -0.009742307476699352, -0.04515901580452919, -0.02749420702457428, -0.049651890993118286, 0.013301064260303974, 0.11968283355236053, -0.02424348145723343, 0.1444554179906845, -0.07167740911245346, -0.029388973489403725, 0.01559029147028923, -0.04436841234564781, 0.022761423140764236, 0.0993872582912445, 0.1067827120423317, -0.10576633363962173, 0.15670733153820038, 0.18208186328411102, -0.08659370988607407, 0.09929535537958145, -0.04175976291298866, -0.062263015657663345, -0.018747150897979736, -0.013092481531202793, -0.0033560830634087324, 0.12353993207216263, -0.11347285658121109, 0.006294716615229845, 0.01045911107212305, 0.027032846584916115, 0.00739213777706027, -0.21803103387355804, -0.03774625062942505, 0.035464849323034286, -0.04120468348264694, -0.00891942624002695, -0.03568720072507858, -0.009411032311618328, 0.09689435362815857, 0.004960527643561363, -0.1035248264670372, 0.046988729387521744, 0.0020305218640714884, -0.08512170612812042, 0.21435831487178802, -0.09001746028661728, -0.09718293696641922, -0.12852175533771515, -0.0769294947385788, -0.03901805356144905, 0.03120225854218006, 0.06789837777614594, -0.08008552342653275, -0.045667439699172974, -0.08984458446502686, 0.014449109323322773, 0.04096229001879692, 0.03718806058168411, 0.009717599488794804, -0.005188105162233114, 0.08768551051616669, -0.1024312898516655, -0.014811867848038673, -0.04858053848147392, -0.07313374429941177, 0.0333389937877655, 0.028623931109905243, 0.11161835491657257, 0.14448679983615875, -0.023325515910983086, -0.004349452443420887, -0.03118256665766239, 0.24501417577266693, -0.04686775431036949, -0.037602681666612625, 0.1279333233833313, -0.015959467738866806, 0.037186913192272186, 0.1451433002948761, 0.06241375952959061, -0.10115495324134827, 0.024590419605374336, 0.04068136587738991, -0.017482934519648552, -0.19812260568141937, -0.03971227630972862, -0.026613211259245872, -0.02327127754688263, 0.09610050171613693, 0.029839621856808662, 0.019063180312514305, 0.07365351915359497, 0.026354176923632622, 0.06896045804023743, -0.02046537958085537, 0.07036693394184113, 0.10307995229959488, 0.04758204147219658, 0.12501883506774902, -0.038445454090833664, -0.060004644095897675, 0.03309561312198639, -0.010424936190247536, 0.19686977565288544, 0.02406918816268444, 0.08905886858701706, 0.05887610465288162, 0.18040665984153748, -0.00047160364920273423, 0.0757846012711525, -0.004807652905583382, -0.052166521549224854, -0.013842928223311901, -0.04691098630428314, -0.034622661769390106, 0.04020717367529869, -0.10172334313392639, 0.07736808061599731, -0.12795616686344147, 0.005315841641277075, 0.06175883486866951, 0.24094869196414948, 0.0565500445663929, -0.33476364612579346, -0.09517239034175873, 0.01941194199025631, -0.027120010927319527, -0.023949474096298218, 0.03545645996928215, 0.10132370889186859, -0.058667585253715515, 0.022081971168518066, -0.0457371287047863, 0.08072098344564438, -0.012779616750776768, 0.04693781957030296, 0.0708581954240799, 0.09145539253950119, -0.0068342736922204494, 0.06987447291612625, -0.2756858468055725, 0.27029457688331604, 0.009052994661033154, 0.07657089084386826, -0.04457295686006546, -0.001079912530258298, 0.03337167948484421, 0.11846649646759033, 0.06615021824836731, -0.015519826672971249, -0.056897737085819244, -0.22689016163349152, -0.040490057319402695, 0.041518595069646835, 0.0796683132648468, -0.040978141129016876, 0.09984930604696274, -0.042990412563085556, 0.004235860425978899, 0.09348119795322418, -0.006544517818838358, -0.08963377773761749, -0.07564465701580048, -0.037398047745227814, 0.030571313574910164, 0.015492348000407219, -0.09327703714370728, -0.1001618430018425, -0.12140534818172455, 0.15145574510097504, -0.033799368888139725, -0.014825155027210712, -0.09944954514503479, 0.066172756254673, 0.05634661018848419, -0.08065054565668106, 0.05460469052195549, 0.015896586701273918, 0.07579638063907623, 0.04297478124499321, -0.0562175028026104, 0.1320030391216278, -0.08067906647920609, -0.17175012826919556, -0.06952811032533646, 0.09418740123510361, 0.029672935605049133, 0.046307194977998734, 0.0026448294520378113, 0.011364751495420933, -0.012157989665865898, -0.08101264387369156, 0.010014890693128109, -0.00789319071918726, 0.06757578998804092, 0.030649766325950623, -0.07518626749515533, -0.005355354398488998, -0.0567534901201725, -0.03218908980488777, 0.1478985846042633, 0.28912290930747986, -0.09229858964681625, -0.010112988762557507, 0.07130666822195053, -0.056675925850868225, -0.20552770793437958, 0.03346061706542969, 0.028453581035137177, -0.0015820475528016686, 0.04998600110411644, -0.13787642121315002, 0.1518363505601883, 0.1168900579214096, -0.028440535068511963, 0.0825655609369278, -0.2685021162033081, -0.12947188317775726, 0.15455396473407745, 0.16037757694721222, 0.12700027227401733, -0.1485721617937088, -0.02000671997666359, -0.04333379492163658, -0.13043606281280518, 0.10353287309408188, -0.11369399726390839, 0.09740863740444183, -0.015598447993397713, 0.04239001125097275, -0.00009531267278362066, -0.049720123410224915, 0.1291455626487732, 0.009338591247797012, 0.12430606782436371, -0.05587777867913246, -0.0336894653737545, 0.03240102156996727, -0.04823123291134834, 0.015325700864195824, -0.08951641619205475, 0.044320885092020035, -0.06014735624194145, -0.0290147103369236, -0.05567258223891258, 0.03842516243457794, -0.03304681554436684, -0.07367701828479767, -0.03665695711970329, 0.03480544313788414, 0.03343726322054863, -0.020325373858213425, 0.14030010998249054, 0.01794535666704178, 0.15785372257232666, 0.11248478293418884, 0.06596355885267258, -0.0859660729765892, -0.01962919533252716, -0.0011290207039564848, -0.03873101994395256, 0.07907737791538239, -0.1371309906244278, 0.0496651791036129, 0.11782227456569672, 0.00678205257281661, 0.14574213325977325, 0.07954422384500504, -0.019587863236665726, -0.0014002311509102583, 0.0758146196603775, -0.16084635257720947, -0.07943017035722733, 0.0012648459523916245, -0.04089124873280525, -0.11156831681728363, 0.07719003409147263, 0.10242579132318497, -0.0789107084274292, -0.0025180946104228497, -0.01753787323832512, 0.007669514510780573, -0.06677570194005966, 0.19422337412834167, 0.07364116609096527, 0.05089425668120384, -0.08952423930168152, 0.07252153009176254, 0.03863031789660454, -0.04686469957232475, -0.0036662565544247627, 0.026316983625292778, -0.08902907371520996, -0.045954134315252304, 0.07785005867481232, 0.19447803497314453, -0.05352356284856796, -0.06168488785624504, -0.13541406393051147, -0.1268724948167801, 0.055448979139328, 0.17588646709918976, 0.1173093169927597, 0.020381882786750793, -0.0146589744836092, 0.016523875296115875, -0.11281248182058334, 0.09675681591033936, 0.022701246663928032, 0.08605555444955826, -0.1646171361207962, 0.1247926577925682, -0.0027699447236955166, 0.013798585161566734, -0.027714883908629417, 0.047452423721551895, -0.13140012323856354, -0.0012104206252843142, -0.1357278674840927, -0.025236668065190315, -0.03287980332970619, 0.01864575408399105, 0.018923921510577202, -0.07014340907335281, -0.06874843686819077, 0.022551394999027252, -0.10572142153978348, -0.013842205516994, 0.0448882132768631, 0.06933402270078659, -0.12813226878643036, -0.039631545543670654, 0.028184441849589348, -0.0662858858704567, 0.06235843524336815, 0.03150881081819534, 0.03335394710302353, 0.06946496665477753, -0.18048039078712463, 0.011200468055903912, 0.07461462169885635, 0.006510418839752674, 0.06436823308467865, -0.09509222209453583, -0.009647856466472149, 0.0034051930997520685, 0.04712267965078354, 0.015023750253021717, 0.09105932712554932, -0.12799569964408875, -0.01506174635142088, -0.02838718332350254, -0.07596425712108612, -0.047575149685144424, 0.004480012692511082, 0.1067098006606102, -0.016526658087968826, 0.21565356850624084, -0.0939059928059578, 0.0019554970785975456, -0.2026534527540207, 0.002109029097482562, -0.017379796132445335, -0.10480287671089172, -0.15411417186260223, -0.055280596017837524, 0.0466759018599987, -0.04712562635540962, 0.15538719296455383, -0.0010063790250569582, 0.04302164167165756, 0.035891592502593994, -0.04772001877427101, 0.055649787187576294, 0.03492860123515129, 0.23956701159477234, 0.042997024953365326, -0.035791732370853424, 0.028171028941869736, 0.044086918234825134, 0.11147727817296982, 0.06442892551422119, 0.1595364511013031, 0.16652853786945343, -0.04811067879199982, 0.09713578224182129, 0.04360225051641464, -0.07220903784036636, -0.1306845098733902, 0.02754896879196167, -0.04950414597988129, 0.0715624988079071, -0.013875484466552734, 0.21738731861114502, 0.09140989184379578, -0.16409625113010406, 0.01132271159440279, -0.07001364231109619, -0.0731973722577095, -0.11720192432403564, -0.017480550333857536, -0.09877797216176987, -0.17964290082454681, 0.0007608439773321152, -0.1117367148399353, -0.005810802802443504, 0.12789180874824524, -0.0012678380589932203, -0.009448636323213577, 0.17211122810840607, 0.008312218822538853, 0.04856165498495102, 0.019754944369196892, -0.003711782395839691, -0.03814193233847618, -0.08911185711622238, -0.09012654423713684, 0.005466078873723745, -0.03244258090853691, 0.017614630982279778, -0.06674059480428696, -0.0509110763669014, 0.04913444072008133, -0.005620410665869713, -0.09563358873128891, 0.01997431553900242, 0.01686648651957512, 0.04256834462285042, 0.03831823915243149, 0.008100347593426704, 0.018322063609957695, 0.006109550129622221, 0.21904020011425018, -0.08005909621715546, -0.08628327399492264, -0.10810822993516922, 0.28505468368530273, 0.049683958292007446, 0.03335549309849739, 0.018389390781521797, -0.08444167673587799, 0.02145443856716156, 0.20902185142040253, 0.17693087458610535, -0.08963834494352341, 0.00013770052464678884, -0.011492175050079823, -0.01854657754302025, -0.048497240990400314, 0.09744010120630264, 0.1274157166481018, 0.005634080618619919, -0.07660135626792908, -0.04036208242177963, -0.04125681519508362, -0.004567612428218126, -0.03634600713849068, 0.045013703405857086, 0.03854648023843765, 0.011843682266771793, -0.04947265237569809, 0.046701643615961075, -0.016939617693424225, -0.09916042536497116, 0.07780112326145172, -0.17382054030895233, -0.14646850526332855, -0.019195588305592537, 0.10693547874689102, -0.0037757279351353645, 0.05415411293506622, -0.03565802797675133, -0.005050600040704012, 0.06421199440956116, -0.01775294914841652, -0.07235291600227356, -0.10819663852453232, 0.08172015100717545, -0.0698651596903801, 0.24854931235313416, -0.03356045484542847, 0.05994328856468201, 0.13046012818813324, 0.042554281651973724, -0.08313065767288208, 0.09456472843885422, 0.04224560782313347, -0.09534749388694763, 0.024237696081399918, 0.051703207194805145, -0.0446707084774971, 0.12534824013710022, 0.039663784205913544, -0.1523486226797104, 0.016577662900090218, -0.06595898419618607, -0.08740202337503433, -0.05084645003080368, -0.04747352749109268, -0.056073520332574844, 0.13207979500293732, 0.18574529886245728, -0.03749602288007736, 0.013347338885068893, -0.05624067783355713, 0.04364204406738281, 0.06526226550340652, 0.030800336971879005, -0.03961940482258797, -0.23507162928581238, 0.041840679943561554, 0.0799521952867508, -0.020948071032762527, -0.2456197440624237, -0.08421734720468521, -0.004500635899603367, -0.054083265364170074, -0.09391871094703674, 0.08548818528652191, 0.11084304004907608, 0.05762499198317528, -0.06569062918424606, -0.13209830224514008, -0.0821789801120758, 0.16097521781921387, -0.1290777176618576, -0.11425619572401047 ]
null
null
stable-baselines3
# **PPO** Agent playing **LunarLander-v2** This is a trained model of a **PPO** agent playing **LunarLander-v2** using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3). ## Usage (with Stable-baselines3) TODO: Add your code ```python from stable_baselines3 import ... from huggingface_sb3 import load_from_hub ... ```
{"library_name": "stable-baselines3", "tags": ["LunarLander-v2", "deep-reinforcement-learning", "reinforcement-learning", "stable-baselines3"], "model-index": [{"name": "PPO", "results": [{"task": {"type": "reinforcement-learning", "name": "reinforcement-learning"}, "dataset": {"name": "LunarLander-v2", "type": "LunarLander-v2"}, "metrics": [{"type": "mean_reward", "value": "289.22 +/- 18.62", "name": "mean_reward", "verified": false}]}]}]}
reinforcement-learning
huxin/ppo-LunarLander-v2
[ "stable-baselines3", "LunarLander-v2", "deep-reinforcement-learning", "reinforcement-learning", "model-index", "region:us" ]
2024-02-13T19:36:39+00:00
[]
[]
TAGS #stable-baselines3 #LunarLander-v2 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us
# PPO Agent playing LunarLander-v2 This is a trained model of a PPO agent playing LunarLander-v2 using the stable-baselines3 library. ## Usage (with Stable-baselines3) TODO: Add your code
[ "# PPO Agent playing LunarLander-v2\nThis is a trained model of a PPO agent playing LunarLander-v2\nusing the stable-baselines3 library.", "## Usage (with Stable-baselines3)\nTODO: Add your code" ]
[ "TAGS\n#stable-baselines3 #LunarLander-v2 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us \n", "# PPO Agent playing LunarLander-v2\nThis is a trained model of a PPO agent playing LunarLander-v2\nusing the stable-baselines3 library.", "## Usage (with Stable-baselines3)\nTODO: Add your code" ]
[ 39, 41, 17 ]
[ "passage: TAGS\n#stable-baselines3 #LunarLander-v2 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us \n# PPO Agent playing LunarLander-v2\nThis is a trained model of a PPO agent playing LunarLander-v2\nusing the stable-baselines3 library.## Usage (with Stable-baselines3)\nTODO: Add your code" ]
[ 0.03942384943366051, 0.04900386184453964, -0.005304091144353151, 0.026427261531352997, 0.107408307492733, -0.026511888951063156, 0.11188238859176636, 0.0814051404595375, 0.10722193866968155, 0.04762078449130058, 0.08338645845651627, 0.06030960753560066, 0.05080918222665787, 0.2571701407432556, 0.04754156619310379, -0.22987541556358337, 0.036159250885248184, -0.04869936779141426, 0.12395193427801132, 0.07178173214197159, -0.0038484656251966953, -0.06485428661108017, 0.020415637642145157, -0.013290755450725555, 0.05367108806967735, 0.04282612353563309, -0.01716216839849949, -0.08207534998655319, 0.07169748842716217, -0.06345846503973007, 0.06986866891384125, 0.07677983492612839, 0.13218913972377777, -0.17832116782665253, 0.029566360637545586, 0.02571309357881546, -0.07189024239778519, 0.01342033501714468, 0.008019951172173023, 0.05120139941573143, 0.17303818464279175, 0.019879888743162155, 0.07844575494527817, -0.0025605305563658476, -0.15412317216396332, -0.018950799480080605, 0.0436202734708786, 0.12546207010746002, 0.08808347582817078, 0.04605821147561073, 0.01970590092241764, 0.17503218352794647, -0.054352790117263794, -0.028833400458097458, 0.21759237349033356, -0.2881564497947693, -0.031460098922252655, 0.321048766374588, 0.06997483223676682, 0.09725230932235718, -0.07540661096572876, -0.03619609400629997, 0.007783263456076384, -0.013137873262166977, -0.028666524216532707, -0.07447073608636856, 0.17313385009765625, 0.05152064561843872, -0.05057951435446739, -0.09541505575180054, 0.16948209702968597, 0.006921638268977404, 0.0018855923553928733, -0.019282981753349304, 0.009060598909854889, 0.07402525842189789, -0.016097044572234154, -0.07255112379789352, 0.057438433170318604, 0.05330665782094002, 0.019649166613817215, -0.1435653269290924, -0.10762494057416916, -0.022740179672837257, -0.008012006990611553, 0.17786912620067596, -0.009255532175302505, 0.042902372777462006, 0.003065188182517886, 0.10384012013673782, -0.12480384111404419, -0.03354184702038765, -0.0454259067773819, -0.07565800100564957, -0.0223417766392231, -0.02058211714029312, -0.03580251708626747, 0.07184842973947525, 0.11971849203109741, 0.027368178591132164, 0.09350208193063736, 0.047715865075588226, -0.03206788748502731, 0.06343851238489151, 0.05555703118443489, 0.14222665131092072, 0.05807621404528618, 0.012854371219873428, 0.13179877400398254, 0.055213116109371185, 0.033023182302713394, -0.0613492950797081, -0.18252409994602203, 0.07489913702011108, -0.07031869143247604, 0.007941240444779396, 0.12051256000995636, -0.04480670019984245, -0.1183447614312172, -0.037500523030757904, -0.017392054200172424, -0.06224250793457031, -0.025395862758159637, 0.0547584593296051, -0.02883218228816986, -0.03973718360066414, 0.0011496668448671699, 0.09384800493717194, 0.00953749567270279, -0.1752052903175354, 0.03303423151373863, -0.025042934343218803, -0.10782608389854431, 0.009975161403417587, 0.0022444494534283876, 0.03394931182265282, 0.04408763721585274, -0.11822668462991714, -0.30899152159690857, -0.07652641832828522, 0.05490870401263237, -0.06516939401626587, -0.18425025045871735, -0.13193942606449127, 0.02454492449760437, -0.09037084132432938, -0.044885024428367615, -0.12759265303611755, -0.028549788519740105, 0.01743689924478531, 0.011519349180161953, 0.10758619755506516, -0.0106219332665205, -0.012188062071800232, -0.1571401208639145, 0.008273907005786896, -0.20951123535633087, 0.0890483483672142, -0.019150104373693466, 0.037884220480918884, -0.032381169497966766, -0.07404014468193054, 0.030707746744155884, 0.052499737590551376, -0.01474119070917368, 0.13510210812091827, -0.15592676401138306, -0.03691192343831062, -0.007996266707777977, -0.13611900806427002, -0.04786273464560509, -0.10358831286430359, -0.04357128217816353, 0.13354332745075226, 0.018664736300706863, 0.15356586873531342, -0.08709818124771118, -0.0722038671374321, 0.20489206910133362, -0.010411538183689117, -0.12820468842983246, -0.076752208173275, 0.10165707021951675, 0.021510310471057892, -0.056606587022542953, -0.02523270808160305, -0.1839766949415207, -0.0152357779443264, -0.04550420492887497, -0.047039128839969635, 0.01796751655638218, -0.010888241231441498, 0.13837894797325134, 0.08494598418474197, 0.05018039792776108, -0.06086122244596481, -0.006730288732796907, 0.10779471695423126, 0.08823856711387634, 0.008680110797286034, 0.023406028747558594, -0.05774238705635071, 0.09552932530641556, -0.04003755748271942, -0.0142367510125041, -0.08283266425132751, -0.036246106028556824, -0.026256313547492027, 0.17507147789001465, 0.09440762549638748, 0.2257927656173706, 0.09567736834287643, 0.039160262793302536, 0.031270865350961685, -0.13181598484516144, -0.1425403207540512, -0.0017254541162401438, 0.09020978957414627, -0.14270411431789398, -0.04119925573468208, -0.08974775671958923, -0.17768175899982452, -0.12202505767345428, 0.0006432619411498308, -0.17960017919540405, 0.06390921026468277, 0.05408334732055664, -0.035177867859601974, 0.03272094577550888, 0.13032332062721252, -0.011533179320394993, -0.03967514634132385, 0.0831870287656784, 0.0379033200442791, -0.041234664618968964, -0.021742934361100197, 0.11885567009449005, 0.15673065185546875, 0.13124459981918335, -0.03511447086930275, 0.004914294462651014, 0.07076404243707657, -0.02309088408946991, 0.06539414077997208, 0.0558244064450264, 0.20973342657089233, 0.188301220536232, 0.038996949791908264, 0.008822928182780743, -0.07048165798187256, 0.0855446457862854, -0.0742373839020729, -0.14302679896354675, -0.05579735338687897, 0.08729292452335358, 0.016605578362941742, 0.023469142615795135, 0.08711627870798111, 0.024545932188630104, 0.09132762253284454, 0.15968108177185059, 0.01990218088030815, -0.09659269452095032, -0.050218869000673294, 0.01175848301500082, 0.027713103219866753, 0.04794301092624664, -0.04514073207974434, -0.00937939714640379, 0.017020760104060173, -0.10303554683923721, 0.031789086759090424, -0.1413339376449585, -0.1358717679977417, 0.044326696544885635, 0.003906996920704842, 0.010907664895057678, 0.02786896750330925, -0.0038291432429105043, 0.019039705395698547, 0.04351753741502762, -0.06975466758012772, 0.047416772693395615, -0.024745507165789604, -0.020031947642564774, 0.03340689837932587, -0.057257164269685745, -0.205775648355484, -0.17696654796600342, 0.00013708483311347663, -0.09910997003316879, 0.10194740444421768, 0.018308809027075768, -0.12373185902833939, 0.047737859189510345, -0.05822649225592613, 0.027574289590120316, -0.01875593699514866, -0.049130141735076904, 0.10507171601057053, 0.1525275856256485, -0.016146350651979446, 0.018018173053860664, -0.04865182936191559, -0.10157987475395203, -0.19632206857204437, 0.0691583976149559, 0.04680244252085686, 0.014610917307436466, 0.10669491440057755, 0.018072687089443207, 0.02367905154824257, -0.007674071006476879, -0.016521066427230835, -0.011659215204417706, -0.08781040459871292, 0.31909599900245667, 0.04510033503174782, -0.025173069909214973, 0.02041010931134224, -0.0043001663871109486, -0.028083480894565582, 0.03263787180185318, -0.0985708013176918, -0.07548979669809341, -0.08774089068174362, -0.04367410019040108, -0.09784720093011856, 0.053299110382795334, 0.05916472524404526, 0.003188040340319276, -0.07727594673633575, 0.04221395403146744, 0.11369874328374863, -0.0923808291554451, -0.07137343287467957, 0.07477962225675583, 0.0972946360707283, -0.07331304252147675, 0.00012658814375754446, 0.00874367356300354, 0.023951783776283264, 0.037102166563272476, 0.06778035312891006, -0.03966575115919113, 0.08589404821395874, -0.19917890429496765, 0.0372927263379097, 0.106058269739151, 0.023754918947815895, 0.0638108178973198, 0.07643651217222214, -0.1058402881026268, -0.008500572293996811, -0.032518330961465836, -0.21341575682163239, 0.1668180525302887, 0.1355515867471695, 0.06788124144077301, -0.025637222453951836, -0.00461410591378808, -0.0649740919470787, 0.05773647129535675, 0.02723747305572033, -0.14758841693401337, 0.004883295856416225, 0.06064270809292793, 0.026899009943008423, 0.01614922471344471, 0.07971042394638062, 0.014697225764393806, -0.1801026314496994, -0.014406266622245312, 0.10730406641960144, 0.002390873385593295, 0.0053148469887673855, -0.03175045922398567, -0.1755964607000351, 0.0751047357916832, 0.004285442177206278, 0.07233936339616776, -0.1676585078239441, 0.14297930896282196, -0.10089799761772156, 0.07726949453353882, -0.004285062663257122, -0.021311495453119278, 0.02507244050502777, -0.0541163794696331, 0.15163759887218475, 0.01058570109307766, -0.021810131147503853, -0.1200498715043068, -0.1717042326927185, -0.019227758049964905, -0.11788936704397202, -0.11679866164922714, 0.050424277782440186, 0.062185097485780716, 0.04923136904835701, -0.061147067695856094, 0.1518532931804657, -0.047422297298908234, 0.060713399201631546, -0.06893875449895859, -0.06755045056343079, 0.03764858841896057, -0.12588608264923096, -0.08176055550575256, 0.05573027580976486, 0.19166934490203857, 0.15833087265491486, -0.02816431224346161, -0.03472423925995827, -0.047419581562280655, -0.006212298292666674, -0.007802055217325687, 0.0275666993111372, 0.023223137483000755, 0.07315318286418915, -0.07681374251842499, -0.11649256944656372, 0.033787861466407776, -0.06713802367448807, -0.055589709430933, -0.015439179725944996, 0.1513158082962036, 0.04671623185276985, 0.07720734924077988, -0.018946662545204163, 0.03887668624520302, -0.001724981120787561, -0.056474871933460236, 0.16197094321250916, 0.03885216265916824, -0.05193585529923439, 0.06837689876556396, 0.053174007683992386, 0.043745119124650955, 0.03011113777756691, -0.026783017441630363, 0.206032395362854, 0.1980147808790207, 0.014206883497536182, 0.2175983190536499, 0.03177616000175476, -0.03772832080721855, -0.1300560086965561, -0.065880686044693, -0.006372632458806038, 0.03559038043022156, 0.08070417493581772, -0.18207235634326935, -0.015011128038167953, -0.05689644813537598, -0.034518610686063766, -0.15059494972229004, -0.28553900122642517, -0.05957856774330139, 0.20075850188732147, 0.14706264436244965, 0.27519428730010986, -0.10432573407888412, 0.035197313874959946, 0.02663275972008705, -0.04912831634283066, -0.006501141935586929, 0.00018665487004909664, 0.10268618166446686, -0.15421873331069946, 0.1176437959074974, 0.08486983180046082, -0.019002694636583328, 0.01058861706405878, -0.1619086116552353, 0.00936629343777895, -0.12191236019134521, 0.05354422330856323, 0.1400289237499237, -0.048128653317689896, -0.054873593151569366, 0.14033560454845428, -0.024562934413552284, -0.22685599327087402, -0.04648222774267197, -0.043600670993328094, -0.010640020482242107, 0.026607351377606392, -0.1013401448726654, 0.04101909324526787, 0.1330099105834961, 0.009380043484270573, 0.1147187277674675, 0.11749245226383209, -0.052566803991794586, 0.10792597383260727, 0.2257719188928604, -0.018785694614052773, 0.04689010605216026, -0.12743118405342102, -0.0012336712097749114, -0.028270328417420387, 0.013657891191542149, -0.09504974633455276, -0.09938385337591171, 0.02366873063147068, 0.02872389927506447, 0.009118586778640747, 0.0921793207526207, -0.029922157526016235, 0.0759170651435852, 0.06817561388015747, -0.13014446198940277, -0.16288450360298157, 0.015828335657715797, -0.007344507612287998, 0.08354310691356659, 0.00027861111448146403, 0.08878035843372345, -0.11932205408811569, -0.018093237653374672, -0.03153328225016594, -0.03319635987281799, -0.130486860871315, -0.07138993591070175, 0.06156524643301964, 0.028095467016100883, -0.06602972000837326, 0.1398407518863678, 0.026440169662237167, 0.15942534804344177, 0.049197953194379807, 0.012499804608523846, 0.07227300107479095, -0.05345509201288223, 0.1283530443906784, 0.13818155229091644, -0.00868943240493536, -0.05460423603653908, -0.1013643890619278, -0.10236792266368866, 0.08925779908895493, -0.05773641914129257, 0.07476430386304855, -0.14885357022285461, -0.06675903499126434, 0.015772046521306038, 0.016141414642333984, -0.09562095999717712, 0.02571965754032135, -0.01625603251159191, -0.18119946122169495, 0.056570518761873245, -0.048285093158483505, 0.0440407395362854, -0.06347788125276566, -0.1110161691904068, -0.17226378619670868, 0.06091433763504028, 0.08593481779098511, -0.053876690566539764, -0.12229149043560028, 0.011023230850696564, -0.00012518465518951416, -0.06341652572154999, -0.05023367330431938, 0.09722746908664703, -0.11020902544260025, 0.031452205032110214, -0.012567701749503613, 0.08853451162576675, -0.03510405123233795, -0.011538895778357983, 0.044220831245183945, -0.08039166033267975, -0.009481523185968399, 0.03534642979502678, -0.026372017338871956, -0.04127239063382149, -0.2689029574394226, 0.0036654395516961813, 0.0341104120016098, 0.02497158572077751, 0.07856601476669312, 0.011906822212040424, 0.021174922585487366, 0.03993808850646019, -0.15396519005298615, -0.013395369984209538, 0.14574195444583893, -0.07689505815505981, -0.022186370566487312, 0.05703273415565491, -0.09054436534643173, 0.013882770203053951, -0.030287226662039757, 0.1345842480659485, 0.023923413828015327, 0.06404478847980499, -0.0851147472858429, 0.10106813907623291, -0.1451139897108078, -0.04998219385743141, -0.01244612317532301, 0.09761348366737366, 0.07019034773111343, -0.10272270441055298, 0.014697125181555748, 0.04210108891129494, 0.19416837394237518, 0.016384804621338844, -0.0356343574821949, -0.03396720811724663, 0.004015897400677204, 0.22076453268527985, 0.03044266067445278, 0.10457023978233337, 0.07281364500522614, -0.026583973318338394, 0.12624378502368927, 0.09929762035608292, 0.11280370503664017, -0.055645186454057693, 0.13904185593128204, 0.04667386785149574, 0.038641396909952164, 0.0614289753139019, 0.06836545467376709, 0.09098632633686066, -0.0008288522367365658, 0.1138714924454689, 0.013811973854899406, -0.02422109805047512, -0.021335409954190254, 0.17759373784065247, 0.10501719266176224, -0.14769648015499115, 0.029047364369034767, -0.01258957851678133, 0.039933037012815475, -0.014194529503583908, -0.15634691715240479, -0.07240267097949982, -0.3315149247646332, 0.1226184144616127, -0.07119352370500565, 0.019930170848965645, 0.007913772016763687, -0.037425633519887924, -0.03296699747443199, -0.04477746784687042, 0.13151589035987854, -0.013641550205647945, -0.006079165264964104, -0.04815853759646416, -0.015360191464424133, -0.11607866734266281, -0.11200575530529022, -0.013207737356424332, -0.13671602308750153, -0.010119039565324783, 0.05595948174595833, 0.003977729007601738, 0.01821410097181797, -0.03142618387937546, 0.0024383175186812878, 0.06541839241981506, -0.05751744285225868, 0.056182678788900375, 0.12097269296646118, 0.08766137808561325, -0.1058853268623352, 0.031048951670527458, 0.2011747509241104, 0.04359564557671547, -0.12483977526426315, 0.01449228823184967, 0.1819491684436798, 0.004885740112513304, 0.017068125307559967, -0.006097703706473112, -0.0540788508951664, -0.07554277032613754, 0.1251034289598465, 0.08296554535627365, -0.09985227137804031, 0.015833314508199692, -0.0726347416639328, -0.01594804972410202, -0.06374675035476685, 0.10130585730075836, 0.09538925439119339, 0.04440245032310486, -0.10621760785579681, -0.08487539738416672, -0.10891728103160858, 0.040588874369859695, -0.08629853278398514, -0.07311757653951645, 0.09629398584365845, -0.07057105004787445, -0.07029950618743896, 0.025521177798509598, -0.17978744208812714, -0.009467960335314274, 0.1711762249469757, -0.24654000997543335, -0.0916430801153183, -0.10857923328876495, 0.14477859437465668, 0.016497576609253883, 0.1013975441455841, -0.006207061931490898, -0.007889035157859325, -0.20577777922153473, 0.024890204891562462, -0.05293011665344238, -0.02073732763528824, 0.07814782857894897, -0.09476397186517715, 0.22629831731319427, -0.08276885002851486, 0.020940175279974937, 0.012659613974392414, 0.0870661810040474, -0.030675338581204414, 0.09283176809549332, -0.03660329803824425, -0.12576518952846527, -0.03620953485369682, 0.03001813031733036, 0.013904244638979435, 0.10071761906147003, 0.09772487729787827, -0.03414725139737129, 0.03389119729399681, 0.09747414290904999, 0.04172342270612717, -0.023843804374337196, 0.0360250361263752, -0.17077107727527618, 0.02182629331946373, -0.018498148769140244, -0.06935930997133255, 0.03687669709324837, -0.06603235751390457, 0.1639697551727295, 0.04022442549467087, 0.0670473501086235, -0.036152735352516174, 0.0073931049555540085, -0.014454689808189869, -0.013775371946394444, -0.026180334389209747, -0.17259705066680908, -0.10422050207853317, -0.1347656100988388, -0.012701659463346004, -0.034971047192811966, 0.04591470584273338, 0.023234914988279343, -0.0003200018545612693, -0.014577031135559082, -0.12090865522623062, 0.04360328987240791, 0.11146783083677292, -0.04631396010518074, -0.026193076744675636 ]
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # Brain-Tumor-Detection This model is a fine-tuned version of [microsoft/swin-base-patch4-window7-224](https://huggingface.co/microsoft/swin-base-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0946 - Accuracy: 0.9804 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 0.92 | 3 | 0.4395 | 0.6667 | | No log | 1.85 | 6 | 0.2817 | 0.9020 | | No log | 2.77 | 9 | 0.1354 | 0.9608 | | 0.3994 | 4.0 | 13 | 0.0956 | 0.9804 | | 0.3994 | 4.62 | 15 | 0.0946 | 0.9804 | ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.0+cu121 - Datasets 2.17.0 - Tokenizers 0.15.1
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["imagefolder"], "metrics": ["accuracy"], "base_model": "microsoft/swin-base-patch4-window7-224", "model-index": [{"name": "Brain-Tumor-Detection", "results": [{"task": {"type": "image-classification", "name": "Image Classification"}, "dataset": {"name": "imagefolder", "type": "imagefolder", "config": "default", "split": "train", "args": "default"}, "metrics": [{"type": "accuracy", "value": 0.9803921568627451, "name": "Accuracy"}]}]}]}
image-classification
ShimaGh/Brain-Tumor-Detection
[ "transformers", "tensorboard", "safetensors", "swin", "image-classification", "generated_from_trainer", "dataset:imagefolder", "base_model:microsoft/swin-base-patch4-window7-224", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2024-02-13T19:36:54+00:00
[]
[]
TAGS #transformers #tensorboard #safetensors #swin #image-classification #generated_from_trainer #dataset-imagefolder #base_model-microsoft/swin-base-patch4-window7-224 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
Brain-Tumor-Detection ===================== This model is a fine-tuned version of microsoft/swin-base-patch4-window7-224 on the imagefolder dataset. It achieves the following results on the evaluation set: * Loss: 0.0946 * Accuracy: 0.9804 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 5e-05 * train\_batch\_size: 16 * eval\_batch\_size: 16 * seed: 42 * gradient\_accumulation\_steps: 4 * total\_train\_batch\_size: 64 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * lr\_scheduler\_warmup\_ratio: 0.1 * num\_epochs: 5 ### Training results ### Framework versions * Transformers 4.35.2 * Pytorch 2.1.0+cu121 * Datasets 2.17.0 * Tokenizers 0.15.1
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 64\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 5", "### Training results", "### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1" ]
[ "TAGS\n#transformers #tensorboard #safetensors #swin #image-classification #generated_from_trainer #dataset-imagefolder #base_model-microsoft/swin-base-patch4-window7-224 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 64\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 5", "### Training results", "### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1" ]
[ 88, 144, 4, 33 ]
[ "passage: TAGS\n#transformers #tensorboard #safetensors #swin #image-classification #generated_from_trainer #dataset-imagefolder #base_model-microsoft/swin-base-patch4-window7-224 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 64\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 5### Training results### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1" ]
[ -0.12127736210823059, 0.17222696542739868, -0.0023365537635982037, 0.0888848528265953, 0.11144842952489853, 0.02964683808386326, 0.10316036641597748, 0.13584554195404053, -0.05958261340856552, 0.11695091426372528, 0.14128874242305756, 0.08507499098777771, 0.07044493407011032, 0.15874934196472168, -0.005702747963368893, -0.2904362380504608, 0.019238455221056938, -0.012187603861093521, -0.1405809223651886, 0.10958791524171829, 0.06851233541965485, -0.12572680413722992, 0.09228307008743286, 0.005400280002504587, -0.1451333612203598, -0.026120787486433983, -0.041467778384685516, -0.045205507427453995, 0.09880220890045166, 0.038166992366313934, 0.08114748448133469, 0.0339299775660038, 0.11128184199333191, -0.23614953458309174, 0.007440959569066763, 0.07393530756235123, 0.009683690033853054, 0.09949567914009094, 0.11833741515874863, 0.011011205613613129, 0.13499663770198822, -0.10771817713975906, 0.06917975097894669, 0.04204488918185234, -0.0846465677022934, -0.23523688316345215, -0.06366647034883499, 0.0913097932934761, 0.12668149173259735, 0.052240096032619476, -0.009710308164358139, 0.07197214663028717, -0.06705714762210846, 0.08422546088695526, 0.21898366510868073, -0.2450324445962906, -0.07364803552627563, 0.04015715420246124, 0.02762426808476448, 0.035711560398340225, -0.13517938554286957, -0.006167258135974407, 0.03956267982721329, 0.00007587528671137989, 0.11603742092847824, 0.028769556432962418, 0.05602722615003586, 0.009981842711567879, -0.14031477272510529, -0.04540056362748146, 0.0971541628241539, 0.10858834534883499, -0.016926420852541924, -0.11713989078998566, -0.05427207797765732, -0.19570505619049072, -0.04112212359905243, 0.01214388757944107, 0.039919834583997726, -0.05489402264356613, -0.07521801441907883, 0.027962876483798027, -0.07103373110294342, -0.07877063751220703, 0.042266812175512314, 0.12168623507022858, 0.059807274490594864, -0.002654820680618286, 0.026944279670715332, 0.12092525511980057, 0.0900493785738945, -0.16496963798999786, 0.001285177655518055, 0.006950158625841141, -0.08050132542848587, -0.0006370447808876634, -0.007060878910124302, 0.02606978453695774, 0.04164230450987816, 0.14638566970825195, -0.027287214994430542, 0.07865959405899048, 0.08009808510541916, 0.02144077979028225, -0.07856166362762451, 0.1508493274450302, -0.08486410975456238, -0.09309542924165726, -0.025466086342930794, 0.11634368449449539, 0.029781440272927284, -0.008922521024942398, -0.08692560344934464, 0.019797543063759804, 0.10948143899440765, 0.02674272656440735, -0.001041705021634698, 0.04383867233991623, -0.058554984629154205, -0.0321129709482193, 0.08895234018564224, -0.0859089344739914, 0.044327471405267715, 0.035658374428749084, -0.06727316230535507, -0.015357056632637978, 0.031673770397901535, -0.011954862624406815, 0.009002024307847023, 0.10516221076250076, -0.09854809939861298, -0.027942093089222908, -0.08220620453357697, -0.07761754840612411, 0.0300226379185915, -0.09297078847885132, 0.01640899106860161, -0.08319015800952911, -0.12156954407691956, -0.03879006206989288, 0.06638113409280777, -0.06397140771150589, -0.07263711094856262, -0.05434035509824753, -0.09691892564296722, 0.058073438704013824, 0.0055617718026041985, 0.1375751793384552, -0.052782922983169556, 0.09449418634176254, 0.003944948315620422, 0.07828808575868607, 0.06588014215230942, 0.038232993334531784, -0.06191951781511307, 0.06806552410125732, -0.16797076165676117, 0.05475842207670212, -0.08786608278751373, 0.07070961594581604, -0.12136843800544739, -0.10036510974168777, -0.009264519438147545, -0.015487532131373882, 0.06648772209882736, 0.1454543024301529, -0.15117846429347992, -0.06701295077800751, 0.14091166853904724, -0.08866103738546371, -0.12065225839614868, 0.10676848888397217, -0.016088776290416718, -0.06161496788263321, 0.009533662348985672, 0.1673496812582016, 0.08164075016975403, -0.0896504670381546, -0.031242968514561653, 0.0018662051297724247, 0.0943654477596283, 0.002241715556010604, 0.10198939591646194, -0.002132517984136939, 0.008569695986807346, 0.021105337888002396, -0.07773803174495697, 0.07719019055366516, -0.09009873867034912, -0.08010216057300568, -0.038888875395059586, -0.0876576155424118, 0.03001253865659237, 0.059825293719768524, 0.024236012250185013, -0.07612353563308716, -0.1366262584924698, 0.014799466356635094, 0.11764085292816162, -0.09546253085136414, -0.0049644229002296925, -0.051543377339839935, 0.07513695955276489, -0.056855760514736176, -0.009734821505844593, -0.1271875947713852, -0.07421576231718063, 0.031194599345326424, -0.07840698957443237, -0.01570022478699684, -0.014073090627789497, 0.07347125560045242, 0.09066576510667801, -0.05677171051502228, -0.09101974219083786, -0.05469638109207153, 0.011396288871765137, -0.0791085958480835, -0.26151174306869507, -0.07958868891000748, -0.028191767632961273, 0.15398022532463074, -0.25754472613334656, 0.01712418533861637, 0.010365470312535763, 0.14870567619800568, 0.042483337223529816, -0.05910634994506836, 0.006072906777262688, 0.011162009090185165, -0.04479440674185753, -0.09782934188842773, 0.028911378234624863, 0.0016357662389054894, -0.11327583342790604, -0.02096685767173767, -0.11995004117488861, 0.1291588395833969, 0.10404811799526215, 0.010642536915838718, -0.09426255524158478, -0.04545651748776436, -0.07462366670370102, -0.05805254727602005, -0.0221241544932127, 0.01741412840783596, 0.08940352499485016, 0.009613331407308578, 0.1092730313539505, -0.07893047481775284, -0.0566893108189106, 0.04152185097336769, -0.0011807138798758388, -0.02805163897573948, 0.1406477987766266, 0.10478725284337997, -0.07652978599071503, 0.13503281772136688, 0.13028383255004883, -0.05533676594495773, 0.13123828172683716, -0.05618731305003166, -0.09774915128946304, -0.034918878227472305, 0.0233644787222147, 0.015568715520203114, 0.15968206524848938, -0.09933090955018997, 0.009448207914829254, 0.027233107015490532, 0.00835058931261301, 0.011733357794582844, -0.17206954956054688, -0.014215254224836826, 0.042979635298252106, -0.05118868127465248, 0.01326058991253376, -0.03263898938894272, -0.023103386163711548, 0.09107513725757599, 0.003962899092584848, -0.04817409813404083, -0.004298539832234383, -0.006074345670640469, -0.08204236626625061, 0.21090799570083618, -0.07943905889987946, -0.14297625422477722, -0.1246136873960495, 0.0367964431643486, -0.04412313178181648, -0.004646268207579851, 0.017293192446231842, -0.1030772402882576, -0.05453746020793915, -0.0858735740184784, 0.00309355859644711, -0.015927428379654884, 0.049771759659051895, 0.00974994245916605, 0.01381022296845913, 0.08152401447296143, -0.0833565816283226, 0.02344629541039467, -0.01014106348156929, -0.014509644359350204, 0.030032260343432426, 0.0439230315387249, 0.12263800948858261, 0.1288197934627533, 0.017585204914212227, 0.019639136269688606, -0.009629490785300732, 0.191340371966362, -0.0946844145655632, 0.02811327576637268, 0.09626647084951401, -0.006735580042004585, 0.048978306353092194, 0.1373882293701172, 0.04329347237944603, -0.0731111541390419, 0.012867365963757038, 0.03210357204079628, -0.015610160306096077, -0.19327959418296814, -0.02950204163789749, -0.02878498286008835, 0.0008023373666219413, 0.1328108012676239, 0.0459713451564312, -0.02776682935655117, 0.0674375668168068, -0.018431156873703003, 0.007542037405073643, -0.01358866598457098, 0.07185724377632141, 0.02451154589653015, 0.046205077320337296, 0.10733946412801743, -0.036082036793231964, -0.01872231811285019, 0.03732813522219658, 0.0004646370653063059, 0.2193073332309723, -0.030749458819627762, 0.1428200602531433, 0.025385258719325066, 0.17211295664310455, 0.004085215274244547, 0.06088855117559433, 0.014735948294401169, -0.03150170296430588, 0.0065011330880224705, -0.05266719311475754, -0.023549508303403854, 0.053584545850753784, 0.020183509215712547, 0.06007607653737068, -0.10651981085538864, 0.0599370077252388, 0.04896686598658562, 0.26045480370521545, 0.0715547502040863, -0.33812135457992554, -0.09042885154485703, 0.01546245813369751, -0.03441189229488373, -0.048210080713033676, 0.020134098827838898, 0.15096469223499298, -0.08610079437494278, 0.07866102457046509, -0.08589942008256912, 0.07010669261217117, -0.06810058653354645, -0.00514519028365612, 0.08591328561306, 0.11011051386594772, 0.001660709735006094, 0.07627668231725693, -0.19787819683551788, 0.2589815855026245, -0.006522234063595533, 0.045700062066316605, -0.057491663843393326, 0.03064604662358761, 0.02748837135732174, 0.027917135506868362, 0.11039384454488754, -0.003335012588649988, -0.10612884908914566, -0.1837037056684494, -0.12279483675956726, 0.01969037391245365, 0.11087334901094437, -0.08666706830263138, 0.11126862466335297, -0.027934959158301353, -0.04034154489636421, 0.04615561664104462, -0.055577997118234634, -0.08245020359754562, -0.12698078155517578, -0.0012205701787024736, -0.042949873954057693, 0.018049331381917, -0.09341724216938019, -0.10167059302330017, -0.09901785105466843, 0.1480409950017929, -0.11761868745088577, -0.03851970285177231, -0.15745441615581512, 0.1068762019276619, 0.1437709927558899, -0.08257364481687546, 0.06370753794908524, -0.009432430379092693, 0.12650452554225922, 0.036055248230695724, -0.048467185348272324, 0.11475881934165955, -0.09640393406152725, -0.22801409661769867, -0.05905279517173767, 0.11268673837184906, 0.039942651987075806, 0.058874957263469696, -0.023909838870167732, 0.023264706134796143, -0.01902562938630581, -0.09587834030389786, 0.0616241954267025, 0.04390410706400871, 0.039866093546152115, 0.01848950982093811, -0.038445599377155304, 0.02742665819823742, -0.03080698847770691, -0.03427593782544136, 0.10142605006694794, 0.27833327651023865, -0.12009349465370178, 0.02830849215388298, 0.030961209908127785, -0.05023054778575897, -0.17660558223724365, 0.01734722964465618, 0.10544867813587189, 0.027049807831645012, 0.03418903052806854, -0.17469798028469086, 0.10324672609567642, 0.08894986659288406, -0.026887422427535057, 0.10171614587306976, -0.2800647020339966, -0.123929463326931, 0.09434658288955688, 0.13237324357032776, -0.033410120755434036, -0.1679443120956421, -0.053657323122024536, -0.0044601475819945335, -0.07420061528682709, 0.08482522517442703, 0.0026759717147797346, 0.09468963742256165, -0.029967213049530983, -0.01577627658843994, 0.02302844263613224, -0.07159233093261719, 0.16159687936306, -0.011374236084520817, 0.08737646043300629, -0.031183643266558647, 0.0201039407402277, -0.0030397544614970684, -0.07951024919748306, 0.03754075616598129, -0.11357790231704712, 0.0555185005068779, -0.10362812876701355, -0.014240548014640808, -0.07779589295387268, 0.02970137633383274, -0.052492741495370865, -0.040029797703027725, -0.03605597838759422, 0.04718206822872162, 0.07714217156171799, -0.000959150493144989, 0.13835901021957397, 0.013319607824087143, 0.10141375660896301, 0.11997675895690918, 0.05197529122233391, 0.004364187829196453, -0.10130757838487625, -0.03522975742816925, -0.01122859213501215, 0.050321586430072784, -0.15155573189258575, 0.013873547315597534, 0.1289706528186798, 0.04009459912776947, 0.11931883543729782, 0.050394024699926376, -0.056790534406900406, -0.016484932973980904, 0.08295035362243652, -0.11337758600711823, -0.1354711502790451, -0.025425342842936516, 0.0019923883955925703, -0.1601855754852295, 0.01615137979388237, 0.07274913787841797, -0.06866539269685745, 0.004768671467900276, 0.0024854179937392473, 0.05185839533805847, 0.0032001356594264507, 0.18993675708770752, 0.0832068994641304, 0.08097968995571136, -0.08768025785684586, 0.11055664718151093, 0.03340005874633789, -0.1392522007226944, 0.024197306483983994, 0.0613902322947979, -0.0823015570640564, -0.012550324201583862, 0.07961598038673401, 0.10454769432544708, -0.020279135555028915, -0.045967668294906616, -0.12568098306655884, -0.11991330236196518, 0.06915818154811859, 0.0694487988948822, 0.0665203183889389, 0.022655542939901352, -0.0016957344487309456, 0.03095969185233116, -0.1067202091217041, 0.14121447503566742, 0.07823922485113144, 0.0994756743311882, -0.18836545944213867, 0.08272137492895126, 0.011274660006165504, 0.008704595267772675, -0.016557736322283745, 0.049273598939180374, -0.12420736998319626, -0.029117221012711525, -0.06727006286382675, 0.008039966225624084, -0.0695234015583992, 0.00968051329255104, 0.0023049679584801197, -0.05565539374947548, -0.03985009342432022, 0.0070405942387878895, -0.09440550208091736, -0.05830763280391693, 0.0008752673165872693, 0.06085212156176567, -0.10306164622306824, -0.020633839070796967, 0.038950152695178986, -0.12192375212907791, 0.09449096024036407, 0.018115825951099396, 0.04542912170290947, 0.013866709545254707, -0.09458369016647339, 0.03044055588543415, 0.046352971345186234, -0.004767432808876038, 0.02500271238386631, -0.13713990151882172, -0.005656082648783922, -0.04921555146574974, -0.009717028588056564, -0.02084665186703205, 0.044594310224056244, -0.13611935079097748, -0.001246120547875762, -0.05746901035308838, -0.04943787306547165, -0.06097963824868202, 0.05335848405957222, 0.06641656160354614, -0.014300242066383362, 0.16807951033115387, -0.07492413371801376, 0.04023568704724312, -0.2392653226852417, -0.002144210273399949, -0.013036097399890423, -0.06278412789106369, -0.08746249228715897, -0.010410124436020851, 0.0776202529668808, -0.052391018718481064, 0.10187516361474991, -0.03752494230866432, 0.023404179140925407, 0.026652276515960693, -0.028236879035830498, 0.043622903525829315, 0.05198090150952339, 0.19563773274421692, 0.014364107511937618, -0.01100172195583582, 0.06619174778461456, 0.018742505460977554, 0.08262581378221512, 0.0612800158560276, 0.14992524683475494, 0.14919540286064148, -0.051574259996414185, 0.10580366849899292, 0.04944775253534317, -0.12184179574251175, -0.1532772034406662, 0.15221138298511505, -0.07204517722129822, 0.1341995745897293, -0.022210828959941864, 0.16951137781143188, 0.1215149536728859, -0.20628724992275238, 0.00724197318777442, -0.010134530253708363, -0.08120468258857727, -0.09361996501684189, -0.09959852695465088, -0.08957335352897644, -0.17670896649360657, 0.017995605245232582, -0.10498052090406418, 0.010007944889366627, 0.07222268730401993, 0.02511000446975231, 0.023750128224492073, 0.15810714662075043, 0.0699654370546341, 0.02470848150551319, 0.06143757328391075, 0.0498770996928215, -0.04279448091983795, -0.029910467565059662, -0.08241673558950424, 0.021667039021849632, -0.024695951491594315, 0.03857850283384323, -0.06422647088766098, -0.0646282285451889, 0.08763927221298218, 0.04475923627614975, -0.09991338104009628, 0.023165570572018623, -0.02099217288196087, 0.039590783417224884, 0.06651134043931961, 0.010270769707858562, 0.00878791231662035, -0.04707111045718193, 0.2063126116991043, -0.0913863480091095, -0.00509443087503314, -0.11754430085420609, 0.1683105230331421, -0.011132380925118923, -0.009417615830898285, 0.03300848603248596, -0.08930910378694534, -0.0018064273754134774, 0.15223076939582825, 0.15777890384197235, -0.04386743903160095, -0.02338055893778801, 0.01729240082204342, -0.01596974767744541, -0.03791023790836334, 0.08409316092729568, 0.09181787818670273, 0.05205252766609192, -0.07095195353031158, -0.04515569657087326, -0.03910773992538452, -0.05758054554462433, -0.030261205509305, 0.05780147388577461, 0.03336772695183754, -0.00950878206640482, -0.045969005674123764, 0.07453446090221405, -0.043470270931720734, -0.12195772677659988, 0.08736759424209595, -0.18033066391944885, -0.17476430535316467, -0.036895815283060074, 0.08886871486902237, 0.016430510208010674, 0.04492217302322388, -0.001938513247296214, -0.02055102027952671, 0.10259099304676056, -0.0048811305314302444, -0.08088326454162598, -0.09112078696489334, 0.04291480407118797, -0.04231945425271988, 0.2373243123292923, -0.029252730309963226, 0.0075571415945887566, 0.12550480663776398, 0.036412179470062256, -0.1368613839149475, 0.010587293654680252, 0.07379403710365295, -0.09713710844516754, 0.04806172847747803, 0.15261656045913696, -0.02443406917154789, 0.11912494152784348, 0.04158735275268555, -0.09450346976518631, -0.007448395248502493, -0.08635895699262619, -0.06093301624059677, -0.05438243970274925, 0.006551024038344622, -0.03517378494143486, 0.157082661986351, 0.19900578260421753, -0.061124738305807114, -0.03273932263255119, -0.047806885093450546, 0.03764340654015541, 0.04721739515662193, 0.09678910672664642, 0.007830574177205563, -0.2302035242319107, 0.031002895906567574, -0.025142468512058258, 0.01912589557468891, -0.19243401288986206, -0.09274054318666458, 0.015701433643698692, -0.05084596574306488, -0.0962914377450943, 0.1051553264260292, 0.08053898066282272, 0.05017895624041557, -0.05996853858232498, -0.04476144537329674, -0.05007241666316986, 0.15571723878383636, -0.16387751698493958, -0.07704804092645645 ]
null
null
diffusers
# Adieu <Gallery /> ## Download model [Download](/ekato/Adieu/tree/main) them in the Files & versions tab.
{"license": "openrail", "tags": ["text-to-image", "stable-diffusion", "lora", "diffusers", "template:sd-lora"], "widget": [{"text": "-", "output": {"url": "images/1000017011.jpg"}}], "base_model": "stabilityai/stable-diffusion-xl-base-1.0"}
text-to-image
ekato/Adieu
[ "diffusers", "text-to-image", "stable-diffusion", "lora", "template:sd-lora", "base_model:stabilityai/stable-diffusion-xl-base-1.0", "license:openrail", "region:us" ]
2024-02-13T19:38:17+00:00
[]
[]
TAGS #diffusers #text-to-image #stable-diffusion #lora #template-sd-lora #base_model-stabilityai/stable-diffusion-xl-base-1.0 #license-openrail #region-us
# Adieu <Gallery /> ## Download model Download them in the Files & versions tab.
[ "# Adieu\n\n<Gallery />", "## Download model\n\n\nDownload them in the Files & versions tab." ]
[ "TAGS\n#diffusers #text-to-image #stable-diffusion #lora #template-sd-lora #base_model-stabilityai/stable-diffusion-xl-base-1.0 #license-openrail #region-us \n", "# Adieu\n\n<Gallery />", "## Download model\n\n\nDownload them in the Files & versions tab." ]
[ 62, 8, 14 ]
[ "passage: TAGS\n#diffusers #text-to-image #stable-diffusion #lora #template-sd-lora #base_model-stabilityai/stable-diffusion-xl-base-1.0 #license-openrail #region-us \n# Adieu\n\n<Gallery />## Download model\n\n\nDownload them in the Files & versions tab." ]
[ -0.08953890949487686, 0.04790917783975601, -0.002258664695546031, 0.010452648624777794, 0.08714760839939117, 0.030788354575634003, 0.16466259956359863, 0.07422660291194916, 0.12070263922214508, 0.062138862907886505, 0.06002868711948395, 0.076541468501091, -0.002252936363220215, 0.17491571605205536, -0.08895036578178406, -0.22878500819206238, 0.04315491393208504, -0.012123612686991692, -0.07931336015462875, 0.02249227836728096, 0.05460073798894882, -0.025579215958714485, 0.11553304642438889, -0.0843726098537445, 0.022026076912879944, 0.04483165219426155, 0.01720592752099037, -0.05002867802977562, -0.017528196796774864, 0.08243560791015625, -0.02467186190187931, 0.13951224088668823, 0.08727068454027176, -0.14730997383594513, 0.05720885843038559, 0.028606031090021133, -0.0930083617568016, 0.036602191627025604, -0.02132960967719555, -0.08858603239059448, 0.18535242974758148, -0.005168757401406765, -0.034568119794130325, 0.049803584814071655, -0.048699889332056046, -0.07071162760257721, -0.058830443769693375, 0.010820435360074043, 0.07538068294525146, -0.013974916189908981, 0.05402961000800133, 0.012991935014724731, -0.020139487460255623, 0.03211768716573715, 0.23497971892356873, -0.27614855766296387, -0.021067803725600243, 0.2717599868774414, 0.10840799659490585, 0.22268670797348022, -0.04963487386703491, 0.13877251744270325, 0.1068224310874939, -0.06720490753650665, 0.003956320695579052, -0.034441135823726654, 0.1434486210346222, 0.0038086548447608948, -0.04724816605448723, 0.04361751675605774, 0.3774494528770447, 0.05035775527358055, -0.003512696595862508, -0.09076886624097824, -0.035157520323991776, 0.1179957240819931, -0.0926271378993988, 0.06134236231446266, 0.05183052271604538, 0.05552596226334572, 0.0011044878046959639, -0.14941567182540894, -0.0734083279967308, -0.07004595547914505, -0.08274223655462265, 0.18503053486347198, -0.011601686477661133, 0.08441387861967087, 0.004395757801830769, 0.08469866961240768, -0.14242194592952728, -0.13150271773338318, 0.017703570425510406, -0.08698265254497528, 0.119130939245224, 0.10405478626489639, -0.05141710489988327, -0.03940347209572792, 0.10325248539447784, 0.11496239900588989, 0.07664757966995239, -0.02067743055522442, -0.04515436664223671, 0.1639045774936676, 0.0003986396186519414, -0.056480471044778824, -0.05022644251585007, -0.12182636559009552, 0.08473631739616394, 0.02677213028073311, 0.12216208130121231, -0.03372843563556671, -0.1768462359905243, 0.03923135995864868, -0.22003546357154846, 0.03482474759221077, 0.07644645124673843, 0.004586302675306797, -0.06163640320301056, -0.02261922135949135, 0.20073899626731873, 0.01607394404709339, -0.0364290289580822, -0.03145930916070938, -0.07953733950853348, 0.1353147029876709, 0.08988503366708755, -0.012346554547548294, 0.081810861825943, 0.06658558547496796, -0.09320162981748581, -0.05704176425933838, -0.023841917514801025, -0.02262050099670887, 0.018473617732524872, -0.1294485628604889, 0.04454321786761284, -0.1196405291557312, -0.2636055052280426, 0.060298189520835876, 0.039274781942367554, -0.07925470173358917, 0.005764744710177183, 0.015441523864865303, -0.01842569001019001, 0.030322520062327385, -0.04248500615358353, -0.061044950038194656, -0.09704789519309998, 0.07750488072633743, -0.06826712191104889, 0.12090447545051575, -0.16789762675762177, 0.02478170581161976, -0.056293852627277374, 0.026188194751739502, -0.23331543803215027, 0.00022943491057958454, -0.16941682994365692, 0.0337626077234745, -0.05965353548526764, -0.058252234011888504, -0.06628337502479553, 0.038528114557266235, -0.0028113422449678183, 0.1793210804462433, -0.11025989055633545, -0.05676855146884918, 0.08329251408576965, -0.1782267540693283, -0.07224539667367935, 0.019115140661597252, 0.04679159075021744, 0.021815702319145203, 0.02488919533789158, 0.16171079874038696, 0.005422758404165506, -0.26121604442596436, 0.052943963557481766, 0.1404980570077896, -0.06704268604516983, -0.10556534677743912, 0.055379755795001984, 0.037647705525159836, 0.06866936385631561, 0.06029064580798149, -0.16624964773654938, 0.09590413421392441, -0.05128157511353493, 0.011652900837361813, -0.03968549519777298, -0.10476060211658478, 0.015100082382559776, 0.04474123939871788, 0.025088883936405182, -0.039343781769275665, -0.028815854340791702, 0.023492539301514626, 0.11727637052536011, 0.0014083432033658028, 0.011856761761009693, -0.029116271063685417, 0.2161165326833725, -0.08896267414093018, -0.01574106514453888, -0.03764208033680916, -0.07518895715475082, 0.003931328188627958, 0.16566333174705505, -0.0009713536128401756, 0.12108814716339111, 0.08018457144498825, -0.023288724943995476, -0.09253130853176117, 0.0023356201127171516, 0.05895444378256798, -0.007769523188471794, 0.00802119541913271, -0.17989619076251984, 0.05119859427213669, -0.03989860415458679, 0.03291761875152588, -0.1578512042760849, 0.038364242762327194, 0.0430523157119751, 0.1279311180114746, 0.049820225685834885, 0.009095713496208191, 0.0300879068672657, -0.05462728068232536, -0.08114426583051682, -0.012120121158659458, 0.06303968280553818, 0.04520559310913086, -0.06162700057029724, 0.22280991077423096, -0.07051511853933334, 0.2139507532119751, 0.20566992461681366, 0.01953805796802044, 0.020576417446136475, -0.09121239930391312, 0.0026554372161626816, 0.022961551323533058, 0.011059499345719814, -0.06994030624628067, -0.13605265319347382, -0.04006219282746315, 0.13299965858459473, -0.08354994654655457, 0.054732516407966614, 0.06647845357656479, -0.05083893612027168, -0.044698361307382584, 0.0381164588034153, 0.18957063555717468, 0.0015252179000526667, 0.12033648788928986, 0.17518314719200134, -0.02605273574590683, 0.1735367625951767, -0.03556525334715843, -0.10061302781105042, 0.028207147493958473, 0.01302677858620882, 0.02655322663486004, 0.22962775826454163, 0.06746885925531387, -0.00045655551366508007, 0.02898833528161049, -0.010264352895319462, 0.03373019024729729, -0.08724043518304825, -0.08271851390600204, -0.0013975913170725107, -0.06947474181652069, 0.0718155950307846, 0.10925092548131943, -0.11658144742250443, 0.08999072015285492, -0.08195167779922485, -0.08369629085063934, -0.025593897327780724, -0.02139931544661522, -0.029463399201631546, 0.04271475598216057, -0.12247668206691742, -0.08648139238357544, -0.1494545340538025, -0.03485455736517906, -0.07137691974639893, 0.006966511718928814, 0.044687412679195404, -0.03022237867116928, -0.0842755064368248, -0.06439395248889923, 0.0027882778085768223, -0.004071424249559641, -0.06984231621026993, -0.047703880816698074, 0.02641049213707447, -0.05279773473739624, -0.12681375443935394, -0.004351367708295584, -0.05643250048160553, 0.011269629932940006, 0.02862805500626564, -0.10265690088272095, 0.11257172375917435, 0.04337991029024124, 0.05545905977487564, 0.022832371294498444, -0.004416523966938257, 0.07172349095344543, -0.03759092465043068, 0.08342360705137253, 0.19031968712806702, 0.08970914781093597, 0.03310561925172806, 0.0561874583363533, 0.05692582204937935, -0.07215063273906708, 0.02082219161093235, -0.0253308042883873, -0.10627838969230652, -0.11843310296535492, -0.14541631937026978, -0.09953858703374863, 0.05428995192050934, -0.009879035875201225, 0.049618158489465714, 0.04034532234072685, 0.12438343465328217, -0.028767641633749008, -0.10043860226869583, 0.03122861497104168, 0.07211373001337051, 0.09226341545581818, -0.0654076486825943, 0.07407019287347794, -0.07873603701591492, 0.04139785096049309, 0.2166317105293274, 0.017329035326838493, 0.15786691009998322, -0.003874699119478464, 0.06980311125516891, 0.043747786432504654, 0.06647634506225586, 0.12020822614431381, 0.06573352217674255, -0.029879318550229073, -0.04077479615807533, -0.03008607216179371, -0.07668495923280716, 0.0788927972316742, 0.05035325884819031, 0.0037585641257464886, -0.06017320230603218, -0.009599956683814526, -0.08585607260465622, -0.007580294273793697, 0.03860330581665039, 0.048575036227703094, -0.2017652541399002, 0.06006553769111633, 0.08095524460077286, 0.11561556905508041, 0.002013476099818945, 0.06875040382146835, 0.13173632323741913, -0.04256974533200264, 0.07568395882844925, 0.004904994275420904, 0.10480893403291702, 0.04729101061820984, -0.09725520014762878, -0.022529561072587967, 0.07059149444103241, -0.032139960676431656, -0.009213361889123917, -0.030364319682121277, 0.14007876813411713, 0.014904972165822983, -0.026562310755252838, 0.04375291243195534, -0.032551851123571396, 0.09555098414421082, 0.20474134385585785, 0.13494223356246948, 0.003998130094259977, 0.023435499519109726, -0.006598655600100756, -0.1190924346446991, -0.006845853757113218, 0.06621213257312775, -0.07868699729442596, -0.02754000946879387, -0.002673208946362138, -0.01786220818758011, 0.022097749635577202, -0.003028406761586666, -0.09657174348831177, -0.13454578816890717, -0.012920563109219074, 0.1217166930437088, -0.013451158069074154, -0.06110626459121704, -0.04799750819802284, -0.12181394547224045, 0.14013051986694336, 0.0008414916810579598, -0.10273377597332001, -0.07819006592035294, -0.08287543803453445, 0.07372594624757767, 0.025272900238633156, 0.06576626002788544, -0.021580353379249573, -0.013668719679117203, -0.057337842881679535, -0.1460886299610138, 0.020022183656692505, -0.10992253571748734, -0.09994662553071976, -0.09734422713518143, 0.13220913708209991, -0.057536616921424866, -0.014719313010573387, -0.0266883485019207, 0.01773708499968052, 0.0012381880078464746, -0.10989387333393097, 0.0008138853590935469, 0.08422049134969711, 0.03597194328904152, 0.07236569374799728, -0.10171447694301605, -0.015108304098248482, 0.0521869994699955, -0.04062746465206146, 0.0037539678160101175, 0.26157042384147644, -0.09104979783296585, 0.07715093344449997, 0.15065062046051025, -0.04434824362397194, -0.2020416110754013, -0.06594670563936234, -0.10153313726186752, -0.04655428230762482, 0.07859553396701813, -0.10199372470378876, 0.07818271219730377, 0.11816604435443878, -0.07038842886686325, 0.2509737014770508, -0.3215576708316803, -0.06946757435798645, 0.007098556030541658, 0.084941066801548, 0.2860926687717438, -0.20233796536922455, -0.06497301906347275, -0.028872983530163765, -0.20512326061725616, 0.03547016531229019, -0.02348020114004612, 0.053650304675102234, -0.029264934360980988, -0.05275839939713478, -0.02268439717590809, -0.01176520437002182, 0.19074688851833344, -0.10696619004011154, 0.04229411855340004, -0.09894497692584991, 0.06632965803146362, 0.20525003969669342, -0.02452091872692108, 0.01580633968114853, -0.2783510386943817, 0.05290503054857254, -0.14697420597076416, -0.015840021893382072, 0.03018895350396633, 0.07020290940999985, -0.016108404844999313, -0.04248008504509926, -0.07676774263381958, 0.019260747358202934, -0.024250078946352005, 0.00998741202056408, 0.09109088033437729, -0.05569630488753319, 0.005394485779106617, 0.16109029948711395, -0.04949146881699562, 0.030817396938800812, -0.14052154123783112, -0.07521142810583115, -0.03662923350930214, 0.07068631052970886, -0.21248240768909454, -0.06204196810722351, 0.09164553880691528, 0.04217909276485443, 0.04368109256029129, 0.018356556072831154, 0.01586868055164814, 0.14087696373462677, 0.1437891572713852, -0.10147826373577118, -0.07323462516069412, -0.03644384816288948, -0.02666410617530346, 0.10376020520925522, 0.05805645510554314, 0.08565608412027359, -0.05512658506631851, 0.04290727153420448, 0.020315373316407204, 0.029621826484799385, -0.0397370383143425, 0.05202793702483177, 0.06606011092662811, -0.025754563510417938, -0.06899400055408478, 0.130326047539711, -0.055076804012060165, -0.07042183727025986, -0.09098583459854126, 0.04936368390917778, -0.09312392771244049, -0.04668711498379707, -0.030779292806982994, 0.015977028757333755, -0.12118154019117355, -0.007993526756763458, -0.09549307823181152, -0.0511704683303833, -0.013527662493288517, 0.10753682255744934, 0.08511993288993835, -0.051796287298202515, 0.03458665311336517, 0.014204324223101139, 0.005924866534769535, 0.0324363075196743, 0.048844803124666214, 0.07007968425750732, -0.1476486772298813, -0.2269071340560913, 0.01756112277507782, -0.026462169364094734, -0.10077550262212753, -0.031886909157037735, -0.10175521671772003, -0.013131622225046158, -0.09724044054746628, 0.07880812138319016, -0.12420901656150818, -0.048273175954818726, -0.06478814035654068, -0.1204967200756073, -0.07103672623634338, 0.009359114803373814, -0.04577651247382164, 0.019313300028443336, 0.03501001000404358, 0.06581699103116989, -0.07023187726736069, -0.042109113186597824, 0.02210160903632641, -0.052109494805336, 0.06802688539028168, 0.05330871045589447, -0.034314729273319244, 0.012251906096935272, -0.18116877973079681, 0.004526512231677771, 0.05730310082435608, 0.056439753621816635, -0.016271285712718964, 0.14056293666362762, 0.051665499806404114, -0.0028847670182585716, 0.004519094713032246, -0.025570912286639214, -0.03624451160430908, -0.13612815737724304, 0.043401848524808884, -0.04914068430662155, 0.02110102027654648, -0.0132062379270792, -0.020987261086702347, 0.1916130632162094, 0.09290704131126404, 0.10693331807851791, -0.08096884936094284, 0.002369315829128027, -0.05903720483183861, 0.0189627967774868, 0.04335547238588333, -0.08498737215995789, 0.03608989343047142, -0.00800985936075449, -0.02582963928580284, -0.011220216751098633, 0.26195207238197327, 0.017537998035550117, -0.16441793739795685, 0.023857181891798973, 0.05279543250799179, 0.1303362250328064, 0.011691354215145111, 0.29495617747306824, 0.06092219427227974, 0.0804884135723114, -0.18778134882450104, 0.0852871686220169, 0.08182637393474579, -0.14351217448711395, -0.062482088804244995, 0.16322089731693268, -0.03476940095424652, 0.046064671128988266, 0.04841345176100731, -0.03537874296307564, 0.007948296144604683, 0.052652668207883835, -0.0042374045588076115, 0.07512301206588745, -0.023861493915319443, 0.058019768446683884, 0.17288759350776672, -0.04962915927171707, -0.010454891249537468, 0.10235483944416046, 0.03139379993081093, -0.09140977263450623, -0.18462210893630981, -0.05099375918507576, -0.29604703187942505, 0.04734421148896217, -0.0612911693751812, 0.0400162935256958, 0.08832857012748718, 0.06038235127925873, 0.0023801655042916536, -0.056330278515815735, -0.09522141516208649, -0.10633130371570587, 0.0888807624578476, -0.0004992518806830049, -0.06717649847269058, -0.08349378407001495, -0.054202254861593246, 0.05785978585481644, -0.08860732614994049, -0.06359288096427917, 0.08040431141853333, 0.050631795078516006, 0.03524543344974518, -0.03032836504280567, -0.022286908701062202, -0.0413755439221859, 0.03333313390612602, -0.01640314795076847, 0.17003197968006134, 0.0324951596558094, 0.02045767940580845, 0.0005189096555113792, 0.20814527571201324, -0.02313009276986122, -0.03319164738059044, -0.04608472064137459, 0.024969695135951042, -0.06863829493522644, 0.07748367637395859, -0.02758488431572914, -0.08364557474851608, -0.03149054944515228, 0.244804248213768, 0.21121133863925934, -0.07617431879043579, 0.034058038145303726, -0.015803145244717598, -0.016218623146414757, 0.03404494374990463, 0.03243718296289444, 0.03869694098830223, 0.24072295427322388, -0.0054123736917972565, -0.061262067407369614, -0.10775258392095566, 0.008923117071390152, -0.07432501763105392, -0.050463344901800156, -0.02512318082153797, -0.12659431993961334, -0.06437715888023376, 0.09343572705984116, -0.09410478919744492, -0.028041105717420578, 0.07751084119081497, -0.12034841626882553, 0.010495930910110474, -0.09638214111328125, 0.06912821531295776, 0.05447760969400406, -0.03091997466981411, -0.09626547247171402, -0.03593829274177551, -0.015263281762599945, -0.020015951246023178, -0.13374847173690796, -0.10854373872280121, -0.022444799542427063, -0.1745215803384781, 0.1272057592868805, -0.059644896537065506, -0.020405184477567673, 0.0038702876772731543, 0.01143919862806797, -0.05973044037818909, 0.05322220176458359, 0.021419471129775047, -0.18288537859916687, -0.04939999058842659, 0.03191810101270676, -0.04650121182203293, 0.10727322846651077, 0.020936261862516403, -0.033282361924648285, 0.036771904677152634, 0.13402071595191956, -0.11710042506456375, -0.07089848071336746, 0.01461456622928381, -0.11727511882781982, 0.09258316457271576, -0.021984688937664032, -0.009666996076703072, -0.059627749025821686, -0.010346904397010803, 0.06050221249461174, 0.1207566112279892, -0.060622889548540115, 0.10183070600032806, -0.05854595825076103, -0.08451535552740097, 0.08232482522726059, 0.04877019673585892, -0.13664506375789642, 0.009275268763303757, -0.15764972567558289, 0.038602061569690704, -0.007777365390211344, 0.033799752593040466, 0.22503621876239777, -0.01305230800062418, -0.024369079619646072, -0.21420198678970337, 0.05423416569828987, 0.06764069199562073, -0.07818328589200974, -0.04520845040678978 ]
null
null
peft
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed] ### Framework versions - PEFT 0.8.2
{"library_name": "peft", "base_model": "google/vit-base-patch16-224-in21k"}
null
mysterious-pie/vit_ft_lora_3_epochs_24classes_vextra
[ "peft", "arxiv:1910.09700", "base_model:google/vit-base-patch16-224-in21k", "region:us" ]
2024-02-13T19:40:31+00:00
[ "1910.09700" ]
[]
TAGS #peft #arxiv-1910.09700 #base_model-google/vit-base-patch16-224-in21k #region-us
# Model Card for Model ID ## Model Details ### Model Description - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact ### Framework versions - PEFT 0.8.2
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact", "### Framework versions\n\n- PEFT 0.8.2" ]
[ "TAGS\n#peft #arxiv-1910.09700 #base_model-google/vit-base-patch16-224-in21k #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact", "### Framework versions\n\n- PEFT 0.8.2" ]
[ 37, 6, 3, 54, 28, 3, 4, 9, 9, 10, 42, 20, 3, 4, 5, 9, 11, 13, 3, 12, 5, 4, 5, 3, 4, 9, 53, 9, 8, 6, 3, 14, 8, 7, 9, 4, 11 ]
[ "passage: TAGS\n#peft #arxiv-1910.09700 #base_model-google/vit-base-patch16-224-in21k #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact### Framework versions\n\n- PEFT 0.8.2" ]
[ -0.1097259595990181, 0.20112037658691406, -0.002964325249195099, 0.0377388522028923, 0.09489890933036804, 0.02169959805905819, 0.053980182856321335, 0.13016833364963531, -0.022767681628465652, 0.11059456318616867, 0.07211367040872574, 0.09456948935985565, 0.10866006463766098, 0.22363243997097015, 0.01239712256938219, -0.20151637494564056, 0.023859472945332527, -0.0910334587097168, -0.006923510693013668, 0.12589748203754425, 0.1451474279165268, -0.09365753084421158, 0.08375521749258041, -0.009864133782684803, -0.017208488658070564, -0.03156324476003647, -0.07452031224966049, -0.02700621262192726, 0.041201334446668625, 0.04898554086685181, 0.05366980656981468, -0.0050510563887655735, 0.08421708643436432, -0.26283103227615356, 0.017283933237195015, 0.04720853269100189, -0.006219862960278988, 0.08864717185497284, 0.09746591001749039, -0.04091762378811836, 0.13244576752185822, -0.02647373080253601, 0.1376902163028717, 0.0807318314909935, -0.0920570120215416, -0.22243790328502655, -0.06904194504022598, 0.09174840152263641, 0.17398592829704285, 0.07761646062135696, -0.04239664599299431, 0.12926317751407623, -0.09034091234207153, 0.01932940073311329, 0.04564965143799782, -0.08824452012777328, -0.07276999205350876, 0.06018311530351639, 0.10009733587503433, 0.05373869836330414, -0.1301424652338028, -0.031834036111831665, 0.024617455899715424, 0.0350346639752388, 0.07172506302595139, 0.014319092035293579, 0.15119825303554535, 0.028895609080791473, -0.14606888592243195, -0.04590925946831703, 0.14245480298995972, 0.0298266913741827, -0.03742418810725212, -0.22013337910175323, 0.004250259138643742, -0.09340731054544449, -0.026720749214291573, -0.0475861094892025, 0.04122820496559143, 0.0054879505187273026, 0.10170221328735352, -0.031078526750206947, -0.08632197231054306, -0.013689140789210796, 0.09566066414117813, 0.045678313821554184, 0.0248627457767725, -0.019312571734189987, 0.003161009633913636, 0.12575198709964752, 0.05526517704129219, -0.1314997375011444, -0.06290856748819351, -0.07072122395038605, -0.04348902031779289, -0.04268159717321396, 0.0409819521009922, 0.05109211057424545, 0.05385163426399231, 0.24908262491226196, -0.03541742265224457, 0.06335359066724777, 0.06418327242136002, 0.023186931386590004, 0.045005928725004196, 0.10113146156072617, -0.057711049914360046, -0.15584403276443481, -0.013486159965395927, 0.09505986422300339, -0.00749570969492197, -0.022823994979262352, -0.05561751499772072, 0.04458099603652954, 0.030624236911535263, 0.10528050363063812, 0.099887415766716, -0.007043721619993448, -0.07815098762512207, -0.056391578167676926, 0.20223501324653625, -0.14946788549423218, 0.04646725952625275, 0.020357798784971237, -0.020995939150452614, -0.04688895493745804, 0.014586497098207474, 0.012954581528902054, -0.033045899122953415, 0.08943350613117218, -0.06968899816274643, -0.035974178463220596, -0.12002238631248474, -0.026299839839339256, 0.03471144288778305, 0.001629865844734013, -0.028747431933879852, -0.03128194063901901, -0.06457123160362244, -0.09638290107250214, 0.105136439204216, -0.06850649416446686, -0.06308777630329132, -0.030313335359096527, -0.08949244022369385, 0.022229129448533058, 0.029605189338326454, 0.09895636886358261, -0.025698676705360413, 0.0404244139790535, -0.007964236661791801, 0.06787923723459244, 0.0842549204826355, 0.03735645115375519, -0.07089484483003616, 0.06240484118461609, -0.19676905870437622, 0.08834069222211838, -0.08413110673427582, 0.02793758548796177, -0.1616329550743103, -0.016653800383210182, 0.004417221061885357, 0.02340998873114586, 0.03530467674136162, 0.16479699313640594, -0.2013486921787262, -0.03756337985396385, 0.1508173942565918, -0.10181991010904312, -0.1179620623588562, 0.03454797342419624, -0.04523365572094917, 0.16406892240047455, 0.01968643255531788, -0.0036475430242717266, 0.10328041017055511, -0.14783723652362823, -0.02521083690226078, -0.019312672317028046, -0.003672140184789896, 0.09680309146642685, 0.08480274677276611, -0.0821189135313034, 0.027291910722851753, 0.016154708340764046, -0.05264858528971672, -0.02463367208838463, -0.04719305783510208, -0.10734828561544418, 0.004328077659010887, -0.07906776666641235, 0.02024509385228157, -0.006333427038043737, -0.07932868599891663, -0.008139350451529026, -0.1621883064508438, -0.03356918692588806, 0.08303488045930862, 0.010130155831575394, -0.02020706608891487, -0.09871784597635269, 0.04305238276720047, -0.031681936234235764, -0.01868441514670849, -0.15227991342544556, -0.018298840150237083, 0.02079528011381626, -0.14152607321739197, 0.008619229309260845, -0.1138877272605896, 0.06864924728870392, 0.004569740034639835, -0.06114518269896507, -0.03482944145798683, -0.010532641783356667, 0.003217757912352681, -0.053163863718509674, -0.24147912859916687, -0.025690991431474686, -0.05054813623428345, 0.15362364053726196, -0.2218954712152481, 0.036814723163843155, 0.052462365478277206, 0.12787339091300964, 0.0005483669228851795, -0.06585515290498734, 0.03006749413907528, -0.06871769577264786, -0.025547359138727188, -0.0786626785993576, -0.004395035095512867, -0.003242376260459423, -0.04022139683365822, 0.016675908118486404, -0.10592113435268402, -0.03924190625548363, 0.10271810740232468, 0.06636706739664078, -0.16066795587539673, -0.01124738808721304, -0.04295684024691582, -0.06481914967298508, -0.08217725902795792, -0.059026334434747696, 0.11565162241458893, 0.05029238387942314, 0.039297692477703094, -0.07360585033893585, -0.07069876790046692, 0.012104821391403675, -0.02039179392158985, -0.021293940022587776, 0.11836212128400803, 0.08108579367399216, -0.10766635090112686, 0.0978405550122261, 0.06716185808181763, 0.030183203518390656, 0.08148030936717987, -0.023545939475297928, -0.10507656633853912, -0.030614808201789856, 0.04629683122038841, 0.011302629485726357, 0.16110871732234955, -0.07662294059991837, 0.048570018261671066, 0.04463629052042961, -0.03677269071340561, 0.04969730228185654, -0.0975797176361084, 0.010617019608616829, 0.01090280245989561, -0.016366232186555862, 0.007492651231586933, -0.023347528651356697, 0.0037071818951517344, 0.08060134202241898, 0.057751987129449844, 0.040098607540130615, 0.026787903159856796, -0.029310310259461403, -0.13489621877670288, 0.1839963048696518, -0.09484463930130005, -0.24697189033031464, -0.16156591475009918, 0.05961168184876442, 0.0536809004843235, -0.018341703340411186, 0.019876830279827118, -0.06153551861643791, -0.1052987352013588, -0.07853434234857559, 0.0038717188872396946, 0.020629892125725746, -0.05754295364022255, -0.06796875596046448, 0.04561891406774521, 0.04763292521238327, -0.11712756752967834, 0.037236765027046204, 0.060237858444452286, -0.01053561270236969, 0.0013734601670876145, 0.05397957190871239, 0.0865190252661705, 0.17902721464633942, -0.00848577544093132, 0.0034034913405776024, 0.05382488667964935, 0.28539109230041504, -0.16293151676654816, 0.11546909064054489, 0.12275584042072296, -0.057245172560214996, 0.07754608243703842, 0.1832781881093979, 0.03307630866765976, -0.10227355360984802, 0.030250778421759605, 0.02994544245302677, -0.028260378167033195, -0.26632410287857056, -0.04603296145796776, -0.015508273616433144, -0.09481409192085266, 0.08992050588130951, 0.09184226393699646, 0.07870149612426758, 0.03945247828960419, -0.0664810836315155, -0.07295691221952438, 0.02802068181335926, 0.10141033679246902, -0.014654556289315224, 0.00627922173589468, 0.0814599096775055, -0.03717189282178879, 0.014120247215032578, 0.0981399267911911, -0.01118292286992073, 0.16888853907585144, 0.05438082292675972, 0.11680648475885391, 0.0792485699057579, 0.09226962924003601, -0.0013867683010175824, 0.026178330183029175, 0.0170521829277277, 0.024540040642023087, 0.0167238712310791, -0.08218267560005188, 0.028721408918499947, 0.10824783891439438, 0.04692545533180237, 0.029560888186097145, 0.017076756805181503, -0.044073767960071564, 0.04372342675924301, 0.183323934674263, 0.0082389609888196, -0.20629648864269257, -0.08211466670036316, 0.05824877321720123, -0.0765635296702385, -0.13838806748390198, -0.017525091767311096, 0.029455062001943588, -0.16819553077220917, 0.02039164863526821, -0.0392085425555706, 0.10433682799339294, -0.09291670471429825, -0.04156472533941269, 0.106420136988163, 0.0683426782488823, -0.02112002484500408, 0.056733958423137665, -0.1954938769340515, 0.12874345481395721, 0.02780451625585556, 0.06814580410718918, -0.08625942468643188, 0.10356822609901428, 0.0038543601986020803, -0.004360610153526068, 0.16306599974632263, 0.005409923382103443, -0.05628366023302078, -0.07046989351511002, -0.09139858931303024, -0.016801005229353905, 0.09518960118293762, -0.1391601264476776, 0.06650526821613312, -0.019601356238126755, -0.030006898567080498, -0.005134419538080692, -0.08829499036073685, -0.1264512985944748, -0.17138192057609558, 0.05703684315085411, -0.09893500804901123, 0.027275631204247475, -0.08818625658750534, -0.06624265015125275, 0.005723053589463234, 0.1812020093202591, -0.20920133590698242, -0.10030058771371841, -0.14960932731628418, -0.08030194044113159, 0.16049824655056, -0.044658876955509186, 0.08290005475282669, 0.0008381559164263308, 0.16499775648117065, 0.015680383890867233, -0.012336552143096924, 0.1002080887556076, -0.08881065249443054, -0.18917948007583618, -0.05365609750151634, 0.16410315036773682, 0.13341955840587616, 0.03721458092331886, -0.016766251996159554, 0.026380857452750206, -0.05618448927998543, -0.1211327612400055, 0.030053632333874702, 0.13964460790157318, 0.06143313646316528, -0.016143443062901497, -0.02459084615111351, -0.07869680225849152, -0.06065715476870537, -0.045141249895095825, -0.00670755747705698, 0.18315842747688293, -0.0756714716553688, 0.16123147308826447, 0.10680834203958511, -0.05867578834295273, -0.20546144247055054, 0.05245213955640793, 0.057058390229940414, 0.01879984699189663, 0.03470494598150253, -0.20564286410808563, 0.08198752999305725, -0.003937928006052971, -0.07269444316625595, 0.17107094824314117, -0.17073263227939606, -0.13928936421871185, 0.09609515219926834, 0.03385011851787567, -0.22598440945148468, -0.1372264176607132, -0.10476545989513397, -0.01797422394156456, -0.12315742671489716, 0.049298323690891266, 0.007689354475587606, 0.013611093163490295, 0.02275940775871277, 0.019061213359236717, 0.023073768243193626, -0.04438363015651703, 0.20397162437438965, -0.026622895151376724, 0.007470262236893177, -0.05040504410862923, -0.08797906339168549, 0.029124701395630836, -0.04879721999168396, 0.10673504322767258, 0.004136295523494482, 0.028207315132021904, -0.16338996589183807, -0.04049261286854744, -0.055052004754543304, 0.030086547136306763, -0.09618807584047318, -0.08263115584850311, -0.04898154363036156, 0.09094484895467758, 0.09037604928016663, -0.025859469547867775, 0.0008461861289106309, -0.08674713969230652, 0.0603070929646492, 0.20430997014045715, 0.18960821628570557, 0.06644872575998306, -0.07888168096542358, 0.02128373086452484, -0.030551450327038765, 0.046557310968637466, -0.2448282390832901, 0.0362483374774456, 0.057747386395931244, 0.02511443756520748, 0.08475914597511292, -0.009147841483354568, -0.15624728798866272, -0.0710115060210228, 0.08314462006092072, -0.04592377692461014, -0.16529880464076996, -0.03098302148282528, 0.020344045013189316, -0.2053796797990799, -0.04694153368473053, 0.021882813423871994, -0.02488253265619278, -0.0390821136534214, 0.026041004806756973, 0.0751487985253334, -0.019914526492357254, 0.1105363741517067, 0.08831103891134262, 0.09470266848802567, -0.10677086561918259, 0.07746650278568268, 0.07673920691013336, -0.04412224516272545, 0.029548941180109978, 0.11306361854076385, -0.05002160370349884, -0.033753976225852966, 0.0806422084569931, 0.0914817526936531, 0.02770897187292576, -0.05208391323685646, 0.01005779579281807, -0.054769858717918396, 0.06596142053604126, 0.10393200814723969, 0.030616765841841698, -0.0016720752464607358, 0.05240030959248543, 0.034837231040000916, -0.09228160232305527, 0.10380960255861282, 0.05706124007701874, 0.017017189413309097, -0.051042865961790085, -0.04379591345787048, -0.003626969875767827, -0.012105810455977917, -0.01950099878013134, -0.00927750114351511, -0.09386655688285828, -0.0065619321539998055, -0.08550219982862473, 0.0246769730001688, -0.06960086524486542, 0.012068483047187328, 0.028463583439588547, -0.05378488823771477, -0.0009935609996318817, 0.008875103667378426, -0.07635853439569473, -0.053863074630498886, -0.013700037263333797, 0.08422953635454178, -0.12501822412014008, 0.04410342127084732, 0.07617317885160446, -0.10349840670824051, 0.07108370959758759, -0.0033358358778059483, 0.009071169421076775, 0.006340810097754002, -0.15207530558109283, 0.058690618723630905, -0.023212991654872894, -0.014738048426806927, 0.02208070456981659, -0.2105827033519745, -0.0024734693579375744, -0.0487726628780365, -0.056619372218847275, 0.010434233583509922, -0.02218296378850937, -0.12405627965927124, 0.09344490617513657, -0.005996065679937601, -0.06602304428815842, -0.020413152873516083, 0.04670941084623337, 0.10808072984218597, -0.022813990712165833, 0.13409088551998138, -0.023571331053972244, 0.07012619078159332, -0.17862391471862793, -0.006682545877993107, -0.012999974191188812, 0.041551485657691956, -0.023404648527503014, -0.02923954278230667, 0.06197798252105713, -0.02035374939441681, 0.17920535802841187, -0.014555197209119797, 0.0754493847489357, 0.053722597658634186, 0.016827749088406563, 0.0048210350796580315, 0.0812978744506836, 0.053787484765052795, -0.0019343679305166006, -0.00252539268694818, 0.041501834988594055, -0.003933870233595371, -0.04076845943927765, -0.15775611996650696, 0.0616869181394577, 0.14460265636444092, 0.04877037554979324, 0.028915781527757645, 0.030182044953107834, -0.11474359780550003, -0.07809372991323471, 0.1361789107322693, -0.02051650732755661, -0.029607320204377174, -0.07088816165924072, 0.17031168937683105, 0.1350930780172348, -0.1980038732290268, 0.07670130580663681, -0.05463641136884689, -0.05044145509600639, -0.13364753127098083, -0.16954344511032104, -0.06212535500526428, -0.05851368233561516, -0.020808687433600426, -0.06523767858743668, 0.05103885754942894, 0.05383980646729469, 0.0048776897601783276, -0.01885969750583172, 0.10594280809164047, 0.01257079467177391, -0.031130168586969376, 0.05236773192882538, 0.06033419072628021, 0.029436910524964333, -0.1007888913154602, 0.008282424882054329, -0.0024844473227858543, 0.014131707139313221, 0.0690208226442337, 0.018060976639389992, -0.052735961973667145, 0.016604002565145493, -0.01798398606479168, -0.11374250054359436, 0.04079597070813179, -0.014723743312060833, -0.041106462478637695, 0.14667825400829315, 0.03155384212732315, 0.00848542433232069, -0.02311042509973049, 0.22728003561496735, -0.07624024152755737, -0.06295774132013321, -0.1523347944021225, 0.06731240451335907, -0.06584116071462631, 0.030146317556500435, 0.028090886771678925, -0.11587388068437576, 0.010689436458051205, 0.17215490341186523, 0.1295740306377411, -0.007264425978064537, 0.008707896806299686, 0.0489814393222332, 0.004523996263742447, -0.028553888201713562, 0.020616386085748672, 0.05244629457592964, 0.13847346603870392, -0.07078429311513901, 0.06287817656993866, -0.010546709410846233, -0.08384065330028534, -0.014437895268201828, 0.10671242326498032, -0.0012094074627384543, 0.002327580703422427, -0.0733616054058075, 0.14193962514400482, -0.08508593589067459, -0.22235919535160065, 0.0694008618593216, -0.07530980557203293, -0.1471939980983734, -0.047578126192092896, 0.02016761153936386, -0.014743037521839142, 0.011986374855041504, 0.07509658485651016, -0.050754934549331665, 0.17595162987709045, 0.04134046658873558, -0.06076350063085556, -0.0952662006020546, 0.05640912801027298, -0.15193602442741394, 0.2750769853591919, 0.015994472429156303, 0.043239474296569824, 0.10450907051563263, -0.014951551333069801, -0.14452579617500305, 0.0070950365625321865, 0.1061304584145546, -0.07065068930387497, 0.05552811920642853, 0.1742352694272995, 0.0029801970813423395, 0.1260167807340622, 0.055635057389736176, -0.04843522980809212, 0.0348442979156971, -0.09940353035926819, -0.03663933277130127, -0.11427559703588486, 0.07808609306812286, -0.08655110746622086, 0.16069084405899048, 0.13289576768875122, -0.06681426614522934, -0.01150993537157774, -0.022621650248765945, 0.08423087000846863, 0.005939872935414314, 0.1142805814743042, 0.011631800793111324, -0.18298384547233582, 0.03470810502767563, 0.004841253161430359, 0.09900550544261932, -0.1965060979127884, -0.05925530940294266, 0.047973450273275375, -0.0205500740557909, -0.0747053250670433, 0.12421680986881256, 0.04083302617073059, 0.03864976763725281, -0.0404374822974205, -0.05871274322271347, 0.011105986312031746, 0.14799614250659943, -0.11314728111028671, -0.008312908001244068 ]
null
null
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
automatic-speech-recognition
SpideyDLK/wav2vec2-large-xls-r-300m-sinhala-test4-with-checkpoints-part3
[ "transformers", "tensorboard", "safetensors", "wav2vec2", "automatic-speech-recognition", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
2024-02-13T19:41:35+00:00
[ "1910.09700" ]
[]
TAGS #transformers #tensorboard #safetensors #wav2vec2 #automatic-speech-recognition #arxiv-1910.09700 #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #tensorboard #safetensors #wav2vec2 #automatic-speech-recognition #arxiv-1910.09700 #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ 51, 6, 3, 82, 28, 3, 4, 9, 9, 10, 42, 20, 3, 4, 5, 9, 11, 13, 3, 12, 5, 4, 5, 3, 4, 9, 53, 9, 8, 6, 3, 14, 8, 7, 9, 4 ]
[ "passage: TAGS\n#transformers #tensorboard #safetensors #wav2vec2 #automatic-speech-recognition #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact" ]
[ -0.06918960809707642, 0.13210147619247437, -0.0040207370184361935, 0.023134203627705574, 0.11738458275794983, 0.003100133500993252, 0.06489233672618866, 0.1062328964471817, -0.018454808741807938, 0.11934409290552139, 0.02399194799363613, 0.10645237565040588, 0.10633884370326996, 0.1783033311367035, -0.006676932331174612, -0.20753470063209534, 0.05159076303243637, -0.1328369528055191, -0.006210802122950554, 0.12206359207630157, 0.12859149277210236, -0.12210913747549057, 0.0661126896739006, -0.03582390025258064, -0.006673390511423349, -0.036393795162439346, -0.05692959576845169, -0.05386972799897194, 0.06668701767921448, 0.062350641936063766, 0.060644831508398056, 0.018570519983768463, 0.09333337843418121, -0.2811316251754761, 0.022820472717285156, 0.08144836127758026, 0.006955916993319988, 0.06573166698217392, 0.07077054679393768, -0.07532189786434174, 0.07820203900337219, -0.06897217780351639, 0.14961306750774384, 0.07984381169080734, -0.09012829512357712, -0.1924186497926712, -0.08871616423130035, 0.0939040556550026, 0.18705229461193085, 0.05621805414557457, -0.031164970248937607, 0.13427454233169556, -0.06805921345949173, 0.01878679171204567, 0.0681581050157547, -0.07620836049318314, -0.052718792110681534, 0.06274411827325821, 0.07032535970211029, 0.09331566095352173, -0.13174302875995636, -0.00581010989844799, 0.02852778322994709, 0.010611804202198982, 0.10465876758098602, 0.019570648670196533, 0.12078016996383667, 0.03880659490823746, -0.14213255047798157, -0.04347489774227142, 0.08553854376077652, 0.040430404245853424, -0.053023193031549454, -0.25508078932762146, -0.01728249341249466, -0.03535711020231247, -0.03508080542087555, -0.050225600600242615, 0.04455358535051346, -0.02446228265762329, 0.07527779787778854, -0.005772874690592289, -0.07288429886102676, -0.049391333013772964, 0.07827333360910416, 0.07604440301656723, 0.027127955108880997, -0.02607012540102005, 0.012724175117909908, 0.11518751829862595, 0.11186537146568298, -0.11170512437820435, -0.052296292036771774, -0.06167195737361908, -0.09369548410177231, -0.047245271503925323, 0.03096519224345684, 0.04075217619538307, 0.05507682263851166, 0.21005180478096008, 0.004100160673260689, 0.05025365948677063, 0.030208947136998177, 0.013425402343273163, 0.06431768089532852, 0.09155748784542084, -0.0652301162481308, -0.12225554138422012, -0.02715214155614376, 0.10966562479734421, 0.009606653824448586, -0.03282571956515312, -0.04075070470571518, 0.0665077269077301, 0.030208082869648933, 0.12366250902414322, 0.0723525807261467, 0.018685176968574524, -0.07855737954378128, -0.06267400830984116, 0.1677972972393036, -0.1649521440267563, 0.03285328298807144, 0.02912791818380356, -0.050073519349098206, -0.008440917357802391, 0.01682254858314991, 0.021022414788603783, -0.018704243004322052, 0.08882031589746475, -0.054653100669384, -0.03264474496245384, -0.11321555823087692, -0.05006399378180504, 0.028676055371761322, 0.006981914862990379, -0.03174450621008873, -0.04053306579589844, -0.10819326341152191, -0.07601769268512726, 0.07845603674650192, -0.06794282793998718, -0.04567456990480423, -0.03693155571818352, -0.077850341796875, 0.013987138867378235, -0.001372430007904768, 0.11866221576929092, -0.028359893709421158, 0.049781348556280136, -0.06040623039007187, 0.07331450283527374, 0.1427365392446518, 0.027582714334130287, -0.05536656826734543, 0.05209227278828621, -0.22961750626564026, 0.10650996118783951, -0.0820845440030098, 0.039568543434143066, -0.16523221135139465, -0.01437871903181076, 0.04151884838938713, 0.02703598327934742, -0.011580551974475384, 0.13367699086666107, -0.20120634138584137, -0.03629620373249054, 0.17902998626232147, -0.11463885754346848, -0.08275967836380005, 0.05660289525985718, -0.05534304678440094, 0.12154120951890945, 0.04968025162816048, -0.015457268804311752, 0.02872299961745739, -0.14586561918258667, -0.015341621823608875, -0.06385710090398788, -0.031775522977113724, 0.15648432075977325, 0.058627333492040634, -0.05283202603459358, 0.06168147549033165, 0.01965263858437538, -0.018219612538814545, -0.04959159716963768, -0.03271770104765892, -0.09723224490880966, 0.011255990713834763, -0.0728980302810669, 0.023943135514855385, -0.031872402876615524, -0.09092787653207779, -0.03651702031493187, -0.15960368514060974, 0.006672970950603485, 0.09574975073337555, -0.005800875835120678, -0.02275932766497135, -0.11338774859905243, -0.010310402140021324, 0.020829740911722183, -0.0006964936037547886, -0.14685183763504028, -0.05314113572239876, 0.017828308045864105, -0.16250769793987274, 0.031012238934636116, -0.03655901551246643, 0.04738416150212288, 0.03556562215089798, -0.03982981666922569, -0.03375418856739998, 0.019630931317806244, 0.022369354963302612, -0.010214408859610558, -0.2756194770336151, -0.015468244440853596, -0.043052829802036285, 0.16435527801513672, -0.2469322234392166, 0.04182727262377739, 0.07295827567577362, 0.1338571161031723, 0.015705497935414314, -0.03647774085402489, 0.028713135048747063, -0.06289805471897125, -0.030222538858652115, -0.06501726806163788, -0.007188703399151564, -0.039097823202610016, -0.04806915298104286, 0.04462466016411781, -0.16899824142456055, -0.033922191709280014, 0.1186266764998436, 0.04557104408740997, -0.15134701132774353, -0.04948775842785835, -0.04092395305633545, -0.056753676384687424, -0.06932670623064041, -0.0517798475921154, 0.10663432627916336, 0.05747092142701149, 0.05196038633584976, -0.05911761149764061, -0.06484735757112503, 0.00799498613923788, -0.01853559911251068, -0.023748042061924934, 0.07913291454315186, 0.06702018529176712, -0.11829525977373123, 0.09312599897384644, 0.08573136478662491, 0.07933273166418076, 0.10508506000041962, -0.0014733473071828485, -0.09117123484611511, -0.025300826877355576, 0.029316658154129982, 0.016105778515338898, 0.14908336102962494, -0.04350128397345543, 0.04314031824469566, 0.040114615112543106, -0.01687462255358696, 0.008028145879507065, -0.09918303042650223, 0.030367493629455566, 0.026081476360559464, -0.012195796705782413, 0.041467417031526566, -0.05302301421761513, 0.021834537386894226, 0.10195169597864151, 0.03181454911828041, 0.04113520681858063, 0.011278065852820873, -0.050533477216959, -0.11812540888786316, 0.17222443222999573, -0.10861039906740189, -0.2369978129863739, -0.12320686131715775, -0.01618431694805622, 0.02991701476275921, -0.015134924091398716, 0.01900940015912056, -0.06770696491003036, -0.11834623664617538, -0.09672471135854721, 0.04564153030514717, 0.06599046289920807, -0.08051323890686035, -0.055777665227651596, 0.06501153111457825, 0.048011794686317444, -0.13664643466472626, 0.02571168728172779, 0.03327706828713417, -0.08857693523168564, 0.00793769583106041, 0.08559047430753708, 0.06839455664157867, 0.18071474134922028, 0.01134483702480793, -0.023087946698069572, 0.017521869391202927, 0.19720622897148132, -0.14027054607868195, 0.10202740132808685, 0.13801661133766174, -0.07145930081605911, 0.07873693108558655, 0.2032429575920105, 0.039016321301460266, -0.10376140475273132, 0.039679598063230515, 0.036421533674001694, -0.025852223858237267, -0.24745285511016846, -0.08099643886089325, 0.00836301501840353, -0.0664474293589592, 0.0802333801984787, 0.08307429403066635, 0.09203000366687775, 0.023238254711031914, -0.1043974831700325, -0.07363210618495941, 0.05418974906206131, 0.11036353558301926, -0.004034504294395447, -0.011317858472466469, 0.09753942489624023, -0.020273780450224876, 0.02676866576075554, 0.08875394612550735, 0.012205728329718113, 0.18836407363414764, 0.050518929958343506, 0.14771167933940887, 0.09208200126886368, 0.053752463310956955, 0.016467519104480743, 0.010000402107834816, 0.017887894064188004, 0.02435637265443802, -0.014350295066833496, -0.08589190989732742, -0.006933859083801508, 0.1298609972000122, 0.027646880596876144, 0.04127250239253044, 0.013248836621642113, -0.04125351831316948, 0.08765199780464172, 0.17516882717609406, 0.013442369177937508, -0.20506484806537628, -0.06488820165395737, 0.0686659887433052, -0.08813467621803284, -0.10374542325735092, -0.021716099232435226, 0.04023343697190285, -0.1762947142124176, 0.02770446240901947, -0.025082001462578773, 0.0983029454946518, -0.12493812292814255, -0.01920684240758419, 0.0476171039044857, 0.06939635425806046, -0.018209589645266533, 0.0625329241156578, -0.17832936346530914, 0.13725855946540833, 0.012600419111549854, 0.07603015750646591, -0.0920197069644928, 0.0829358845949173, 0.010243658907711506, -0.008985995315015316, 0.14880549907684326, -0.002428766805678606, -0.056611087173223495, -0.10275979340076447, -0.09291432052850723, -0.01180565357208252, 0.11795864999294281, -0.11873860657215118, 0.09995509684085846, -0.017298342660069466, -0.043639615178108215, 0.0016699014231562614, -0.12897762656211853, -0.1380222588777542, -0.17400150001049042, 0.041601065546274185, -0.12252611666917801, 0.04249255359172821, -0.10634490847587585, -0.05313412845134735, -0.058118730783462524, 0.19448153674602509, -0.2263878583908081, -0.07106572389602661, -0.1503530591726303, -0.06515897810459137, 0.11819497495889664, -0.042735762894153595, 0.08508200198411942, 0.017862383276224136, 0.19214710593223572, 0.010283242911100388, -0.013114631175994873, 0.10883224755525589, -0.10211063176393509, -0.21299202740192413, -0.10015871375799179, 0.13945214450359344, 0.13517092168331146, 0.038856618106365204, 0.002108179498463869, 0.030881604179739952, -0.006152692716568708, -0.11462404578924179, 0.028862472623586655, 0.18585458397865295, 0.10306477546691895, 0.03526908904314041, -0.03260820358991623, -0.14471980929374695, -0.08779244124889374, -0.045098960399627686, 0.017435450106859207, 0.19264571368694305, -0.07120641320943832, 0.17354503273963928, 0.15474873781204224, -0.053835928440093994, -0.20943360030651093, 0.03015606477856636, 0.036211419850587845, 0.0007652041967958212, 0.05587008595466614, -0.19489167630672455, 0.0909743532538414, 0.0033501458819955587, -0.057322751730680466, 0.12121490389108658, -0.17501963675022125, -0.15013514459133148, 0.07031099498271942, 0.07301220297813416, -0.17921873927116394, -0.12142012268304825, -0.09439031779766083, -0.04026462882757187, -0.11460573226213455, 0.07970702648162842, -0.016233494505286217, 0.010252374224364758, 0.032961323857307434, 0.018216567113995552, 0.010428756475448608, -0.04740371182560921, 0.1864585429430008, -0.003947122488170862, 0.04788469523191452, -0.07597782462835312, -0.06253167986869812, 0.045070283114910126, -0.06455249339342117, 0.0716865211725235, -0.00903246272355318, 0.006079745013266802, -0.1052967831492424, -0.06088602915406227, -0.03328738734126091, 0.02272024378180504, -0.07930614799261093, -0.09432698786258698, -0.03726235777139664, 0.10006307810544968, 0.09058371931314468, -0.03892482817173004, -0.06462740153074265, -0.08978539705276489, 0.028800709173083305, 0.21877005696296692, 0.177296444773674, 0.05685123801231384, -0.066028892993927, -0.00540707865729928, -0.01588953658938408, 0.053271859884262085, -0.2026120126247406, 0.0566285103559494, 0.035300228744745255, 0.033545590937137604, 0.11711569130420685, -0.026464059948921204, -0.16407892107963562, -0.048686347901821136, 0.05304291099309921, -0.07358507066965103, -0.17289869487285614, 0.014132710173726082, 0.07088939845561981, -0.1477956771850586, -0.023786291480064392, 0.04775075986981392, -0.017420068383216858, -0.03159533068537712, 0.006238185800611973, 0.08124099671840668, 0.01671770215034485, 0.09224288910627365, 0.053469255566596985, 0.09704500436782837, -0.10683690756559372, 0.06699982285499573, 0.07745448499917984, -0.10474617779254913, 0.03967198729515076, 0.0603945255279541, -0.06895622611045837, -0.03619396686553955, 0.033563096076250076, 0.08692663908004761, 0.04178347438573837, -0.060071151703596115, 0.0073408023454248905, -0.10486608743667603, 0.06092875450849533, 0.1210157498717308, 0.04285310208797455, 0.0076990588568151, 0.036018576472997665, 0.04045969620347023, -0.09288305044174194, 0.12451037764549255, 0.04114879295229912, 0.028287222608923912, -0.05418051406741142, -0.028997255489230156, 0.03649618849158287, -0.03188192844390869, -0.01566455140709877, -0.04152749106287956, -0.06663620471954346, -0.010323094204068184, -0.16889281570911407, 0.006573607679456472, -0.05270812287926674, 0.008401375263929367, 0.021295055747032166, -0.03304858133196831, 0.005127503536641598, 0.019244063645601273, -0.07131489366292953, -0.052214257419109344, -0.006754601374268532, 0.10161449760198593, -0.17169132828712463, 0.014349433593451977, 0.0744767114520073, -0.12469461560249329, 0.08815638720989227, 0.018520260229706764, 0.0005999338463880122, 0.03465453162789345, -0.13307695090770721, 0.043367430567741394, -0.006723123602569103, 0.011691853404045105, 0.048354603350162506, -0.21661832928657532, -0.0025545719545334578, -0.04856108874082565, -0.055710889399051666, -0.006375120021402836, -0.02562650851905346, -0.11432337760925293, 0.10399775207042694, 0.010540200397372246, -0.0755159854888916, -0.02542583830654621, 0.037674929946660995, 0.0969945415854454, -0.03298725560307503, 0.16065140068531036, -0.01863807439804077, 0.06254526972770691, -0.1797095239162445, -0.018202031031250954, -0.01975269988179207, 0.023043567314743996, -0.03248249739408493, -0.008440588600933552, 0.05180126056075096, -0.023841936141252518, 0.20870842039585114, -0.022057142108678818, 0.033427316695451736, 0.06674833595752716, -0.021141132339835167, -0.02877473458647728, 0.1086326614022255, 0.054397158324718475, 0.012029323726892471, 0.03175004944205284, 0.006914193741977215, -0.04090225324034691, -0.004564614500850439, -0.1556052416563034, 0.07673801481723785, 0.17203287780284882, 0.0805397778749466, -0.00828546192497015, 0.06094660609960556, -0.11003988236188889, -0.11399497091770172, 0.10722645372152328, -0.05822233483195305, -0.014757114462554455, -0.05772337689995766, 0.14011409878730774, 0.15646083652973175, -0.19130073487758636, 0.06022409349679947, -0.06736859679222107, -0.04819837212562561, -0.10633485019207001, -0.17335662245750427, -0.061282314360141754, -0.0583864226937294, -0.01613355241715908, -0.05076048895716667, 0.06713438034057617, 0.08348768949508667, 0.02054755762219429, 0.016258614137768745, 0.0817527249455452, -0.02199946530163288, 0.007656866684556007, 0.034995537251234055, 0.06331320106983185, 0.0073803807608783245, -0.04667557775974274, 0.009565448388457298, 0.0006085589993745089, 0.035281602293252945, 0.04957476258277893, 0.037472013384103775, -0.026353945955634117, 0.007689491845667362, -0.02916470356285572, -0.11019428819417953, 0.04115133360028267, -0.026625385507941246, -0.06341774761676788, 0.1439228504896164, 0.031860120594501495, -0.008713874034583569, -0.025656426325440407, 0.25211021304130554, -0.07529866695404053, -0.08892348408699036, -0.1387489140033722, 0.13557645678520203, -0.031552400439977646, 0.06481313705444336, 0.037692490965127945, -0.11259825527667999, 0.03179538995027542, 0.1362704634666443, 0.1458069533109665, -0.049145035445690155, 0.019655266776680946, 0.013711978681385517, 0.0032459446229040623, -0.04005579650402069, 0.04973040521144867, 0.06590425968170166, 0.12457112967967987, -0.05082963407039642, 0.08012272417545319, -0.0028764382004737854, -0.10040896385908127, -0.02852385863661766, 0.12230420112609863, -0.003029873361811042, 0.019506774842739105, -0.0761401429772377, 0.12728425860404968, -0.043905097991228104, -0.2665610611438751, 0.06613168120384216, -0.0650629922747612, -0.14912083745002747, -0.022557994350790977, 0.05126400291919708, -0.008650023490190506, 0.026705266907811165, 0.06785756349563599, -0.0670214518904686, 0.18420551717281342, 0.03873218223452568, -0.05507900193333626, -0.058854296803474426, 0.07306438684463501, -0.09833692759275436, 0.2929907441139221, 0.00751500902697444, 0.05993965268135071, 0.09920700639486313, -0.029096059501171112, -0.13847678899765015, 0.031734831631183624, 0.08172675222158432, -0.07410130649805069, 0.055994872003793716, 0.21827135980129242, -0.008840959519147873, 0.11804516613483429, 0.07454971224069595, -0.09561564773321152, 0.05016838759183884, -0.10613930225372314, -0.09673135727643967, -0.08329153805971146, 0.09532807767391205, -0.05763502046465874, 0.14755868911743164, 0.1186022087931633, -0.04606860503554344, 0.02281493879854679, -0.018614748492836952, 0.048749152570962906, 0.0023650694638490677, 0.12439922988414764, 0.020209291949868202, -0.19710010290145874, 0.026845410466194153, -0.008902255445718765, 0.10291280597448349, -0.2202581763267517, -0.09718955308198929, 0.04764820635318756, 0.0019112902227789164, -0.05895697697997093, 0.12370198965072632, 0.055919989943504333, 0.04170476272702217, -0.04714735969901085, -0.028212912380695343, -0.002841046778485179, 0.16146929562091827, -0.11127673834562302, 0.0008471902110613883 ]
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # log_classifier This model is a fine-tuned version of [google-bert/bert-base-uncased](https://huggingface.co/google-bert/bert-base-uncased) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.2781 - Accuracy: 0.9195 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 1.0 | 38 | 0.3269 | 0.8792 | | No log | 2.0 | 76 | 0.2781 | 0.9195 | ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.0+cu121 - Datasets 2.17.0 - Tokenizers 0.15.1
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "metrics": ["accuracy"], "base_model": "google-bert/bert-base-uncased", "model-index": [{"name": "log_classifier", "results": []}]}
text-classification
SzymonSt2808/log_classifier
[ "transformers", "tensorboard", "safetensors", "bert", "text-classification", "generated_from_trainer", "base_model:google-bert/bert-base-uncased", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2024-02-13T19:44:20+00:00
[]
[]
TAGS #transformers #tensorboard #safetensors #bert #text-classification #generated_from_trainer #base_model-google-bert/bert-base-uncased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
log\_classifier =============== This model is a fine-tuned version of google-bert/bert-base-uncased on the None dataset. It achieves the following results on the evaluation set: * Loss: 0.2781 * Accuracy: 0.9195 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 2e-05 * train\_batch\_size: 16 * eval\_batch\_size: 16 * seed: 42 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * num\_epochs: 2 ### Training results ### Framework versions * Transformers 4.35.2 * Pytorch 2.1.0+cu121 * Datasets 2.17.0 * Tokenizers 0.15.1
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2", "### Training results", "### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1" ]
[ "TAGS\n#transformers #tensorboard #safetensors #bert #text-classification #generated_from_trainer #base_model-google-bert/bert-base-uncased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2", "### Training results", "### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1" ]
[ 72, 98, 4, 33 ]
[ "passage: TAGS\n#transformers #tensorboard #safetensors #bert #text-classification #generated_from_trainer #base_model-google-bert/bert-base-uncased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2### Training results### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1" ]
[ -0.09293235093355179, 0.11133982241153717, -0.0024531749077141285, 0.11330931633710861, 0.14639820158481598, 0.015315143391489983, 0.16406910121440887, 0.11230919510126114, -0.04447007179260254, 0.047885384410619736, 0.12979725003242493, 0.11156237870454788, 0.011531769298017025, 0.12557297945022583, -0.0651460513472557, -0.21441759169101715, 0.0041813901625573635, 0.018431808799505234, -0.06370221078395844, 0.11802481859922409, 0.08222868293523788, -0.12503397464752197, 0.09688206762075424, -0.014295254833996296, -0.15257102251052856, 0.009374433197081089, 0.02917054109275341, -0.048613764345645905, 0.1305297464132309, 0.032913658767938614, 0.12908005714416504, 0.030945057049393654, 0.08231308311223984, -0.19470861554145813, 0.01158109214156866, 0.06416472792625427, -0.007045847363770008, 0.08688025921583176, 0.036355629563331604, 0.0018166606314480305, 0.04355720803141594, -0.08005909621715546, 0.061783481389284134, 0.023780226707458496, -0.12699472904205322, -0.2214139699935913, -0.08097031712532043, 0.031153837218880653, 0.08482255786657333, 0.07752372324466705, -0.01391487754881382, 0.12393099814653397, -0.05672109127044678, 0.08978977799415588, 0.22017447650432587, -0.3191046118736267, -0.0633382499217987, 0.04675597324967384, 0.02754354476928711, 0.09116342663764954, -0.09508328884840012, -0.004826701246201992, 0.05600231885910034, 0.01877686381340027, 0.12203219532966614, -0.02243776246905327, -0.05641230568289757, 0.006658654659986496, -0.13397818803787231, -0.016258271411061287, 0.14959952235221863, 0.06061103194952011, -0.04303477331995964, -0.0387975350022316, -0.07422911375761032, -0.13084280490875244, -0.044127415865659714, -0.009912763722240925, 0.06224937364459038, -0.01907348446547985, -0.08181317150592804, -0.03021584078669548, -0.0983218252658844, -0.07619597017765045, -0.058339014649391174, 0.14168396592140198, 0.03205380588769913, 0.004672188311815262, -0.014413951896131039, 0.10289266705513, -0.02551320008933544, -0.14061753451824188, 0.022712284699082375, 0.030815955251455307, 0.006360352970659733, -0.04617268964648247, -0.04740002751350403, -0.06753737479448318, 0.02977384254336357, 0.1590358167886734, -0.04044228047132492, 0.057049885392189026, -0.004708202090114355, 0.04874659329652786, -0.1100507453083992, 0.18188244104385376, -0.047834612429142, -0.0357341393828392, 0.03351140022277832, 0.09073993563652039, 0.06030258908867836, -0.011097544804215431, -0.13124574720859528, 0.03170182555913925, 0.10764455050230026, 0.01968720555305481, -0.043543435633182526, 0.07765417546033859, -0.06376627087593079, -0.00801969226449728, 0.026640446856617928, -0.08578451722860336, 0.036418717354536057, 0.013593211770057678, -0.05800289660692215, -0.07187260687351227, 0.040461212396621704, 0.02327335812151432, 0.013185796327888966, 0.10208919644355774, -0.08640851825475693, 0.010629533790051937, -0.0803651213645935, -0.1189388856291771, 0.022285504266619682, -0.08442945033311844, 0.01949777826666832, -0.11368398368358612, -0.17060013115406036, -0.013346359133720398, 0.06498366594314575, -0.03301157057285309, -0.03450959175825119, -0.05734561011195183, -0.06688838452100754, 0.01291088480502367, -0.02027767337858677, 0.08080741763114929, -0.06640477478504181, 0.09998875856399536, 0.04073711112141609, 0.06832735985517502, -0.053313661366701126, 0.03940509632229805, -0.10436836630105972, 0.04162922129034996, -0.18686296045780182, 0.030276091769337654, -0.07269377261400223, 0.07742691040039062, -0.0894189402461052, -0.06265918910503387, -0.008041063323616982, 0.00009733180195325986, 0.07846365123987198, 0.10478898882865906, -0.1628415286540985, -0.06984569132328033, 0.15773840248584747, -0.07556591182947159, -0.1515415459871292, 0.13907943665981293, -0.053309690207242966, 0.05331386625766754, 0.07059917598962784, 0.18695536255836487, 0.06256479024887085, -0.08883655071258545, 0.010993228293955326, -0.007206704467535019, 0.07200352102518082, -0.03431273251771927, 0.06773804128170013, -0.00949716567993164, -0.012905877083539963, 0.016024546697735786, -0.05850556865334511, 0.06181911751627922, -0.0801657885313034, -0.08643276989459991, -0.03874468058347702, -0.10565056651830673, 0.06303120404481888, 0.05185398831963539, 0.0697556659579277, -0.11092332005500793, -0.09360712021589279, 0.05296981334686279, 0.058814357966184616, -0.07097343355417252, 0.020921528339385986, -0.07009196281433105, 0.101033516228199, -0.06793174892663956, -0.008002504706382751, -0.14807195961475372, -0.06734414398670197, 0.02558062970638275, -0.012991000898182392, 0.021639689803123474, -0.01763765513896942, 0.07775869220495224, 0.07105796039104462, -0.07249193638563156, -0.028982345014810562, -0.009701958857476711, 0.011016881093382835, -0.12938816845417023, -0.19846108555793762, -0.0075053866021335125, -0.025845272466540337, 0.14157453179359436, -0.22871580719947815, 0.04776183143258095, 0.0006013872916810215, 0.09433753043413162, 0.04206518828868866, -0.014508333057165146, -0.03898623213171959, 0.06008807569742203, -0.039874885231256485, -0.06844004988670349, 0.06555216759443283, 0.006346642505377531, -0.10625771433115005, -0.031634371727705, -0.1362432986497879, 0.19473187625408173, 0.13873498141765594, -0.09667939692735672, -0.07055378705263138, 0.009739243425428867, -0.03314025700092316, -0.027775269001722336, -0.04170951992273331, -0.00043739553075283766, 0.14776037633419037, -0.002218567533418536, 0.15826115012168884, -0.08820371329784393, -0.031043313443660736, 0.023039115592837334, -0.04560324549674988, 0.009506570175290108, 0.11419478058815002, 0.0989275649189949, -0.10863763093948364, 0.14858859777450562, 0.1907540112733841, -0.0854497179389, 0.12858711183071136, -0.04071085527539253, -0.047931455075740814, -0.023961851373314857, 0.0048709227703511715, 0.009374501183629036, 0.11034272611141205, -0.13173852860927582, -0.004093734081834555, 0.00694170081987977, 0.011604084633290768, 0.011144289746880531, -0.21861562132835388, -0.02561865746974945, 0.03599900007247925, -0.05757780373096466, -0.0112252626568079, -0.029048403725028038, -0.013340294361114502, 0.09785860031843185, 0.010087438859045506, -0.07971011847257614, 0.048021312803030014, -0.0031062387861311436, -0.08317354321479797, 0.20089496672153473, -0.07960015535354614, -0.13943107426166534, -0.1375044733285904, -0.0662040188908577, -0.05040432885289192, 0.0324772484600544, 0.06170264631509781, -0.07656678557395935, -0.04158898442983627, -0.10531588643789291, 0.005531130824238062, 0.03975095599889755, 0.026128940284252167, 0.010930098593235016, -0.004882704000920057, 0.09548109024763107, -0.09538068622350693, -0.008813824504613876, -0.038571953773498535, -0.05863815173506737, 0.02641458436846733, 0.021418511867523193, 0.11183171719312668, 0.1336458921432495, -0.02332754246890545, -0.005321017932146788, -0.025044945999979973, 0.2308378964662552, -0.06375622749328613, -0.01366066001355648, 0.1202554777264595, -0.024095596745610237, 0.05036346614360809, 0.13938893377780914, 0.06618531793355942, -0.101066954433918, 0.022388974204659462, 0.04300948977470398, -0.03137674927711487, -0.221758633852005, -0.02508009411394596, -0.029625510796904564, 0.003010388230904937, 0.09770563989877701, 0.0349569246172905, 0.013850550167262554, 0.06641960889101028, 0.020905861631035805, 0.08848615735769272, -0.011211227625608444, 0.06423331797122955, 0.12180852144956589, 0.036893315613269806, 0.1261090785264969, -0.04282877966761589, -0.054979946464300156, 0.03899213299155235, -0.00458105094730854, 0.20159141719341278, 0.031022369861602783, 0.15605732798576355, 0.04421624541282654, 0.15700863301753998, 0.0002581025182735175, 0.05401136726140976, -0.0060158115811645985, -0.039147909730672836, -0.015562031418085098, -0.047671444714069366, -0.031814008951187134, 0.0339074470102787, -0.07035500556230545, 0.051590677350759506, -0.0959276556968689, -0.0014094163198024035, 0.06110822409391403, 0.2451987862586975, 0.04698392376303673, -0.32201266288757324, -0.08765741437673569, 0.029176075011491776, -0.025832481682300568, -0.01955893076956272, 0.032029055058956146, 0.12508152425289154, -0.04868847876787186, 0.03210573270916939, -0.07340489327907562, 0.08733313530683517, -0.04797414690256119, 0.04843149334192276, 0.06876246631145477, 0.07720188796520233, -0.010576630011200905, 0.06558694690465927, -0.2675679326057434, 0.27211013436317444, 0.017909908667206764, 0.05945521965622902, -0.04142458364367485, -0.006495608016848564, 0.03752744570374489, 0.08721166104078293, 0.06858138740062714, -0.015607373788952827, -0.029793091118335724, -0.2005442976951599, -0.07525350153446198, 0.03107847273349762, 0.09732275456190109, -0.05638549104332924, 0.09868154674768448, -0.03448449447751045, -0.006384760607033968, 0.08007968217134476, -0.0005743230576626956, -0.08452501147985458, -0.08547701686620712, -0.018102824687957764, 0.03803804889321327, -0.014330332167446613, -0.08167028427124023, -0.10372801870107651, -0.14169202744960785, 0.1508103460073471, -0.05327468365430832, -0.029116526246070862, -0.09890755265951157, 0.06519526243209839, 0.057162150740623474, -0.08068486303091049, 0.04322652891278267, 0.0013952285517007113, 0.0746755301952362, 0.03086341731250286, -0.06131754070520401, 0.1400238424539566, -0.06717930734157562, -0.18263880908489227, -0.08082186430692673, 0.10979630798101425, 0.013595105148851871, 0.046999115496873856, -0.0131917679682374, 0.013176938518881798, -0.016971619799733162, -0.07294272631406784, 0.03645056486129761, -0.00390725489705801, 0.047490958124399185, 0.022584691643714905, -0.059453271329402924, 0.015453034080564976, -0.05999162793159485, -0.03033154457807541, 0.14517658948898315, 0.28986799716949463, -0.08445749431848526, 0.005679506808519363, 0.0640253946185112, -0.060976866632699966, -0.21176959574222565, 0.03688117116689682, 0.029388029128313065, -0.0006537970621138811, 0.04615962505340576, -0.15573279559612274, 0.1047150120139122, 0.09753585606813431, -0.03709351271390915, 0.11968527734279633, -0.2672717571258545, -0.1429300457239151, 0.12191219627857208, 0.1528632640838623, 0.12713339924812317, -0.16604860126972198, -0.04961562901735306, -0.037987224757671356, -0.10012708604335785, 0.10960659384727478, -0.13136087357997894, 0.11113027483224869, -0.0024111371021717787, 0.05484393984079361, 0.007357792463153601, -0.054186414927244186, 0.13060595095157623, -0.021657882258296013, 0.110022634267807, -0.05956033617258072, -0.029919179156422615, 0.04390580952167511, -0.057008810341358185, 0.014130419120192528, -0.10275031626224518, 0.04259456321597099, -0.042383160442113876, -0.028497397899627686, -0.04798230156302452, 0.03181037679314613, -0.037187397480010986, -0.04963258281350136, -0.0378902293741703, 0.02414815127849579, 0.03417655825614929, -0.004078260622918606, 0.17258024215698242, 0.008692942559719086, 0.14124901592731476, 0.13730356097221375, 0.08150292187929153, -0.0739896297454834, -0.01849963329732418, -0.005178878549486399, -0.04444199427962303, 0.06518767774105072, -0.16755102574825287, 0.046351127326488495, 0.11599975824356079, 0.008925158530473709, 0.14406709372997284, 0.06951858103275299, -0.037265896797180176, 0.013595487922430038, 0.06423178315162659, -0.16307464241981506, -0.12256176024675369, -0.014306627213954926, -0.058743804693222046, -0.12372718006372452, 0.057742465287446976, 0.12577755749225616, -0.06840634346008301, 0.006650812923908234, -0.003153209574520588, 0.011118707247078419, -0.02917655184864998, 0.17860504984855652, 0.07301107794046402, 0.04101411625742912, -0.08444638550281525, 0.09124813973903656, 0.05865972861647606, -0.06637660413980484, 0.010499073192477226, 0.039804857224226, -0.08409600704908371, -0.04811599478125572, 0.03669968619942665, 0.20850123465061188, -0.017151786014437675, -0.048477884382009506, -0.15580546855926514, -0.1076449528336525, 0.0515223927795887, 0.17106066644191742, 0.09572945535182953, 0.012796970084309578, -0.02613840624690056, 0.018673451617360115, -0.09028934687376022, 0.12051532417535782, 0.03367964178323746, 0.08583427220582962, -0.15850958228111267, 0.09971938282251358, 0.0029153572395443916, 0.008270032703876495, -0.029397590085864067, 0.05104171857237816, -0.12843742966651917, -0.00795762985944748, -0.14321114122867584, -0.005607332568615675, -0.007890219800174236, 0.00868042092770338, 0.0030924389138817787, -0.07063166052103043, -0.06646428257226944, 0.006666373461484909, -0.10399244725704193, -0.027999086305499077, 0.04075723886489868, 0.054667189717292786, -0.1250367909669876, -0.04452918842434883, 0.03577091544866562, -0.06864625215530396, 0.07582425326108932, 0.020735478028655052, 0.0255067590624094, 0.05098343268036842, -0.1987060308456421, 0.026575006544589996, 0.052623867988586426, 0.013335462659597397, 0.04518261179327965, -0.08613839000463486, -0.020294515416026115, -0.009577321819961071, 0.05172001197934151, 0.017502807080745697, 0.0979895144701004, -0.12859289348125458, 0.0038709600921720266, -0.0206014271825552, -0.06556100398302078, -0.04459334909915924, 0.024201463907957077, 0.07743774354457855, 0.007576090283691883, 0.20475009083747864, -0.10354882478713989, 0.015362917445600033, -0.21209010481834412, 0.01027145329862833, -0.000429615582106635, -0.11724939942359924, -0.12731029093265533, -0.059037480503320694, 0.05202597752213478, -0.06193327158689499, 0.14266380667686462, 0.019306281581521034, 0.040049802511930466, 0.03521720692515373, -0.02512066438794136, 0.04279288277029991, 0.02674049884080887, 0.2024928778409958, 0.03459681570529938, -0.036077968776226044, 0.02238302305340767, 0.03357730433344841, 0.11471500992774963, 0.06852211803197861, 0.15648327767848969, 0.15819670259952545, -0.04635777696967125, 0.09964225441217422, 0.05110817402601242, -0.05234059691429138, -0.1370069533586502, 0.06874901801347733, -0.05408759042620659, 0.10652691125869751, -0.025659047067165375, 0.20442937314510345, 0.08492489159107208, -0.1521936058998108, 0.013893760740756989, -0.05721563473343849, -0.07561366260051727, -0.11336801946163177, -0.06030626222491264, -0.10400118678808212, -0.1621764898300171, 0.001611848478205502, -0.101788729429245, 0.001460443832911551, 0.09947189688682556, -0.004980454221367836, -0.017731554806232452, 0.17145685851573944, 0.008841385133564472, 0.030551131814718246, 0.05948476120829582, 0.004142665304243565, -0.04773255065083504, -0.08253877609968185, -0.10605611652135849, 0.006950345356017351, -0.008021825924515724, 0.016877854242920876, -0.04078791290521622, -0.035272981971502304, 0.04238798841834068, -0.00415963726118207, -0.11290303617715836, 0.005558297969400883, 0.025559695437550545, 0.044110581278800964, 0.03453141078352928, 0.006202393677085638, 0.015492054633796215, 0.002886135596781969, 0.22679948806762695, -0.07437188923358917, -0.04532374441623688, -0.09578630328178406, 0.2103717178106308, 0.011941956356167793, 0.012421537190675735, 0.003832993796095252, -0.0985540971159935, 0.0252844151109457, 0.23094794154167175, 0.1879887729883194, -0.09744708985090256, -0.0011813432211056352, -0.00474336976185441, -0.011239711195230484, -0.028422875329852104, 0.09374862909317017, 0.11149456351995468, 0.001109323580749333, -0.07238400727510452, -0.051673322916030884, -0.02495935931801796, -0.00830252468585968, -0.04598027467727661, 0.06403598934412003, 0.03056848980486393, 0.019970405846834183, -0.0649765133857727, 0.05463991314172745, -0.03099309653043747, -0.12146041542291641, 0.05956455320119858, -0.20914530754089355, -0.14922216534614563, -0.023757891729474068, 0.11621933430433273, -0.007612770888954401, 0.04934462532401085, -0.029035115614533424, 0.0038678545970469713, 0.05838087201118469, -0.0328606516122818, -0.07357533276081085, -0.07157047092914581, 0.05670910328626633, -0.10645976662635803, 0.24769370257854462, -0.035336896777153015, 0.038722481578588486, 0.12780070304870605, 0.041702255606651306, -0.07237932831048965, 0.08007445931434631, 0.045542579144239426, -0.07919419556856155, 0.029067890718579292, 0.06948234140872955, -0.03740346431732178, 0.11957899481058121, 0.05989620089530945, -0.11074211448431015, 0.01617325097322464, -0.04334954917430878, -0.09302804619073868, -0.06274577230215073, -0.04157090187072754, -0.06721461564302444, 0.13127672672271729, 0.1833391934633255, -0.038036592304706573, 0.00537380576133728, -0.04592671990394592, 0.019163770601153374, 0.06773164123296738, 0.03419819101691246, -0.03816339001059532, -0.22738036513328552, 0.02714507281780243, 0.07871446758508682, -0.006962907500565052, -0.2875145673751831, -0.07632528245449066, -0.010026550851762295, -0.036895524710416794, -0.09098750352859497, 0.08191515505313873, 0.11730039119720459, 0.05315031111240387, -0.061622168868780136, -0.11211970448493958, -0.06583951413631439, 0.1590355634689331, -0.13197702169418335, -0.09941110014915466 ]
null
null
peft
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed] ### Framework versions - PEFT 0.7.1
{"library_name": "peft", "base_model": "bigscience/mt0-base"}
null
alitolga/627_bigscience_mt0-base_PrefixTuning
[ "peft", "safetensors", "arxiv:1910.09700", "base_model:bigscience/mt0-base", "region:us" ]
2024-02-13T19:45:04+00:00
[ "1910.09700" ]
[]
TAGS #peft #safetensors #arxiv-1910.09700 #base_model-bigscience/mt0-base #region-us
# Model Card for Model ID ## Model Details ### Model Description - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact ### Framework versions - PEFT 0.7.1
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact", "### Framework versions\n\n- PEFT 0.7.1" ]
[ "TAGS\n#peft #safetensors #arxiv-1910.09700 #base_model-bigscience/mt0-base #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact", "### Framework versions\n\n- PEFT 0.7.1" ]
[ 36, 6, 3, 54, 28, 3, 4, 9, 9, 10, 42, 20, 3, 4, 5, 9, 11, 13, 3, 12, 5, 4, 5, 3, 4, 9, 53, 9, 8, 6, 3, 14, 8, 7, 9, 4, 11 ]
[ "passage: TAGS\n#peft #safetensors #arxiv-1910.09700 #base_model-bigscience/mt0-base #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact### Framework versions\n\n- PEFT 0.7.1" ]
[ -0.09970706701278687, 0.1880176067352295, -0.0032796398736536503, 0.03252210095524788, 0.09286137670278549, 0.016964655369520187, 0.049636390060186386, 0.12481708079576492, -0.02919818088412285, 0.10980280488729477, 0.06970852613449097, 0.09925585240125656, 0.1010303944349289, 0.21307633817195892, 0.011123419739305973, -0.20605313777923584, 0.025960132479667664, -0.09653521329164505, -0.011023713275790215, 0.12187550216913223, 0.15173904597759247, -0.09947776049375534, 0.07772871106863022, -0.017447715625166893, -0.005513359326869249, -0.03462189808487892, -0.0721888393163681, -0.033401209861040115, 0.045596636831760406, 0.04616513475775719, 0.05465865135192871, -0.00023112654162105173, 0.08071602880954742, -0.26404404640197754, 0.019991695880889893, 0.04041823372244835, -0.0052948822267353535, 0.0882648378610611, 0.09872210025787354, -0.03973419964313507, 0.12463592737913132, -0.039647411555051804, 0.14007917046546936, 0.08253129571676254, -0.09257293492555618, -0.22359894216060638, -0.06762063503265381, 0.08625121414661407, 0.171598881483078, 0.07486061751842499, -0.042131584137678146, 0.12553191184997559, -0.10028247535228729, 0.015979789197444916, 0.04895217716693878, -0.08095894753932953, -0.06641662865877151, 0.056603025645017624, 0.10907548666000366, 0.06227802857756615, -0.13505354523658752, -0.031044308096170425, 0.017202287912368774, 0.03516962751746178, 0.08119188994169235, 0.018778959289193153, 0.14997868239879608, 0.030777228996157646, -0.1510625183582306, -0.034437116235494614, 0.1294451653957367, 0.02790471911430359, -0.03722541779279709, -0.23076161742210388, 0.006127171218395233, -0.08773526549339294, -0.029874002560973167, -0.05349263548851013, 0.041789207607507706, -0.0009533475385978818, 0.09749934077262878, -0.028171498328447342, -0.08983287960290909, -0.004314348567277193, 0.08679506182670593, 0.049287099391222, 0.020755866542458534, -0.020460370928049088, 0.0044783600606024265, 0.1177552342414856, 0.048643533140420914, -0.12921522557735443, -0.06436999887228012, -0.0714259147644043, -0.03986718878149986, -0.04758160933852196, 0.037729814648628235, 0.05245962366461754, 0.05917269363999367, 0.25165510177612305, -0.03204511106014252, 0.04940728843212128, 0.06266403198242188, 0.019339874386787415, 0.04669701308012009, 0.09156271815299988, -0.0659230649471283, -0.15488363802433014, -0.008509929291903973, 0.09450967609882355, -0.004104815423488617, -0.020473582670092583, -0.04975953698158264, 0.036872778087854385, 0.03614385426044464, 0.11193212866783142, 0.09667142480611801, -0.013516724109649658, -0.07571376860141754, -0.05177490413188934, 0.2104874849319458, -0.14497290551662445, 0.04119675233960152, 0.02409527637064457, -0.01760074496269226, -0.03966556116938591, 0.004767547827214003, 0.019899308681488037, -0.01727631501853466, 0.10168362408876419, -0.06942452490329742, -0.03699829429388046, -0.1164274662733078, -0.02336697466671467, 0.03610466420650482, 0.01842643693089485, -0.024465259164571762, -0.030313801020383835, -0.056777890771627426, -0.08982069045305252, 0.0980890542268753, -0.0669654980301857, -0.060290247201919556, -0.02539110742509365, -0.09521128237247467, 0.02267340198159218, 0.02842194400727749, 0.10341829061508179, -0.02499539777636528, 0.038472626358270645, -0.015944870188832283, 0.0672866702079773, 0.07934800535440445, 0.03252407908439636, -0.07139413058757782, 0.06129873916506767, -0.20449425280094147, 0.0835590809583664, -0.07718385756015778, 0.029029956087470055, -0.16438184678554535, -0.02120395377278328, 0.004833235405385494, 0.019923696294426918, 0.03285932540893555, 0.1594921350479126, -0.19814074039459229, -0.03478777036070824, 0.15430989861488342, -0.09684470295906067, -0.12263526022434235, 0.04519303888082504, -0.05231219157576561, 0.16179333627223969, 0.01788218319416046, -0.014508314430713654, 0.08894367516040802, -0.14550119638442993, -0.03180202096700668, -0.02655736729502678, -0.0046742199920117855, 0.1016935184597969, 0.09388162195682526, -0.07686112821102142, 0.03318120166659355, 0.01163145899772644, -0.041199102997779846, -0.027252530679106712, -0.05239841714501381, -0.11079513281583786, 0.00495518371462822, -0.07698605209589005, 0.023910267278552055, -0.01666419953107834, -0.07663103193044662, -0.013300842605531216, -0.1625625044107437, -0.029319699853658676, 0.08247487246990204, 0.01958858221769333, -0.016533462330698967, -0.09397809207439423, 0.0380699522793293, -0.03377417102456093, -0.022870495915412903, -0.14875805377960205, -0.02553252875804901, 0.018693558871746063, -0.14017945528030396, 0.010527866892516613, -0.12255918979644775, 0.06775986403226852, 0.01183881051838398, -0.0682520717382431, -0.034627314656972885, -0.010590373538434505, 0.006513264961540699, -0.053371328860521317, -0.24164249002933502, -0.021326668560504913, -0.054683566093444824, 0.15501107275485992, -0.21785475313663483, 0.037210796028375626, 0.05195560306310654, 0.12184429913759232, 0.006029097363352776, -0.05840582028031349, 0.03032398223876953, -0.06969554722309113, -0.027428986504673958, -0.06902489811182022, -0.005837097764015198, -0.011450139805674553, -0.05085735023021698, 0.017257867380976677, -0.12066929787397385, -0.03587270900607109, 0.1008659303188324, 0.0716373547911644, -0.16089732944965363, -0.024322988465428352, -0.04435553774237633, -0.06145375221967697, -0.082353875041008, -0.061544351279735565, 0.10226962715387344, 0.05111978203058243, 0.03303086385130882, -0.07315544039011002, -0.06901729106903076, 0.006074552424252033, -0.020233429968357086, -0.025845319032669067, 0.11911454051733017, 0.07514772564172745, -0.12093621492385864, 0.10077179223299026, 0.08068756014108658, 0.03176617994904518, 0.07922183722257614, -0.02324594371020794, -0.10687272995710373, -0.03373356908559799, 0.037547994405031204, 0.015970105305314064, 0.15397107601165771, -0.06681742519140244, 0.058108072727918625, 0.04879069700837135, -0.036680400371551514, 0.046794891357421875, -0.08976394683122635, 0.010672517120838165, 0.008707747794687748, -0.011387593112885952, 0.018767094239592552, -0.019234998151659966, 0.011594166979193687, 0.09068144112825394, 0.05653935670852661, 0.03839952126145363, 0.03147292137145996, -0.028748605400323868, -0.12913621962070465, 0.18572215735912323, -0.10043260455131531, -0.2367268204689026, -0.15987688302993774, 0.056840699166059494, 0.05512991175055504, -0.01902272179722786, 0.023857977241277695, -0.056326158344745636, -0.10463690757751465, -0.08060819655656815, 0.007481089793145657, 0.03327825665473938, -0.05569979175925255, -0.07431279867887497, 0.05081402137875557, 0.03718314692378044, -0.11910944432020187, 0.036542925983667374, 0.05765248090028763, -0.016850918531417847, 0.005092555191367865, 0.0626993477344513, 0.08278017491102219, 0.17838738858699799, -0.00817374512553215, -0.009174015372991562, 0.0520445741713047, 0.2845333516597748, -0.15961404144763947, 0.11615908890962601, 0.13246183097362518, -0.06112437695264816, 0.07662465423345566, 0.19349145889282227, 0.03648621588945389, -0.09890075027942657, 0.035487640649080276, 0.024674123153090477, -0.028607355430722237, -0.26848942041397095, -0.05064081773161888, -0.01406821794807911, -0.09766728430986404, 0.07136162370443344, 0.09057067334651947, 0.08180148154497147, 0.03917955607175827, -0.06943166255950928, -0.09911816567182541, 0.03208243101835251, 0.10244447737932205, -0.02498186193406582, 0.005974580068141222, 0.08577533811330795, -0.03298202157020569, 0.006509842351078987, 0.09652547538280487, -0.01313635054975748, 0.1583964228630066, 0.05806413292884827, 0.11883989721536636, 0.07875194400548935, 0.08012039214372635, 0.0022733286023139954, 0.02758602611720562, 0.011410696431994438, 0.01766182668507099, 0.011394613422453403, -0.0865095853805542, 0.03632853180170059, 0.11424865573644638, 0.04415161535143852, 0.03625446557998657, 0.012537756003439426, -0.04105934128165245, 0.053247515112161636, 0.17606940865516663, 0.00974822323769331, -0.19790948927402496, -0.08119819313287735, 0.06786452233791351, -0.07751011848449707, -0.13282281160354614, -0.018676243722438812, 0.03150571882724762, -0.16511671245098114, 0.013914629817008972, -0.03782041743397713, 0.09853333234786987, -0.07141299545764923, -0.037269044667482376, 0.09035418927669525, 0.06934143602848053, -0.027897140011191368, 0.058432385325431824, -0.2014840692281723, 0.11928361654281616, 0.027061212807893753, 0.06554783880710602, -0.09223612397909164, 0.09859947860240936, 0.0015719735529273748, -0.004818320740014315, 0.17041219770908356, 0.0023446762934327126, -0.07583969086408615, -0.06176105886697769, -0.09908964484930038, -0.014705066569149494, 0.10163475573062897, -0.13617655634880066, 0.0688241645693779, -0.017113778740167618, -0.0342370830476284, 0.003153581405058503, -0.07139869034290314, -0.11806413531303406, -0.1738867461681366, 0.056909672915935516, -0.10204369574785233, 0.03234529867768288, -0.09336725622415543, -0.06456742435693741, 0.008215068839490414, 0.17791806161403656, -0.20305941998958588, -0.0944640040397644, -0.14957641065120697, -0.09332799166440964, 0.16588976979255676, -0.04430030658841133, 0.0879925861954689, 0.0031276741065084934, 0.1694037765264511, 0.013774528168141842, -0.006311940960586071, 0.10810340195894241, -0.09286068379878998, -0.1989690214395523, -0.057307954877614975, 0.17073944211006165, 0.13147994875907898, 0.04620497673749924, -0.01514276210218668, 0.030529487878084183, -0.05576961115002632, -0.11744025349617004, 0.023847274482250214, 0.1399061679840088, 0.07334389537572861, -0.01111523061990738, -0.03938404098153114, -0.09548226743936539, -0.05921252816915512, -0.052340082824230194, 0.017391262575984, 0.19878847897052765, -0.0765790268778801, 0.1691199392080307, 0.12209959328174591, -0.05275176092982292, -0.21092915534973145, 0.05418183282017708, 0.050803348422050476, 0.014480896294116974, 0.03509186953306198, -0.1941242516040802, 0.09392564743757248, 0.005682836286723614, -0.07394372671842575, 0.16195067763328552, -0.17092156410217285, -0.14549198746681213, 0.09063240885734558, 0.03200271725654602, -0.22057868540287018, -0.1453525424003601, -0.09783509373664856, -0.025363709777593613, -0.10497254878282547, 0.068727508187294, 0.005105928983539343, 0.013756698928773403, 0.027907414361834526, 0.018446555361151695, 0.026725972071290016, -0.05253703147172928, 0.20566385984420776, -0.028302378952503204, 0.006233870051801205, -0.0479409284889698, -0.09221111238002777, 0.035014808177948, -0.04718736186623573, 0.10203003883361816, 0.0034816921688616276, 0.02259339578449726, -0.14964738488197327, -0.04227222129702568, -0.05924706533551216, 0.03328561410307884, -0.0975399762392044, -0.09499592334032059, -0.05021630972623825, 0.10013283789157867, 0.0970805361866951, -0.031561218202114105, 0.007356567308306694, -0.0903957262635231, 0.07417149841785431, 0.19712941348552704, 0.1975347250699997, 0.06694280356168747, -0.06674187630414963, 0.028953038156032562, -0.032983940094709396, 0.0426892526447773, -0.22792011499404907, 0.04470573365688324, 0.05855967476963997, 0.020750857889652252, 0.08617141842842102, -0.006590632721781731, -0.15108947455883026, -0.07295892387628555, 0.0839480608701706, -0.04679834097623825, -0.17205454409122467, -0.027722537517547607, 0.04949919879436493, -0.2140648365020752, -0.04549429938197136, 0.019403928890824318, -0.017095394432544708, -0.04326630383729935, 0.02244923636317253, 0.08535085618495941, -0.01732112281024456, 0.1078013926744461, 0.09293133020401001, 0.09631044417619705, -0.10146819055080414, 0.07121807336807251, 0.0750582218170166, -0.051436785608530045, 0.023571889847517014, 0.11597426235675812, -0.042804595082998276, -0.03614803031086922, 0.0889359712600708, 0.08116666972637177, 0.025469811633229256, -0.04420782998204231, 0.012107199989259243, -0.06195899099111557, 0.06240798532962799, 0.1172645166516304, 0.029475051909685135, -0.009586624801158905, 0.0562325194478035, 0.03230256214737892, -0.09919240325689316, 0.11044453829526901, 0.04451320320367813, 0.021234221756458282, -0.04070017859339714, -0.03218388929963112, -0.015125266276299953, -0.013397876173257828, -0.022268647328019142, -0.002246198942884803, -0.09602329134941101, -0.011229162104427814, -0.08831340074539185, 0.024286050349473953, -0.07294435054063797, 0.009284865111112595, 0.024356506764888763, -0.0521586537361145, 0.0029764524661004543, 0.002803004579618573, -0.07675144821405411, -0.047760576009750366, -0.01910528540611267, 0.08796778321266174, -0.12593959271907806, 0.03342410922050476, 0.07663857191801071, -0.10710355639457703, 0.07803954184055328, -0.004180698189884424, 0.010243389755487442, 0.004357768222689629, -0.16117820143699646, 0.055061519145965576, -0.019930213689804077, -0.011882545426487923, 0.017259584739804268, -0.21626858413219452, -0.00858068186789751, -0.04902093857526779, -0.05189770087599754, 0.012165626510977745, -0.0330248698592186, -0.12450410425662994, 0.1000291109085083, -0.008525132201611996, -0.06605640798807144, -0.019710754975676537, 0.03112913854420185, 0.09855234622955322, -0.027926286682486534, 0.13407890498638153, -0.02192608267068863, 0.07542283833026886, -0.17204199731349945, -0.0038187094032764435, -0.02039450779557228, 0.03397654742002487, -0.025934133678674698, -0.021816467866301537, 0.05856834724545479, -0.019021175801753998, 0.1830398589372635, -0.028004968538880348, 0.06819328665733337, 0.06116953492164612, 0.015185668133199215, 0.017295662313699722, 0.08777481317520142, 0.061120811849832535, 0.0033325161784887314, -0.006113908253610134, 0.03753207251429558, -0.00620339484885335, -0.037533070892095566, -0.15353138744831085, 0.07670847326517105, 0.16074706614017487, 0.049817513674497604, 0.01944517344236374, 0.030900437384843826, -0.11464765667915344, -0.07312017679214478, 0.11917223781347275, -0.006933846045285463, -0.04087993502616882, -0.07261873036623001, 0.18034902215003967, 0.13930116593837738, -0.19915615022182465, 0.07629747688770294, -0.05708786100149155, -0.04776613041758537, -0.1367461383342743, -0.15106093883514404, -0.06602363288402557, -0.0373213104903698, -0.02504006214439869, -0.06830063462257385, 0.04868239909410477, 0.057119689881801605, 0.009342510253190994, -0.016918299719691277, 0.10561450570821762, 0.00818807352334261, -0.021656176075339317, 0.04883718863129616, 0.06574098020792007, 0.03123481571674347, -0.0977921411395073, 0.008795672096312046, -0.0023663691245019436, 0.015247374773025513, 0.06376027315855026, 0.020914841443300247, -0.05957821384072304, 0.008299834094941616, -0.022589311003684998, -0.12065114080905914, 0.038399964570999146, -0.017934689298272133, -0.03727000951766968, 0.14428788423538208, 0.025028608739376068, 0.005949738435447216, -0.021658016368746758, 0.2318955510854721, -0.07524336874485016, -0.07217191159725189, -0.14207594096660614, 0.07196655869483948, -0.06736919283866882, 0.0361652635037899, 0.02826823852956295, -0.1139536201953888, 0.012956143356859684, 0.15382476150989532, 0.13540691137313843, -0.015783274546265602, 0.013356857933104038, 0.05035088583827019, 0.0046878173016011715, -0.03196173533797264, 0.019984571263194084, 0.057641059160232544, 0.1496744304895401, -0.07462629675865173, 0.07510922104120255, -0.009107173420488834, -0.07638952136039734, -0.02115754410624504, 0.11306552588939667, -0.0007587762665934861, 0.004847508389502764, -0.06734060496091843, 0.13490094244480133, -0.09194768965244293, -0.22461692988872528, 0.05285736173391342, -0.07967444509267807, -0.15385973453521729, -0.05189809948205948, 0.0191314946860075, -0.016085827723145485, 0.01623571291565895, 0.08047546446323395, -0.051721613854169846, 0.15695683658123016, 0.048113640397787094, -0.05536772683262825, -0.0806153267621994, 0.06420004367828369, -0.12992587685585022, 0.28051382303237915, 0.02257552742958069, 0.043652553111314774, 0.10946892946958542, -0.021963907405734062, -0.1485951542854309, 0.00975669827312231, 0.10790731757879257, -0.07331472635269165, 0.06317361444234848, 0.17822052538394928, 0.002021088497713208, 0.1281358003616333, 0.05917123705148697, -0.0639241635799408, 0.03873462229967117, -0.09462197124958038, -0.05195291340351105, -0.10812472552061081, 0.08125423640012741, -0.08185729384422302, 0.1598132997751236, 0.12511084973812103, -0.07241670042276382, -0.008050436154007912, -0.0226003285497427, 0.08513206988573074, 0.013622744008898735, 0.11444628983736038, 0.010493848472833633, -0.19066646695137024, 0.03599109128117561, 0.005979688838124275, 0.10569039732217789, -0.21124032139778137, -0.0658564493060112, 0.047238606959581375, -0.012370409443974495, -0.0861196368932724, 0.11470377445220947, 0.04864535853266716, 0.03097468987107277, -0.039123356342315674, -0.052621666342020035, 0.004724403377622366, 0.14286325871944427, -0.11613064259290695, -0.005452694371342659 ]
null
null
stable-baselines3
# **PPO** Agent playing **LunarLander-v2** This is a trained model of a **PPO** agent playing **LunarLander-v2** using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3). ## Usage (with Stable-baselines3) TODO: Add your code ```python from stable_baselines3 import ... from huggingface_sb3 import load_from_hub ... ```
{"library_name": "stable-baselines3", "tags": ["LunarLander-v2", "deep-reinforcement-learning", "reinforcement-learning", "stable-baselines3"], "model-index": [{"name": "PPO", "results": [{"task": {"type": "reinforcement-learning", "name": "reinforcement-learning"}, "dataset": {"name": "LunarLander-v2", "type": "LunarLander-v2"}, "metrics": [{"type": "mean_reward", "value": "277.17 +/- 24.54", "name": "mean_reward", "verified": false}]}]}]}
reinforcement-learning
CarbonBlack8/ppo-LunarLander-v2
[ "stable-baselines3", "LunarLander-v2", "deep-reinforcement-learning", "reinforcement-learning", "model-index", "region:us" ]
2024-02-13T19:45:30+00:00
[]
[]
TAGS #stable-baselines3 #LunarLander-v2 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us
# PPO Agent playing LunarLander-v2 This is a trained model of a PPO agent playing LunarLander-v2 using the stable-baselines3 library. ## Usage (with Stable-baselines3) TODO: Add your code
[ "# PPO Agent playing LunarLander-v2\nThis is a trained model of a PPO agent playing LunarLander-v2\nusing the stable-baselines3 library.", "## Usage (with Stable-baselines3)\nTODO: Add your code" ]
[ "TAGS\n#stable-baselines3 #LunarLander-v2 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us \n", "# PPO Agent playing LunarLander-v2\nThis is a trained model of a PPO agent playing LunarLander-v2\nusing the stable-baselines3 library.", "## Usage (with Stable-baselines3)\nTODO: Add your code" ]
[ 39, 41, 17 ]
[ "passage: TAGS\n#stable-baselines3 #LunarLander-v2 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us \n# PPO Agent playing LunarLander-v2\nThis is a trained model of a PPO agent playing LunarLander-v2\nusing the stable-baselines3 library.## Usage (with Stable-baselines3)\nTODO: Add your code" ]
[ 0.03942384943366051, 0.04900386184453964, -0.005304091144353151, 0.026427261531352997, 0.107408307492733, -0.026511888951063156, 0.11188238859176636, 0.0814051404595375, 0.10722193866968155, 0.04762078449130058, 0.08338645845651627, 0.06030960753560066, 0.05080918222665787, 0.2571701407432556, 0.04754156619310379, -0.22987541556358337, 0.036159250885248184, -0.04869936779141426, 0.12395193427801132, 0.07178173214197159, -0.0038484656251966953, -0.06485428661108017, 0.020415637642145157, -0.013290755450725555, 0.05367108806967735, 0.04282612353563309, -0.01716216839849949, -0.08207534998655319, 0.07169748842716217, -0.06345846503973007, 0.06986866891384125, 0.07677983492612839, 0.13218913972377777, -0.17832116782665253, 0.029566360637545586, 0.02571309357881546, -0.07189024239778519, 0.01342033501714468, 0.008019951172173023, 0.05120139941573143, 0.17303818464279175, 0.019879888743162155, 0.07844575494527817, -0.0025605305563658476, -0.15412317216396332, -0.018950799480080605, 0.0436202734708786, 0.12546207010746002, 0.08808347582817078, 0.04605821147561073, 0.01970590092241764, 0.17503218352794647, -0.054352790117263794, -0.028833400458097458, 0.21759237349033356, -0.2881564497947693, -0.031460098922252655, 0.321048766374588, 0.06997483223676682, 0.09725230932235718, -0.07540661096572876, -0.03619609400629997, 0.007783263456076384, -0.013137873262166977, -0.028666524216532707, -0.07447073608636856, 0.17313385009765625, 0.05152064561843872, -0.05057951435446739, -0.09541505575180054, 0.16948209702968597, 0.006921638268977404, 0.0018855923553928733, -0.019282981753349304, 0.009060598909854889, 0.07402525842189789, -0.016097044572234154, -0.07255112379789352, 0.057438433170318604, 0.05330665782094002, 0.019649166613817215, -0.1435653269290924, -0.10762494057416916, -0.022740179672837257, -0.008012006990611553, 0.17786912620067596, -0.009255532175302505, 0.042902372777462006, 0.003065188182517886, 0.10384012013673782, -0.12480384111404419, -0.03354184702038765, -0.0454259067773819, -0.07565800100564957, -0.0223417766392231, -0.02058211714029312, -0.03580251708626747, 0.07184842973947525, 0.11971849203109741, 0.027368178591132164, 0.09350208193063736, 0.047715865075588226, -0.03206788748502731, 0.06343851238489151, 0.05555703118443489, 0.14222665131092072, 0.05807621404528618, 0.012854371219873428, 0.13179877400398254, 0.055213116109371185, 0.033023182302713394, -0.0613492950797081, -0.18252409994602203, 0.07489913702011108, -0.07031869143247604, 0.007941240444779396, 0.12051256000995636, -0.04480670019984245, -0.1183447614312172, -0.037500523030757904, -0.017392054200172424, -0.06224250793457031, -0.025395862758159637, 0.0547584593296051, -0.02883218228816986, -0.03973718360066414, 0.0011496668448671699, 0.09384800493717194, 0.00953749567270279, -0.1752052903175354, 0.03303423151373863, -0.025042934343218803, -0.10782608389854431, 0.009975161403417587, 0.0022444494534283876, 0.03394931182265282, 0.04408763721585274, -0.11822668462991714, -0.30899152159690857, -0.07652641832828522, 0.05490870401263237, -0.06516939401626587, -0.18425025045871735, -0.13193942606449127, 0.02454492449760437, -0.09037084132432938, -0.044885024428367615, -0.12759265303611755, -0.028549788519740105, 0.01743689924478531, 0.011519349180161953, 0.10758619755506516, -0.0106219332665205, -0.012188062071800232, -0.1571401208639145, 0.008273907005786896, -0.20951123535633087, 0.0890483483672142, -0.019150104373693466, 0.037884220480918884, -0.032381169497966766, -0.07404014468193054, 0.030707746744155884, 0.052499737590551376, -0.01474119070917368, 0.13510210812091827, -0.15592676401138306, -0.03691192343831062, -0.007996266707777977, -0.13611900806427002, -0.04786273464560509, -0.10358831286430359, -0.04357128217816353, 0.13354332745075226, 0.018664736300706863, 0.15356586873531342, -0.08709818124771118, -0.0722038671374321, 0.20489206910133362, -0.010411538183689117, -0.12820468842983246, -0.076752208173275, 0.10165707021951675, 0.021510310471057892, -0.056606587022542953, -0.02523270808160305, -0.1839766949415207, -0.0152357779443264, -0.04550420492887497, -0.047039128839969635, 0.01796751655638218, -0.010888241231441498, 0.13837894797325134, 0.08494598418474197, 0.05018039792776108, -0.06086122244596481, -0.006730288732796907, 0.10779471695423126, 0.08823856711387634, 0.008680110797286034, 0.023406028747558594, -0.05774238705635071, 0.09552932530641556, -0.04003755748271942, -0.0142367510125041, -0.08283266425132751, -0.036246106028556824, -0.026256313547492027, 0.17507147789001465, 0.09440762549638748, 0.2257927656173706, 0.09567736834287643, 0.039160262793302536, 0.031270865350961685, -0.13181598484516144, -0.1425403207540512, -0.0017254541162401438, 0.09020978957414627, -0.14270411431789398, -0.04119925573468208, -0.08974775671958923, -0.17768175899982452, -0.12202505767345428, 0.0006432619411498308, -0.17960017919540405, 0.06390921026468277, 0.05408334732055664, -0.035177867859601974, 0.03272094577550888, 0.13032332062721252, -0.011533179320394993, -0.03967514634132385, 0.0831870287656784, 0.0379033200442791, -0.041234664618968964, -0.021742934361100197, 0.11885567009449005, 0.15673065185546875, 0.13124459981918335, -0.03511447086930275, 0.004914294462651014, 0.07076404243707657, -0.02309088408946991, 0.06539414077997208, 0.0558244064450264, 0.20973342657089233, 0.188301220536232, 0.038996949791908264, 0.008822928182780743, -0.07048165798187256, 0.0855446457862854, -0.0742373839020729, -0.14302679896354675, -0.05579735338687897, 0.08729292452335358, 0.016605578362941742, 0.023469142615795135, 0.08711627870798111, 0.024545932188630104, 0.09132762253284454, 0.15968108177185059, 0.01990218088030815, -0.09659269452095032, -0.050218869000673294, 0.01175848301500082, 0.027713103219866753, 0.04794301092624664, -0.04514073207974434, -0.00937939714640379, 0.017020760104060173, -0.10303554683923721, 0.031789086759090424, -0.1413339376449585, -0.1358717679977417, 0.044326696544885635, 0.003906996920704842, 0.010907664895057678, 0.02786896750330925, -0.0038291432429105043, 0.019039705395698547, 0.04351753741502762, -0.06975466758012772, 0.047416772693395615, -0.024745507165789604, -0.020031947642564774, 0.03340689837932587, -0.057257164269685745, -0.205775648355484, -0.17696654796600342, 0.00013708483311347663, -0.09910997003316879, 0.10194740444421768, 0.018308809027075768, -0.12373185902833939, 0.047737859189510345, -0.05822649225592613, 0.027574289590120316, -0.01875593699514866, -0.049130141735076904, 0.10507171601057053, 0.1525275856256485, -0.016146350651979446, 0.018018173053860664, -0.04865182936191559, -0.10157987475395203, -0.19632206857204437, 0.0691583976149559, 0.04680244252085686, 0.014610917307436466, 0.10669491440057755, 0.018072687089443207, 0.02367905154824257, -0.007674071006476879, -0.016521066427230835, -0.011659215204417706, -0.08781040459871292, 0.31909599900245667, 0.04510033503174782, -0.025173069909214973, 0.02041010931134224, -0.0043001663871109486, -0.028083480894565582, 0.03263787180185318, -0.0985708013176918, -0.07548979669809341, -0.08774089068174362, -0.04367410019040108, -0.09784720093011856, 0.053299110382795334, 0.05916472524404526, 0.003188040340319276, -0.07727594673633575, 0.04221395403146744, 0.11369874328374863, -0.0923808291554451, -0.07137343287467957, 0.07477962225675583, 0.0972946360707283, -0.07331304252147675, 0.00012658814375754446, 0.00874367356300354, 0.023951783776283264, 0.037102166563272476, 0.06778035312891006, -0.03966575115919113, 0.08589404821395874, -0.19917890429496765, 0.0372927263379097, 0.106058269739151, 0.023754918947815895, 0.0638108178973198, 0.07643651217222214, -0.1058402881026268, -0.008500572293996811, -0.032518330961465836, -0.21341575682163239, 0.1668180525302887, 0.1355515867471695, 0.06788124144077301, -0.025637222453951836, -0.00461410591378808, -0.0649740919470787, 0.05773647129535675, 0.02723747305572033, -0.14758841693401337, 0.004883295856416225, 0.06064270809292793, 0.026899009943008423, 0.01614922471344471, 0.07971042394638062, 0.014697225764393806, -0.1801026314496994, -0.014406266622245312, 0.10730406641960144, 0.002390873385593295, 0.0053148469887673855, -0.03175045922398567, -0.1755964607000351, 0.0751047357916832, 0.004285442177206278, 0.07233936339616776, -0.1676585078239441, 0.14297930896282196, -0.10089799761772156, 0.07726949453353882, -0.004285062663257122, -0.021311495453119278, 0.02507244050502777, -0.0541163794696331, 0.15163759887218475, 0.01058570109307766, -0.021810131147503853, -0.1200498715043068, -0.1717042326927185, -0.019227758049964905, -0.11788936704397202, -0.11679866164922714, 0.050424277782440186, 0.062185097485780716, 0.04923136904835701, -0.061147067695856094, 0.1518532931804657, -0.047422297298908234, 0.060713399201631546, -0.06893875449895859, -0.06755045056343079, 0.03764858841896057, -0.12588608264923096, -0.08176055550575256, 0.05573027580976486, 0.19166934490203857, 0.15833087265491486, -0.02816431224346161, -0.03472423925995827, -0.047419581562280655, -0.006212298292666674, -0.007802055217325687, 0.0275666993111372, 0.023223137483000755, 0.07315318286418915, -0.07681374251842499, -0.11649256944656372, 0.033787861466407776, -0.06713802367448807, -0.055589709430933, -0.015439179725944996, 0.1513158082962036, 0.04671623185276985, 0.07720734924077988, -0.018946662545204163, 0.03887668624520302, -0.001724981120787561, -0.056474871933460236, 0.16197094321250916, 0.03885216265916824, -0.05193585529923439, 0.06837689876556396, 0.053174007683992386, 0.043745119124650955, 0.03011113777756691, -0.026783017441630363, 0.206032395362854, 0.1980147808790207, 0.014206883497536182, 0.2175983190536499, 0.03177616000175476, -0.03772832080721855, -0.1300560086965561, -0.065880686044693, -0.006372632458806038, 0.03559038043022156, 0.08070417493581772, -0.18207235634326935, -0.015011128038167953, -0.05689644813537598, -0.034518610686063766, -0.15059494972229004, -0.28553900122642517, -0.05957856774330139, 0.20075850188732147, 0.14706264436244965, 0.27519428730010986, -0.10432573407888412, 0.035197313874959946, 0.02663275972008705, -0.04912831634283066, -0.006501141935586929, 0.00018665487004909664, 0.10268618166446686, -0.15421873331069946, 0.1176437959074974, 0.08486983180046082, -0.019002694636583328, 0.01058861706405878, -0.1619086116552353, 0.00936629343777895, -0.12191236019134521, 0.05354422330856323, 0.1400289237499237, -0.048128653317689896, -0.054873593151569366, 0.14033560454845428, -0.024562934413552284, -0.22685599327087402, -0.04648222774267197, -0.043600670993328094, -0.010640020482242107, 0.026607351377606392, -0.1013401448726654, 0.04101909324526787, 0.1330099105834961, 0.009380043484270573, 0.1147187277674675, 0.11749245226383209, -0.052566803991794586, 0.10792597383260727, 0.2257719188928604, -0.018785694614052773, 0.04689010605216026, -0.12743118405342102, -0.0012336712097749114, -0.028270328417420387, 0.013657891191542149, -0.09504974633455276, -0.09938385337591171, 0.02366873063147068, 0.02872389927506447, 0.009118586778640747, 0.0921793207526207, -0.029922157526016235, 0.0759170651435852, 0.06817561388015747, -0.13014446198940277, -0.16288450360298157, 0.015828335657715797, -0.007344507612287998, 0.08354310691356659, 0.00027861111448146403, 0.08878035843372345, -0.11932205408811569, -0.018093237653374672, -0.03153328225016594, -0.03319635987281799, -0.130486860871315, -0.07138993591070175, 0.06156524643301964, 0.028095467016100883, -0.06602972000837326, 0.1398407518863678, 0.026440169662237167, 0.15942534804344177, 0.049197953194379807, 0.012499804608523846, 0.07227300107479095, -0.05345509201288223, 0.1283530443906784, 0.13818155229091644, -0.00868943240493536, -0.05460423603653908, -0.1013643890619278, -0.10236792266368866, 0.08925779908895493, -0.05773641914129257, 0.07476430386304855, -0.14885357022285461, -0.06675903499126434, 0.015772046521306038, 0.016141414642333984, -0.09562095999717712, 0.02571965754032135, -0.01625603251159191, -0.18119946122169495, 0.056570518761873245, -0.048285093158483505, 0.0440407395362854, -0.06347788125276566, -0.1110161691904068, -0.17226378619670868, 0.06091433763504028, 0.08593481779098511, -0.053876690566539764, -0.12229149043560028, 0.011023230850696564, -0.00012518465518951416, -0.06341652572154999, -0.05023367330431938, 0.09722746908664703, -0.11020902544260025, 0.031452205032110214, -0.012567701749503613, 0.08853451162576675, -0.03510405123233795, -0.011538895778357983, 0.044220831245183945, -0.08039166033267975, -0.009481523185968399, 0.03534642979502678, -0.026372017338871956, -0.04127239063382149, -0.2689029574394226, 0.0036654395516961813, 0.0341104120016098, 0.02497158572077751, 0.07856601476669312, 0.011906822212040424, 0.021174922585487366, 0.03993808850646019, -0.15396519005298615, -0.013395369984209538, 0.14574195444583893, -0.07689505815505981, -0.022186370566487312, 0.05703273415565491, -0.09054436534643173, 0.013882770203053951, -0.030287226662039757, 0.1345842480659485, 0.023923413828015327, 0.06404478847980499, -0.0851147472858429, 0.10106813907623291, -0.1451139897108078, -0.04998219385743141, -0.01244612317532301, 0.09761348366737366, 0.07019034773111343, -0.10272270441055298, 0.014697125181555748, 0.04210108891129494, 0.19416837394237518, 0.016384804621338844, -0.0356343574821949, -0.03396720811724663, 0.004015897400677204, 0.22076453268527985, 0.03044266067445278, 0.10457023978233337, 0.07281364500522614, -0.026583973318338394, 0.12624378502368927, 0.09929762035608292, 0.11280370503664017, -0.055645186454057693, 0.13904185593128204, 0.04667386785149574, 0.038641396909952164, 0.0614289753139019, 0.06836545467376709, 0.09098632633686066, -0.0008288522367365658, 0.1138714924454689, 0.013811973854899406, -0.02422109805047512, -0.021335409954190254, 0.17759373784065247, 0.10501719266176224, -0.14769648015499115, 0.029047364369034767, -0.01258957851678133, 0.039933037012815475, -0.014194529503583908, -0.15634691715240479, -0.07240267097949982, -0.3315149247646332, 0.1226184144616127, -0.07119352370500565, 0.019930170848965645, 0.007913772016763687, -0.037425633519887924, -0.03296699747443199, -0.04477746784687042, 0.13151589035987854, -0.013641550205647945, -0.006079165264964104, -0.04815853759646416, -0.015360191464424133, -0.11607866734266281, -0.11200575530529022, -0.013207737356424332, -0.13671602308750153, -0.010119039565324783, 0.05595948174595833, 0.003977729007601738, 0.01821410097181797, -0.03142618387937546, 0.0024383175186812878, 0.06541839241981506, -0.05751744285225868, 0.056182678788900375, 0.12097269296646118, 0.08766137808561325, -0.1058853268623352, 0.031048951670527458, 0.2011747509241104, 0.04359564557671547, -0.12483977526426315, 0.01449228823184967, 0.1819491684436798, 0.004885740112513304, 0.017068125307559967, -0.006097703706473112, -0.0540788508951664, -0.07554277032613754, 0.1251034289598465, 0.08296554535627365, -0.09985227137804031, 0.015833314508199692, -0.0726347416639328, -0.01594804972410202, -0.06374675035476685, 0.10130585730075836, 0.09538925439119339, 0.04440245032310486, -0.10621760785579681, -0.08487539738416672, -0.10891728103160858, 0.040588874369859695, -0.08629853278398514, -0.07311757653951645, 0.09629398584365845, -0.07057105004787445, -0.07029950618743896, 0.025521177798509598, -0.17978744208812714, -0.009467960335314274, 0.1711762249469757, -0.24654000997543335, -0.0916430801153183, -0.10857923328876495, 0.14477859437465668, 0.016497576609253883, 0.1013975441455841, -0.006207061931490898, -0.007889035157859325, -0.20577777922153473, 0.024890204891562462, -0.05293011665344238, -0.02073732763528824, 0.07814782857894897, -0.09476397186517715, 0.22629831731319427, -0.08276885002851486, 0.020940175279974937, 0.012659613974392414, 0.0870661810040474, -0.030675338581204414, 0.09283176809549332, -0.03660329803824425, -0.12576518952846527, -0.03620953485369682, 0.03001813031733036, 0.013904244638979435, 0.10071761906147003, 0.09772487729787827, -0.03414725139737129, 0.03389119729399681, 0.09747414290904999, 0.04172342270612717, -0.023843804374337196, 0.0360250361263752, -0.17077107727527618, 0.02182629331946373, -0.018498148769140244, -0.06935930997133255, 0.03687669709324837, -0.06603235751390457, 0.1639697551727295, 0.04022442549467087, 0.0670473501086235, -0.036152735352516174, 0.0073931049555540085, -0.014454689808189869, -0.013775371946394444, -0.026180334389209747, -0.17259705066680908, -0.10422050207853317, -0.1347656100988388, -0.012701659463346004, -0.034971047192811966, 0.04591470584273338, 0.023234914988279343, -0.0003200018545612693, -0.014577031135559082, -0.12090865522623062, 0.04360328987240791, 0.11146783083677292, -0.04631396010518074, -0.026193076744675636 ]
null
null
peft
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed] ### Framework versions - PEFT 0.7.1
{"library_name": "peft", "base_model": "bigscience/bloomz-560m"}
null
alitolga/627_bigscience_bloomz-560m_PrefixTuning
[ "peft", "safetensors", "arxiv:1910.09700", "base_model:bigscience/bloomz-560m", "region:us" ]
2024-02-13T19:50:42+00:00
[ "1910.09700" ]
[]
TAGS #peft #safetensors #arxiv-1910.09700 #base_model-bigscience/bloomz-560m #region-us
# Model Card for Model ID ## Model Details ### Model Description - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact ### Framework versions - PEFT 0.7.1
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact", "### Framework versions\n\n- PEFT 0.7.1" ]
[ "TAGS\n#peft #safetensors #arxiv-1910.09700 #base_model-bigscience/bloomz-560m #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact", "### Framework versions\n\n- PEFT 0.7.1" ]
[ 37, 6, 3, 54, 28, 3, 4, 9, 9, 10, 42, 20, 3, 4, 5, 9, 11, 13, 3, 12, 5, 4, 5, 3, 4, 9, 53, 9, 8, 6, 3, 14, 8, 7, 9, 4, 11 ]
[ "passage: TAGS\n#peft #safetensors #arxiv-1910.09700 #base_model-bigscience/bloomz-560m #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact### Framework versions\n\n- PEFT 0.7.1" ]
[ -0.10523585230112076, 0.19241665303707123, -0.0031757974065840244, 0.03358880802989006, 0.0900687724351883, 0.01673889346420765, 0.04906756430864334, 0.12329605966806412, -0.024018019437789917, 0.11028973013162613, 0.07299679517745972, 0.09383043646812439, 0.10637103021144867, 0.22537995874881744, 0.008622542023658752, -0.20443634688854218, 0.024877045303583145, -0.0932924896478653, -0.009766984730958939, 0.12322424352169037, 0.14876991510391235, -0.09954889863729477, 0.07843997329473495, -0.015178902074694633, -0.003745972877368331, -0.03304009139537811, -0.07042612135410309, -0.0327649861574173, 0.045167192816734314, 0.04655758664011955, 0.05341289937496185, -0.0019919644109904766, 0.08180045336484909, -0.26137927174568176, 0.020476674661040306, 0.04073581472039223, -0.004726028069853783, 0.09095046669244766, 0.099962018430233, -0.042016252875328064, 0.12309090048074722, -0.03883061185479164, 0.1374281346797943, 0.08074498176574707, -0.08964534848928452, -0.23090983927249908, -0.06757918745279312, 0.09116121381521225, 0.17203669250011444, 0.07189728319644928, -0.04134487360715866, 0.12558194994926453, -0.09293898940086365, 0.017597032710909843, 0.05018754303455353, -0.08980989456176758, -0.06720324605703354, 0.05257353186607361, 0.10646983981132507, 0.06132340803742409, -0.1324053555727005, -0.030917949974536896, 0.02312537096440792, 0.03182590752840042, 0.08363530784845352, 0.015494183637201786, 0.1560298353433609, 0.026450002565979958, -0.14756639301776886, -0.037660982459783554, 0.1320207417011261, 0.028916556388139725, -0.03864383324980736, -0.23116697371006012, 0.0018687128322198987, -0.09145312756299973, -0.029878266155719757, -0.05110380798578262, 0.0448804646730423, -0.0013019503094255924, 0.10466146469116211, -0.025618847459554672, -0.08962686359882355, -0.006923926528543234, 0.08776397258043289, 0.053570445626974106, 0.022060947492718697, -0.015906712040305138, 0.004183816257864237, 0.11972536146640778, 0.053863272070884705, -0.13024011254310608, -0.06565803289413452, -0.07039516419172287, -0.04220045730471611, -0.04391878470778465, 0.041882168501615524, 0.05979185923933983, 0.06167091801762581, 0.25591909885406494, -0.0388454906642437, 0.05329905077815056, 0.06787082552909851, 0.018552949652075768, 0.049181658774614334, 0.09450682252645493, -0.06521414965391159, -0.1592588722705841, -0.004605477675795555, 0.09376439452171326, 0.0005042164120823145, -0.020761460065841675, -0.048107679933309555, 0.04182417318224907, 0.03397706523537636, 0.11214511841535568, 0.09487387537956238, -0.01732439175248146, -0.07825370132923126, -0.0515257753431797, 0.2095567286014557, -0.1478605568408966, 0.042528774589300156, 0.02573632076382637, -0.015134245157241821, -0.03461923077702522, 0.005265118554234505, 0.016257334500551224, -0.018507052212953568, 0.09934923052787781, -0.06981006264686584, -0.03895556554198265, -0.11441123485565186, -0.02478533796966076, 0.03222881257534027, 0.012491274625062943, -0.024422524496912956, -0.031752001494169235, -0.060548532754182816, -0.08966064453125, 0.09685845673084259, -0.06548820436000824, -0.06121059134602547, -0.022050341591238976, -0.09153665602207184, 0.0206922497600317, 0.028336403891444206, 0.0978098213672638, -0.026348093524575233, 0.03964997082948685, -0.013343138620257378, 0.06881300359964371, 0.07847799360752106, 0.02967420034110546, -0.07408347725868225, 0.06545528769493103, -0.19890423119068146, 0.08136327564716339, -0.08019448071718216, 0.02992628887295723, -0.1646977812051773, -0.022773409262299538, 0.006634039804339409, 0.01913393847644329, 0.03309927135705948, 0.16120508313179016, -0.20137013494968414, -0.03666067123413086, 0.1523764282464981, -0.09916771948337555, -0.12111477553844452, 0.04339341074228287, -0.04744869843125343, 0.1536482721567154, 0.019986404106020927, -0.00846268329769373, 0.09423771500587463, -0.14243358373641968, -0.03035847283899784, -0.022000709548592567, -0.0032380225602537394, 0.09989037364721298, 0.09331901371479034, -0.07820313423871994, 0.023442070931196213, 0.01275382749736309, -0.051289163529872894, -0.025495029985904694, -0.048937492072582245, -0.1085507795214653, 0.00540738133713603, -0.0783955454826355, 0.024134604260325432, -0.015365035273134708, -0.07900307327508926, -0.012836701236665249, -0.1599644273519516, -0.039853956550359726, 0.08463632315397263, 0.01326426025480032, -0.01923493668437004, -0.09663697332143784, 0.04598912224173546, -0.03421254828572273, -0.020727185532450676, -0.14417283236980438, -0.01997527666389942, 0.019145773723721504, -0.14016810059547424, 0.0017134839436039329, -0.12022976577281952, 0.06801678240299225, 0.01270816195756197, -0.06125415116548538, -0.03753328323364258, -0.008746624924242496, 0.0037420655135065317, -0.05242781713604927, -0.23992861807346344, -0.023524079471826553, -0.05287518724799156, 0.1581195592880249, -0.2198179066181183, 0.03588075190782547, 0.051607877016067505, 0.12248361855745316, 0.005630691070109606, -0.06131529435515404, 0.031266096979379654, -0.06989415735006332, -0.028048785403370857, -0.07290240377187729, -0.007925247773528099, -0.011931350454688072, -0.05169440805912018, 0.013562477193772793, -0.11873611807823181, -0.027231303974986076, 0.09856617450714111, 0.08095627278089523, -0.15867891907691956, -0.018997758626937866, -0.044047631323337555, -0.05815788730978966, -0.08182612806558609, -0.058146338909864426, 0.1059526801109314, 0.05191829800605774, 0.03586176037788391, -0.0753692016005516, -0.06907816976308823, 0.0077302237041294575, -0.021509986370801926, -0.021006150171160698, 0.12078336626291275, 0.0744558721780777, -0.11393174529075623, 0.09912583231925964, 0.07906536012887955, 0.0348358117043972, 0.0799865797162056, -0.025675294920802116, -0.10501261055469513, -0.03237595781683922, 0.04122878238558769, 0.014908581040799618, 0.16092635691165924, -0.05853202939033508, 0.05902481824159622, 0.04949740692973137, -0.04292506352066994, 0.04296985641121864, -0.09149949997663498, 0.010470147244632244, 0.010410319082438946, -0.014122202061116695, 0.01694360561668873, -0.02154412493109703, 0.009827692992985249, 0.08865023404359818, 0.05420755594968796, 0.03771458566188812, 0.02608821913599968, -0.026974573731422424, -0.12966448068618774, 0.19052043557167053, -0.09586833417415619, -0.24120600521564484, -0.1577075868844986, 0.06692901253700256, 0.0580194890499115, -0.018901454284787178, 0.019308896735310555, -0.055879008024930954, -0.10764938592910767, -0.08203180879354477, 0.004297949839383364, 0.03418346866965294, -0.05228374898433685, -0.06812629848718643, 0.04358237609267235, 0.04303496330976486, -0.11903213709592819, 0.03620203584432602, 0.054576002061367035, -0.01700592413544655, 0.004285135772079229, 0.06318338215351105, 0.08366677165031433, 0.176926389336586, -0.009351622313261032, -0.00701331440359354, 0.05170128494501114, 0.2819819748401642, -0.15988358855247498, 0.11772751808166504, 0.13128675520420074, -0.0625220388174057, 0.07675972580909729, 0.19027280807495117, 0.03665157034993172, -0.09958901256322861, 0.030265305191278458, 0.021816782653331757, -0.030062833800911903, -0.26616063714027405, -0.05090520530939102, -0.013881643302738667, -0.08828865736722946, 0.07519932836294174, 0.09267780929803848, 0.07125388085842133, 0.041134532541036606, -0.07206064462661743, -0.09301645308732986, 0.026642965152859688, 0.09961370378732681, -0.02702098898589611, 0.0074653951451182365, 0.0856492817401886, -0.03550904616713524, 0.008434479124844074, 0.09810104221105576, -0.010403451509773731, 0.15947885811328888, 0.05701972544193268, 0.11865700781345367, 0.07600173354148865, 0.08502639085054398, 0.0015560240717604756, 0.03203246369957924, 0.011299531906843185, 0.018066775053739548, 0.013962003402411938, -0.08596988022327423, 0.03359764814376831, 0.1123242974281311, 0.04192127659916878, 0.03988516330718994, 0.015966808423399925, -0.038672350347042084, 0.051353514194488525, 0.17383505403995514, 0.009224682115018368, -0.20165102183818817, -0.08086571097373962, 0.0678367018699646, -0.07777981460094452, -0.13405178487300873, -0.01706111803650856, 0.03466441482305527, -0.1654392033815384, 0.022367950528860092, -0.03812387213110924, 0.09710913896560669, -0.08105769753456116, -0.0383942611515522, 0.09093685448169708, 0.06708948314189911, -0.025771886110305786, 0.05627673491835594, -0.19453325867652893, 0.11929468810558319, 0.025498932227492332, 0.06639986485242844, -0.08882134407758713, 0.10080153495073318, 0.0015830716583877802, -0.002827994292601943, 0.17064854502677917, 0.00396798737347126, -0.0647670105099678, -0.06802137941122055, -0.10151977092027664, -0.012294994667172432, 0.0955352932214737, -0.13954830169677734, 0.0693773552775383, -0.017436619848012924, -0.032949939370155334, -0.00379717699252069, -0.08365845680236816, -0.12080235034227371, -0.16962020099163055, 0.05526595562696457, -0.10065776854753494, 0.02963905967772007, -0.09450186789035797, -0.06402581185102463, 0.0016447230009362102, 0.1758534163236618, -0.221049502491951, -0.099439837038517, -0.15045523643493652, -0.09203857183456421, 0.1627749800682068, -0.04604145884513855, 0.09013167768716812, 0.002967356238514185, 0.16763457655906677, 0.013427999801933765, -0.01146218553185463, 0.1064327210187912, -0.09382712095975876, -0.2002546489238739, -0.05447737127542496, 0.1717241257429123, 0.12985308468341827, 0.04311872273683548, -0.013666221871972084, 0.031310997903347015, -0.05757446959614754, -0.12240282446146011, 0.024062639102339745, 0.14237621426582336, 0.061671651899814606, -0.012238296680152416, -0.028690282255411148, -0.10236315429210663, -0.054911743849515915, -0.045961882919073105, 0.01290530152618885, 0.19624820351600647, -0.07879310846328735, 0.16853630542755127, 0.1116180568933487, -0.051553890109062195, -0.2098962813615799, 0.05223813280463219, 0.05341963469982147, 0.018939435482025146, 0.04021522030234337, -0.19576120376586914, 0.09381387382745743, 0.006120661273598671, -0.07449401915073395, 0.16733454167842865, -0.1706794649362564, -0.1431451290845871, 0.09323684126138687, 0.03286112844944, -0.21452036499977112, -0.14260877668857574, -0.10008791089057922, -0.02469412423670292, -0.1082526296377182, 0.06333810836076736, 0.0031522398348897696, 0.012239672243595123, 0.02257772907614708, 0.014246078208088875, 0.026076704263687134, -0.050657421350479126, 0.20578831434249878, -0.03135096654295921, 0.007632825523614883, -0.04551493376493454, -0.0879964530467987, 0.03318847715854645, -0.04471033439040184, 0.10509884357452393, -0.005405787844210863, 0.024511296302080154, -0.15388575196266174, -0.0401189960539341, -0.05525004863739014, 0.03608360514044762, -0.09478496760129929, -0.0900716707110405, -0.05145189166069031, 0.09415885806083679, 0.09304048120975494, -0.02845882624387741, 0.004157790914177895, -0.08647133409976959, 0.06840072572231293, 0.1937006562948227, 0.1954476237297058, 0.06905819475650787, -0.07046634703874588, 0.02594001777470112, -0.03211519122123718, 0.045530736446380615, -0.2342580407857895, 0.04159945249557495, 0.05917493626475334, 0.023545416072010994, 0.08149808645248413, -0.009017148986458778, -0.15184861421585083, -0.06937459856271744, 0.08419203013181686, -0.05064030736684799, -0.17817333340644836, -0.027072446420788765, 0.03763740882277489, -0.21101301908493042, -0.03896739333868027, 0.021290738135576248, -0.020186137408018112, -0.0426216647028923, 0.021235913038253784, 0.08351550251245499, -0.016517294570803642, 0.1104542687535286, 0.09197460114955902, 0.09579703211784363, -0.10553829371929169, 0.06775181740522385, 0.07408850640058517, -0.04314756393432617, 0.026339082047343254, 0.11338170617818832, -0.04394453391432762, -0.03332458809018135, 0.08504169434309006, 0.08312982320785522, 0.02883407101035118, -0.04837585240602493, 0.00947035662829876, -0.06586188077926636, 0.05955604463815689, 0.11171329021453857, 0.029615500941872597, -0.005859312601387501, 0.056268736720085144, 0.029634524136781693, -0.09356749802827835, 0.11066773533821106, 0.049894604831933975, 0.021461177617311478, -0.0453205406665802, -0.03595864400267601, -0.012104660272598267, -0.013802928850054741, -0.022205475717782974, -0.002463884651660919, -0.093277707695961, -0.012235240079462528, -0.08797411620616913, 0.024712510406970978, -0.07128290832042694, 0.011230859905481339, 0.025415560230612755, -0.050730958580970764, 0.0016498506302013993, 0.00703368429094553, -0.07419323921203613, -0.0479341484606266, -0.0171243604272604, 0.0845983475446701, -0.12876904010772705, 0.03626353666186333, 0.07491008192300797, -0.10523191094398499, 0.0799737498164177, -0.007761073298752308, 0.011362619698047638, 0.0009681414812803268, -0.15830397605895996, 0.055595822632312775, -0.01710372604429722, -0.012087428011000156, 0.016550347208976746, -0.21253953874111176, -0.005875639151781797, -0.04672057181596756, -0.05788135156035423, 0.010339708998799324, -0.025199927389621735, -0.12678846716880798, 0.09565634280443192, -0.007822254672646523, -0.061709191650152206, -0.018793558701872826, 0.032944243401288986, 0.0961802750825882, -0.026637839153409004, 0.13656403124332428, -0.02090754173696041, 0.07221148908138275, -0.17452722787857056, -0.007870275527238846, -0.02083730883896351, 0.033059969544410706, -0.03470858931541443, -0.019441574811935425, 0.05895300582051277, -0.01840386539697647, 0.18348339200019836, -0.02386590838432312, 0.06815049052238464, 0.05841715633869171, 0.022102653980255127, 0.020089847967028618, 0.08656177669763565, 0.06457642465829849, 0.0010756644187495112, -0.005756031721830368, 0.03298773244023323, -0.00698737520724535, -0.035285964608192444, -0.15905046463012695, 0.07067011296749115, 0.1529911458492279, 0.04265623912215233, 0.0208640955388546, 0.027203455567359924, -0.11200641095638275, -0.07752584666013718, 0.11473242193460464, -0.015071135014295578, -0.038491759449243546, -0.0664471983909607, 0.17551042139530182, 0.14022375643253326, -0.19934086501598358, 0.07161565124988556, -0.050131965428590775, -0.047581031918525696, -0.1361912041902542, -0.156560480594635, -0.06512465327978134, -0.04133807122707367, -0.023492639884352684, -0.06827943027019501, 0.04574837163090706, 0.05852936953306198, 0.007651002146303654, -0.01660618558526039, 0.11054229736328125, 0.011537956073880196, -0.024892907589673996, 0.053094282746315, 0.06877226382493973, 0.03142937645316124, -0.09274056553840637, 0.006562992464751005, -0.002574546728283167, 0.011637810617685318, 0.0662066861987114, 0.020060386508703232, -0.057023800909519196, 0.013940774835646152, -0.021744590252637863, -0.12022101879119873, 0.03750484809279442, -0.0167926587164402, -0.03901829198002815, 0.14159266650676727, 0.0287017859518528, 0.007514316588640213, -0.02378738299012184, 0.22852195799350739, -0.07505472749471664, -0.065031036734581, -0.1404382884502411, 0.07582728564739227, -0.06629271060228348, 0.03596862778067589, 0.030712660402059555, -0.11467297375202179, 0.01477861125022173, 0.15465913712978363, 0.13047051429748535, -0.013332086615264416, 0.013724202290177345, 0.04304558411240578, 0.004834852647036314, -0.03137184679508209, 0.022832293063402176, 0.05349003151059151, 0.15041334927082062, -0.07183294743299484, 0.07061224430799484, -0.008850648067891598, -0.0803975835442543, -0.024627355858683586, 0.10974038392305374, -0.0020760318730026484, 0.001918512280099094, -0.06927435845136642, 0.13579262793064117, -0.08620598912239075, -0.22067508101463318, 0.06020862236618996, -0.07802305370569229, -0.15387435257434845, -0.05075832083821297, 0.023893559351563454, -0.014367419295012951, 0.012081216089427471, 0.07923830300569534, -0.05044680833816528, 0.16309694945812225, 0.04737137630581856, -0.05732227861881256, -0.08187534660100937, 0.059840165078639984, -0.14279983937740326, 0.27845558524131775, 0.022909710183739662, 0.0441056489944458, 0.10610579699277878, -0.021039210259914398, -0.1487336903810501, 0.009871214628219604, 0.10736533999443054, -0.07100019603967667, 0.06379495561122894, 0.17384633421897888, 0.004322149325162172, 0.1251736730337143, 0.05971107631921768, -0.05656478926539421, 0.03428587689995766, -0.09498327970504761, -0.048072509467601776, -0.10915200412273407, 0.08333861827850342, -0.08273552358150482, 0.15997040271759033, 0.12181850522756577, -0.07399049401283264, -0.008166846819221973, -0.02298089489340782, 0.0860132947564125, 0.015783848240971565, 0.11300382763147354, 0.016389554366469383, -0.18897756934165955, 0.034580983221530914, 0.002739573363214731, 0.10515239834785461, -0.20032383501529694, -0.0600222572684288, 0.03838657587766647, -0.011450165882706642, -0.08480072021484375, 0.11553619801998138, 0.04811837151646614, 0.03262196481227875, -0.038944851607084274, -0.04684358835220337, 0.007074012886732817, 0.1403563916683197, -0.11253374069929123, -0.005194281227886677 ]
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # he This model is a fine-tuned version of [openai/whisper-medium](https://huggingface.co/openai/whisper-medium) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.1370 - Precision: 0.0 - Recall: 0.0 - F1: 0.0 - Precision Median: 0.0 - Recall Median: 0.0 - F1 Median: 0.0 - Precision Max: 0 - Recall Max: 0 - F1 Max: 0 - Precision Min: 0 - Recall Min: 0 - F1 Min: 0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - training_steps: 8000 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Precision Median | Recall Median | F1 Median | Precision Max | Recall Max | F1 Max | Precision Min | Recall Min | F1 Min | |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:---:|:----------------:|:-------------:|:---------:|:-------------:|:----------:|:------:|:-------------:|:----------:|:------:| | 0.0675 | 0.4 | 1000 | 0.1219 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0 | 0 | 0 | 0 | 0 | 0 | | 0.0313 | 0.79 | 2000 | 0.1256 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0 | 0 | 0 | 0 | 0 | 0 | | 0.0262 | 1.19 | 3000 | 0.1282 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0 | 0 | 0 | 0 | 0 | 0 | | 0.0063 | 1.58 | 4000 | 0.1273 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0 | 0 | 0 | 0 | 0 | 0 | | 0.0073 | 1.98 | 5000 | 0.1299 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0 | 0 | 0 | 0 | 0 | 0 | | 0.0024 | 2.37 | 6000 | 0.1337 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0 | 0 | 0 | 0 | 0 | 0 | | 0.0014 | 2.77 | 7000 | 0.1375 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0 | 0 | 0 | 0 | 0 | 0 | | 0.0002 | 3.17 | 8000 | 0.1370 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0 | 0 | 0 | 0 | 0 | 0 | ### Framework versions - Transformers 4.36.2 - Pytorch 1.13.1+cu117 - Datasets 2.16.1 - Tokenizers 0.15.0
{"language": ["he"], "license": "apache-2.0", "tags": ["hf-asr-leaderboard", "generated_from_trainer"], "metrics": ["precision", "recall", "f1"], "base_model": "openai/whisper-medium", "model-index": [{"name": "he", "results": []}]}
automatic-speech-recognition
cantillation/whisper-medium-he-teamim-aviv-bavly-without-nikud-8000-steps-lr-1e-5
[ "transformers", "tensorboard", "safetensors", "whisper", "automatic-speech-recognition", "hf-asr-leaderboard", "generated_from_trainer", "he", "base_model:openai/whisper-medium", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2024-02-13T19:51:55+00:00
[]
[ "he" ]
TAGS #transformers #tensorboard #safetensors #whisper #automatic-speech-recognition #hf-asr-leaderboard #generated_from_trainer #he #base_model-openai/whisper-medium #license-apache-2.0 #endpoints_compatible #region-us
he == This model is a fine-tuned version of openai/whisper-medium on an unknown dataset. It achieves the following results on the evaluation set: * Loss: 0.1370 * Precision: 0.0 * Recall: 0.0 * F1: 0.0 * Precision Median: 0.0 * Recall Median: 0.0 * F1 Median: 0.0 * Precision Max: 0 * Recall Max: 0 * F1 Max: 0 * Precision Min: 0 * Recall Min: 0 * F1 Min: 0 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 1e-05 * train\_batch\_size: 8 * eval\_batch\_size: 8 * seed: 42 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * training\_steps: 8000 * mixed\_precision\_training: Native AMP ### Training results ### Framework versions * Transformers 4.36.2 * Pytorch 1.13.1+cu117 * Datasets 2.16.1 * Tokenizers 0.15.0
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* training\\_steps: 8000\n* mixed\\_precision\\_training: Native AMP", "### Training results", "### Framework versions\n\n\n* Transformers 4.36.2\n* Pytorch 1.13.1+cu117\n* Datasets 2.16.1\n* Tokenizers 0.15.0" ]
[ "TAGS\n#transformers #tensorboard #safetensors #whisper #automatic-speech-recognition #hf-asr-leaderboard #generated_from_trainer #he #base_model-openai/whisper-medium #license-apache-2.0 #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* training\\_steps: 8000\n* mixed\\_precision\\_training: Native AMP", "### Training results", "### Framework versions\n\n\n* Transformers 4.36.2\n* Pytorch 1.13.1+cu117\n* Datasets 2.16.1\n* Tokenizers 0.15.0" ]
[ 81, 112, 4, 33 ]
[ "passage: TAGS\n#transformers #tensorboard #safetensors #whisper #automatic-speech-recognition #hf-asr-leaderboard #generated_from_trainer #he #base_model-openai/whisper-medium #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* training\\_steps: 8000\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.36.2\n* Pytorch 1.13.1+cu117\n* Datasets 2.16.1\n* Tokenizers 0.15.0" ]
[ -0.13952678442001343, 0.13760671019554138, -0.0009907667990773916, 0.0731511041522026, 0.07706434279680252, -0.022045891731977463, 0.17154338955879211, 0.1411016881465912, -0.023418527096509933, 0.08025364577770233, 0.11429449915885925, 0.07894987612962723, 0.04571417719125748, 0.19021855294704437, -0.0724325031042099, -0.20112581551074982, 0.0648873969912529, 0.014276285655796528, 0.01712890900671482, 0.10055114328861237, 0.08754560351371765, -0.12008161842823029, 0.06399913132190704, 0.01927725039422512, -0.12630683183670044, -0.024693476036190987, -0.00118794955778867, -0.0755893662571907, 0.10557924956083298, 0.004912843927741051, 0.05579911544919014, 0.060139305889606476, 0.039720699191093445, -0.18639256060123444, 0.012972129508852959, 0.03654402121901512, 0.024205895140767097, 0.07217274606227875, 0.022682450711727142, -0.027045251801609993, 0.018946774303913116, -0.05972093716263771, 0.06868983060121536, 0.026841260492801666, -0.10287246108055115, -0.28050535917282104, -0.09663904458284378, 0.046937115490436554, 0.07949692755937576, 0.06466169655323029, -0.0203043594956398, 0.16817793250083923, -0.01628543995320797, 0.09072087705135345, 0.22755467891693115, -0.28468215465545654, -0.031776703894138336, -0.023549186065793037, 0.010189351625740528, 0.08779580146074295, -0.09294401109218597, -0.030389506369829178, 0.03891710564494133, 0.03215733543038368, 0.11770915985107422, -0.007067703176289797, -0.02558491751551628, -0.03147807717323303, -0.13320457935333252, -0.042327314615249634, 0.18268534541130066, 0.06001894921064377, -0.0418139211833477, -0.10970889031887054, -0.05916851386427879, -0.1422257274389267, -0.053465183824300766, 0.00783709529787302, 0.01714334264397621, -0.04224567115306854, -0.06981661170721054, -0.0019230357138440013, -0.07530824840068817, -0.06802692264318466, 0.006617770995944738, 0.17419444024562836, 0.02822243608534336, 0.006909553427249193, -0.012679963372647762, 0.05437130481004715, -0.02239426225423813, -0.16795431077480316, -0.029268383979797363, 0.02272201143205166, -0.013073282316327095, -0.020118432119488716, -0.034041255712509155, -0.09409584105014801, 0.050375208258628845, 0.12834513187408447, -0.06968899071216583, 0.08941254764795303, -0.037597957998514175, 0.03497729450464249, -0.08893536776304245, 0.179300457239151, -0.02209293283522129, 0.0032493313774466515, 0.028836525976657867, 0.13841572403907776, 0.08438237756490707, -0.04030782729387283, -0.10689365118741989, 0.05641775205731392, 0.13812528550624847, 0.015039708465337753, -0.041532158851623535, 0.06400702148675919, -0.05551290512084961, -0.014205406419932842, 0.041402075439691544, -0.13188843429088593, 0.009645515121519566, 0.011061979457736015, -0.03469259291887283, -0.08862080425024033, 0.003439254593104124, 0.02847924456000328, -0.020423486828804016, 0.05579133331775665, -0.06915009766817093, -0.0025668847374618053, -0.04885515943169594, -0.100181445479393, 0.01614198461174965, -0.07382695376873016, 0.012615492567420006, -0.10052677989006042, -0.1294967234134674, -0.010314902290701866, 0.03608638420701027, -0.027782682329416275, -0.03062490001320839, -0.08732552826404572, -0.07889549434185028, 0.04023495689034462, -0.022246452048420906, 0.019667863845825195, -0.07449056953191757, 0.07786805182695389, 0.06526114046573639, 0.07905144989490509, -0.04640629142522812, 0.039044953882694244, -0.087006114423275, 0.05081067234277725, -0.1926354020833969, 0.09074555337429047, -0.10400845110416412, 0.08984377980232239, -0.10454744100570679, -0.06916479766368866, 0.01490875892341137, -0.023379774764180183, 0.08863601833581924, 0.10926618427038193, -0.18347099423408508, -0.07480361312627792, 0.22343622148036957, -0.12846027314662933, -0.1595882773399353, 0.1490848809480667, -0.03279385343194008, 0.02964216284453869, 0.0630139708518982, 0.28662222623825073, 0.06001934036612511, -0.11092141270637512, -0.025489632040262222, -0.023300452157855034, 0.08798319846391678, -0.03929382190108299, 0.0860239565372467, -0.0307818204164505, 0.040558282285928726, 0.015564420260488987, -0.008822024799883366, 0.026614176109433174, -0.05765838176012039, -0.09495244175195694, -0.03586816415190697, -0.09195293486118317, 0.014058771543204784, 0.03174616023898125, 0.05286972224712372, -0.12222225964069366, -0.09023800492286682, 0.0018819324905052781, 0.10002937912940979, -0.10864338278770447, 0.03565613552927971, -0.1368420273065567, 0.12036441266536713, -0.074301578104496, -0.01329475361853838, -0.15412472188472748, 0.02960028499364853, 0.0373089462518692, -0.014117766171693802, 0.0010936218313872814, -0.07170562446117401, 0.07991264760494232, 0.05462566018104553, -0.03818214684724808, -0.061103854328393936, 0.004349499940872192, 0.02890978753566742, -0.08710318803787231, -0.21379980444908142, -0.0400642529129982, -0.059801697731018066, 0.17757809162139893, -0.1734239012002945, 0.03144999220967293, 0.08930457383394241, 0.10888177901506424, 0.05095699429512024, -0.035207681357860565, 0.014924664050340652, 0.057349152863025665, -0.014458558522164822, -0.07834315299987793, 0.04642454534769058, 0.04416881501674652, -0.1359025090932846, 0.016599703580141068, -0.1949283331632614, 0.1385640799999237, 0.13898994028568268, 0.04813786968588829, -0.03269350528717041, -0.016416041180491447, -0.03351717069745064, -0.0428471714258194, -0.005029583815485239, -0.009645259007811546, 0.15527039766311646, 0.02649957314133644, 0.14066742360591888, -0.10725679248571396, -0.03373376652598381, 0.03921515867114067, -0.0407106950879097, -0.014339433051645756, 0.10785772651433945, -0.047277580946683884, -0.132393941283226, 0.1259441375732422, 0.11730287969112396, -0.06180701032280922, 0.13737724721431732, -0.07550913095474243, -0.05782076716423035, -0.014859952963888645, 0.033482447266578674, 0.03721017763018608, 0.13284920156002045, -0.13213294744491577, -0.024915073066949844, 0.01738641783595085, 0.011608945205807686, 0.023564757779240608, -0.18829461932182312, 0.004559609107673168, 0.019910352304577827, -0.057519443333148956, -0.03229069337248802, 0.007612329907715321, -0.016086114570498466, 0.08095963299274445, 0.0031280757393687963, -0.09673266112804413, 0.0335075780749321, -0.024667827412486076, -0.08361590653657913, 0.18478013575077057, -0.10253606736660004, -0.1724604070186615, -0.09979367256164551, -0.07909002155065536, -0.05355162173509598, 0.013824192807078362, 0.06999900937080383, -0.08301566541194916, -0.03849232196807861, -0.1311292052268982, -0.053875114768743515, 0.035378530621528625, 0.014415501616895199, 0.07987746596336365, -0.012146681547164917, 0.09308107197284698, -0.1078517884016037, -0.021903080865740776, -0.029622290283441544, 0.022843776270747185, 0.05162031203508377, 0.0004946519038639963, 0.08472432941198349, 0.1417517364025116, 0.007321772165596485, 0.049713049083948135, -0.03294040262699127, 0.22037537395954132, -0.0685766339302063, -0.04922356829047203, 0.10206390172243118, -0.035044148564338684, 0.07619155198335648, 0.15660342574119568, 0.040895696729421616, -0.1011463850736618, -0.011618933640420437, -0.020370330661535263, -0.03803703561425209, -0.17504626512527466, -0.04744196683168411, -0.04563859477639198, -0.012982247397303581, 0.09720916301012039, 0.03132845461368561, 0.02753303386271, 0.0323278047144413, 0.006423890125006437, 0.015224621631205082, 0.0018121659522876143, 0.08569394052028656, 0.08655977994203568, 0.037783537060022354, 0.10520908981561661, -0.039392124861478806, -0.03541753068566322, 0.014944486320018768, 0.04177634045481682, 0.19254761934280396, -0.0008386825793422759, 0.20633086562156677, 0.036244604736566544, 0.15625877678394318, 0.028071323409676552, 0.05782323703169823, -0.02554476074874401, -0.018940959125757217, 0.002473631640896201, -0.07180411368608475, -0.06879820674657822, 0.048791106790304184, -0.029697395861148834, 0.04685012251138687, -0.08001735061407089, 0.050428230315446854, 0.05386698991060257, 0.3132568895816803, 0.07936623692512512, -0.34533125162124634, -0.09218253195285797, 0.021174196153879166, -0.035398341715335846, -0.026906849816441536, 0.016544535756111145, 0.147795170545578, -0.037619225680828094, 0.06854726374149323, -0.06303583085536957, 0.07304415851831436, -0.07455359399318695, 0.03186802566051483, 0.0014981778804212809, 0.06567404419183731, -0.010920550674200058, 0.04257208853960037, -0.24520844221115112, 0.2914424240589142, 0.026327654719352722, 0.0871846005320549, -0.05868685990571976, -0.006491821724921465, 0.02831323631107807, 0.029263511300086975, 0.09139426052570343, 0.0013169600861147046, -0.1287093162536621, -0.18151646852493286, -0.12736305594444275, 0.02790444903075695, 0.08577512204647064, 0.015392295084893703, 0.09314057976007462, -0.009068428538739681, -0.006695159245282412, 0.04199163243174553, -0.05941822752356529, -0.04966486617922783, -0.10599589347839355, 0.03105395846068859, 0.1097242534160614, 0.013497880659997463, -0.09728604555130005, -0.09354797005653381, -0.04867929592728615, 0.11048930883407593, -0.03032960742712021, -0.056073617190122604, -0.1036311537027359, 0.019731516018509865, 0.07743531465530396, -0.08009391278028488, 0.015340824611485004, 0.01355274673551321, 0.12542477250099182, -0.0015819519758224487, -0.05475936084985733, 0.10154446959495544, -0.05626315250992775, -0.15706577897071838, -0.033907417207956314, 0.1657419502735138, 0.014555132016539574, 0.04535266011953354, 0.012320934794843197, 0.031170940026640892, -0.012120737694203854, -0.06435918807983398, 0.0658680647611618, 0.018835190683603287, 0.02507408708333969, -0.003217795630916953, 0.018829308450222015, -0.013658439740538597, -0.0854177474975586, -0.0248419176787138, 0.17979297041893005, 0.2741365432739258, -0.07260720431804657, 0.06396795809268951, 0.09374501556158066, -0.02898370660841465, -0.17090082168579102, -0.0044535500928759575, 0.057127926498651505, 0.005476508289575577, -0.020859582349658012, -0.13927561044692993, 0.04013827443122864, 0.04457934573292732, -0.04107222333550453, 0.060816291719675064, -0.25094643235206604, -0.12892790138721466, 0.14319847524166107, 0.11143214255571365, 0.09231234341859818, -0.1407514214515686, -0.06730140745639801, -0.04323867708444595, -0.10737331956624985, 0.06762491166591644, -0.1253475397825241, 0.10789571702480316, -0.000917725614272058, 0.07326857000589371, 0.018295206129550934, -0.0589795745909214, 0.12151063978672028, 0.005122160539031029, 0.07136988639831543, -0.044394031167030334, 0.023533230647444725, 0.069204181432724, -0.0849379375576973, 0.05348295345902443, -0.10203447937965393, 0.045528966933488846, -0.07183389365673065, -0.010949411429464817, -0.05864160507917404, 0.007944165728986263, -0.0030316761694848537, -0.022552739828824997, -0.0007312982343137264, 0.03316262736916542, 0.07386661320924759, -0.0021686870604753494, 0.1614617258310318, -0.026289092376828194, 0.13835783302783966, 0.14880427718162537, 0.12064467370510101, -0.1468285471200943, -0.02314603142440319, 0.004483194090425968, -0.03312119469046593, 0.05832451209425926, -0.12782999873161316, 0.0651787668466568, 0.09732037037611008, 0.026690365746617317, 0.1419166922569275, 0.055366016924381256, -0.09106750786304474, 0.039437685161828995, 0.06767778843641281, -0.16316545009613037, -0.1653350293636322, -0.007606088649481535, 0.08649016916751862, -0.1238330751657486, 0.08083775639533997, 0.11859474331140518, -0.051391396671533585, -0.004457693547010422, -0.018413100391626358, 0.028104174882173538, -0.03241231292486191, 0.17186608910560608, 0.04778032749891281, 0.07109935581684113, -0.11447151750326157, 0.07386885583400726, 0.03504394739866257, -0.09499535709619522, 0.0744684487581253, 0.05127740651369095, -0.10268817842006683, -0.029224537312984467, -0.01123687345534563, 0.14173001050949097, 0.04390224441885948, -0.07855525612831116, -0.14006291329860687, -0.11288508772850037, 0.04256857559084892, 0.18568961322307587, 0.06725504249334335, 0.021585842594504356, -0.01086790394037962, -0.01787094958126545, -0.09837163239717484, 0.09473828971385956, 0.022000515833497047, 0.06130092218518257, -0.1499934047460556, 0.06993622332811356, -0.011192400008440018, 0.025801515206694603, -0.013540349900722504, -0.009987715631723404, -0.11548613011837006, 0.01847647875547409, -0.11473560333251953, 0.04102512449026108, -0.058336373418569565, 0.017535876482725143, 0.0022530837450176477, -0.04265516623854637, -0.05170251056551933, 0.03366095945239067, -0.11137335002422333, -0.01873284950852394, 0.0238687414675951, 0.027061615139245987, -0.11010918766260147, -0.025221722200512886, 0.012613009661436081, -0.10025765001773834, 0.09700404852628708, 0.05194196477532387, -0.0284629724919796, 0.0359262079000473, -0.09125072509050369, -0.024013075977563858, 0.06453672796487808, 0.009561005048453808, 0.06120120733976364, -0.13431687653064728, -0.0463961623609066, 0.021345246583223343, 0.019788216799497604, 0.022361766546964645, 0.1500340849161148, -0.09785444289445877, 0.007376182824373245, -0.031098319217562675, -0.03051861748099327, -0.07203586399555206, 0.05828973278403282, 0.10208345204591751, 0.0373634435236454, 0.17041431367397308, -0.09380073845386505, 0.008811553940176964, -0.17912623286247253, -0.003556671552360058, 0.0034365549217909575, -0.12335660308599472, -0.08699744939804077, -0.006721549201756716, 0.0733736976981163, -0.07280978560447693, 0.13470712304115295, -0.045589473098516464, 0.02128145657479763, 0.029710613191127777, -0.05916701629757881, -0.007507008500397205, 0.033561475574970245, 0.2017853707075119, 0.01558584626764059, -0.03136829286813736, 0.09086005389690399, -0.007929814979434013, 0.0693659633398056, 0.14398543536663055, 0.1380322277545929, 0.15778329968452454, 0.06342986226081848, 0.09603700041770935, 0.06900295615196228, -0.024371188133955002, -0.1570601910352707, 0.0743936151266098, -0.03841729462146759, 0.11335005611181259, -0.0037189426366239786, 0.20141440629959106, 0.13495440781116486, -0.1375390589237213, 0.03955613076686859, -0.06035585701465607, -0.08669664710760117, -0.10519300401210785, -0.10567913949489594, -0.10087175667285919, -0.16226288676261902, 0.00846761092543602, -0.10945020616054535, 0.0116187809035182, 0.08215869218111038, 0.016232384368777275, 0.012161771766841412, 0.14202728867530823, 0.012147095985710621, 0.03928237408399582, 0.07253586500883102, -0.018151864409446716, -0.065265953540802, -0.03780866041779518, -0.08725323528051376, 0.07422500848770142, 0.01203945092856884, 0.05191102623939514, -0.018970537930727005, -0.03722473606467247, 0.05648024380207062, -0.03521604463458061, -0.12602633237838745, 0.01785377413034439, 0.012790442444384098, 0.061440665274858475, 0.024692518636584282, 0.04656894505023956, -0.02105541154742241, -0.004960860591381788, 0.2022348791360855, -0.08375901728868484, -0.08730907738208771, -0.13542121648788452, 0.1978864073753357, -0.010117840021848679, -0.02181730978190899, 0.014685289934277534, -0.10064903646707535, 0.008276062086224556, 0.1616653949022293, 0.1625422090291977, -0.03863252326846123, 0.01609131693840027, -0.058260589838027954, -0.0037538218311965466, -0.07501925528049469, 0.0681939646601677, 0.10508722811937332, 0.009566280990839005, -0.05151880159974098, -0.04011636599898338, -0.045718759298324585, -0.042590998113155365, -0.044953469187021255, 0.05301603302359581, 0.010764438658952713, 0.007052600383758545, -0.059060707688331604, 0.058849018067121506, -0.04279002174735069, -0.0945587158203125, -0.019621266052126884, -0.20441994071006775, -0.14971590042114258, -0.007214282639324665, 0.062041617929935455, 0.011230170726776123, 0.026565302163362503, -0.006494682282209396, 0.011321411468088627, 0.047168903052806854, -0.016233956441283226, -0.04816127568483353, -0.06583292037248611, 0.0779670923948288, -0.14518049359321594, 0.18744049966335297, -0.02696164697408676, 0.057091400027275085, 0.12782254815101624, 0.06180205196142197, -0.09806308895349503, 0.09131740033626556, 0.039458245038986206, -0.0702233761548996, 0.013744531199336052, 0.15673649311065674, -0.03724237158894539, 0.13343793153762817, 0.048342593014240265, -0.10944563150405884, -0.005292977672070265, -0.06908832490444183, -0.03489036113023758, -0.03952891007065773, -0.0329834446310997, -0.04485289379954338, 0.13533951342105865, 0.15117937326431274, -0.07289766520261765, -0.015959937125444412, -0.032919663935899734, 0.009103664197027683, 0.045004796236753464, -0.013906300067901611, -0.053070612251758575, -0.2679717540740967, 0.007777462247759104, 0.04157641902565956, -0.005221229046583176, -0.2835926413536072, -0.0790155753493309, -0.007521207444369793, -0.04189062491059303, -0.05326061323285103, 0.09724845737218857, 0.11498071253299713, 0.04840371385216713, -0.07876396924257278, -0.03308158367872238, -0.042580537497997284, 0.1574714630842209, -0.13668611645698547, -0.08251844346523285 ]
null
null
diffusers
# SDXL LoRA DreamBooth - tonyassi/gucci-ss18-fashion-dreambooth by [Tony Assi](https://www.tonyassi.com/) Dreambooth style based on the [Gucci SS18](https://www.vogue.com/fashion-shows/spring-2018-ready-to-wear/gucci) collection. ![](https://cdn.discordapp.com/attachments/1120417968032063538/1207054482253545482/gucci-spring-2018-ready-to-wear-1.png?ex=65de4016&is=65cbcb16&hm=ae523fa3240cc3d1ef1ed70e6695d2556db0c3480550994889084435016758ce&) ## Trigger words Use **Gucci SS18 style** in the prompt to trigger the style. ## How to use ```bash pip install diffusers accelerate ``` ```python import torch from diffusers import DiffusionPipeline, AutoencoderKL # Load the pipeline vae = AutoencoderKL.from_pretrained("madebyollin/sdxl-vae-fp16-fix", torch_dtype=torch.float16) pipe = DiffusionPipeline.from_pretrained( "stabilityai/stable-diffusion-xl-base-1.0", vae=vae, torch_dtype=torch.float16, variant="fp16", use_safetensors=True ) pipe.load_lora_weights("tonyassi/gucci-ss18-fashion-dreambooth") pipe.to("cuda") # Generate image prompt = "Gucci SS18 style, megan fox wearing a gold mesh dress with crystals" image = pipe(prompt=prompt, height=1024, width=1024, num_inference_steps=50, negative_prompt="ugly, deformed face, deformed body").images[0] image ``` ## Examples ![](https://cdn.discordapp.com/attachments/1120417968032063538/1207065510416490506/Gucci_SS18_style_marilyn_monroe_wearing_elegant_dress_made_of_crystals.png?ex=65de4a5b&is=65cbd55b&hm=4ad902b026b9dd0d77965022920d2626773e157bdb02f077e754a6cfd2335078&) **Gucci SS18 style, marilyn monroe wearing elegant dress made of crystals** ![](https://cdn.discordapp.com/attachments/1120417968032063538/1207065510802358272/Gucci_SS18_style_young_nicole_kidman_wearing_a_pink_sequins_leotard_70s_glam_rock.png?ex=65de4a5b&is=65cbd55b&hm=9063fad06244703dc875197c396ddc972ac660f30bb0434ab9026a01d7adcf25&) **Gucci SS18 style, young nicole kidman wearing a pink sequins leotard, 70s, glam rock** ![](https://cdn.discordapp.com/attachments/1120417968032063538/1207065509506584627/Gucci_SS18_style_david_bowie_suit_70s_glam_rock.png?ex=65de4a5b&is=65cbd55b&hm=b27d4d0c937633cb7939dfe358b0f4219a72127f20b522303310c4dcb735d506&) **Gucci SS18 style, david bowie, suit, 70s, glam rock** ![](https://cdn.discordapp.com/attachments/1120417968032063538/1207065511209345064/Gucci_SS18_style_young_nicole_kidman_wearing_a_pink_sequins_leotard_stocking_70s_glam_rock.png?ex=65de4a5c&is=65cbd55c&hm=44517e4c123285b4a64d65583668f5d28885c2c571959decf4719ecd71d45867&) **Gucci SS18 style, young nicole kidman wearing a pink sequins leotard, stocking, 70s, glam rock** ![](https://cdn.discordapp.com/attachments/1120417968032063538/1207065510047645748/Gucci_SS18_style_david_bowie.png?ex=65de4a5b&is=65cbd55b&hm=6adee4b4af51a43d9ece17f989790b4b1c712c0093885fc224d45e21bd651cee&) **Gucci SS18 style, david bowie** ## Model description These are tonyassi/gucci-ss18-fashion-dreambooth LoRA adaption weights for stabilityai/stable-diffusion-xl-base-1.0. The weights were trained using [DreamBooth](https://dreambooth.github.io/). LoRA for the text encoder was enabled: False. Special VAE used for training: madebyollin/sdxl-vae-fp16-fix. ## Download model Weights for this model are available in Safetensors format. [Download](https://huggingface.co/tonyassi/gucci-ss18-fashion-dreambooth/tree/main) them in the Files & versions tab.
{"license": "openrail++", "tags": ["stable-diffusion-xl", "stable-diffusion-xl-diffusers", "text-to-image", "diffusers", "lora", "template:sd-lora"], "base_model": "stabilityai/stable-diffusion-xl-base-1.0", "instance_prompt": "Gucci SS18 style"}
text-to-image
tonyassi/gucci-ss18-fashion-dreambooth
[ "diffusers", "stable-diffusion-xl", "stable-diffusion-xl-diffusers", "text-to-image", "lora", "template:sd-lora", "base_model:stabilityai/stable-diffusion-xl-base-1.0", "license:openrail++", "has_space", "region:us" ]
2024-02-13T19:57:10+00:00
[]
[]
TAGS #diffusers #stable-diffusion-xl #stable-diffusion-xl-diffusers #text-to-image #lora #template-sd-lora #base_model-stabilityai/stable-diffusion-xl-base-1.0 #license-openrail++ #has_space #region-us
# SDXL LoRA DreamBooth - tonyassi/gucci-ss18-fashion-dreambooth by Tony Assi Dreambooth style based on the Gucci SS18 collection. ![](URL ## Trigger words Use Gucci SS18 style in the prompt to trigger the style. ## How to use ## Examples ![](URL Gucci SS18 style, marilyn monroe wearing elegant dress made of crystals ![](URL Gucci SS18 style, young nicole kidman wearing a pink sequins leotard, 70s, glam rock ![](URL Gucci SS18 style, david bowie, suit, 70s, glam rock ![](URL Gucci SS18 style, young nicole kidman wearing a pink sequins leotard, stocking, 70s, glam rock ![](URL Gucci SS18 style, david bowie ## Model description These are tonyassi/gucci-ss18-fashion-dreambooth LoRA adaption weights for stabilityai/stable-diffusion-xl-base-1.0. The weights were trained using DreamBooth. LoRA for the text encoder was enabled: False. Special VAE used for training: madebyollin/sdxl-vae-fp16-fix. ## Download model Weights for this model are available in Safetensors format. Download them in the Files & versions tab.
[ "# SDXL LoRA DreamBooth - tonyassi/gucci-ss18-fashion-dreambooth\n\nby Tony Assi\n\nDreambooth style based on the Gucci SS18 collection.\n\n![](URL", "## Trigger words\nUse Gucci SS18 style in the prompt to trigger the style.", "## How to use", "## Examples\n![](URL\nGucci SS18 style, marilyn monroe wearing elegant dress made of crystals\n\n![](URL\nGucci SS18 style, young nicole kidman wearing a pink sequins leotard, 70s, glam rock\n\n![](URL \nGucci SS18 style, david bowie, suit, 70s, glam rock\n\n![](URL\nGucci SS18 style, young nicole kidman wearing a pink sequins leotard, stocking, 70s, glam rock\n\n![](URL\nGucci SS18 style, david bowie", "## Model description\nThese are tonyassi/gucci-ss18-fashion-dreambooth LoRA adaption weights for stabilityai/stable-diffusion-xl-base-1.0.\n\nThe weights were trained using DreamBooth.\n\nLoRA for the text encoder was enabled: False.\n\nSpecial VAE used for training: madebyollin/sdxl-vae-fp16-fix.", "## Download model\nWeights for this model are available in Safetensors format.\n\nDownload them in the Files & versions tab." ]
[ "TAGS\n#diffusers #stable-diffusion-xl #stable-diffusion-xl-diffusers #text-to-image #lora #template-sd-lora #base_model-stabilityai/stable-diffusion-xl-base-1.0 #license-openrail++ #has_space #region-us \n", "# SDXL LoRA DreamBooth - tonyassi/gucci-ss18-fashion-dreambooth\n\nby Tony Assi\n\nDreambooth style based on the Gucci SS18 collection.\n\n![](URL", "## Trigger words\nUse Gucci SS18 style in the prompt to trigger the style.", "## How to use", "## Examples\n![](URL\nGucci SS18 style, marilyn monroe wearing elegant dress made of crystals\n\n![](URL\nGucci SS18 style, young nicole kidman wearing a pink sequins leotard, 70s, glam rock\n\n![](URL \nGucci SS18 style, david bowie, suit, 70s, glam rock\n\n![](URL\nGucci SS18 style, young nicole kidman wearing a pink sequins leotard, stocking, 70s, glam rock\n\n![](URL\nGucci SS18 style, david bowie", "## Model description\nThese are tonyassi/gucci-ss18-fashion-dreambooth LoRA adaption weights for stabilityai/stable-diffusion-xl-base-1.0.\n\nThe weights were trained using DreamBooth.\n\nLoRA for the text encoder was enabled: False.\n\nSpecial VAE used for training: madebyollin/sdxl-vae-fp16-fix.", "## Download model\nWeights for this model are available in Safetensors format.\n\nDownload them in the Files & versions tab." ]
[ 82, 47, 18, 4, 129, 95, 28 ]
[ "passage: TAGS\n#diffusers #stable-diffusion-xl #stable-diffusion-xl-diffusers #text-to-image #lora #template-sd-lora #base_model-stabilityai/stable-diffusion-xl-base-1.0 #license-openrail++ #has_space #region-us \n# SDXL LoRA DreamBooth - tonyassi/gucci-ss18-fashion-dreambooth\n\nby Tony Assi\n\nDreambooth style based on the Gucci SS18 collection.\n\n![](URL## Trigger words\nUse Gucci SS18 style in the prompt to trigger the style.## How to use## Examples\n![](URL\nGucci SS18 style, marilyn monroe wearing elegant dress made of crystals\n\n![](URL\nGucci SS18 style, young nicole kidman wearing a pink sequins leotard, 70s, glam rock\n\n![](URL \nGucci SS18 style, david bowie, suit, 70s, glam rock\n\n![](URL\nGucci SS18 style, young nicole kidman wearing a pink sequins leotard, stocking, 70s, glam rock\n\n![](URL\nGucci SS18 style, david bowie## Model description\nThese are tonyassi/gucci-ss18-fashion-dreambooth LoRA adaption weights for stabilityai/stable-diffusion-xl-base-1.0.\n\nThe weights were trained using DreamBooth.\n\nLoRA for the text encoder was enabled: False.\n\nSpecial VAE used for training: madebyollin/sdxl-vae-fp16-fix.## Download model\nWeights for this model are available in Safetensors format.\n\nDownload them in the Files & versions tab." ]
[ -0.019707415252923965, 0.22241799533367157, -0.004406807944178581, 0.07900799810886383, 0.12193217128515244, -0.01721348613500595, 0.06812304258346558, 0.12098677456378937, 0.13506869971752167, 0.10365070402622223, -0.05160728842020035, 0.03908849507570267, 0.09851735085248947, 0.05375472456216812, 0.09543228149414062, -0.20411045849323273, -0.02282683365046978, -0.12696492671966553, -0.026373127475380898, 0.05938360095024109, 0.051364872604608536, -0.04011444002389908, 0.07836581766605377, -0.04217975214123726, 0.03499554097652435, 0.019771965220570564, -0.013698955066502094, 0.024083392694592476, -0.027988187968730927, 0.06584027409553528, 0.04204309731721878, 0.047152601182460785, 0.06987784057855606, -0.3750828802585602, 0.024609172716736794, 0.10135456174612045, -0.020463241264224052, -0.059610217809677124, 0.025028521195054054, -0.11085367947816849, -0.0032128957100212574, -0.06904133409261703, 0.0625365823507309, 0.06926532834768295, -0.058851417154073715, -0.14995786547660828, -0.08625417202711105, 0.11508800089359283, 0.046972598880529404, 0.07709019631147385, -0.012785316444933414, -0.0586920790374279, -0.04038408398628235, 0.00855195987969637, 0.2211693674325943, -0.02393897995352745, -0.03874745965003967, -0.008365398272871971, 0.11727690696716309, 0.11127449572086334, -0.10015128552913666, 0.043518174439668655, -0.0908244177699089, 0.0036027145106345415, 0.04055629298090935, -0.018261954188346863, 0.09289199113845825, -0.0011297166347503662, -0.06976912915706635, 0.007786016911268234, 0.0860602855682373, -0.0220638457685709, -0.10718242079019547, -0.1129506453871727, 0.018614519387483597, 0.17138619720935822, -0.0985189825296402, -0.06836596876382828, 0.037619609385728836, 0.05088483542203903, 0.007136689964681864, -0.11275818198919296, -0.06049777567386627, 0.00887623243033886, 0.1323179006576538, 0.15114669501781464, -0.0211319662630558, -0.03109334036707878, 0.0370931439101696, -0.006250858306884766, -0.010771626606583595, -0.09133262187242508, -0.002847513183951378, -0.09409569203853607, -0.03304697200655937, 0.0652545839548111, 0.007138813380151987, -0.037017129361629486, 0.09900681674480438, 0.12650206685066223, 0.08801209926605225, 0.0437009371817112, 0.03696750849485397, -0.02138163335621357, -0.0370476208627224, 0.12957318127155304, -0.026645395904779434, -0.1328786313533783, -0.018582647666335106, 0.04742598161101341, 0.0671864002943039, -0.014599179849028587, -0.01473622489720583, 0.02409493736922741, -0.126723051071167, 0.015435987152159214, 0.11691289395093918, -0.05322026461362839, -0.07393215596675873, -0.010571025311946869, 0.15101633965969086, -0.0805220901966095, 0.08947594463825226, -0.004750877618789673, 0.0569683276116848, -0.06346480548381805, -0.003313949331641197, 0.053461953997612, -0.0039509874768555164, 0.047903988510370255, -0.08023467659950256, -0.06145835667848587, -0.049558140337467194, -0.0044363634660840034, 0.023875417187809944, -0.0852731317281723, -0.07376464456319809, -0.021623268723487854, -0.11215575039386749, -0.0805085077881813, 0.016857698559761047, -0.04056943580508232, 0.031433962285518646, -0.03598187118768692, -0.08697269856929779, 0.015889331698417664, 0.10980931669473648, -0.036333877593278885, -0.0020607830956578255, 0.09300053119659424, -0.08865660429000854, 0.09060276299715042, 0.11799488216638565, 0.04273586347699165, -0.07030967622995377, 0.003843537298962474, -0.1306522935628891, 0.15288375318050385, -0.14259257912635803, 0.07738639414310455, -0.08313994109630585, -0.025236690416932106, -0.0013960669748485088, 0.04912435635924339, 0.009246856905519962, 0.1340055614709854, -0.1977408081293106, -0.10604721307754517, 0.2077186107635498, -0.12485893070697784, 0.03494783863425255, 0.077080637216568, -0.020239612087607384, 0.043104566633701324, 0.09665201604366302, 0.10137283802032471, 0.19667382538318634, -0.045318882912397385, -0.06858699768781662, -0.10118161141872406, 0.028630252927541733, 0.08940033614635468, -0.016072476282715797, -0.08076221495866776, 0.0709129199385643, 0.0311544518917799, -0.09858234226703644, -0.051387231796979904, -0.022403644397854805, -0.0494348369538784, 0.02394765242934227, -0.048987582325935364, -0.010393192060291767, 0.041449323296546936, -0.1270328015089035, -0.031747713685035706, -0.14097845554351807, -0.10444324463605881, -0.005732344929128885, -0.019099542871117592, -0.012546893209218979, -0.0695963203907013, 0.01022020261734724, 0.1438647210597992, -0.03571489453315735, -0.025234375149011612, -0.12111334502696991, 0.06452783197164536, 0.016352718695998192, 0.057853467762470245, -0.1424788385629654, 0.04764239117503166, 0.004658228252083063, -0.04457244277000427, -0.02230243571102619, -0.029392899945378304, -0.03416893631219864, -0.018541237339377403, -0.15124563872814178, 0.02714477851986885, -0.04541579633951187, 0.11959998309612274, -0.09035022556781769, 0.019150113686919212, 0.06373394280672073, 0.1283244788646698, 0.07795429974794388, -0.1086290031671524, 0.012362667359411716, 0.003242933889850974, 0.011822815984487534, -0.058543067425489426, 0.01354182232171297, -0.005421188194304705, 0.027383966371417046, 0.009873936884105206, -0.06555873900651932, -0.14680208265781403, 0.03683508187532425, 0.05576588213443756, -0.07175369560718536, 0.03402404114603996, -0.016591189429163933, 0.0034762085415422916, -0.05554857477545738, -0.03648523986339569, 0.12030045688152313, 0.007176365237683058, -0.02045370079576969, -0.032413966953754425, -0.006789017468690872, 0.025840934365987778, 0.003104341449216008, -0.053353361785411835, 0.02642972767353058, 0.04550427198410034, 0.046055227518081665, 0.08586956560611725, -0.06583885848522186, 0.021261679008603096, 0.22952842712402344, 0.017013508826494217, -0.0088157644495368, 0.029063841328024864, -0.020727073773741722, 0.03961152583360672, 0.07498915493488312, -0.027763662859797478, -0.005646875593811274, -0.007822948507964611, -0.08427845686674118, -0.058893583714962006, -0.1163533553481102, -0.04646429792046547, 0.04091683775186539, -0.044696252793073654, 0.08351685851812363, -0.021442918106913567, -0.030393211171030998, 0.012828717939555645, 0.057558171451091766, 0.036699943244457245, -0.07629133015871048, -0.03818991035223007, -0.10591716319322586, 0.06463710963726044, -0.0980387032032013, -0.17024414241313934, -0.09209221601486206, -0.01806865818798542, 0.039014190435409546, 0.034930936992168427, 0.04769105836749077, -0.1264139860868454, -0.04637801647186279, -0.07571559399366379, 0.07817063480615616, 0.03992075100541115, -0.010857652872800827, -0.05105844512581825, 0.0008820911752991378, -0.033422790467739105, -0.039592232555150986, -0.0299507025629282, -0.0068738702684640884, -0.09629154950380325, -0.039055708795785904, 0.0009671238367445767, 0.1226188987493515, 0.022538015618920326, 0.04448774456977844, -0.016975538805127144, -0.0022749437484890223, 0.1872999519109726, -0.09615439176559448, 0.1405172348022461, 0.12195201963186264, 0.09411201626062393, 0.1129828542470932, 0.2312065064907074, 0.005972873419523239, -0.013074742630124092, 0.006606015842407942, 0.08705083280801773, -0.054807618260383606, -0.1242581307888031, -0.057000696659088135, -0.0008892503683455288, -0.10365904867649078, -0.005325858946889639, 0.07531678676605225, 0.09419398009777069, 0.058104004710912704, -0.09405026584863663, -0.05371195450425148, 0.045220207422971725, 0.07193306088447571, 0.07070137560367584, 0.03446489945054054, 0.058189503848552704, -0.022758787497878075, -0.050122570246458054, 0.09415611624717712, -0.051193542778491974, 0.16357125341892242, 0.029853390529751778, 0.07439947873353958, 0.028930747881531715, -0.008485029451549053, 0.005700353998690844, -0.1409391164779663, 0.02086826227605343, 0.03566761314868927, -0.00868996698409319, -0.08156357705593109, 0.04464712366461754, 0.12977296113967896, 0.07482243329286575, -0.14989474415779114, 0.0272847767919302, -0.060236599296331406, 0.06340895593166351, 0.23030531406402588, -0.0030197633896023035, -0.08881980180740356, -0.037702471017837524, 0.06898468732833862, -0.02482987567782402, -0.015314810909330845, -0.024793069809675217, 0.07270237058401108, -0.07887154817581177, 0.10902515798807144, 0.008749254047870636, 0.04252311959862709, -0.0808195248246193, -0.031612493097782135, 0.2386854588985443, 0.12653937935829163, -0.013880491256713867, 0.027304133400321007, -0.03609493002295494, 0.012169851921498775, 0.03885604813694954, 0.07358148694038391, -0.0007834747084416449, 0.03547315299510956, 0.027735549956560135, -0.09386633336544037, 0.13139769434928894, 0.05151886120438576, -0.05740215256810188, -0.05650752782821655, 0.04104403033852577, -0.002615702338516712, 0.16337043046951294, -0.17046071588993073, 0.11520833522081375, -0.03576174005866051, -0.09247958660125732, -0.053085993975400925, -0.02219892293214798, -0.1362762302160263, -0.12609155476093292, 0.04488939046859741, -0.0034199205692857504, 0.013599671423435211, -0.05265164002776146, -0.028705013915896416, -0.02061612904071808, 0.0987139567732811, -0.10483673214912415, -0.16455355286598206, -0.0961708202958107, -0.09096306562423706, 0.07503020018339157, -0.03859347850084305, 0.07883001863956451, 0.020676087588071823, 0.14147156476974487, -0.02512165904045105, -0.031031891703605652, 0.0006869258941151202, -0.0508570522069931, -0.16584442555904388, -0.12362395972013474, 0.16929757595062256, 0.06589486449956894, -0.012403803877532482, -0.0011546224122866988, 0.04499480500817299, 0.01810425892472267, -0.10728719085454941, 0.13184210658073425, 0.111654132604599, -0.07998085021972656, 0.06713824719190598, -0.05483575537800789, -0.039035212248563766, -0.1101549044251442, -0.0009559981990605593, 0.009227496571838856, 0.15830370783805847, -0.08810997754335403, 0.14843574166297913, 0.02248568646609783, -0.0759531781077385, -0.20167851448059082, -0.05056062713265419, 0.03988518938422203, -0.030152704566717148, 0.0058147902600467205, -0.3054209351539612, 0.0000907651410670951, -0.00873168371617794, -0.019361479207873344, 0.17345698177814484, -0.1575845181941986, -0.05500009283423424, -0.12041234225034714, 0.08684051036834717, -0.08719313144683838, -0.16616331040859222, -0.11397753655910492, -0.033604174852371216, -0.007390150334686041, 0.10410869866609573, 0.21633696556091309, 0.04207785055041313, 0.03899698331952095, -0.008976088836789131, 0.04467945173382759, -0.008443161845207214, 0.0795583575963974, -0.02273123525083065, -0.0037875170819461346, -0.0550956167280674, -0.021025093272328377, 0.04306374862790108, -0.007900035940110683, -0.02279658429324627, -0.03284715488553047, -0.009372805245220661, 0.00506774103268981, -0.05040116608142853, -0.08594507724046707, -0.009264723397791386, -0.05769752711057663, -0.05440036579966545, -0.08179222047328949, 0.061730291694402695, 0.07862921804189682, -0.032130297273397446, -0.17978700995445251, -0.10293847322463989, 0.059788405895233154, -0.0034497217275202274, 0.05674155801534653, 0.13692404329776764, -0.11110926419496536, -0.05492323637008667, -0.03532031551003456, 0.016574744135141373, -0.10044770687818527, 0.050058331340551376, 0.03117728978395462, 0.05098940059542656, 0.11454367637634277, -0.00988265872001648, -0.11525087803602219, -0.0355415977537632, 0.12165073305368423, -0.04100971668958664, -0.2138199359178543, -0.06930083781480789, 0.012119178660213947, -0.11005263030529022, -0.1754128485918045, 0.0368570014834404, 0.0022919997572898865, -0.07968045026063919, -0.01464089099317789, 0.10982611775398254, 0.08882369846105576, -0.00023925564892124385, 0.09937457740306854, -0.0075509073212742805, -0.014866919256746769, 0.06281367689371109, 0.04880372807383537, 0.06194715574383736, -0.03319154679775238, 0.20405425131320953, -0.040146928280591965, 0.0017471736064180732, 0.03229697048664093, 0.029546521604061127, -0.08755753934383392, 0.014425043947994709, -0.01667041704058647, -0.06183260306715965, 0.004592985846102238, 0.08677153289318085, -0.010966905392706394, -0.03798777237534523, 0.09061577916145325, 0.04702533036470413, -0.02472330816090107, 0.12198729813098907, -0.005345407407730818, -0.006961033679544926, -0.13099198043346405, -0.10456517338752747, 0.027019403874874115, -0.043608080595731735, 0.007365758065134287, -0.018449541181325912, -0.06635058671236038, -0.014630828984081745, 0.04318452253937721, 0.026574162766337395, -0.05202139541506767, -0.07208272069692612, -0.03061874583363533, -0.01334713026881218, -0.015573164448142052, -0.005775699391961098, -0.014966161921620369, -0.10787191987037659, -0.01631242036819458, 0.08239677548408508, -0.16175477206707, -0.011701189912855625, 0.11349134892225266, -0.07304906100034714, 0.0628354549407959, 0.0026785354129970074, -0.03857247531414032, 0.003591222455725074, -0.16825905442237854, 0.041071731597185135, -0.0040409257635474205, 0.0028150014113634825, -0.03381812572479248, -0.01670941710472107, 0.007094438653439283, -0.1072983667254448, -0.09368041157722473, -0.035161785781383514, 0.006813875865191221, -0.0685674324631691, 0.09241130203008652, 0.02323717251420021, -0.04597879573702812, -0.02173932082951069, 0.021119488403201103, 0.181386336684227, 0.017200272530317307, 0.006864007096737623, -0.07724417746067047, 0.06012174114584923, -0.11706619709730148, -0.027994763106107712, -0.007405432872474194, 0.03169053792953491, -0.07563263922929764, -0.018516426905989647, 0.027887478470802307, 0.04304710030555725, 0.02350117452442646, 0.058428820222616196, 0.05521635338664055, -0.021480681374669075, -0.014264700934290886, 0.12073031067848206, 0.02518814243376255, 0.040688060224056244, -0.004133300390094519, -0.060770194977521896, 0.06402571499347687, -0.0625176876783371, -0.016786575317382812, 0.0022725663147866726, 0.013755835592746735, 0.11694226413965225, 0.23919400572776794, 0.03460170328617096, 0.08557922393083572, -0.09892280399799347, -0.11141547560691833, 0.07930685579776764, 0.01629459112882614, -0.011675897054374218, -0.0992109552025795, 0.05480056256055832, 0.10482386499643326, -0.17659059166908264, 0.1325886994600296, 0.008119486272335052, -0.05069144815206528, -0.024558624252676964, -0.1596628874540329, -0.07401759922504425, -0.036825425922870636, -0.0016848184168338776, -0.08380180597305298, 0.019461829215288162, 0.04855072498321533, -0.05689241737127304, -0.0164425540715456, -0.0063298605382442474, -0.04087720438838005, -0.08080195635557175, 0.030451510101556778, 0.019781028851866722, 0.031873591244220734, 0.07658976316452026, 0.0078007918782532215, 0.03451964631676674, 0.024376356974244118, 0.011122733354568481, 0.062485698610544205, -0.022315075621008873, 0.0634244903922081, -0.11316860467195511, -0.0789947435259819, 0.05247049406170845, -0.04567554593086243, -0.009715781547129154, 0.07614243775606155, 0.07817742228507996, 0.010458349250257015, 0.031359974294900894, 0.22796134650707245, -0.014453601092100143, 0.08754190057516098, -0.07310420274734497, 0.02977263182401657, -0.0747804120182991, 0.028418004512786865, -0.01639651693403721, -0.13109487295150757, -0.020061766728758812, 0.13105560839176178, 0.0226539708673954, -0.037304919213056564, 0.014327162876725197, 0.04528845474123955, 0.016984717920422554, 0.024914169684052467, 0.008524861186742783, -0.0007223541615530849, 0.28889134526252747, -0.03501500189304352, 0.0430278405547142, 0.0013101274380460382, -0.08032604306936264, -0.08541759848594666, 0.09664914011955261, -0.0014981135027483106, 0.010545676574110985, -0.09093578159809113, 0.1337568461894989, -0.03004913032054901, -0.17440412938594818, 0.20811773836612701, -0.11189216375350952, -0.08613089472055435, 0.024325836449861526, -0.05962032079696655, 0.033682290464639664, 0.05129802227020264, 0.060313913971185684, -0.03366370126605034, 0.08637261390686035, 0.058138661086559296, -0.07816962152719498, -0.02878871001303196, -0.02677706815302372, -0.06941132247447968, 0.2134372740983963, -0.04903985559940338, -0.019821712747216225, 0.04198097065091133, 0.055025018751621246, -0.052601832896471024, -0.004276474006474018, 0.06135975569486618, -0.11473855376243591, -0.0916338786482811, 0.2062360942363739, -0.0548359751701355, 0.05708589777350426, 0.08277056366205215, 0.039825063198804855, 0.07334751635789871, 0.013108190149068832, -0.09590591490268707, -0.04004548862576485, 0.07670260220766068, -0.21494536101818085, 0.06332488358020782, 0.09390266239643097, -0.023295097053050995, -0.017096582800149918, -0.03267566114664078, -0.03782045468688011, 0.03316326439380646, 0.18608932197093964, -0.03300051391124725, -0.043651554733514786, 0.00730477599427104, 0.1342359334230423, 0.0847463309764862, -0.041061483323574066, -0.07342750579118729, -0.01118212565779686, 0.033362988382577896, -0.03908267244696617, 0.11844128370285034, 0.10649380087852478, -0.01971311680972576, -0.04344910383224487, -0.14670273661613464, 0.030980531126260757, 0.14116071164608002, -0.07973425835371017, 0.016027729958295822 ]
null
null
null
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> This modelcard aims to be a base template for new models. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/modelcard_template.md?plain=1). ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{}
null
samtsp/mental_health
[ "arxiv:1910.09700", "region:us" ]
2024-02-13T19:58:04+00:00
[ "1910.09700" ]
[]
TAGS #arxiv-1910.09700 #region-us
# Model Card for Model ID This modelcard aims to be a base template for new models. It has been generated using this raw template. ## Model Details ### Model Description - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID\n\n\n\nThis modelcard aims to be a base template for new models. It has been generated using this raw template.", "## Model Details", "### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#arxiv-1910.09700 #region-us \n", "# Model Card for Model ID\n\n\n\nThis modelcard aims to be a base template for new models. It has been generated using this raw template.", "## Model Details", "### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ 15, 29, 3, 54, 28, 3, 4, 9, 9, 10, 42, 20, 3, 4, 5, 9, 11, 13, 3, 12, 5, 4, 5, 3, 4, 9, 53, 9, 8, 6, 3, 14, 8, 7, 9, 4 ]
[ "passage: TAGS\n#arxiv-1910.09700 #region-us \n# Model Card for Model ID\n\n\n\nThis modelcard aims to be a base template for new models. It has been generated using this raw template.## Model Details### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact" ]
[ -0.1066984087228775, 0.19898438453674316, -0.002620849059894681, 0.027911467477679253, 0.09412756562232971, 0.02142420969903469, 0.05197415128350258, 0.12995286285877228, -0.022686492651700974, 0.09772004932165146, 0.07303693890571594, 0.09985987842082977, 0.11060800403356552, 0.19985371828079224, 0.022886212915182114, -0.19676423072814941, 0.0380873903632164, -0.07859895378351212, -0.0053507364355027676, 0.12146519124507904, 0.14281919598579407, -0.09727081656455994, 0.09723988175392151, -0.0014166681794449687, -0.036095861345529556, -0.032103247940540314, -0.07407337427139282, -0.015863366425037384, 0.04475326091051102, 0.04351950064301491, 0.06786411255598068, -0.005140793044120073, 0.08499236404895782, -0.25888389348983765, 0.01854773797094822, 0.04429004341363907, -0.010532835498452187, 0.08963978290557861, 0.08659035712480545, -0.0503561794757843, 0.133544921875, -0.022494465112686157, 0.13020861148834229, 0.09404259920120239, -0.09533175081014633, -0.22292618453502655, -0.06270451098680496, 0.08238474279642105, 0.17174527049064636, 0.08238210529088974, -0.04212580993771553, 0.11285244673490524, -0.08852370828390121, 0.009028825908899307, 0.026999270543456078, -0.06745808571577072, -0.0541369654238224, 0.06993507593870163, 0.10711865872144699, 0.058814842253923416, -0.11843946576118469, -0.02349478006362915, 0.02842799760401249, 0.03442508354783058, 0.06341962516307831, 0.009891163557767868, 0.16722379624843597, 0.02796054631471634, -0.14623764157295227, -0.045774128288030624, 0.14920903742313385, 0.03012506291270256, -0.04215821996331215, -0.20709463953971863, -0.006747975014150143, -0.08887778222560883, -0.02198212593793869, -0.04837455600500107, 0.049169208854436874, 0.017894232645630836, 0.1088389903306961, -0.04266509786248207, -0.09933976829051971, -0.01135216560214758, 0.09378619492053986, 0.0346422903239727, 0.014278990216553211, -0.007528449408710003, -0.0004577430372592062, 0.1285746544599533, 0.05469144508242607, -0.12714460492134094, -0.06107940524816513, -0.07115962356328964, -0.038405947387218475, -0.0369131825864315, 0.02906898967921734, 0.04100806638598442, 0.043637074530124664, 0.256840318441391, -0.004247268196195364, 0.055150821805000305, 0.07816781103610992, 0.029096456244587898, 0.05928764492273331, 0.1027042493224144, -0.05899541452527046, -0.16195039451122284, -0.010378924198448658, 0.08282309025526047, -0.001994097838178277, -0.03426366299390793, -0.07769469916820526, 0.04409316927194595, 0.03231460228562355, 0.10194675624370575, 0.10213643312454224, -0.00840142834931612, -0.06900389492511749, -0.06323889642953873, 0.2089644968509674, -0.140780970454216, 0.04491811990737915, 0.014242668636143208, -0.02261270396411419, -0.03249955177307129, 0.01572984829545021, 0.02586018666625023, -0.03010505624115467, 0.0799839049577713, -0.07839398831129074, -0.03527021035552025, -0.12673307955265045, -0.027471506968140602, 0.022427299991250038, -0.003538058837875724, -0.020284080877900124, -0.028788061812520027, -0.07818468660116196, -0.09368987381458282, 0.11955387890338898, -0.06332498788833618, -0.04999423027038574, -0.03507404401898384, -0.08191268146038055, 0.029741330072283745, 0.037989094853401184, 0.09118469059467316, -0.02362010069191456, 0.030886046588420868, -0.011452100239694118, 0.06288884580135345, 0.05513912811875343, 0.03946930542588234, -0.08444610983133316, 0.06066261604428291, -0.20766597986221313, 0.09049645811319351, -0.061090268194675446, 0.035395506769418716, -0.16141952574253082, -0.007028630934655666, 0.010053620673716068, 0.03627076745033264, 0.029977615922689438, 0.15956375002861023, -0.21718010306358337, -0.033392831683158875, 0.13398465514183044, -0.10535142570734024, -0.10920390486717224, 0.03313661739230156, -0.05438750982284546, 0.1846301406621933, 0.021011749282479286, -0.0004254789964761585, 0.07448633760213852, -0.12248582392930984, -0.023236358538269997, -0.017138930037617683, -0.02830614522099495, 0.0713268369436264, 0.081262968480587, -0.09127437323331833, 0.018645839765667915, 0.012333624064922333, -0.04975755140185356, -0.027676379308104515, -0.039810311049222946, -0.1081002727150917, -0.0050798640586435795, -0.07177256047725677, 0.0032630744390189648, -0.016572296619415283, -0.08035643398761749, 0.001617362373508513, -0.16802436113357544, -0.02455195039510727, 0.07920140773057938, 0.00814704317599535, -0.014599336311221123, -0.09017311781644821, 0.05735816806554794, -0.0606040433049202, -0.027315329760313034, -0.14675214886665344, 0.0052270544692873955, 0.013336344622075558, -0.14710111916065216, 0.020299475640058517, -0.10115809738636017, 0.06349264085292816, 0.011477353982627392, -0.04672861844301224, -0.04369935020804405, 0.00010367027425672859, 0.003724793205037713, -0.053453478962183, -0.23140744864940643, -0.03374785929918289, -0.044856321066617966, 0.15386763215065002, -0.21619822084903717, 0.036581382155418396, 0.04023589566349983, 0.11894483119249344, -0.0035392972640693188, -0.05845631659030914, 0.02484537661075592, -0.07670248299837112, -0.039557602256536484, -0.07007710635662079, 0.001572409993968904, -0.0014461677055805922, -0.04872008040547371, 0.016424696892499924, -0.12396717816591263, -0.06818132847547531, 0.11014950275421143, 0.04142379015684128, -0.15492349863052368, -0.0041915783658623695, -0.030916975811123848, -0.06000775843858719, -0.05342598259449005, -0.05972793325781822, 0.11303042620420456, 0.04413822293281555, 0.03973376750946045, -0.07595038414001465, -0.05902018025517464, 0.010918207466602325, -0.029565786942839622, -0.016297759488224983, 0.093429334461689, 0.0999373197555542, -0.12134627997875214, 0.0989208072423935, 0.07267444580793381, 0.03332529962062836, 0.08463598042726517, -0.010384823195636272, -0.10775253921747208, -0.031286682933568954, 0.028272075578570366, 0.002783637959510088, 0.16402263939380646, -0.07852847874164581, 0.05485396459698677, 0.04151233285665512, -0.02716772072017193, 0.05665498599410057, -0.0957925021648407, 0.01761704683303833, 0.021760543808341026, -0.005722802598029375, 0.007382235489785671, -0.030695531517267227, -0.00876180361956358, 0.07580762356519699, 0.06536837667226791, 0.03976568952202797, 0.033472709357738495, -0.029367417097091675, -0.13732430338859558, 0.1905110627412796, -0.10305890440940857, -0.22935202717781067, -0.1717958003282547, 0.05177067220211029, 0.05311822518706322, -0.006569376215338707, 0.025543777272105217, -0.06189529225230217, -0.10580533742904663, -0.08138279616832733, 0.018486447632312775, 0.006763557903468609, -0.06118743494153023, -0.09133443981409073, 0.039310213178396225, 0.03854845091700554, -0.130089670419693, 0.03713662177324295, 0.05565035715699196, -0.013944907113909721, -0.01169645506888628, 0.04666148126125336, 0.09532339870929718, 0.19625136256217957, -0.007600754965096712, -0.008414373733103275, 0.06695323437452316, 0.291735976934433, -0.15341155230998993, 0.12901893258094788, 0.12389722466468811, -0.07051796466112137, 0.08391714096069336, 0.18495109677314758, 0.03306480497121811, -0.09837296605110168, 0.020791195333003998, 0.02582281269133091, -0.026949184015393257, -0.2523637115955353, -0.05266418308019638, -0.006489538121968508, -0.11327216774225235, 0.0706535056233406, 0.08739753067493439, 0.09179038554430008, 0.052998367697000504, -0.06112697720527649, -0.09166146069765091, -0.0003765109577216208, 0.11145967245101929, -0.04029975086450577, 0.002805144991725683, 0.07836277037858963, -0.040786001831293106, 0.013674819841980934, 0.09837955981492996, 0.005413474980741739, 0.16164012253284454, 0.06552805751562119, 0.13401569426059723, 0.08540262281894684, 0.07983675599098206, 0.011559084989130497, 0.0339510552585125, 0.006372304633259773, 0.017902348190546036, 0.009317909367382526, -0.07689813524484634, 0.023254239931702614, 0.11834190040826797, 0.040373314172029495, 0.045214686542749405, 0.011671608313918114, -0.039621613919734955, 0.03956446796655655, 0.1767490804195404, 0.016947781667113304, -0.2155885547399521, -0.0772833302617073, 0.06542522460222244, -0.058402981609106064, -0.1495116949081421, -0.025624670088291168, 0.02235453948378563, -0.1576213240623474, 0.0005415278719738126, -0.028421247377991676, 0.10273677110671997, -0.09609103202819824, -0.04047273099422455, 0.08817856013774872, 0.0699915885925293, -0.028443265706300735, 0.062181491404771805, -0.17871366441249847, 0.12371177971363068, 0.03400380536913872, 0.07102521508932114, -0.09190616011619568, 0.09926209598779678, -0.005890274420380592, 0.013706923462450504, 0.1655038744211197, 0.015541122294962406, -0.09426835179328918, -0.0710771307349205, -0.08823360502719879, -0.013685347512364388, 0.09967034310102463, -0.13288703560829163, 0.06852756440639496, -0.019996264949440956, -0.027522722259163857, 0.006831855047494173, -0.08690175414085388, -0.13149002194404602, -0.18112291395664215, 0.05532965436577797, -0.10246208310127258, 0.024886637926101685, -0.07404468208551407, -0.04842938482761383, 0.040009692311286926, 0.19972540438175201, -0.21909521520137787, -0.10020548850297928, -0.15154099464416504, -0.11518421769142151, 0.16108576953411102, -0.04635784775018692, 0.09108468145132065, -0.01019457820802927, 0.1620863974094391, 0.010745878331363201, -0.02071799710392952, 0.1160653606057167, -0.0854450985789299, -0.1714930683374405, -0.05915606766939163, 0.14893034100532532, 0.14446774125099182, 0.035233963280916214, -0.01287093386054039, 0.031883079558610916, -0.07141809165477753, -0.11891574412584305, 0.03534413129091263, 0.13700434565544128, 0.06963939964771271, -0.01262744888663292, -0.03579355776309967, -0.09220988303422928, -0.0504169799387455, -0.03974350169301033, 0.008700315840542316, 0.18124674260616302, -0.07448253035545349, 0.15189899504184723, 0.13152532279491425, -0.0723627433180809, -0.20356547832489014, 0.06049336493015289, 0.0346653088927269, 0.02138940617442131, 0.01630406267940998, -0.21584440767765045, 0.08776868134737015, -0.006339477840811014, -0.06874293833971024, 0.18010607361793518, -0.17902927100658417, -0.13895957171916962, 0.0988360270857811, 0.03516041859984398, -0.1823381632566452, -0.13705183565616608, -0.09613028168678284, -0.03228107467293739, -0.1230158656835556, 0.05866828188300133, 0.026329705491662025, 0.015535218641161919, 0.021184591576457024, 0.029537182301282883, 0.019990645349025726, -0.050314560532569885, 0.2066401094198227, -0.012754418887197971, 0.013829488307237625, -0.06200092285871506, -0.10324833542108536, 0.04607655853033066, -0.05281443893909454, 0.11618918925523758, 0.0008675124263390899, 0.0222539734095335, -0.1703120321035385, -0.034940678626298904, -0.05094180256128311, 0.03240950033068657, -0.0940355733036995, -0.09862573444843292, -0.04792311042547226, 0.0863310769200325, 0.09178805351257324, -0.02642832137644291, -0.0012948049698024988, -0.10240978002548218, 0.04736471548676491, 0.19468940794467926, 0.19447913765907288, 0.056417278945446014, -0.06639297306537628, 0.028046250343322754, -0.03318989276885986, 0.0474521666765213, -0.24178913235664368, 0.03477860614657402, 0.05343414843082428, 0.011909419670701027, 0.08445286750793457, -0.003811764298006892, -0.16544894874095917, -0.0645582303404808, 0.08673491328954697, -0.044566020369529724, -0.1641440987586975, -0.032721146941185, 0.022641237825155258, -0.20684140920639038, -0.04179441183805466, 0.011281585320830345, -0.019901549443602562, -0.0412454716861248, 0.019307231530547142, 0.07510565966367722, -0.03287685289978981, 0.08019816875457764, 0.09813148528337479, 0.08825678378343582, -0.10000404715538025, 0.08111211657524109, 0.06777224689722061, -0.04150259494781494, 0.033621978014707565, 0.10420102626085281, -0.04986701160669327, -0.04245395585894585, 0.08457721024751663, 0.12530498206615448, -0.023088322952389717, -0.05413966253399849, 0.01212761178612709, -0.04834878444671631, 0.054270725697278976, 0.10672589391469955, 0.03587748110294342, -0.0011736709857359529, 0.050750378519296646, 0.028017738834023476, -0.10256616771221161, 0.08914750814437866, 0.03725229576230049, 0.01791483536362648, -0.03840089589357376, -0.04189951717853546, 0.004631043411791325, -0.01516848523169756, -0.018755726516246796, -0.0170601699501276, -0.08432288467884064, -0.012585177086293697, -0.11483073979616165, 0.008729316294193268, -0.06474046409130096, 0.0068718683905899525, 0.030621705576777458, -0.048198994249105453, 0.002455118577927351, 0.0015593849821016192, -0.0763937458395958, -0.051290228962898254, -0.013947847299277782, 0.06659863144159317, -0.12318508327007294, 0.042245566844940186, 0.06755290925502777, -0.0967436209321022, 0.06653253734111786, -0.007241120561957359, 0.011410431936383247, 0.0035017048940062523, -0.15551216900348663, 0.04931795224547386, -0.02801262028515339, -0.024408893659710884, 0.02252740040421486, -0.1943521499633789, -0.0076536573469638824, -0.04313570633530617, -0.0573619082570076, -0.004662544000893831, -0.010509601794183254, -0.11749584227800369, 0.10912971943616867, 0.007869033142924309, -0.06068027764558792, -0.027412936091423035, 0.04882120341062546, 0.10086818784475327, -0.02643461339175701, 0.13437911868095398, -0.007259611040353775, 0.07193886488676071, -0.16531120240688324, -0.004601365886628628, -0.012241186574101448, 0.0436379611492157, -0.026195699349045753, -0.0405074842274189, 0.046567026525735855, -0.02435867115855217, 0.19325846433639526, -0.022945057600736618, 0.07084392011165619, 0.04857128486037254, 0.032133232802152634, 0.015501349233090878, 0.0795411467552185, 0.07082084566354752, -0.005705251824110746, 0.0012581591727212071, 0.03978053480386734, 0.017982542514801025, -0.03728210926055908, -0.1555383801460266, 0.06970943510532379, 0.13411745429039001, 0.06166819855570793, 0.04408809542655945, 0.016431381925940514, -0.10990120470523834, -0.0851396843791008, 0.11883285641670227, -0.007235993165522814, -0.03617050126194954, -0.06722866743803024, 0.173916757106781, 0.14665542542934418, -0.1881304681301117, 0.07379920780658722, -0.04021890461444855, -0.047917887568473816, -0.1390228271484375, -0.19778694212436676, -0.05708994343876839, -0.04697444662451744, -0.031041637063026428, -0.06054393947124481, 0.0458466075360775, 0.05282822251319885, -0.0030727433040738106, -0.022938158363103867, 0.09914897382259369, 0.015943726524710655, -0.02539999410510063, 0.02896830625832081, 0.05823741853237152, 0.03165817633271217, -0.08659189939498901, 0.015606595203280449, 0.005570207256823778, 0.012911485508084297, 0.06870248168706894, 0.02457418665289879, -0.05173683166503906, 0.027958450838923454, -0.022081928327679634, -0.11900684982538223, 0.030582444742321968, -0.008381795138120651, -0.040503326803445816, 0.13942766189575195, 0.027071574702858925, 0.004188410937786102, -0.01948714442551136, 0.21969179809093475, -0.07356120645999908, -0.06005077436566353, -0.13661882281303406, 0.08630871772766113, -0.0646464005112648, 0.0389556810259819, 0.019435638561844826, -0.1268240362405777, 0.018831565976142883, 0.177330881357193, 0.1440054327249527, -0.01947030983865261, 0.0009323913836851716, 0.04481814056634903, 0.005301912315189838, -0.0307242963463068, 0.02539828233420849, 0.04486451670527458, 0.15326927602291107, -0.08694307506084442, 0.06786760687828064, -0.017319831997156143, -0.0827098861336708, -0.012140425853431225, 0.11545486748218536, -0.006142809521406889, 0.0005785097600892186, -0.06454599648714066, 0.1289747804403305, -0.09516481310129166, -0.20264828205108643, 0.06106545776128769, -0.06001608073711395, -0.13382460176944733, -0.04726420342922211, 0.03044959343969822, -0.012021848000586033, 0.015416436828672886, 0.06878575682640076, -0.056695982813835144, 0.18439672887325287, 0.04437119513750076, -0.07817023992538452, -0.10054308921098709, 0.05510885640978813, -0.15851499140262604, 0.2766340374946594, 0.028239939361810684, 0.029272811487317085, 0.10791875422000885, -0.003789537586271763, -0.14966876804828644, 0.014645657502114773, 0.09151527285575867, -0.055869899690151215, 0.05673375353217125, 0.17499114573001862, 0.0023246139753609896, 0.11875680834054947, 0.04801327362656593, -0.05847916379570961, 0.053358294069767, -0.10834988206624985, -0.05009477213025093, -0.10316772758960724, 0.06604080647230148, -0.08997134119272232, 0.1677444577217102, 0.12227477133274078, -0.0657745748758316, -0.012364364229142666, -0.022101426497101784, 0.08321168273687363, 0.01727793924510479, 0.10294997692108154, 0.009740419685840607, -0.16769815981388092, 0.037040822207927704, 0.015676027163863182, 0.09718561172485352, -0.194975346326828, -0.05438392236828804, 0.04106029495596886, -0.019167637452483177, -0.07238626480102539, 0.11234915256500244, 0.04907934367656708, 0.053642772138118744, -0.04926076903939247, -0.025769544765353203, 0.009724765084683895, 0.1444830447435379, -0.1114528551697731, -0.024292191490530968 ]
null
null
peft
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed] ### Framework versions - PEFT 0.7.1
{"library_name": "peft", "base_model": "bigscience/mt0-large"}
null
alitolga/627_bigscience_mt0-large_PrefixTuning
[ "peft", "safetensors", "arxiv:1910.09700", "base_model:bigscience/mt0-large", "region:us" ]
2024-02-13T19:58:29+00:00
[ "1910.09700" ]
[]
TAGS #peft #safetensors #arxiv-1910.09700 #base_model-bigscience/mt0-large #region-us
# Model Card for Model ID ## Model Details ### Model Description - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact ### Framework versions - PEFT 0.7.1
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact", "### Framework versions\n\n- PEFT 0.7.1" ]
[ "TAGS\n#peft #safetensors #arxiv-1910.09700 #base_model-bigscience/mt0-large #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact", "### Framework versions\n\n- PEFT 0.7.1" ]
[ 37, 6, 3, 54, 28, 3, 4, 9, 9, 10, 42, 20, 3, 4, 5, 9, 11, 13, 3, 12, 5, 4, 5, 3, 4, 9, 53, 9, 8, 6, 3, 14, 8, 7, 9, 4, 11 ]
[ "passage: TAGS\n#peft #safetensors #arxiv-1910.09700 #base_model-bigscience/mt0-large #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact### Framework versions\n\n- PEFT 0.7.1" ]
[ -0.10329285264015198, 0.19123567640781403, -0.003162568900734186, 0.03152334317564964, 0.0924748107790947, 0.0173750389367342, 0.04939376562833786, 0.12474949657917023, -0.024995610117912292, 0.10992436110973358, 0.0718991681933403, 0.0943777933716774, 0.10566803812980652, 0.22291399538516998, 0.008972898125648499, -0.20435582101345062, 0.025665005668997765, -0.0948035940527916, -0.010869896970689297, 0.12268577516078949, 0.14814266562461853, -0.0994880348443985, 0.07724972069263458, -0.016673453152179718, -0.0016740146093070507, -0.033501140773296356, -0.07083374261856079, -0.03281131386756897, 0.04394635558128357, 0.04642690345644951, 0.05276493355631828, -0.0009831934003159404, 0.0810641273856163, -0.2617826461791992, 0.020550739020109177, 0.04297945648431778, -0.0038836889434605837, 0.08932307362556458, 0.09905099868774414, -0.04220724105834961, 0.12389163672924042, -0.03591815009713173, 0.13903215527534485, 0.08097527921199799, -0.08995258808135986, -0.23319008946418762, -0.06633374840021133, 0.08905189484357834, 0.17240816354751587, 0.07316629588603973, -0.04154044762253761, 0.12679213285446167, -0.09532308578491211, 0.016483545303344727, 0.048802152276039124, -0.08728253096342087, -0.06705479323863983, 0.054959505796432495, 0.10704414546489716, 0.06351327896118164, -0.13179384171962738, -0.031036879867315292, 0.021785154938697815, 0.0333254411816597, 0.08302492648363113, 0.015946531668305397, 0.15390197932720184, 0.027723072096705437, -0.14742814004421234, -0.03778916224837303, 0.13501006364822388, 0.025608541443943977, -0.03673375025391579, -0.23260630667209625, 0.003333127126097679, -0.08851760625839233, -0.028692761436104774, -0.05119727924466133, 0.043331827968358994, -0.000740863848477602, 0.10336242616176605, -0.026807235553860664, -0.08851850032806396, -0.007064414210617542, 0.08608829975128174, 0.05188068747520447, 0.022070245817303658, -0.01719503477215767, 0.004584740847349167, 0.12058630585670471, 0.05327777564525604, -0.13065242767333984, -0.06517702341079712, -0.07196366041898727, -0.0416409969329834, -0.045075397938489914, 0.04142850637435913, 0.05693614482879639, 0.06157223507761955, 0.25293421745300293, -0.03912102058529854, 0.05253172293305397, 0.06702540069818497, 0.018896453082561493, 0.0485229454934597, 0.09428489208221436, -0.06721611320972443, -0.15685619413852692, -0.00619259849190712, 0.09641793370246887, -0.0025779407005757093, -0.021180076524615288, -0.04781441390514374, 0.04340546205639839, 0.03160572424530983, 0.1133960485458374, 0.0966489240527153, -0.015827428549528122, -0.07624366879463196, -0.052186548709869385, 0.20806875824928284, -0.1484907865524292, 0.04191198572516441, 0.026409698650240898, -0.015660520642995834, -0.03476943448185921, 0.005969157442450523, 0.0157073512673378, -0.019164269790053368, 0.09952778369188309, -0.06925448030233383, -0.03852417692542076, -0.11569683998823166, -0.026189247146248817, 0.03301352635025978, 0.015910545364022255, -0.026449183002114296, -0.03150055557489395, -0.05952472239732742, -0.09123411029577255, 0.0976516604423523, -0.06562686711549759, -0.05946263298392296, -0.022328583523631096, -0.09221343696117401, 0.01988605037331581, 0.027851710096001625, 0.09646357595920563, -0.026870284229516983, 0.04023125395178795, -0.015643609687685966, 0.06863865256309509, 0.08116479963064194, 0.03138970583677292, -0.07393033802509308, 0.06444593518972397, -0.2009269744157791, 0.08226597309112549, -0.07939589768648148, 0.03082207217812538, -0.16275838017463684, -0.022842567414045334, 0.004513426683843136, 0.020158814266324043, 0.030431460589170456, 0.16075176000595093, -0.20106585323810577, -0.037768084555864334, 0.15388384461402893, -0.09931781142950058, -0.12145674973726273, 0.04332636669278145, -0.0468338243663311, 0.15674950182437897, 0.019569236785173416, -0.011844165623188019, 0.09679717570543289, -0.1417139768600464, -0.02993275225162506, -0.021346628665924072, -0.003395706182345748, 0.10055196285247803, 0.09460904449224472, -0.07784125208854675, 0.02614038810133934, 0.012287450954318047, -0.04935002326965332, -0.025283770635724068, -0.049486011266708374, -0.10894729942083359, 0.0072288732044398785, -0.0779338851571083, 0.0214002076536417, -0.013361554592847824, -0.07962801307439804, -0.013444783166050911, -0.16057062149047852, -0.037594955414533615, 0.08435806632041931, 0.012650652788579464, -0.019764922559261322, -0.09689339250326157, 0.044444818049669266, -0.03472353518009186, -0.019170338287949562, -0.14464524388313293, -0.0220845527946949, 0.018219662830233574, -0.14053164422512054, 0.0036211542319506407, -0.11873974651098251, 0.06785600632429123, 0.011641447432339191, -0.06279727071523666, -0.03666863217949867, -0.00690578343346715, 0.004549688659608364, -0.05114910751581192, -0.24103595316410065, -0.02154134213924408, -0.05334816873073578, 0.15754938125610352, -0.21925939619541168, 0.03663923218846321, 0.050563521683216095, 0.12279795855283737, 0.006220904178917408, -0.06057696044445038, 0.030807014554739, -0.06841524690389633, -0.027903854846954346, -0.07186471670866013, -0.007455829996615648, -0.011831543408334255, -0.04991558566689491, 0.017393117770552635, -0.1221260130405426, -0.026314344257116318, 0.10037737339735031, 0.07744330912828445, -0.1585528552532196, -0.018361002206802368, -0.04346153512597084, -0.05847829580307007, -0.0829392597079277, -0.05934637412428856, 0.10632814466953278, 0.05135788768529892, 0.0357702374458313, -0.07457895576953888, -0.07046908140182495, 0.006622327025979757, -0.021371537819504738, -0.022426126524806023, 0.11993269622325897, 0.07525009661912918, -0.11319450289011002, 0.09933678060770035, 0.07988030463457108, 0.03387005999684334, 0.08079297840595245, -0.02531038038432598, -0.10585328191518784, -0.03095429763197899, 0.04050600901246071, 0.016221147030591965, 0.15668347477912903, -0.05978095158934593, 0.05817355960607529, 0.04962288960814476, -0.0413992740213871, 0.04441274702548981, -0.09123548120260239, 0.010045765899121761, 0.010691746138036251, -0.013650196604430676, 0.016043806448578835, -0.020155012607574463, 0.011640893295407295, 0.08962464332580566, 0.05344416946172714, 0.039399318397045135, 0.025997621938586235, -0.028812438249588013, -0.13007062673568726, 0.1890290081501007, -0.09712988883256912, -0.24157030880451202, -0.15948361158370972, 0.0647711232304573, 0.05632292106747627, -0.018972525373101234, 0.020004648715257645, -0.05699409916996956, -0.10744290798902512, -0.08284313976764679, 0.006095566786825657, 0.03522997722029686, -0.05380091816186905, -0.0691305622458458, 0.04623221233487129, 0.04366683214902878, -0.11934655159711838, 0.03653166443109512, 0.05668237432837486, -0.020069297403097153, 0.00462671834975481, 0.06268492341041565, 0.08291184157133102, 0.1787724494934082, -0.00836150348186493, -0.006405137944966555, 0.05111353471875191, 0.2805814743041992, -0.16012856364250183, 0.11492379754781723, 0.13326586782932281, -0.061822742223739624, 0.07563114911317825, 0.19151988625526428, 0.037609122693538666, -0.09887012094259262, 0.030284054577350616, 0.022553792223334312, -0.03109055943787098, -0.2657819092273712, -0.051277827471494675, -0.014939128421247005, -0.0905604362487793, 0.07462584972381592, 0.09213842451572418, 0.07263633608818054, 0.039653290063142776, -0.07312314212322235, -0.09420494735240936, 0.030250001698732376, 0.09984512627124786, -0.02523212507367134, 0.006709114648401737, 0.08644577860832214, -0.03583397716283798, 0.00835019163787365, 0.09818749874830246, -0.01091671735048294, 0.1597106158733368, 0.05475523695349693, 0.11404413729906082, 0.07818804681301117, 0.08121585100889206, 0.0003469842777121812, 0.031167274340987206, 0.009262911044061184, 0.018342528492212296, 0.01380919013172388, -0.08667711913585663, 0.033802688121795654, 0.11291820555925369, 0.04370145499706268, 0.03874006122350693, 0.015739278867840767, -0.04007212072610855, 0.05329105257987976, 0.1722172349691391, 0.008664121851325035, -0.20237377285957336, -0.07977880537509918, 0.06799615919589996, -0.07822200655937195, -0.1322741061449051, -0.016140814870595932, 0.03755449503660202, -0.16433092951774597, 0.022041503340005875, -0.0372559055685997, 0.09795094281435013, -0.07969769835472107, -0.03824940696358681, 0.09482347965240479, 0.06847827881574631, -0.025954347103834152, 0.05668753385543823, -0.19388732314109802, 0.119425468146801, 0.025718411430716515, 0.06639116257429123, -0.08979388326406479, 0.10060109198093414, 0.004093972500413656, -0.0010652854107320309, 0.16943924129009247, 0.002678467193618417, -0.06241421401500702, -0.06853936612606049, -0.09978486597537994, -0.01194701623171568, 0.0955599993467331, -0.13977214694023132, 0.06978337466716766, -0.0187965277582407, -0.032168835401535034, -0.0034486979711800814, -0.08083139359951019, -0.11684127897024155, -0.17023377120494843, 0.055609993636608124, -0.10013565421104431, 0.028597652912139893, -0.09473206102848053, -0.06509171426296234, -0.00029184893355704844, 0.17570136487483978, -0.2164364904165268, -0.09815792739391327, -0.15020427107810974, -0.09365438669919968, 0.16314084827899933, -0.04566332325339317, 0.0864475667476654, 0.003142255824059248, 0.1677333116531372, 0.0143357552587986, -0.01120893843472004, 0.10649465024471283, -0.09397505968809128, -0.19935908913612366, -0.05486159771680832, 0.1707821637392044, 0.1289655864238739, 0.044051237404346466, -0.015770558267831802, 0.031472839415073395, -0.05595370754599571, -0.12267628312110901, 0.0231629628688097, 0.14523689448833466, 0.06387076526880264, -0.009526419453322887, -0.030579552054405212, -0.09992799162864685, -0.055414795875549316, -0.045570407062768936, 0.010895907878875732, 0.19427426159381866, -0.07799384742975235, 0.16801120340824127, 0.11430723965167999, -0.0514538548886776, -0.21321521699428558, 0.05246661975979805, 0.05068788677453995, 0.016898388043045998, 0.039830826222896576, -0.19446967542171478, 0.09313203394412994, 0.005513141397386789, -0.07344579696655273, 0.1645168513059616, -0.1722051352262497, -0.1433822363615036, 0.092499740421772, 0.03195139020681381, -0.21749933063983917, -0.14298178255558014, -0.09912918508052826, -0.023858370259404182, -0.10835202038288116, 0.0606689490377903, 0.0013193793129175901, 0.012641048058867455, 0.02466445416212082, 0.012098548002541065, 0.025666894391179085, -0.050968848168849945, 0.2066328525543213, -0.033085718750953674, 0.006638026796281338, -0.04621833562850952, -0.08586680889129639, 0.03352314233779907, -0.0451563224196434, 0.10474327951669693, -0.003698543179780245, 0.02362469956278801, -0.1532212644815445, -0.04175867140293121, -0.05579805746674538, 0.034263331443071365, -0.09351543337106705, -0.09062200784683228, -0.05098685994744301, 0.09586400538682938, 0.09512672573328018, -0.02948460541665554, 0.0017215092666447163, -0.09044519811868668, 0.07170174270868301, 0.19392390549182892, 0.19922317564487457, 0.06539072096347809, -0.06956130266189575, 0.027563992887735367, -0.03170780465006828, 0.045680832117795944, -0.23162133991718292, 0.04353845864534378, 0.058711372315883636, 0.02072266675531864, 0.0833677425980568, -0.008013524115085602, -0.1513132005929947, -0.06991203129291534, 0.08525049686431885, -0.04931584373116493, -0.17649732530117035, -0.02774152345955372, 0.038840003311634064, -0.2116115242242813, -0.041097693145275116, 0.020767847076058388, -0.01961369253695011, -0.04129144176840782, 0.021330440416932106, 0.08434978127479553, -0.017004217952489853, 0.11027301847934723, 0.08885522931814194, 0.09579738229513168, -0.10612418502569199, 0.06746753305196762, 0.07504786550998688, -0.04678388684988022, 0.025543808937072754, 0.1156453862786293, -0.04324830695986748, -0.03416917100548744, 0.08394785970449448, 0.08315800130367279, 0.03034902922809124, -0.0485706552863121, 0.009489355608820915, -0.06692440807819366, 0.06081061810255051, 0.11077925562858582, 0.029387149959802628, -0.006287150084972382, 0.05633477494120598, 0.029980262741446495, -0.09473933279514313, 0.11201741546392441, 0.04883471503853798, 0.01986096240580082, -0.04579858109354973, -0.03456662595272064, -0.011788072995841503, -0.014496367424726486, -0.022648174315690994, -0.002828391268849373, -0.09335190057754517, -0.010312242433428764, -0.08821985870599747, 0.02238784357905388, -0.07282277196645737, 0.010636577382683754, 0.026351111009716988, -0.050851378589868546, -0.00008970432827482, 0.005786297842860222, -0.07524571567773819, -0.04716046527028084, -0.017953502014279366, 0.08606957644224167, -0.128200501203537, 0.03666289523243904, 0.07454752922058105, -0.105265311896801, 0.07908564060926437, -0.008390432223677635, 0.009151401929557323, 0.0008598154527135193, -0.1572188138961792, 0.05440807715058327, -0.017491450533270836, -0.010192995890974998, 0.01764194294810295, -0.212607741355896, -0.004333374090492725, -0.04600708559155464, -0.05970161780714989, 0.01198236271739006, -0.028636030852794647, -0.12473251670598984, 0.09781620651483536, -0.007831308990716934, -0.06281718611717224, -0.018399598076939583, 0.03234229236841202, 0.09768076241016388, -0.02687213197350502, 0.1373647004365921, -0.02291807159781456, 0.07300806790590286, -0.17528606951236725, -0.008754742331802845, -0.019648898392915726, 0.03352578356862068, -0.030504798516631126, -0.019556112587451935, 0.06022823229432106, -0.019273892045021057, 0.1864510178565979, -0.021587226539850235, 0.06571242958307266, 0.06062941253185272, 0.020169571042060852, 0.02051384374499321, 0.08688796311616898, 0.0616736076772213, 0.002860825276002288, -0.005856692790985107, 0.03146934509277344, -0.0075307609513401985, -0.035897210240364075, -0.160085067152977, 0.07285258173942566, 0.1537626087665558, 0.04876144975423813, 0.018000034615397453, 0.029836608096957207, -0.11258742958307266, -0.07727393507957458, 0.11578049510717392, -0.014780272729694843, -0.0378817617893219, -0.06773675978183746, 0.17114806175231934, 0.14214204251766205, -0.1994670182466507, 0.07312390953302383, -0.05344349518418312, -0.04621947184205055, -0.13792240619659424, -0.15746335685253143, -0.06456737220287323, -0.043465979397296906, -0.024492327123880386, -0.06893821060657501, 0.04602259024977684, 0.05821891129016876, 0.00861385464668274, -0.015541004948318005, 0.11019778996706009, 0.0064995125867426395, -0.024231037124991417, 0.050324615091085434, 0.06758232414722443, 0.033990487456321716, -0.0935843214392662, 0.007859748788177967, -0.0014255908317863941, 0.012044519186019897, 0.06669924408197403, 0.02114819921553135, -0.058334749191999435, 0.012882139533758163, -0.02308126911520958, -0.11961127072572708, 0.03759530186653137, -0.01713300496339798, -0.040078893303871155, 0.14323174953460693, 0.028526263311505318, 0.006717626005411148, -0.022945359349250793, 0.2272522747516632, -0.07660269737243652, -0.06764759868383408, -0.14339855313301086, 0.07687097042798996, -0.06497088819742203, 0.035887233912944794, 0.030441859737038612, -0.11419662833213806, 0.01174079068005085, 0.15573279559612274, 0.13504844903945923, -0.014281627722084522, 0.013466904871165752, 0.044830821454524994, 0.005399035289883614, -0.03320683538913727, 0.023556727916002274, 0.055826377123594284, 0.1531984657049179, -0.07093484699726105, 0.0707358792424202, -0.008959333412349224, -0.07919628173112869, -0.022268911823630333, 0.10886799544095993, -0.0007035445305518806, 0.002428618259727955, -0.06872285157442093, 0.13600866496562958, -0.08552168309688568, -0.21952393651008606, 0.060666222125291824, -0.07861257344484329, -0.15234939754009247, -0.051699552685022354, 0.024057278409600258, -0.016179176047444344, 0.011553998105227947, 0.07921662926673889, -0.05220044404268265, 0.16214822232723236, 0.0468282476067543, -0.055433716624975204, -0.08245920389890671, 0.061441950500011444, -0.13941282033920288, 0.2790188789367676, 0.02279778942465782, 0.04124056547880173, 0.10642540454864502, -0.02265266701579094, -0.1470445692539215, 0.011291798204183578, 0.10797706246376038, -0.07300738990306854, 0.06282566487789154, 0.17456604540348053, 0.003469933522865176, 0.127353698015213, 0.060589443892240524, -0.057661525905132294, 0.035669177770614624, -0.0981585830450058, -0.04988042265176773, -0.10798922181129456, 0.08247920125722885, -0.08255324512720108, 0.16015028953552246, 0.12420449405908585, -0.07234503328800201, -0.007244202308356762, -0.022373033687472343, 0.08544815331697464, 0.014390097931027412, 0.11505712568759918, 0.015465245582163334, -0.18908089399337769, 0.03432828187942505, 0.00007827609806554392, 0.10610438138246536, -0.20322813093662262, -0.060417622327804565, 0.04128408432006836, -0.01247307751327753, -0.08510533720254898, 0.11681237071752548, 0.04724228382110596, 0.03329182416200638, -0.03933549299836159, -0.049435805529356, 0.007557680830359459, 0.14068087935447693, -0.11401131749153137, -0.004372063558548689 ]
null
null
peft
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed] ### Framework versions - PEFT 0.7.1
{"library_name": "peft", "base_model": "bigscience/bloomz-1b1"}
null
alitolga/627_bigscience_bloomz-1b1_PrefixTuning
[ "peft", "safetensors", "arxiv:1910.09700", "base_model:bigscience/bloomz-1b1", "region:us" ]
2024-02-13T19:58:36+00:00
[ "1910.09700" ]
[]
TAGS #peft #safetensors #arxiv-1910.09700 #base_model-bigscience/bloomz-1b1 #region-us
# Model Card for Model ID ## Model Details ### Model Description - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact ### Framework versions - PEFT 0.7.1
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact", "### Framework versions\n\n- PEFT 0.7.1" ]
[ "TAGS\n#peft #safetensors #arxiv-1910.09700 #base_model-bigscience/bloomz-1b1 #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact", "### Framework versions\n\n- PEFT 0.7.1" ]
[ 37, 6, 3, 54, 28, 3, 4, 9, 9, 10, 42, 20, 3, 4, 5, 9, 11, 13, 3, 12, 5, 4, 5, 3, 4, 9, 53, 9, 8, 6, 3, 14, 8, 7, 9, 4, 11 ]
[ "passage: TAGS\n#peft #safetensors #arxiv-1910.09700 #base_model-bigscience/bloomz-1b1 #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact### Framework versions\n\n- PEFT 0.7.1" ]
[ -0.10406560450792313, 0.19425687193870544, -0.003238717094063759, 0.03301034867763519, 0.08961185067892075, 0.01655193604528904, 0.050940368324518204, 0.12380604445934296, -0.022474464029073715, 0.11066809296607971, 0.07313181459903717, 0.09400776028633118, 0.10651304572820663, 0.22505055367946625, 0.00889458879828453, -0.20354348421096802, 0.02590738609433174, -0.09394055604934692, -0.00952990259975195, 0.1227763369679451, 0.14901705086231232, -0.09868566691875458, 0.07731540501117706, -0.014637304469943047, -0.0037805160973221064, -0.033878475427627563, -0.06984070688486099, -0.03323674574494362, 0.04494154825806618, 0.046289630234241486, 0.053854696452617645, -0.0010066755348816514, 0.08155188709497452, -0.26326748728752136, 0.020673003047704697, 0.04122786223888397, -0.005029963329434395, 0.09061834216117859, 0.10046501457691193, -0.04354410991072655, 0.12216510623693466, -0.038755424320697784, 0.13881593942642212, 0.08122878521680832, -0.09042450040578842, -0.2320789098739624, -0.06715157628059387, 0.09322907030582428, 0.17039798200130463, 0.07160341739654541, -0.04120100662112236, 0.12372875213623047, -0.0925740897655487, 0.017817987129092216, 0.05132638290524483, -0.08864545822143555, -0.06731913238763809, 0.0529620386660099, 0.1077355444431305, 0.061172932386398315, -0.1327805072069168, -0.0316508486866951, 0.022350046783685684, 0.03147114813327789, 0.08248620480298996, 0.015565922483801842, 0.15597201883792877, 0.025918254628777504, -0.14699219167232513, -0.03873882070183754, 0.1311718076467514, 0.02729072980582714, -0.03840427100658417, -0.23110629618167877, 0.0021448738407343626, -0.09108421951532364, -0.0294029638171196, -0.0511138029396534, 0.04421953856945038, -0.0006817530374974012, 0.10421200096607208, -0.025791248306632042, -0.08969767391681671, -0.0061360253021121025, 0.08962085843086243, 0.05479322746396065, 0.02203916385769844, -0.01552613265812397, 0.00425034249201417, 0.12032906711101532, 0.05734625086188316, -0.13100387156009674, -0.06504301726818085, -0.07009242475032806, -0.04239249974489212, -0.04364583641290665, 0.04169044643640518, 0.06077960506081581, 0.061975348740816116, 0.25779908895492554, -0.03955047205090523, 0.05344853922724724, 0.06658788025379181, 0.018999725580215454, 0.04951322823762894, 0.09419935941696167, -0.06430859863758087, -0.1580950915813446, -0.004516693763434887, 0.09342572838068008, 0.00009963570482796058, -0.02110627107322216, -0.048200104385614395, 0.0426858626306057, 0.03539404273033142, 0.1128576248884201, 0.09537528455257416, -0.018052833154797554, -0.07795663177967072, -0.05144510045647621, 0.20972341299057007, -0.14740920066833496, 0.04240169748663902, 0.026401281356811523, -0.014552208594977856, -0.035616930574178696, 0.005585568957030773, 0.0163466464728117, -0.01813776232302189, 0.09903790801763535, -0.0696907788515091, -0.038441359996795654, -0.11253383755683899, -0.023854490369558334, 0.0327116884291172, 0.012990976683795452, -0.024149613454937935, -0.031007470563054085, -0.05883125960826874, -0.09039817005395889, 0.09780257940292358, -0.06563583761453629, -0.06186206638813019, -0.022346336394548416, -0.09193801134824753, 0.02077268250286579, 0.028606677427887917, 0.09649024158716202, -0.026551490649580956, 0.039635542780160904, -0.015217144973576069, 0.0680195763707161, 0.07902587950229645, 0.029322072863578796, -0.07453081756830215, 0.06541144102811813, -0.1995486468076706, 0.08111470937728882, -0.08114521950483322, 0.02861085534095764, -0.1644931137561798, -0.022525928914546967, 0.008435513824224472, 0.018864309415221214, 0.03193633630871773, 0.15998873114585876, -0.20060290396213531, -0.03651338443160057, 0.15429189801216125, -0.10018467903137207, -0.1206154003739357, 0.04421364888548851, -0.04833550006151199, 0.15473836660385132, 0.019306639209389687, -0.008471737615764141, 0.09458731859922409, -0.1417921483516693, -0.03106781654059887, -0.022954395040869713, -0.0029385709203779697, 0.1022193655371666, 0.09306058287620544, -0.07741675525903702, 0.0247439444065094, 0.012726753018796444, -0.051170870661735535, -0.025869417935609818, -0.04910421743988991, -0.10817420482635498, 0.005589903797954321, -0.07791081815958023, 0.02337084896862507, -0.01565944030880928, -0.0795171856880188, -0.013295579701662064, -0.1606183499097824, -0.04099130630493164, 0.08494576811790466, 0.013109841383993626, -0.018315693363547325, -0.0962948277592659, 0.04542579501867294, -0.03398802503943443, -0.020706741139292717, -0.14391443133354187, -0.02024153806269169, 0.01967526040971279, -0.13945336639881134, 0.002363586565479636, -0.11915373057126999, 0.06723814457654953, 0.012051272206008434, -0.061525147408246994, -0.037755683064460754, -0.008385688997805119, 0.004197619389742613, -0.05161309614777565, -0.24041348695755005, -0.02327118255198002, -0.05385667458176613, 0.15952004492282867, -0.21970213949680328, 0.035686153918504715, 0.05116155371069908, 0.12281429022550583, 0.0058323899284005165, -0.062048040330410004, 0.031119024381041527, -0.07079688459634781, -0.02868548221886158, -0.07308699935674667, -0.008548860438168049, -0.011128493584692478, -0.05105677619576454, 0.014586290344595909, -0.11948337405920029, -0.028760341927409172, 0.09782686829566956, 0.08079870790243149, -0.15900041162967682, -0.01838335581123829, -0.04457588866353035, -0.058055099099874496, -0.08181348443031311, -0.05762618035078049, 0.10763504356145859, 0.0522831492125988, 0.03521823137998581, -0.07474814355373383, -0.06930321455001831, 0.007979443296790123, -0.022167829796671867, -0.020592184737324715, 0.11996089667081833, 0.07364754378795624, -0.11526631563901901, 0.09891755878925323, 0.07740189880132675, 0.03520644083619118, 0.0794937014579773, -0.025363067165017128, -0.10414546728134155, -0.0324794203042984, 0.04124823212623596, 0.015109611675143242, 0.16082409024238586, -0.0591130293905735, 0.05874853953719139, 0.04883284494280815, -0.04315223544836044, 0.042915623635053635, -0.09158989787101746, 0.010298642329871655, 0.009998408146202564, -0.014685506001114845, 0.017869118601083755, -0.022301286458969116, 0.010185171850025654, 0.08878825604915619, 0.05413031578063965, 0.0365273654460907, 0.026137543842196465, -0.02748091332614422, -0.12970149517059326, 0.18932859599590302, -0.0958501324057579, -0.24128073453903198, -0.1583695113658905, 0.06541340798139572, 0.05796598270535469, -0.01853444240987301, 0.02014877460896969, -0.05505094677209854, -0.10753604769706726, -0.08213486522436142, 0.004325510002672672, 0.03375200554728508, -0.05373181402683258, -0.06774283945560455, 0.043908603489398956, 0.042783573269844055, -0.11934267729520798, 0.035932037979364395, 0.0550248809158802, -0.016940534114837646, 0.004711983259767294, 0.06330413371324539, 0.08221766352653503, 0.17783722281455994, -0.008219553157687187, -0.006982862483710051, 0.05108547955751419, 0.28129714727401733, -0.16006554663181305, 0.11773225665092468, 0.1302797496318817, -0.06430964171886444, 0.07700087875127792, 0.19033437967300415, 0.036392658948898315, -0.09885182231664658, 0.02972271293401718, 0.021699048578739166, -0.030490748584270477, -0.2659197151660919, -0.050654876977205276, -0.012858626432716846, -0.08782025426626205, 0.07596482336521149, 0.09268046170473099, 0.07296761125326157, 0.04120808094739914, -0.072663314640522, -0.09333258122205734, 0.028723033145070076, 0.10016927868127823, -0.028132345527410507, 0.007085226476192474, 0.08554641902446747, -0.03505152836441994, 0.008965250104665756, 0.09760136902332306, -0.010265059769153595, 0.15786132216453552, 0.0563528835773468, 0.11822042614221573, 0.07630486786365509, 0.08377037942409515, 0.0013032614951953292, 0.031733959913253784, 0.012045771814882755, 0.01873369701206684, 0.013756494037806988, -0.08524487167596817, 0.0336591936647892, 0.11285161972045898, 0.04125966876745224, 0.039208635687828064, 0.016106795519590378, -0.038712192326784134, 0.05106797814369202, 0.17400994896888733, 0.009314230643212795, -0.1999632567167282, -0.08185054361820221, 0.06815139949321747, -0.07799976319074631, -0.1343800276517868, -0.016458390280604362, 0.03465923294425011, -0.16607269644737244, 0.021530840545892715, -0.03863292932510376, 0.09701912105083466, -0.08059898763895035, -0.03841926157474518, 0.09028011560440063, 0.06648259609937668, -0.025881588459014893, 0.056431714445352554, -0.1933746188879013, 0.11785569041967392, 0.024661798030138016, 0.06567437201738358, -0.08980309218168259, 0.10088306665420532, 0.0019360508304089308, -0.0038737759459763765, 0.17148074507713318, 0.0037553103175014257, -0.06482551991939545, -0.06820035725831985, -0.10179886966943741, -0.012656962499022484, 0.09577079117298126, -0.13989180326461792, 0.06961677968502045, -0.01789531670510769, -0.03298730030655861, -0.0035419713240116835, -0.0824039876461029, -0.11943204700946808, -0.1699933409690857, 0.05574832111597061, -0.10087523609399796, 0.030129434540867805, -0.09503402560949326, -0.06414077430963516, 0.001412665587849915, 0.17678557336330414, -0.2182084023952484, -0.09905234724283218, -0.14949069917201996, -0.09211678802967072, 0.16428136825561523, -0.04663864150643349, 0.08989666402339935, 0.0023726746439933777, 0.16759055852890015, 0.01238885149359703, -0.010928932577371597, 0.10492663085460663, -0.09387652575969696, -0.20205704867839813, -0.055570557713508606, 0.17192725837230682, 0.12939608097076416, 0.04328153654932976, -0.013339740224182606, 0.03173208236694336, -0.057698722928762436, -0.12187308073043823, 0.024158166721463203, 0.14188764989376068, 0.06044969707727432, -0.012154904194176197, -0.02801620587706566, -0.10102991759777069, -0.05645272508263588, -0.046045027673244476, 0.012435296550393105, 0.1965608447790146, -0.07935840636491776, 0.16968244314193726, 0.11105112731456757, -0.05177795886993408, -0.20894072949886322, 0.05042435601353645, 0.05339532345533371, 0.01713847927749157, 0.0395476259291172, -0.19520145654678345, 0.0942840576171875, 0.005137374624609947, -0.07435052841901779, 0.16722841560840607, -0.16977578401565552, -0.1426618993282318, 0.09399678558111191, 0.03244039788842201, -0.21454477310180664, -0.14309503138065338, -0.10017349570989609, -0.024004323408007622, -0.10718628764152527, 0.0641440600156784, 0.004548975732177496, 0.011910926550626755, 0.023365236818790436, 0.01282413862645626, 0.02623802423477173, -0.05102181434631348, 0.20545105636119843, -0.031047295778989792, 0.007080481853336096, -0.04595755413174629, -0.0893433541059494, 0.0332806296646595, -0.044925570487976074, 0.10425300151109695, -0.004054264165461063, 0.02473088912665844, -0.15175117552280426, -0.04013407602906227, -0.05605629086494446, 0.035079218447208405, -0.09549117833375931, -0.08968551456928253, -0.051722753793001175, 0.09446664154529572, 0.09416043013334274, -0.028505908325314522, 0.005546026397496462, -0.08631966263055801, 0.06964221596717834, 0.19561974704265594, 0.1938764601945877, 0.06888297945261002, -0.07028936594724655, 0.025424398481845856, -0.032784875482320786, 0.044457487761974335, -0.23259377479553223, 0.042447175830602646, 0.05841358006000519, 0.02288421243429184, 0.08250246196985245, -0.008707687258720398, -0.15242883563041687, -0.06940528005361557, 0.08374598622322083, -0.05106255039572716, -0.17730672657489777, -0.02704254537820816, 0.037875570356845856, -0.2109445184469223, -0.04025065153837204, 0.022366005927324295, -0.01962103135883808, -0.042938169091939926, 0.021318914368748665, 0.08449866622686386, -0.016066471114754677, 0.10918346047401428, 0.09167086333036423, 0.09602566063404083, -0.10522374510765076, 0.06774057447910309, 0.07573504000902176, -0.043481238186359406, 0.026006201282143593, 0.1135452538728714, -0.043121617287397385, -0.03328714519739151, 0.08435914665460587, 0.08309377729892731, 0.028665006160736084, -0.047383394092321396, 0.007977182976901531, -0.06637255847454071, 0.05933302268385887, 0.11359561234712601, 0.029262302443385124, -0.005358004476875067, 0.056098759174346924, 0.03005960024893284, -0.09383027255535126, 0.11128173768520355, 0.050975266844034195, 0.021087955683469772, -0.045405495911836624, -0.034989047795534134, -0.012276321649551392, -0.013540562242269516, -0.021908482536673546, -0.003135211765766144, -0.09382878988981247, -0.011662014760077, -0.09003644436597824, 0.026235533878207207, -0.07193634659051895, 0.011357861571013927, 0.025818241760134697, -0.05035759136080742, 0.0029852839652448893, 0.00616007624194026, -0.07435796409845352, -0.04793865606188774, -0.017582014203071594, 0.08500213176012039, -0.129414364695549, 0.03575414419174194, 0.07502112537622452, -0.10593675076961517, 0.08041650801897049, -0.007449007127434015, 0.01082664169371128, 0.00023644360771868378, -0.15958306193351746, 0.0554952435195446, -0.016748573631048203, -0.012522924691438675, 0.01570514217019081, -0.21300143003463745, -0.006445453502237797, -0.04732141271233559, -0.05873408168554306, 0.010213776491582394, -0.02562352642416954, -0.12598779797554016, 0.09538064152002335, -0.006752215791493654, -0.06279680132865906, -0.018243294209241867, 0.032727181911468506, 0.09617746621370316, -0.02699391171336174, 0.13562992215156555, -0.021053282544016838, 0.07383079826831818, -0.17434577643871307, -0.00796822365373373, -0.020778752863407135, 0.03368622437119484, -0.03336729109287262, -0.01956181786954403, 0.05829671025276184, -0.01775907166302204, 0.1836550533771515, -0.02413504756987095, 0.06776360422372818, 0.057737067341804504, 0.021459100767970085, 0.019439008086919785, 0.08707094937562943, 0.06282966583967209, 0.0014440420782193542, -0.006464559584856033, 0.03191197291016579, -0.008555687963962555, -0.03505712375044823, -0.16003556549549103, 0.07100331038236618, 0.15331357717514038, 0.04495752602815628, 0.019953755661845207, 0.026906171813607216, -0.11299748718738556, -0.07522197812795639, 0.11431015282869339, -0.014664338901638985, -0.03957637771964073, -0.0661516860127449, 0.1743321716785431, 0.13897420465946198, -0.19911989569664001, 0.07176198065280914, -0.05042212828993797, -0.04699085280299187, -0.13562434911727905, -0.15653498470783234, -0.06501156836748123, -0.04042598605155945, -0.023638101294636726, -0.06869101524353027, 0.04654425382614136, 0.05740462988615036, 0.007625872734934092, -0.01603016071021557, 0.10851871222257614, 0.01126676145941019, -0.02527027204632759, 0.05267367884516716, 0.06931975483894348, 0.031459737569093704, -0.09350360184907913, 0.007396499160677195, -0.0031836391426622868, 0.013630462810397148, 0.0666925236582756, 0.021088790148496628, -0.05584340542554855, 0.013449893333017826, -0.02202310971915722, -0.12064781039953232, 0.03790435194969177, -0.016918759793043137, -0.038650721311569214, 0.14298753440380096, 0.02859502099454403, 0.00869451742619276, -0.02334393933415413, 0.2285020649433136, -0.07459686696529388, -0.06595925986766815, -0.14037080109119415, 0.0768558531999588, -0.06703399866819382, 0.03535596281290054, 0.03056887723505497, -0.11464963853359222, 0.014412084594368935, 0.15441854298114777, 0.13138209283351898, -0.012973802164196968, 0.014233420602977276, 0.04382805526256561, 0.004823282826691866, -0.0312243290245533, 0.02264155074954033, 0.05275898799300194, 0.15186989307403564, -0.07263489067554474, 0.07095479220151901, -0.008129306137561798, -0.0795561894774437, -0.024569250643253326, 0.10924279689788818, -0.003069473896175623, 0.0022186830174177885, -0.06973854452371597, 0.13541647791862488, -0.08811133354902267, -0.2218925803899765, 0.06029706820845604, -0.07826652377843857, -0.15293918550014496, -0.051031675189733505, 0.02396201901137829, -0.01457368116825819, 0.012024321593344212, 0.07993827760219574, -0.05135862156748772, 0.16418829560279846, 0.04744509980082512, -0.056464921683073044, -0.0804126188158989, 0.06003319099545479, -0.14362242817878723, 0.27951258420944214, 0.02301441878080368, 0.04363001137971878, 0.10592808574438095, -0.020526202395558357, -0.14913330972194672, 0.00866720825433731, 0.10723792016506195, -0.0708908811211586, 0.06280865520238876, 0.17452995479106903, 0.004209337290376425, 0.12536828219890594, 0.060263048857450485, -0.056566279381513596, 0.03430161252617836, -0.09297394752502441, -0.0485515370965004, -0.10863914340734482, 0.08223908394575119, -0.08303339034318924, 0.1598685383796692, 0.12213613837957382, -0.07417436689138412, -0.008303163573145866, -0.0226339939981699, 0.08515344560146332, 0.015766233205795288, 0.11314819008111954, 0.0159926638007164, -0.1889730542898178, 0.03524153679609299, 0.0024523865431547165, 0.10564562678337097, -0.20049771666526794, -0.06069818511605263, 0.03977077454328537, -0.011889920569956303, -0.08461174368858337, 0.1154099553823471, 0.04654447361826897, 0.033156219869852066, -0.038938164710998535, -0.04548432677984238, 0.006955538876354694, 0.14036913216114044, -0.1122618019580841, -0.005090599413961172 ]
null
null
null
![image/png](https://cdn-uploads.huggingface.co/production/uploads/64c14f6b02e1f8f67c73bd05/pf4d6FA7DriRtVq5HCkxd.png) ![image/png](https://cdn-uploads.huggingface.co/production/uploads/64c14f6b02e1f8f67c73bd05/e4u8VYfDBh11u60rFYJHF.png) This model is a finetune of jondurbin's excellent [bagel](https://huggingface.co/jondurbin/bagel-34b-v0.2) model. It has been trained with new datasets and a new technique, which we will share to the community soon. This model has not utilised any form of merging. ### Evaluation Results | Average | ARC | HellaSwag | MMLU | TruthfulQA | Winogrande | GSM8K | | --- | --- | --- | --- | --- | --- | --- | | 77.29 | 74.23 | 86.76 | 76.66 | 70.22 | 83.66 | 72.18 | ### Contamination Results With reference model jondurbin/bagel-34b-v0.2: | ARC | TruthfulQA | GSM8K | | --- | --- | --- | | 0.08| 0.38| 0.88| *** Vanilla Quantization by [nold](https://huggingface.co/nold), Original Model [abacusai/Smaug-34B-v0.1](https://huggingface.co/abacusai/Smaug-34B-v0.1). Created using [llm-quantizer](https://github.com/Nold360/llm-quantizer) Pipeline - 465d7970507dcaac4cb50221157a68c840965774
{"license": "other", "license_name": "yi-license", "license_link": "https://huggingface.co/01-ai/Yi-34B-200K/blob/main/LICENSE", "base_model": "jondurbin/bagel-34b-v0.2"}
null
nold/Smaug-34B-v0.1-GGUF
[ "gguf", "base_model:jondurbin/bagel-34b-v0.2", "license:other", "region:us" ]
2024-02-13T19:59:16+00:00
[]
[]
TAGS #gguf #base_model-jondurbin/bagel-34b-v0.2 #license-other #region-us
!image/png !image/png This model is a finetune of jondurbin's excellent bagel model. It has been trained with new datasets and a new technique, which we will share to the community soon. This model has not utilised any form of merging. ### Evaluation Results ### Contamination Results With reference model jondurbin/bagel-34b-v0.2: ARC: 0.08, TruthfulQA: 0.38, GSM8K: 0.88 * Vanilla Quantization by nold, Original Model abacusai/Smaug-34B-v0.1. Created using llm-quantizer Pipeline - 465d7970507dcaac4cb50221157a68c840965774
[ "### Evaluation Results", "### Contamination Results\n\n\nWith reference model jondurbin/bagel-34b-v0.2:\n\n\nARC: 0.08, TruthfulQA: 0.38, GSM8K: 0.88\n\n\n* \n\n\nVanilla Quantization by nold, Original Model abacusai/Smaug-34B-v0.1. Created using llm-quantizer Pipeline - 465d7970507dcaac4cb50221157a68c840965774" ]
[ "TAGS\n#gguf #base_model-jondurbin/bagel-34b-v0.2 #license-other #region-us \n", "### Evaluation Results", "### Contamination Results\n\n\nWith reference model jondurbin/bagel-34b-v0.2:\n\n\nARC: 0.08, TruthfulQA: 0.38, GSM8K: 0.88\n\n\n* \n\n\nVanilla Quantization by nold, Original Model abacusai/Smaug-34B-v0.1. Created using llm-quantizer Pipeline - 465d7970507dcaac4cb50221157a68c840965774" ]
[ 30, 5, 98 ]
[ "passage: TAGS\n#gguf #base_model-jondurbin/bagel-34b-v0.2 #license-other #region-us \n### Evaluation Results### Contamination Results\n\n\nWith reference model jondurbin/bagel-34b-v0.2:\n\n\nARC: 0.08, TruthfulQA: 0.38, GSM8K: 0.88\n\n\n* \n\n\nVanilla Quantization by nold, Original Model abacusai/Smaug-34B-v0.1. Created using llm-quantizer Pipeline - 465d7970507dcaac4cb50221157a68c840965774" ]
[ -0.0888485535979271, 0.24211232364177704, -0.003611702471971512, 0.03261280432343483, -0.029944289475679398, -0.009591741487383842, 0.07693994045257568, 0.03790891543030739, 0.09494385123252869, 0.06909965723752975, 0.14705783128738403, 0.07292553782463074, -0.05660180374979973, 0.10319512337446213, -0.02675187774002552, 0.008515519089996815, 0.05565009266138077, -0.017243986949324608, -0.004082720726728439, 0.11752737313508987, 0.029513653367757797, 0.0036557819694280624, 0.10524348169565201, 0.04753255099058151, -0.09265627712011337, 0.10846391320228577, 0.030483592301607132, -0.02283792942762375, 0.023213213309645653, 0.019822722300887108, 0.04567146673798561, 0.05270516127347946, 0.017515260726213455, -0.0851215124130249, 0.043769337236881256, -0.08539492636919022, -0.05434618517756462, 0.10143213719129562, 0.024462927132844925, -0.03313833475112915, 0.08856592327356339, 0.09077185392379761, -0.04258766770362854, 0.12547698616981506, -0.1187194213271141, -0.030908480286598206, -0.07792342454195023, 0.0541175901889801, 0.02464902214705944, 0.034352902323007584, -0.016084030270576477, 0.13050051033496857, -0.08819983154535294, 0.04525698348879814, 0.16372311115264893, -0.19049274921417236, 0.007898852229118347, 0.18351787328720093, -0.02142554335296154, -0.05781031399965286, -0.03324899077415466, 0.0960160493850708, 0.07108546793460846, -0.052892766892910004, -0.14022235572338104, -0.028852934017777443, -0.04718252643942833, 0.034709829837083817, -0.006157125812023878, -0.0013080679345875978, 0.18955901265144348, 0.10300039499998093, -0.11924233287572861, 0.1166684627532959, -0.07941803336143494, -0.12712408602237701, -0.0036343673709779978, 0.021101213991642, 0.006154481787234545, 0.07121264189481735, 0.012417512945830822, 0.042279914021492004, -0.06997231394052505, -0.10453294217586517, -0.08504866063594818, 0.20719362795352936, -0.04844778776168823, 0.0729505643248558, -0.050487272441387177, 0.07202059030532837, -0.2297777682542801, -0.12882429361343384, -0.07552332431077957, -0.026564989238977432, 0.08626670390367508, 0.03126305714249611, 0.09143783152103424, 0.14127416908740997, 0.17627179622650146, 0.16371291875839233, -0.0764671117067337, 0.033041443675756454, -0.023400118574500084, 0.024444984272122383, -0.03499279171228409, 0.027597906067967415, -0.08641791343688965, 0.01734805852174759, 0.08282089233398438, -0.05219406262040138, 0.05397224798798561, 0.002355861710384488, -0.06346184015274048, -0.07543433457612991, -0.03867267072200775, 0.0865631178021431, 0.005074936896562576, -0.05578358843922615, -0.0865921676158905, 0.020757362246513367, 0.18087655305862427, 0.02611209824681282, -0.049026478081941605, 0.07029610872268677, 0.0031065645162016153, 0.018283197656273842, 0.06768990308046341, 0.020538445562124252, 0.10760790854692459, -0.04310455545783043, -0.17133867740631104, 0.003419556189328432, -0.03079438954591751, -0.017581038177013397, 0.0018056283006444573, 0.016106493771076202, 0.06505647301673889, -0.14228737354278564, -0.1115138903260231, 0.02281857468187809, 0.005250939168035984, -0.05903179571032524, 0.03588453307747841, 0.08483122289180756, -0.02258710004389286, 0.0019674506038427353, 0.008258198387920856, -0.045938100665807724, -0.10584122687578201, 0.052501387894153595, 0.040538251399993896, 0.022199656814336777, -0.13736078143119812, -0.05879758298397064, -0.08214493840932846, 0.1049509048461914, -0.03108988143503666, -0.08861774206161499, -0.08308161795139313, 0.10849593579769135, -0.13563726842403412, -0.02732890099287033, -0.14058126509189606, -0.11807821691036224, 0.06413370370864868, 0.1709180623292923, -0.0765911415219307, -0.016386235132813454, -0.024486184120178223, -0.027507388964295387, 0.001977680018171668, -0.015056796371936798, 0.020741064101457596, 0.0493888221681118, 0.014675494283437729, 0.16389091312885284, -0.02423517405986786, -0.16829049587249756, -0.034340571612119675, 0.02204236574470997, -0.0020701561588793993, -0.03500230237841606, 0.12593410909175873, -0.11365696042776108, -0.153418630361557, 0.05534965172410011, 0.07210244983434677, 0.10301444679498672, -0.04255702346563339, -0.09061263501644135, -0.05519824102520943, -0.08380115777254105, 0.044091638177633286, -0.08632108569145203, 0.05611313134431839, -0.03897857293486595, -0.042421843856573105, -0.11611437797546387, 0.1044669821858406, -0.00559696601703763, -0.013725413009524345, -0.089662104845047, 0.2386833131313324, -0.1647554636001587, -0.06234725937247276, -0.04724124073982239, 0.03757144510746002, -0.0071012661792337894, -0.09450620412826538, -0.00677844975143671, -0.153812974691391, 0.021100521087646484, 0.09206753969192505, 0.04852672666311264, 0.04187877103686333, -0.04149812087416649, 0.037914469838142395, 0.06424439698457718, -0.08466000109910965, 0.024030640721321106, 0.00898200087249279, 0.11502639949321747, -0.005595039576292038, 0.02408452332019806, 0.04476805403828621, 0.027375390753149986, -0.009015732444822788, -0.06046547740697861, 0.14255759119987488, -0.07059026509523392, -0.023621516302227974, -0.030359670519828796, 0.04022274166345596, -0.03178069740533829, -0.11603771895170212, -0.020153947174549103, 0.0763988345861435, 0.28416478633880615, 0.09182639420032501, 0.18050718307495117, 0.016006166115403175, 0.021677706390619278, -0.04489864408969879, 0.02932688593864441, 0.06859420984983444, 0.10279443860054016, 0.030432160943746567, -0.057301115244627, 0.043096765875816345, -0.07459575682878494, 0.0703345537185669, 0.02452082559466362, -0.07800838351249695, -0.05992353707551956, 0.1480633169412613, 0.10666977614164352, -0.20602047443389893, 0.07527239620685577, 0.15061381459236145, 0.008189122192561626, 0.080904521048069, 0.022743627429008484, -0.0663597360253334, -0.12101580202579498, -0.02821097895503044, -0.02716638147830963, 0.19157181680202484, -0.07399901002645493, 0.16241027414798737, 0.028705088421702385, -0.06689818948507309, 0.0777694508433342, -0.047130536288022995, -0.07900029420852661, -0.025801951065659523, -0.02007351815700531, -0.14045295119285583, 0.12224974483251572, -0.07010086625814438, 0.08828635513782501, -0.012348676100373268, -0.025884106755256653, -0.03931402042508125, 0.04644100368022919, -0.1061781495809555, 0.23244476318359375, -0.06239652633666992, -0.1281440556049347, -0.13634741306304932, -0.003947558347135782, -0.04388159513473511, 0.012610397301614285, 0.055659957230091095, -0.10394887626171112, -0.08901840448379517, -0.08918982744216919, -0.026730135083198547, -0.03399323672056198, 0.020231099799275398, 0.10092449188232422, -0.023161008954048157, 0.050732050091028214, -0.09513231366872787, 0.02407560497522354, -0.01299701351672411, 0.008872205391526222, 0.06247330456972122, -0.06779711693525314, 0.15055686235427856, 0.1291196346282959, -0.06652193516492844, 0.0210429597645998, 0.09099829941987991, 0.3467591106891632, 0.016184762120246887, 0.007369303610175848, 0.21470661461353302, 0.000331791874486953, 0.020968319848179817, 0.10594914853572845, 0.07214456796646118, -0.08310549706220627, -0.04529653862118721, -0.06916071474552155, -0.09483477473258972, -0.21136794984340668, -0.027990389615297318, 0.006266967859119177, -0.017446525394916534, -0.04538159444928169, 0.0760636106133461, 0.050664763897657394, 0.08656566590070724, 0.04539315029978752, -0.07702326774597168, -0.07555590569972992, 0.05811609700322151, 0.030962582677602768, -0.028030935674905777, 0.019770704209804535, -0.06793401390314102, 0.020175768062472343, 0.07342680543661118, 0.019337711855769157, 0.02630883827805519, 0.15553860366344452, 0.06565134972333908, 0.13240407407283783, 0.11763836443424225, 0.14991220831871033, 0.026477351784706116, -0.010464421473443508, -0.05474507063627243, -0.023248180747032166, -0.09121403098106384, 0.059141840785741806, 0.027226954698562622, -0.09579437971115112, 0.04916788265109062, 0.12148351967334747, -0.042070817202329636, 0.04110288619995117, -0.15987622737884521, 0.14396731555461884, -0.20197738707065582, -0.07991863787174225, 0.015583711676299572, 0.10707337409257889, -0.08498100191354752, -0.013644474558532238, -0.06475045531988144, 0.01670096442103386, -0.07073988765478134, 0.0610121414065361, 0.03858070448040962, 0.16719365119934082, 0.07358484715223312, -0.01912057399749756, -0.10453075170516968, -0.059995636343955994, 0.041712481528520584, -0.15641438961029053, 0.1784118413925171, 0.05254727602005005, -0.0729709267616272, 0.0652884766459465, -0.0518295019865036, 0.02938993275165558, 0.10205977410078049, 0.1353207677602768, 0.03237070143222809, -0.021623972803354263, -0.08959263563156128, -0.254584401845932, 0.07636371999979019, -0.08313853293657303, 0.01717040129005909, -0.018330799415707588, 0.08287044614553452, -0.027301771566271782, -0.060280393809080124, 0.12679913640022278, -0.3733402192592621, 0.012760325334966183, 0.07430987060070038, 0.06342361122369766, 0.008554788306355476, -0.04521898552775383, -0.02948387898504734, 0.05926446244120598, 0.20391422510147095, -0.09380566328763962, -0.07595986872911453, -0.0796876922249794, 0.008570755831897259, 0.18716387450695038, -0.08496538549661636, 0.021931767463684082, -0.09864610433578491, -0.06511303037405014, 0.04247822239995003, -0.10469867289066315, 0.08659813553094864, 0.018593404442071915, -0.06136813759803772, -0.05842209607362747, 0.14625269174575806, -0.013997813686728477, 0.05055360123515129, 0.014328385703265667, -0.008303500711917877, -0.033989068120718, -0.12661564350128174, 0.024408074095845222, 0.0024097596760839224, 0.04103711619973183, 0.036692626774311066, -0.046714454889297485, -0.1507672816514969, -0.05153810605406761, -0.016661254689097404, 0.03551010414958, 0.2798968553543091, -0.07760517299175262, 0.05586005747318268, 0.16252464056015015, -0.02994900569319725, -0.05593809485435486, -0.02123691700398922, -0.08902030438184738, 0.0902651771903038, -0.013228053227066994, -0.09932045638561249, 0.079123854637146, 0.2026415914297104, -0.08631613105535507, 0.268485426902771, -0.27936652302742004, -0.11193949729204178, 0.19843026995658875, 0.04183667525649071, 0.30441370606422424, -0.0725908875465393, -0.06939363479614258, -0.028152765706181526, -0.3717678189277649, 0.007244472857564688, -0.005710078403353691, 0.04636946693062782, -0.08815445005893707, 0.04368020221590996, 0.047458715736866, -0.07491231709718704, 0.26358991861343384, 0.05077424645423889, 0.044126804918050766, 0.03744485601782799, -0.17213471233844757, 0.05855923891067505, 0.01501441840082407, 0.1178453117609024, 0.07486163824796677, 0.10550965368747711, 0.04792094603180885, -0.015184960328042507, -0.032503750175237656, 0.12217909097671509, -0.0527644045650959, -0.07441645860671997, -0.11557269841432571, -0.029865896329283714, -0.1231197640299797, 0.005528011824935675, 0.07172629237174988, -0.06643295288085938, 0.09312692284584045, 0.13565759360790253, -0.1038135215640068, -0.10477221757173538, 0.010974264703691006, 0.07757852226495743, -0.043168652802705765, 0.034514375030994415, -0.2421116828918457, 0.04208825156092644, 0.12900863587856293, 0.06215452402830124, 0.010810707695782185, -0.0015377029776573181, -0.07009511440992355, 0.05506506562232971, 0.11738894879817963, -0.19111867249011993, 0.19077624380588531, -0.07362867891788483, -0.03923869878053665, 0.0116015849635005, 0.02391701377928257, 0.09976941347122192, 0.006551078055053949, -0.03819964453577995, 0.015019748359918594, 0.004077650140970945, -0.07849933952093124, 0.11346417665481567, 0.06454473733901978, 0.022011911496520042, -0.03749872371554375, 0.04809702932834625, 0.019390873610973358, 0.13179746270179749, -0.11496340483427048, -0.08253229409456253, -0.07678988575935364, -0.05765203759074211, -0.136680006980896, 0.06815881282091141, -0.06519585847854614, 0.0050246575847268105, -0.10360638052225113, 0.016908161342144012, -0.017799347639083862, 0.12454800307750702, 0.05386529490351677, 0.1080382838845253, -0.03459599241614342, -0.17401456832885742, -0.006274666637182236, 0.011450116522610188, -0.02265438251197338, -0.0064772325567901134, -0.08107490837574005, -0.04695113003253937, -0.04933107644319534, 0.05954790860414505, -0.02939378283917904, -0.0044317469000816345, -0.12337857484817505, -0.031135380268096924, -0.13119663298130035, 0.05380585789680481, -0.061938244849443436, -0.012187030166387558, -0.019452152773737907, -0.04537183791399002, -0.11538673937320709, 0.059216808527708054, -0.06250368058681488, -0.001412191428244114, 0.016434388235211372, 0.10356654971837997, -0.03708568215370178, -0.011527853086590767, 0.12050995975732803, 0.021400192752480507, 0.05586203187704086, 0.11576665937900543, 0.03537335246801376, 0.043618686497211456, -0.16272778809070587, 0.011131636798381805, 0.09814798086881638, -0.03940419480204582, -0.0059091150760650635, -0.11048980057239532, 0.05507316440343857, 0.020566221326589584, 0.00011036457726731896, 0.009315017610788345, 0.09922944009304047, -0.11533552408218384, -0.18662209808826447, -0.0687423050403595, -0.06479988247156143, 0.039553530514240265, -0.08313864469528198, 0.12580370903015137, 0.054520606994628906, -0.02375066466629505, 0.04627993702888489, -0.0175679549574852, -0.09808246046304703, -0.007910745218396187, -0.03842936083674431, -0.03489202260971069, -0.1722373068332672, -0.0058657038025557995, -0.06209026277065277, -0.011087140068411827, 0.189277783036232, -0.05111806094646454, -0.046113092452287674, -0.032723572105169296, 0.09265304356813431, 0.1829356998205185, -0.03184524178504944, 0.2710716128349304, -0.010759797878563404, 0.06864137202501297, 0.04951529577374458, 0.061860091984272, -0.04007010534405708, -0.05633777752518654, 0.014992302283644676, 0.08175414800643921, -0.00773460092023015, 0.059849198907613754, -0.06214617192745209, -0.13952282071113586, 0.07852087169885635, 0.019827287644147873, 0.01944335363805294, -0.043107129633426666, 0.05191498249769211, -0.03456195071339607, 0.1877807378768921, -0.07917676866054535, -0.023248745128512383, 0.04040032997727394, -0.02186916396021843, -0.16050073504447937, -0.052288126200437546, -0.08224663883447647, -0.04161147028207779, 0.0979602038860321, -0.043645016849040985, -0.04895232990384102, 0.08550235629081726, -0.031966954469680786, -0.005635630339384079, 0.09124261140823364, -0.034946247935295105, -0.10787708312273026, -0.011496025137603283, 0.02735695242881775, -0.04661578685045242, -0.060675472021102905, 0.05400250852108002, 0.0827828049659729, -0.045863647013902664, 0.011664417572319508, -0.0214616097509861, 0.08578838407993317, -0.028735538944602013, -0.0784284844994545, -0.012223251163959503, -0.12886643409729004, 0.040641285479068756, 0.010527650825679302, 0.09585222601890564, -0.014815614558756351, -0.044274549931287766, -0.003010884393006563, 0.08951768279075623, 0.017146404832601547, 0.08292409777641296, -0.013523407280445099, 0.0802147313952446, -0.03480024263262749, 0.048642564564943314, -0.012475097551941872, -0.04311474785208702, 0.06892114877700806, 0.08900996297597885, 0.07539749145507812, -0.03985882177948952, -0.004457035567611456, -0.03390015661716461, 0.03557392209768295, 0.00832129642367363, -0.09277016669511795, 0.021737396717071533, 0.048301707953214645, -0.07083237916231155, -0.112114816904068, -0.021411918103694916, 0.028156176209449768, -0.12674130499362946, 0.0028247307054698467, -0.03495616465806961, -0.064759761095047, -0.1474991738796234, 0.056812237948179245, -0.12466185539960861, 0.12379393726587296, 0.065910704433918, -0.09448208659887314, -0.10691579431295395, 0.0028994649183005095, -0.06697697192430496, 0.013051280751824379, -0.005626704078167677, -0.165455162525177, 0.002241369104012847, -0.003912799991667271, 0.0015842816792428493, -0.20234644412994385, -0.0628020316362381, 0.06605376303195953, 0.12389783561229706, 0.18615208566188812, 0.050071924924850464, 0.09594660997390747, 0.08868127316236496, -0.012180508114397526, -0.14769995212554932, 0.10862192511558533, -0.04195346683263779, -0.09296493977308273, -0.02668476104736328, -0.09363939613103867, 0.08274232596158981, 0.037035517394542694, 0.008513189852237701, -0.020869407802820206, 0.0182056687772274, 0.05682773515582085, -0.009627741761505604, -0.04410116374492645, 0.03965151309967041, -0.10264153778553009, 0.07441967725753784, -0.008966692723333836, -0.11129417270421982, -0.06022876873612404, -0.03883825242519379, 0.1659434288740158, 0.10968877375125885, -0.13165594637393951, -0.013990044593811035, -0.04452759027481079, 0.054872892796993256, -0.0669812336564064, -0.07615059614181519, -0.1675313413143158, -0.02790328674018383, -0.04629985988140106, 0.008344789035618305, -0.027739981189370155, 0.03859967365860939, 0.04460214823484421, 0.024844344705343246, 0.01953916810452938, -0.2719627916812897, 0.07076551765203476, -0.001477640587836504, -0.10358249396085739, -0.12157901376485825 ]
null
null
null
*** **Note**: For compatiblity with current llama.cpp, please download the files published on 2/15/2024. The files originally published here will fail to load. *** <br/> # nomic-embed-text-v1 - GGUF Original model: [nomic-embed-text-v1](https://huggingface.co/nomic-ai/nomic-embed-text-v1) ## Description This repo contains llama.cpp-compatible files for [nomic-embed-text-v1](https://huggingface.co/nomic-ai/nomic-embed-text-v1) in GGUF format. llama.cpp will default to 2048 tokens of context with these files. To use the full 8192 tokens that Nomic Embed is benchmarked on, you will have to choose a context extension method. The original model uses Dynamic NTK-Aware RoPE scaling, but that is not currently available in llama.cpp. A combination of YaRN and linear scaling is an acceptable substitute. These files were converted and quantized with llama.cpp [PR 5500](https://github.com/ggerganov/llama.cpp/pull/5500), commit [34aa045de](https://github.com/ggerganov/llama.cpp/pull/5500/commits/34aa045de44271ff7ad42858c75739303b8dc6eb). ## Example `llama.cpp` Command Compute a single embedding: ```shell ./embedding -ngl 99 -m nomic-embed-text-v1.f16.gguf -c 8192 -b 8192 --rope-scaling yarn --rope-freq-scale .75 -p 'search_query: What is TSNE?' ``` You can also submit a batch of texts to embed, as long as the total number of tokens does not exceed the context length. Only the first three embeddings are shown by the `embedding` example. texts.txt: ``` search_query: What is TSNE? search_query: Who is Laurens Van der Maaten? ``` Compute multiple embeddings: ```shell ./embedding -ngl 99 -m nomic-embed-text-v1.f16.gguf -c 8192 -b 8192 --rope-scaling yarn --rope-freq-scale .75 -f texts.txt ``` ## Compatibility These files are compatible with llama.cpp as of commit [4524290e8](https://github.com/ggerganov/llama.cpp/commit/4524290e87b8e107cc2b56e1251751546f4b9051) from 2/15/2024. ## Provided Files The below table shows the mean squared error of the embeddings produced by these quantizations of Nomic Embed relative to the Sentence Transformers implementation. Name | Quant | Size | MSE -----|-------|------|----- [nomic-embed-text-v1.Q2\_K.gguf](https://huggingface.co/nomic-ai/nomic-embed-text-v1-GGUF/blob/main/nomic-embed-text-v1.Q2_K.gguf) | Q2\_K | 48 MiB | 2.36e-03 [nomic-embed-text-v1.Q3\_K\_S.gguf](https://huggingface.co/nomic-ai/nomic-embed-text-v1-GGUF/blob/main/nomic-embed-text-v1.Q3_K_S.gguf) | Q3\_K\_S | 57 MiB | 1.31e-03 [nomic-embed-text-v1.Q3\_K\_M.gguf](https://huggingface.co/nomic-ai/nomic-embed-text-v1-GGUF/blob/main/nomic-embed-text-v1.Q3_K_M.gguf) | Q3\_K\_M | 65 MiB | 8.73e-04 [nomic-embed-text-v1.Q3\_K\_L.gguf](https://huggingface.co/nomic-ai/nomic-embed-text-v1-GGUF/blob/main/nomic-embed-text-v1.Q3_K_L.gguf) | Q3\_K\_L | 69 MiB | 8.68e-04 [nomic-embed-text-v1.Q4\_0.gguf](https://huggingface.co/nomic-ai/nomic-embed-text-v1-GGUF/blob/main/nomic-embed-text-v1.Q4_0.gguf) | Q4\_0 | 75 MiB | 6.87e-04 [nomic-embed-text-v1.Q4\_K\_S.gguf](https://huggingface.co/nomic-ai/nomic-embed-text-v1-GGUF/blob/main/nomic-embed-text-v1.Q4_K_S.gguf) | Q4\_K\_S | 75 MiB | 6.81e-04 [nomic-embed-text-v1.Q4\_K\_M.gguf](https://huggingface.co/nomic-ai/nomic-embed-text-v1-GGUF/blob/main/nomic-embed-text-v1.Q4_K_M.gguf) | Q4\_K\_M | 81 MiB | 3.12e-04 [nomic-embed-text-v1.Q5\_0.gguf](https://huggingface.co/nomic-ai/nomic-embed-text-v1-GGUF/blob/main/nomic-embed-text-v1.Q5_0.gguf) | Q5\_0 | 91 MiB | 2.79e-04 [nomic-embed-text-v1.Q5\_K\_S.gguf](https://huggingface.co/nomic-ai/nomic-embed-text-v1-GGUF/blob/main/nomic-embed-text-v1.Q5_K_S.gguf) | Q5\_K\_S | 91 MiB | 2.61e-04 [nomic-embed-text-v1.Q5\_K\_M.gguf](https://huggingface.co/nomic-ai/nomic-embed-text-v1-GGUF/blob/main/nomic-embed-text-v1.Q5_K_M.gguf) | Q5\_K\_M | 95 MiB | 7.34e-05 [nomic-embed-text-v1.Q6\_K.gguf](https://huggingface.co/nomic-ai/nomic-embed-text-v1-GGUF/blob/main/nomic-embed-text-v1.Q6_K.gguf) | Q6\_K | 108 MiB | 6.29e-05 [nomic-embed-text-v1.Q8\_0.gguf](https://huggingface.co/nomic-ai/nomic-embed-text-v1-GGUF/blob/main/nomic-embed-text-v1.Q8_0.gguf) | Q8\_0 | 140 MiB | 6.34e-06 [nomic-embed-text-v1.f16.gguf](https://huggingface.co/nomic-ai/nomic-embed-text-v1-GGUF/blob/main/nomic-embed-text-v1.f16.gguf) | F16 | 262 MiB | 5.62e-10 [nomic-embed-text-v1.f32.gguf](https://huggingface.co/nomic-ai/nomic-embed-text-v1-GGUF/blob/main/nomic-embed-text-v1.f32.gguf) | F32 | 262 MiB | 9.34e-11
{"language": ["en"], "license": "apache-2.0", "tags": ["feature-extraction", "sentence-similarity"], "model_name": "nomic-embed-text-v1", "base_model": "nomic-ai/nomic-embed-text-v1", "inference": false, "model_creator": "Nomic", "model_type": "bert", "pipeline_tag": "sentence-similarity", "quantized_by": "Nomic"}
sentence-similarity
nomic-ai/nomic-embed-text-v1-GGUF
[ "gguf", "feature-extraction", "sentence-similarity", "en", "base_model:nomic-ai/nomic-embed-text-v1", "license:apache-2.0", "region:us" ]
2024-02-13T19:59:17+00:00
[]
[ "en" ]
TAGS #gguf #feature-extraction #sentence-similarity #en #base_model-nomic-ai/nomic-embed-text-v1 #license-apache-2.0 #region-us
* Note: For compatiblity with current URL, please download the files published on 2/15/2024. The files originally published here will fail to load. \* nomic-embed-text-v1 - GGUF ========================== Original model: nomic-embed-text-v1 Description ----------- This repo contains URL-compatible files for nomic-embed-text-v1 in GGUF format. URL will default to 2048 tokens of context with these files. To use the full 8192 tokens that Nomic Embed is benchmarked on, you will have to choose a context extension method. The original model uses Dynamic NTK-Aware RoPE scaling, but that is not currently available in URL. A combination of YaRN and linear scaling is an acceptable substitute. These files were converted and quantized with URL PR 5500, commit 34aa045de. Example 'URL' Command --------------------- Compute a single embedding: You can also submit a batch of texts to embed, as long as the total number of tokens does not exceed the context length. Only the first three embeddings are shown by the 'embedding' example. URL: Compute multiple embeddings: Compatibility ------------- These files are compatible with URL as of commit 4524290e8 from 2/15/2024. Provided Files -------------- The below table shows the mean squared error of the embeddings produced by these quantizations of Nomic Embed relative to the Sentence Transformers implementation.
[]
[ "TAGS\n#gguf #feature-extraction #sentence-similarity #en #base_model-nomic-ai/nomic-embed-text-v1 #license-apache-2.0 #region-us \n" ]
[ 52 ]
[ "passage: TAGS\n#gguf #feature-extraction #sentence-similarity #en #base_model-nomic-ai/nomic-embed-text-v1 #license-apache-2.0 #region-us \n" ]
[ -0.02096908912062645, 0.06948933005332947, -0.005991148296743631, 0.03957642987370491, -0.022857028990983963, 0.059271540492773056, 0.1486535370349884, 0.146012082695961, 0.051656324416399, -0.03371026739478111, 0.13810200989246368, 0.04146862030029297, 0.06321176141500473, -0.02019762620329857, -0.0025338067207485437, -0.15426400303840637, 0.1275337040424347, 0.010112812742590904, 0.020630786195397377, 0.015260053798556328, 0.12857744097709656, 0.0659121423959732, 0.026051726192235947, 0.0273270420730114, -0.05806528776884079, 0.0479404553771019, 0.022609131410717964, -0.0021426458843052387, 0.061756305396556854, 0.04868964105844498, -0.03874320909380913, 0.01648641563951969, -0.05548962205648422, -0.24536259472370148, 0.02183324657380581, -0.023946810513734818, -0.09205017983913422, 0.02592610940337181, 0.016329016536474228, -0.079132579267025, 0.07753722369670868, 0.15594586730003357, -0.08448605984449387, 0.0672040581703186, -0.14302614331245422, -0.1929873377084732, -0.0696844831109047, 0.12057311087846756, 0.030652612447738647, 0.031341880559921265, -0.02966882288455963, -0.01834023930132389, -0.13143302500247955, 0.0038515604101121426, 0.10714326798915863, -0.28740960359573364, 0.009525594301521778, 0.18217383325099945, 0.0201262254267931, 0.011197850108146667, -0.03342340141534805, 0.0837639570236206, 0.05463551729917526, 0.007168922573328018, -0.14553998410701752, -0.017356425523757935, -0.07121236622333527, 0.14196962118148804, -0.04631878808140755, -0.02069062367081642, 0.3301082253456116, 0.08813871443271637, 0.033645566552877426, 0.052173417061567307, -0.03912520408630371, 0.10305222123861313, -0.07394612580537796, 0.058690305799245834, 0.024383146315813065, 0.22831495106220245, 0.18199658393859863, -0.06663468480110168, -0.10926578938961029, -0.09673777967691422, -0.1478843241930008, 0.0263211689889431, 0.01761702634394169, 0.08598889410495758, -0.09827107191085815, 0.014825304038822651, -0.240458682179451, -0.1479041874408722, -0.04494250565767288, -0.07019392400979996, 0.11458281427621841, 0.12053652107715607, -0.1238604262471199, 0.02974272333085537, 0.23564577102661133, 0.1584223359823227, 0.04616047814488411, 0.01390629168599844, -0.09330130368471146, 0.16755442321300507, -0.042462680488824844, 0.0950876846909523, -0.06825379282236099, 0.01994800567626953, 0.06718403100967407, -0.027006937190890312, 0.037745196372270584, -0.03227950632572174, -0.18283121287822723, 0.04812827333807945, -0.03937869518995285, 0.056439612060785294, 0.11807799339294434, 0.006789061240851879, -0.0955624207854271, 0.016003889963030815, 0.07626743614673615, -0.03995217755436897, -0.01598259247839451, -0.004187602549791336, 0.04591574892401695, -0.0012804897269234061, 0.009009276516735554, 0.08792433887720108, -0.04582640528678894, -0.105216845870018, -0.0887860506772995, 0.013874130323529243, 0.01452465821057558, 0.025714758783578873, 0.04038378223776817, -0.02879433147609234, 0.01255648024380207, -0.08449587970972061, -0.18107189238071442, 0.03327196091413498, 0.10504315048456192, -0.0459265299141407, -0.07088620960712433, -0.026877673342823982, -0.009058117866516113, 0.03861672058701515, -0.012185759842395782, -0.07818438857793808, -0.08743767440319061, 0.02702789194881916, -0.05860000476241112, 0.01802428998053074, -0.18191616237163544, 0.043365828692913055, -0.17451359331607819, 0.02435709349811077, 0.0010435006115585566, 0.027300089597702026, -0.1098637729883194, 0.09284370392560959, -0.0632360503077507, 0.08258315175771713, -0.11745166033506393, 0.0002416313800495118, -0.06939803808927536, 0.18091343343257904, -0.1617676168680191, -0.08389829844236374, 0.1679898351430893, -0.08745712786912918, -0.1418837308883667, 0.09422672539949417, 0.05742965266108513, -0.05895662680268288, 0.05356249585747719, 0.3987289071083069, -0.06456496566534042, 0.04189472272992134, 0.11728615313768387, 0.17491514980793, -0.017878687009215355, 0.047968123108148575, 0.1072489395737648, -0.1634984165430069, -0.1474844217300415, 0.07127924263477325, -0.08871882408857346, 0.07239352911710739, 0.018728207796812057, -0.12196505814790726, -0.08752255141735077, -0.028504082933068275, -0.009480898268520832, -0.04335692897439003, -0.009530001319944859, -0.04423714429140091, -0.040398549288511276, -0.17375898361206055, 0.07639756053686142, 0.023156747221946716, 0.021033866330981255, -0.0167493037879467, 0.16661526262760162, -0.05662720650434494, 0.07985948026180267, -0.045689575374126434, 0.0218434426933527, 0.01823159120976925, 0.0110839968547225, 0.11117029190063477, 0.07558165490627289, 0.005969819147139788, -0.030756020918488503, -0.03106689639389515, 0.010044252499938011, 0.009146144613623619, 0.05118774622678757, 0.02234656922519207, -0.20577673614025116, 0.07220225036144257, 0.017227767035365105, 0.11510380357503891, 0.014825536869466305, 0.011231599375605583, -0.021626921370625496, 0.04478553310036659, -0.08123262226581573, 0.032972726970911026, 0.026867616921663284, -0.06352541595697403, -0.0037334593944251537, 0.010552318766713142, 0.06868833303451538, 0.009896798059344292, -0.12590891122817993, 0.24729342758655548, -0.014917188324034214, 0.09581813961267471, 0.16094131767749786, -0.08928730338811874, 0.11320513486862183, 0.022815221920609474, -0.008409606292843819, 0.009203287772834301, 0.0696272999048233, -0.032163072377443314, 0.1069965735077858, -0.05866614729166031, 0.08198743313550949, -0.08917111158370972, 0.00507002416998148, -0.05418562889099121, -0.08447959274053574, -0.023821786046028137, 0.036527279764413834, 0.12580223381519318, -0.21320092678070068, 0.17819316685199738, 0.28774234652519226, -0.010823376476764679, 0.13008719682693481, -0.11740276962518692, -0.050360072404146194, -0.02723507769405842, -0.015184717252850533, -0.057458601891994476, 0.13177187740802765, -0.1862030327320099, 0.047337617725133896, 0.10096629709005356, 0.0701460987329483, 0.12524192035198212, -0.13654746115207672, -0.0740286260843277, -0.01010444201529026, -0.11868943274021149, -0.06718245893716812, 0.006130272056907415, -0.06035662442445755, 0.06105826422572136, -0.006292624864727259, -0.08499733358621597, 0.06952721625566483, -0.005028788466006517, -0.1226242259144783, 0.11695601046085358, -0.15772785246372223, -0.1367696225643158, -0.10610605776309967, 0.014920512214303017, -0.1466837376356125, 0.0582982562482357, 0.13863034546375275, -0.07602972537279129, -0.006326849106699228, -0.0415831133723259, -0.0375325046479702, -0.0656188577413559, -0.04552927985787392, -0.07117573916912079, 0.007422108668833971, 0.023594748228788376, -0.14783023297786713, -0.06724776327610016, -0.012198664247989655, -0.004469339270144701, 0.017789732664823532, -0.11232107877731323, 0.07922903448343277, 0.13645708560943604, 0.08563005179166794, 0.06587965786457062, 0.01275499165058136, 0.2093610018491745, -0.024195674806833267, 0.0027756027411669493, 0.1600840985774994, 0.030579084530472755, 0.058389537036418915, 0.1578405648469925, 0.038744110614061356, -0.058277130126953125, -0.060790810734033585, -0.009308257140219212, -0.014270939864218235, -0.16807949542999268, -0.04699217900633812, -0.10406424105167389, 0.07286698371171951, -0.02184532955288887, 0.1096738949418068, 0.17521843314170837, 0.04994117468595505, -0.06751537322998047, 0.008792482316493988, 0.045282695442438126, 0.02230069972574711, 0.16416947543621063, 0.004370950162410736, 0.08797293156385422, -0.03735588118433952, -0.02442716620862484, 0.10947100818157196, 0.11986468732357025, 0.09617538750171661, 0.16235633194446564, 0.061818379908800125, 0.13354536890983582, 0.08180147409439087, 0.07950063794851303, -0.012782973237335682, -0.01695089042186737, -0.02528415620326996, -0.08717716485261917, -0.06708960235118866, 0.049831025302410126, 0.10935808718204498, 0.03641393408179283, -0.11660497635602951, 0.03517421334981918, -0.1888120025396347, 0.1378718763589859, 0.14594602584838867, 0.1275693029165268, -0.04427588731050491, 0.026946285739541054, 0.07630623877048492, 0.02981206402182579, -0.01099107600748539, 0.09563808143138885, -0.14114080369472504, -0.042767226696014404, 0.07333467155694962, 0.07553120702505112, 0.1282336413860321, 0.10324915498495102, 0.05097200721502304, -0.1053069606423378, -0.08612599223852158, 0.04666605591773987, 0.12147096544504166, -0.18361590802669525, 0.21500028669834137, 0.04020813852548599, -0.05068770796060562, -0.001135886530391872, 0.03769965097308159, 0.12464151531457901, 0.18874500691890717, 0.09766378253698349, 0.03734969720244408, -0.11305861920118332, 0.046283043920993805, -0.07778849452733994, 0.12312280386686325, 0.016179973259568214, -0.1637098789215088, -0.049226973205804825, -0.072544164955616, 0.01876649260520935, 0.013926674611866474, 0.2425839751958847, -0.17353375256061554, -0.1368909776210785, 0.05633341521024704, 0.08619602769613266, 0.03477737680077553, -0.10162259638309479, 0.07141800224781036, -0.041643429547548294, 0.193955659866333, -0.090115986764431, -0.0380437858402729, -0.05576682835817337, -0.03865646943449974, 0.04946504160761833, 0.007796179037541151, -0.043859414756298065, -0.10852242261171341, 0.013067943975329399, -0.06836940348148346, -0.19634227454662323, 0.09768407791852951, -0.09534430503845215, 0.02669418789446354, -0.061186522245407104, 0.11539452522993088, -0.1043953150510788, 0.061326999217271805, 0.058823876082897186, 0.05943707749247551, -0.11651277542114258, -0.11140557378530502, 0.06573520600795746, 0.009088370017707348, -0.071134552359581, 0.04507697746157646, -0.10581158101558685, 0.015700729563832283, -0.05017196387052536, -0.08276254683732986, 0.14542663097381592, 0.26552337408065796, -0.05505308136343956, 0.16806398332118988, 0.266918808221817, -0.09705954045057297, -0.2519494295120239, -0.19430044293403625, -0.1705743968486786, -0.04492763802409172, -0.026040153577923775, -0.1572439968585968, 0.09671640396118164, 0.09820657223463058, -0.1061006411910057, 0.09826669842004776, -0.20097216963768005, -0.056097887456417084, 0.2073875516653061, -0.10331524908542633, 0.25562819838523865, -0.15964734554290771, -0.11942410469055176, -0.07094848901033401, -0.17844852805137634, 0.06201896443963051, -0.08552195876836777, 0.10150625556707382, 0.0025171914603561163, -0.12585701048374176, -0.0266881100833416, 0.04250285401940346, 0.2628607451915741, -0.002122467616572976, 0.08266916126012802, -0.02454831637442112, 0.07932167500257492, 0.1039205864071846, 0.026956820860505104, 0.08668148517608643, -0.2266818732023239, 0.04859157279133797, -0.03151850029826164, 0.00008842987881507725, -0.051926057785749435, 0.06568462401628494, 0.008946509100496769, -0.07572036981582642, -0.11291781812906265, -0.03586563095450401, 0.023602142930030823, 0.02430971898138523, 0.08205877989530563, -0.08664820343255997, 0.10194019973278046, 0.08022059500217438, -0.06563667207956314, -0.2558751404285431, -0.05877077579498291, -0.09369169175624847, -0.08655669540166855, 0.10626505315303802, -0.18523798882961273, 0.05178103223443031, 0.02982325293123722, -0.04644692316651344, 0.11166409403085709, 0.08881448209285736, -0.03094821237027645, -0.0686851367354393, 0.12354981154203415, -0.08914762735366821, -0.09727956354618073, -0.10565590113401413, -0.039031945168972015, 0.08308715373277664, -0.05536765977740288, 0.1158587783575058, 0.022427592426538467, -0.02124698832631111, 0.034213148057460785, 0.03030400350689888, -0.1583544760942459, -0.010322125628590584, 0.07559017837047577, 0.02304258942604065, -0.13789358735084534, 0.14861765503883362, 0.04151648283004761, 0.012826947495341301, 0.016000626608729362, 0.02949843741953373, -0.06147369369864464, -0.10728784650564194, -0.11647016555070877, 0.18626829981803894, -0.07412706315517426, -0.10127250105142593, -0.09242037683725357, -0.12988154590129852, 0.017598535865545273, -0.03192831203341484, 0.08515556901693344, 0.10675201565027237, 0.04375627264380455, -0.04738494008779526, 0.09514803439378738, 0.009977753274142742, -0.0636424645781517, 0.02729315683245659, -0.03165663033723831, -0.19392287731170654, -0.01748167723417282, 0.06792256981134415, -0.02968948520720005, -0.010770811699330807, -0.053260039538145065, -0.011434505693614483, -0.1477198302745819, -0.0012767064617946744, -0.04723842069506645, -0.01923418417572975, 0.08309634774923325, -0.05912110581994057, -0.02710280939936638, 0.030599579215049744, -0.11277883499860764, -0.09220045804977417, -0.06821227818727493, 0.10002598911523819, -0.08364786207675934, -0.017142267897725105, 0.13777795433998108, -0.01978854276239872, 0.08195557445287704, 0.12600158154964447, -0.03764057531952858, 0.08532458543777466, -0.3360244929790497, -0.09701570123434067, 0.010603484697639942, 0.032239239662885666, -0.07598394900560379, -0.03814699128270149, 0.012766288593411446, 0.0441289022564888, -0.0679791048169136, 0.018752681091427803, 0.012001612223684788, -0.14940646290779114, -0.22266356647014618, -0.09049958735704422, -0.1157698780298233, -0.01517570298165083, -0.13348133862018585, 0.14870692789554596, 0.09689883887767792, 0.07613970339298248, -0.03129439800977707, 0.001977781532332301, -0.09168035537004471, 0.04131738096475601, -0.027139799669384956, -0.08183907717466354, -0.09855196624994278, -0.10119391977787018, -0.06861282140016556, -0.03156033903360367, 0.28611326217651367, 0.014188169501721859, -0.09932678192853928, 0.03660690784454346, 0.11195121705532074, 0.21097347140312195, 0.019087690860033035, 0.2832537889480591, 0.1209610179066658, -0.005581413395702839, -0.14698044955730438, 0.039050254970788956, -0.008142444305121899, -0.054572101682424545, -0.043525636196136475, 0.055052995681762695, 0.06225045397877693, 0.09892463684082031, 0.07835371047258377, -0.02136777527630329, 0.05198492482304573, 0.10606605559587479, 0.0843500941991806, 0.0717507153749466, 0.05493521690368652, 0.06836847215890884, 0.278533399105072, -0.07002688944339752, 0.08023928105831146, -0.038268886506557465, -0.008410433307290077, -0.11412264406681061, -0.15991520881652832, -0.078264020383358, -0.2404506951570511, 0.017542488873004913, -0.09006426483392715, 0.02811441756784916, 0.1411086767911911, 0.009716988541185856, -0.039250705391168594, -0.004654961172491312, -0.02222641371190548, -0.10263126343488693, 0.03776056319475174, -0.09449192136526108, -0.05735509470105171, 0.0052188122645020485, -0.08509045094251633, 0.06737085431814194, -0.0360548235476017, -0.008051881566643715, 0.03262952342629433, 0.057929474860429764, 0.03835364803671837, -0.16013717651367188, -0.07918328046798706, -0.02544800378382206, -0.07556939870119095, -0.03734919801354408, 0.15056875348091125, 0.057070255279541016, 0.00986015796661377, 0.14763061702251434, 0.1715817004442215, -0.021796705201268196, -0.185011625289917, -0.1285804808139801, 0.09399141371250153, -0.030810905620455742, 0.06395673006772995, -0.039930857717990875, -0.06480851024389267, -0.04696911945939064, 0.24418674409389496, 0.22029510140419006, -0.12103431671857834, -0.021930169314146042, 0.046231333166360855, 0.0037992247380316257, 0.03156576678156853, -0.02389366924762726, 0.05000622943043709, 0.09070645272731781, -0.10265003889799118, 0.0019825659692287445, -0.04588949680328369, -0.02494640462100506, -0.19910545647144318, 0.10094248503446579, 0.08750297874212265, -0.0965033769607544, -0.030997904017567635, 0.1050925999879837, -0.07769124954938889, 0.19794276356697083, -0.05035627633333206, -0.07250111550092697, -0.032204531133174896, -0.04456931725144386, 0.07140084356069565, 0.04582029581069946, 0.058193713426589966, -0.08593133836984634, -0.040824927389621735, 0.04836326465010643, -0.032006215304136276, -0.2054407298564911, -0.10264977067708969, 0.03827852010726929, -0.010933318175375462, 0.14890599250793457, 0.020165940746665, 0.02865247055888176, 0.07225234061479568, 0.042234938591718674, -0.011678962036967278, 0.17059873044490814, 0.00939285196363926, 0.035359643399715424, -0.031847547739744186, -0.0736413449048996, 0.03330383077263832, -0.07844769209623337, 0.09603815525770187, -0.029557108879089355, 0.06246678903698921, 0.12669327855110168, -0.12380411475896835, -0.00856688804924488, -0.008353942073881626, -0.1398332417011261, 0.021701762452721596, 0.0004480055358726531, -0.03405037149786949, -0.08938993513584137, -0.07816119492053986, 0.04651622846722603, 0.0688806101679802, -0.15491996705532074, -0.022711249068379402, 0.06222228333353996, 0.023536117747426033, 0.15282045304775238, -0.002857398707419634, -0.08878827095031738, -0.07780718803405762, -0.03164283558726311, 0.08411817252635956, -0.04275514930486679, 0.0811433345079422, 0.168192058801651, -0.010975069366395473, -0.012868029996752739, -0.3006122410297394, 0.09137114137411118, -0.03469720855355263, -0.006066289264708757, -0.11198955029249191 ]
null
null
transformers
![image/png](https://cdn-uploads.huggingface.co/production/uploads/642265bc01c62c1e4102dc36/yAUStIqYEyDeS6sUUkkCL.png) ![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/642265bc01c62c1e4102dc36/Lzr0NPMlJ2PwZcj-uxVqb.jpeg) GGUF QuantsThanks to konz00: https://huggingface.co/konz00/Echidna-7b-128k-GGUF ### Models Merged The following models were included in the merge: * [Test157t/Hex-Macaroniac-7b](https://huggingface.co/Test157t/Hex-Macaroniac-7b) * [Test157t/Cetus-Sea-7b-128k](https://huggingface.co/Test157t/Cetus-Sea-7b-128k) ### Configuration The following YAML configuration was used to produce this model: ```yaml slices: - sources: - model: Test157t/Hex-Macaroniac-7b layer_range: [0, 32] - model: Test157t/Cetus-Sea-7b-128k layer_range: [0, 32] merge_method: slerp base_model: Test157t/Hex-Macaroniac-7b parameters: t: - filter: self_attn value: [0, 0.5, 0.3, 0.7, 1] - filter: mlp value: [1, 0.5, 0.7, 0.3, 0] - value: 0.5 dtype: bfloat16 ``` @misc{open-llm-leaderboard, author = {Edward Beeching and Clémentine Fourrier and Nathan Habib and Sheon Han and Nathan Lambert and Nazneen Rajani and Omar Sanseviero and Lewis Tunstall and Thomas Wolf}, title = {Open LLM Leaderboard}, year = {2023}, publisher = {Hugging Face}, howpublished = "\url{https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard}" } @software{eval-harness, author = {Gao, Leo and Tow, Jonathan and Biderman, Stella and Black, Sid and DiPofi, Anthony and Foster, Charles and Golding, Laurence and Hsu, Jeffrey and McDonell, Kyle and Muennighoff, Niklas and Phang, Jason and Reynolds, Laria and Tang, Eric and Thite, Anish and Wang, Ben and Wang, Kevin and Zou, Andy}, title = {A framework for few-shot language model evaluation}, month = sep, year = 2021, publisher = {Zenodo}, version = {v0.0.1}, doi = {10.5281/zenodo.5371628}, url = {https://doi.org/10.5281/zenodo.5371628} } @misc{clark2018think, title={Think you have Solved Question Answering? Try ARC, the AI2 Reasoning Challenge}, author={Peter Clark and Isaac Cowhey and Oren Etzioni and Tushar Khot and Ashish Sabharwal and Carissa Schoenick and Oyvind Tafjord}, year={2018}, eprint={1803.05457}, archivePrefix={arXiv}, primaryClass={cs.AI} } @misc{zellers2019hellaswag, title={HellaSwag: Can a Machine Really Finish Your Sentence?}, author={Rowan Zellers and Ari Holtzman and Yonatan Bisk and Ali Farhadi and Yejin Choi}, year={2019}, eprint={1905.07830}, archivePrefix={arXiv}, primaryClass={cs.CL} } @misc{hendrycks2021measuring, title={Measuring Massive Multitask Language Understanding}, author={Dan Hendrycks and Collin Burns and Steven Basart and Andy Zou and Mantas Mazeika and Dawn Song and Jacob Steinhardt}, year={2021}, eprint={2009.03300}, archivePrefix={arXiv}, primaryClass={cs.CY} } @misc{lin2022truthfulqa, title={TruthfulQA: Measuring How Models Mimic Human Falsehoods}, author={Stephanie Lin and Jacob Hilton and Owain Evans}, year={2022}, eprint={2109.07958}, archivePrefix={arXiv}, primaryClass={cs.CL} } @misc{DBLP:journals/corr/abs-1907-10641, title={{WINOGRANDE:} An Adversarial Winograd Schema Challenge at Scale}, author={Keisuke Sakaguchi and Ronan Le Bras and Chandra Bhagavatula and Yejin Choi}, year={2019}, eprint={1907.10641}, archivePrefix={arXiv}, primaryClass={cs.CL} } @misc{DBLP:journals/corr/abs-2110-14168, title={Training Verifiers to Solve Math Word Problems}, author={Karl Cobbe and Vineet Kosaraju and Mohammad Bavarian and Mark Chen and Heewoo Jun and Lukasz Kaiser and Matthias Plappert and Jerry Tworek and Jacob Hilton and Reiichiro Nakano and Christopher Hesse and John Schulman}, year={2021}, eprint={2110.14168}, archivePrefix={arXiv}, primaryClass={cs.CL} }
{"license": "other", "library_name": "transformers", "tags": ["mergekit", "merge"], "base_model": ["Test157t/Hex-Macaroniac-7b", "Test157t/Cetus-Sea-7b-128k"]}
text-generation
Test157t/Echidna-7b-128k
[ "transformers", "safetensors", "mistral", "text-generation", "mergekit", "merge", "arxiv:1803.05457", "arxiv:1905.07830", "arxiv:2009.03300", "arxiv:2109.07958", "arxiv:1907.10641", "arxiv:2110.14168", "base_model:Test157t/Hex-Macaroniac-7b", "base_model:Test157t/Cetus-Sea-7b-128k", "license:other", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2024-02-13T20:11:55+00:00
[ "1803.05457", "1905.07830", "2009.03300", "2109.07958", "1907.10641", "2110.14168" ]
[]
TAGS #transformers #safetensors #mistral #text-generation #mergekit #merge #arxiv-1803.05457 #arxiv-1905.07830 #arxiv-2009.03300 #arxiv-2109.07958 #arxiv-1907.10641 #arxiv-2110.14168 #base_model-Test157t/Hex-Macaroniac-7b #base_model-Test157t/Cetus-Sea-7b-128k #license-other #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
!image/png !image/jpeg GGUF QuantsThanks to konz00: URL ### Models Merged The following models were included in the merge: * Test157t/Hex-Macaroniac-7b * Test157t/Cetus-Sea-7b-128k ### Configuration The following YAML configuration was used to produce this model: @misc{open-llm-leaderboard, author = {Edward Beeching and Clémentine Fourrier and Nathan Habib and Sheon Han and Nathan Lambert and Nazneen Rajani and Omar Sanseviero and Lewis Tunstall and Thomas Wolf}, title = {Open LLM Leaderboard}, year = {2023}, publisher = {Hugging Face}, howpublished = "\url{URL } @software{eval-harness, author = {Gao, Leo and Tow, Jonathan and Biderman, Stella and Black, Sid and DiPofi, Anthony and Foster, Charles and Golding, Laurence and Hsu, Jeffrey and McDonell, Kyle and Muennighoff, Niklas and Phang, Jason and Reynolds, Laria and Tang, Eric and Thite, Anish and Wang, Ben and Wang, Kevin and Zou, Andy}, title = {A framework for few-shot language model evaluation}, month = sep, year = 2021, publisher = {Zenodo}, version = {v0.0.1}, doi = {10.5281/zenodo.5371628}, url = {URL } @misc{clark2018think, title={Think you have Solved Question Answering? Try ARC, the AI2 Reasoning Challenge}, author={Peter Clark and Isaac Cowhey and Oren Etzioni and Tushar Khot and Ashish Sabharwal and Carissa Schoenick and Oyvind Tafjord}, year={2018}, eprint={1803.05457}, archivePrefix={arXiv}, primaryClass={cs.AI} } @misc{zellers2019hellaswag, title={HellaSwag: Can a Machine Really Finish Your Sentence?}, author={Rowan Zellers and Ari Holtzman and Yonatan Bisk and Ali Farhadi and Yejin Choi}, year={2019}, eprint={1905.07830}, archivePrefix={arXiv}, primaryClass={cs.CL} } @misc{hendrycks2021measuring, title={Measuring Massive Multitask Language Understanding}, author={Dan Hendrycks and Collin Burns and Steven Basart and Andy Zou and Mantas Mazeika and Dawn Song and Jacob Steinhardt}, year={2021}, eprint={2009.03300}, archivePrefix={arXiv}, primaryClass={cs.CY} } @misc{lin2022truthfulqa, title={TruthfulQA: Measuring How Models Mimic Human Falsehoods}, author={Stephanie Lin and Jacob Hilton and Owain Evans}, year={2022}, eprint={2109.07958}, archivePrefix={arXiv}, primaryClass={cs.CL} } @misc{DBLP:journals/corr/abs-1907-10641, title={{WINOGRANDE:} An Adversarial Winograd Schema Challenge at Scale}, author={Keisuke Sakaguchi and Ronan Le Bras and Chandra Bhagavatula and Yejin Choi}, year={2019}, eprint={1907.10641}, archivePrefix={arXiv}, primaryClass={cs.CL} } @misc{DBLP:journals/corr/abs-2110-14168, title={Training Verifiers to Solve Math Word Problems}, author={Karl Cobbe and Vineet Kosaraju and Mohammad Bavarian and Mark Chen and Heewoo Jun and Lukasz Kaiser and Matthias Plappert and Jerry Tworek and Jacob Hilton and Reiichiro Nakano and Christopher Hesse and John Schulman}, year={2021}, eprint={2110.14168}, archivePrefix={arXiv}, primaryClass={cs.CL} }
[ "### Models Merged\n\nThe following models were included in the merge:\n* Test157t/Hex-Macaroniac-7b\n* Test157t/Cetus-Sea-7b-128k", "### Configuration\n\nThe following YAML configuration was used to produce this model:\n\n\n\n@misc{open-llm-leaderboard,\n author = {Edward Beeching and Clémentine Fourrier and Nathan Habib and Sheon Han and Nathan Lambert and Nazneen Rajani and Omar Sanseviero and Lewis Tunstall and Thomas Wolf},\n title = {Open LLM Leaderboard},\n year = {2023},\n publisher = {Hugging Face},\n howpublished = \"\\url{URL\n}\n@software{eval-harness,\n author = {Gao, Leo and\n Tow, Jonathan and\n Biderman, Stella and\n Black, Sid and\n DiPofi, Anthony and\n Foster, Charles and\n Golding, Laurence and\n Hsu, Jeffrey and\n McDonell, Kyle and\n Muennighoff, Niklas and\n Phang, Jason and\n Reynolds, Laria and\n Tang, Eric and\n Thite, Anish and\n Wang, Ben and\n Wang, Kevin and\n Zou, Andy},\n title = {A framework for few-shot language model evaluation},\n month = sep,\n year = 2021,\n publisher = {Zenodo},\n version = {v0.0.1},\n doi = {10.5281/zenodo.5371628},\n url = {URL\n}\n@misc{clark2018think,\n title={Think you have Solved Question Answering? Try ARC, the AI2 Reasoning Challenge},\n author={Peter Clark and Isaac Cowhey and Oren Etzioni and Tushar Khot and Ashish Sabharwal and Carissa Schoenick and Oyvind Tafjord},\n year={2018},\n eprint={1803.05457},\n archivePrefix={arXiv},\n primaryClass={cs.AI}\n}\n@misc{zellers2019hellaswag,\n title={HellaSwag: Can a Machine Really Finish Your Sentence?},\n author={Rowan Zellers and Ari Holtzman and Yonatan Bisk and Ali Farhadi and Yejin Choi},\n year={2019},\n eprint={1905.07830},\n archivePrefix={arXiv},\n primaryClass={cs.CL}\n}\n@misc{hendrycks2021measuring,\n title={Measuring Massive Multitask Language Understanding},\n author={Dan Hendrycks and Collin Burns and Steven Basart and Andy Zou and Mantas Mazeika and Dawn Song and Jacob Steinhardt},\n year={2021},\n eprint={2009.03300},\n archivePrefix={arXiv},\n primaryClass={cs.CY}\n}\n@misc{lin2022truthfulqa,\n title={TruthfulQA: Measuring How Models Mimic Human Falsehoods},\n author={Stephanie Lin and Jacob Hilton and Owain Evans},\n year={2022},\n eprint={2109.07958},\n archivePrefix={arXiv},\n primaryClass={cs.CL}\n}\n@misc{DBLP:journals/corr/abs-1907-10641,\n title={{WINOGRANDE:} An Adversarial Winograd Schema Challenge at Scale},\n author={Keisuke Sakaguchi and Ronan Le Bras and Chandra Bhagavatula and Yejin Choi},\n year={2019},\n eprint={1907.10641},\n archivePrefix={arXiv},\n primaryClass={cs.CL}\n}\n@misc{DBLP:journals/corr/abs-2110-14168,\n title={Training Verifiers to Solve Math Word Problems},\n author={Karl Cobbe and\n Vineet Kosaraju and\n Mohammad Bavarian and\n Mark Chen and\n Heewoo Jun and\n Lukasz Kaiser and\n Matthias Plappert and\n Jerry Tworek and\n Jacob Hilton and\n Reiichiro Nakano and\n Christopher Hesse and\n John Schulman},\n year={2021},\n eprint={2110.14168},\n archivePrefix={arXiv},\n primaryClass={cs.CL}\n}" ]
[ "TAGS\n#transformers #safetensors #mistral #text-generation #mergekit #merge #arxiv-1803.05457 #arxiv-1905.07830 #arxiv-2009.03300 #arxiv-2109.07958 #arxiv-1907.10641 #arxiv-2110.14168 #base_model-Test157t/Hex-Macaroniac-7b #base_model-Test157t/Cetus-Sea-7b-128k #license-other #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### Models Merged\n\nThe following models were included in the merge:\n* Test157t/Hex-Macaroniac-7b\n* Test157t/Cetus-Sea-7b-128k", "### Configuration\n\nThe following YAML configuration was used to produce this model:\n\n\n\n@misc{open-llm-leaderboard,\n author = {Edward Beeching and Clémentine Fourrier and Nathan Habib and Sheon Han and Nathan Lambert and Nazneen Rajani and Omar Sanseviero and Lewis Tunstall and Thomas Wolf},\n title = {Open LLM Leaderboard},\n year = {2023},\n publisher = {Hugging Face},\n howpublished = \"\\url{URL\n}\n@software{eval-harness,\n author = {Gao, Leo and\n Tow, Jonathan and\n Biderman, Stella and\n Black, Sid and\n DiPofi, Anthony and\n Foster, Charles and\n Golding, Laurence and\n Hsu, Jeffrey and\n McDonell, Kyle and\n Muennighoff, Niklas and\n Phang, Jason and\n Reynolds, Laria and\n Tang, Eric and\n Thite, Anish and\n Wang, Ben and\n Wang, Kevin and\n Zou, Andy},\n title = {A framework for few-shot language model evaluation},\n month = sep,\n year = 2021,\n publisher = {Zenodo},\n version = {v0.0.1},\n doi = {10.5281/zenodo.5371628},\n url = {URL\n}\n@misc{clark2018think,\n title={Think you have Solved Question Answering? Try ARC, the AI2 Reasoning Challenge},\n author={Peter Clark and Isaac Cowhey and Oren Etzioni and Tushar Khot and Ashish Sabharwal and Carissa Schoenick and Oyvind Tafjord},\n year={2018},\n eprint={1803.05457},\n archivePrefix={arXiv},\n primaryClass={cs.AI}\n}\n@misc{zellers2019hellaswag,\n title={HellaSwag: Can a Machine Really Finish Your Sentence?},\n author={Rowan Zellers and Ari Holtzman and Yonatan Bisk and Ali Farhadi and Yejin Choi},\n year={2019},\n eprint={1905.07830},\n archivePrefix={arXiv},\n primaryClass={cs.CL}\n}\n@misc{hendrycks2021measuring,\n title={Measuring Massive Multitask Language Understanding},\n author={Dan Hendrycks and Collin Burns and Steven Basart and Andy Zou and Mantas Mazeika and Dawn Song and Jacob Steinhardt},\n year={2021},\n eprint={2009.03300},\n archivePrefix={arXiv},\n primaryClass={cs.CY}\n}\n@misc{lin2022truthfulqa,\n title={TruthfulQA: Measuring How Models Mimic Human Falsehoods},\n author={Stephanie Lin and Jacob Hilton and Owain Evans},\n year={2022},\n eprint={2109.07958},\n archivePrefix={arXiv},\n primaryClass={cs.CL}\n}\n@misc{DBLP:journals/corr/abs-1907-10641,\n title={{WINOGRANDE:} An Adversarial Winograd Schema Challenge at Scale},\n author={Keisuke Sakaguchi and Ronan Le Bras and Chandra Bhagavatula and Yejin Choi},\n year={2019},\n eprint={1907.10641},\n archivePrefix={arXiv},\n primaryClass={cs.CL}\n}\n@misc{DBLP:journals/corr/abs-2110-14168,\n title={Training Verifiers to Solve Math Word Problems},\n author={Karl Cobbe and\n Vineet Kosaraju and\n Mohammad Bavarian and\n Mark Chen and\n Heewoo Jun and\n Lukasz Kaiser and\n Matthias Plappert and\n Jerry Tworek and\n Jacob Hilton and\n Reiichiro Nakano and\n Christopher Hesse and\n John Schulman},\n year={2021},\n eprint={2110.14168},\n archivePrefix={arXiv},\n primaryClass={cs.CL}\n}" ]
[ 147, 43, 911 ]
[ "passage: TAGS\n#transformers #safetensors #mistral #text-generation #mergekit #merge #arxiv-1803.05457 #arxiv-1905.07830 #arxiv-2009.03300 #arxiv-2109.07958 #arxiv-1907.10641 #arxiv-2110.14168 #base_model-Test157t/Hex-Macaroniac-7b #base_model-Test157t/Cetus-Sea-7b-128k #license-other #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### Models Merged\n\nThe following models were included in the merge:\n* Test157t/Hex-Macaroniac-7b\n* Test157t/Cetus-Sea-7b-128k" ]
[ -0.14485667645931244, -0.004415238741785288, -0.0008950494811870158, -0.019280973821878433, 0.04529620334506035, 0.06546206027269363, 0.2541586756706238, 0.07581619918346405, 0.13586777448654175, 0.048605334013700485, 0.06117168068885803, 0.04557614028453827, 0.0459042564034462, 0.1324179470539093, -0.10633807629346848, -0.1504538506269455, 0.09544326364994049, 0.005628635175526142, -0.11975196748971939, 0.1249287873506546, 0.10470741242170334, -0.06726241111755371, 0.11816433072090149, 0.013626043684780598, -0.05585860088467598, 0.042852506041526794, 0.02011641301214695, -0.035413552075624466, 0.12318933755159378, 0.09989610314369202, 0.12291093915700912, 0.08640378713607788, 0.013611946254968643, -0.14089712500572205, 0.04747025668621063, 0.02098197117447853, 0.00501948082819581, 0.09975273907184601, 0.07118243724107742, -0.012981167994439602, 0.10451144725084305, 0.014676244929432869, 0.050739023834466934, 0.056517232209444046, -0.09860225766897202, -0.12159430980682373, -0.12385639548301697, 0.05698036029934883, 0.1107015311717987, 0.04446982964873314, -0.013707146048545837, 0.15806205570697784, 0.04625216871500015, 0.06895677745342255, 0.16931301355361938, -0.26611843705177307, -0.01316991075873375, 0.1799701601266861, 0.116401307284832, -0.013820970430970192, 0.03788095340132713, 0.06611964851617813, 0.07642101496458054, -0.03598899021744728, -0.02868184633553028, -0.03846238926053047, 0.07068999111652374, -0.01874529756605625, -0.13095121085643768, 0.00715097738429904, 0.15321744978427887, 0.04477645456790924, -0.028062282130122185, -0.08120249956846237, -0.10186663269996643, 0.027462664991617203, -0.010539563372731209, -0.07062578946352005, 0.01932675391435623, -0.030562546104192734, 0.08982474356889725, -0.031237253919243813, -0.05751175805926323, -0.08486711978912354, -0.11448373645544052, 0.17611940205097198, 0.002179926261305809, 0.033831071108579636, -0.10277606546878815, 0.05085938423871994, -0.14723704755306244, -0.12576280534267426, -0.023449338972568512, -0.0770079642534256, -0.034930095076560974, -0.03194373846054077, -0.05562419444322586, -0.16494879126548767, 0.08348873257637024, 0.16255460679531097, -0.09542512148618698, 0.044669777154922485, 0.03232027217745781, 0.06017501652240753, -0.022528205066919327, 0.04427588731050491, -0.11972251534461975, -0.10781563073396683, 0.006240209564566612, 0.08328715711832047, 0.054436516016721725, -0.012328704819083214, -0.10283315181732178, -0.028981110081076622, -0.016294466331601143, 0.01061702799052, 0.04430977255105972, 0.07682937383651733, -0.04793919250369072, -0.043807800859212875, 0.18500113487243652, -0.049004387110471725, -0.02589619718492031, 0.027418682351708412, -0.02265407331287861, 0.013814100064337254, 0.09515445679426193, 0.06319726258516312, 0.061622217297554016, 0.0543992780148983, -0.08074571937322617, -0.020865637809038162, -0.0343816876411438, -0.14969399571418762, 0.013699020259082317, 0.009942552074790001, -0.0420340821146965, -0.10423079133033752, -0.21487155556678772, -0.02543330192565918, -0.0002585465554147959, -0.03821960836648941, 0.05384274572134018, -0.027833154425024986, 0.041260626167058945, -0.022493183612823486, -0.0027493101079016924, 0.0023515010252594948, -0.02457909658551216, -0.02386777475476265, 0.03115399368107319, 0.07874611020088196, -0.1298319399356842, 0.019721029326319695, -0.08416062593460083, 0.10187182575464249, -0.18590503931045532, 0.056695081293582916, -0.06313561648130417, 0.04240354895591736, -0.12614457309246063, -0.021250922232866287, -0.028576813638210297, 0.0483211986720562, 0.08646970242261887, 0.19203922152519226, -0.17759490013122559, -0.07425329834222794, 0.09814656525850296, -0.14461500942707062, -0.18789967894554138, 0.08041495084762573, -0.012758312746882439, 0.09917280822992325, 0.053430698812007904, 0.16492639482021332, 0.09114699810743332, -0.04168766736984253, 0.014062201604247093, -0.02117130160331726, 0.03271634131669998, 0.017062470316886902, 0.0946272686123848, -0.06470716744661331, -0.16062703728675842, 0.022961419075727463, -0.06524970382452011, 0.04832465201616287, -0.07295077294111252, -0.04140288010239601, -0.039031896740198135, -0.053520433604717255, 0.11410731822252274, -0.02604147605597973, 0.02960846573114395, -0.08909895271062851, -0.05930127203464508, 0.04136744141578674, 0.10549760609865189, -0.07902488857507706, 0.0013068177504464984, -0.07884865999221802, 0.15914589166641235, -0.11423081159591675, 0.0173584446310997, -0.1385040134191513, -0.0974012091755867, -0.01429513655602932, -0.040015432983636856, -0.014876418747007847, 0.010392996482551098, 0.0862913653254509, 0.03673293814063072, -0.04027177765965462, -0.021562859416007996, 0.11698649823665619, 0.06060011684894562, -0.06319187581539154, -0.19339817762374878, -0.08836302161216736, -0.06311888247728348, 0.20904140174388885, -0.18583495914936066, 0.07124660164117813, -0.0382000207901001, 0.1382531076669693, -0.0375581830739975, 0.024100003764033318, 0.047256436198949814, -0.0033519112039357424, -0.05888771638274193, -0.023834191262722015, 0.057869669049978256, -0.03182263299822807, -0.2071225643157959, 0.12475400418043137, -0.16082577407360077, 0.0967302918434143, 0.08073121309280396, 0.021792421117424965, -0.048378512263298035, -0.08974634110927582, -0.014190981164574623, -0.03892180323600769, 0.02577287144958973, -0.05237213149666786, 0.06724175065755844, 0.03644179180264473, 0.11504700034856796, -0.09473791718482971, -0.011859886348247528, 0.029774248600006104, -0.012127035297453403, -0.04260912537574768, 0.11736199259757996, -0.024294666945934296, -0.21629075706005096, 0.08846145123243332, 0.24375508725643158, 0.04004431888461113, 0.060001030564308167, 0.02408943884074688, -0.03320440277457237, -0.050908274948596954, -0.009709762409329414, -0.0030298654455691576, 0.054415713995695114, -0.06784074008464813, 0.07151370495557785, 0.06831479072570801, -0.0014747055247426033, 0.05946431681513786, -0.10226024687290192, -0.008348838426172733, 0.053062956780195236, -0.018700623884797096, -0.03657510504126549, 0.10604028403759003, 0.016890523955225945, 0.106266550719738, -0.007264900021255016, -0.074925996363163, -0.002961825579404831, -0.022774450480937958, -0.1464376002550125, 0.25137969851493835, -0.07410254329442978, -0.18908455967903137, -0.16348358988761902, 0.005975397769361734, -0.07034474611282349, -0.016148865222930908, 0.02519410476088524, -0.05961233377456665, -0.056225426495075226, -0.1335700899362564, 0.09936722368001938, 0.07977989315986633, -0.015187141485512257, 0.04216904565691948, -0.02560250088572502, 0.025028277188539505, -0.11671234667301178, -0.022677598521113396, -0.009133324027061462, 0.028312021866440773, 0.03556949645280838, -0.07852302491664886, 0.06903322786092758, 0.15422865748405457, -0.04249589890241623, -0.0002785016258712858, -0.025383973494172096, 0.16574904322624207, -0.03525985777378082, 0.04873806983232498, 0.2067747861146927, -0.08624784648418427, 0.03868616744875908, 0.15937387943267822, 0.028637319803237915, -0.05892025679349899, -0.023250777274370193, -0.056793462485075, -0.041888393461704254, -0.25640439987182617, -0.09846809506416321, -0.012239706702530384, 0.06318312883377075, -0.025346361100673676, 0.016475258395075798, 0.08559399098157883, 0.0797584280371666, -0.047788139432668686, -0.08457864075899124, 0.035377562046051025, 0.07199085503816605, 0.1699790507555008, -0.007909429259598255, 0.12499988079071045, -0.06312236189842224, 0.020622413605451584, 0.06068972125649452, -0.001453225500881672, 0.0903899073600769, 0.09481541067361832, 0.09607457369565964, 0.06627772003412247, 0.01935778744518757, 0.05123686417937279, 0.0904175192117691, -0.018908407539129257, -0.04713926091790199, -0.03114168718457222, -0.11643733084201813, 0.0322006531059742, 0.06043211370706558, -0.10163998603820801, 0.07601984590291977, -0.08151165395975113, 0.025152698159217834, 0.01579020731151104, 0.09649932384490967, 0.1271490752696991, -0.34660881757736206, -0.12668123841285706, 0.03879966959357262, 0.06300171464681625, -0.031028706580400467, -0.02723788283765316, 0.06483698636293411, -0.05853091925382614, 0.15688486397266388, -0.04561358690261841, 0.09345750510692596, 0.06919180601835251, 0.018817191943526268, 0.01222173124551773, 0.0900348648428917, -0.03344343230128288, 0.04366433620452881, -0.14694367349147797, 0.2441173791885376, 0.024688830599188805, -0.06452673673629761, 0.020355191081762314, 0.018000483512878418, 0.02024706080555916, 0.2155844271183014, 0.07236755639314651, -0.009026669897139072, -0.07892389595508575, -0.08302249759435654, -0.10286537557840347, 0.020534897223114967, -0.002503703348338604, -0.051984187215566635, 0.09008071571588516, -0.02077123522758484, -0.049667730927467346, 0.042005058377981186, 0.07709959149360657, -0.15821444988250732, -0.07758958637714386, 0.06539003551006317, 0.0499659962952137, 0.007252264302223921, -0.08973252773284912, -0.08816450834274292, -0.14126600325107574, 0.17680197954177856, -0.017438020557165146, -0.06217337027192116, -0.08592833578586578, 0.006177338305860758, 0.16786828637123108, -0.062212955206632614, 0.03348571062088013, -0.02417621575295925, 0.06524878740310669, -0.057932622730731964, -0.14331185817718506, 0.12443462014198303, -0.10885453224182129, -0.1389264315366745, -0.05248822271823883, 0.12665154039859772, -0.0528857484459877, 0.02903997339308262, 0.017043469473719597, 0.019272569566965103, -0.03225499019026756, -0.04464675486087799, -0.023733098059892654, 0.15295875072479248, 0.01740352436900139, 0.09419957548379898, -0.001838318887166679, -0.14128240942955017, -0.03166653588414192, -0.029458699747920036, 0.1111232340335846, 0.3057571351528168, -0.03658422455191612, 0.02840319275856018, 0.13987740874290466, -0.060434434562921524, -0.13629724085330963, 0.06823867559432983, -0.05152932554483414, 0.033241309225559235, 0.05429837480187416, 0.014457478187978268, 0.021437734365463257, 0.09729976207017899, -0.003766777692362666, 0.08756757527589798, -0.3129940927028656, -0.16489025950431824, 0.14424766600131989, 0.04748368635773659, 0.15045329928398132, -0.13068781793117523, -0.07476980984210968, -0.1093364879488945, -0.2117069810628891, 0.043893154710531235, -0.12136612832546234, 0.019010083749890327, -0.0282013937830925, -0.012368325144052505, 0.03274022787809372, -0.05680703744292259, 0.1695701628923416, -0.046871911734342575, 0.05053505301475525, -0.04259730875492096, -0.02689957432448864, 0.08746900409460068, -0.044049475342035294, 0.09605740755796432, -0.06142771989107132, 0.05254439637064934, -0.03272048383951187, -0.03476797416806221, -0.023985963314771652, 0.05326206237077713, -0.045140840113162994, -0.026933109387755394, -0.05071612447500229, 0.035020358860492706, -0.04423115402460098, -0.019564107060432434, 0.1403738260269165, -0.06704746186733246, 0.17769132554531097, 0.12782441079616547, 0.16151702404022217, -0.032355181872844696, 0.024343423545360565, 0.04846079647541046, -0.05103951692581177, 0.07396242022514343, -0.15528063476085663, -0.02421138621866703, 0.10032450407743454, -0.012006494216620922, 0.07261255383491516, 0.0395284965634346, -0.07220777869224548, -0.02116897888481617, 0.06484413146972656, -0.16084638237953186, -0.19953523576259613, -0.03661816567182541, 0.03081434778869152, -0.04912053048610687, 0.10926184058189392, 0.16752374172210693, -0.05138448253273964, -0.005233775824308395, -0.003995595499873161, -0.005727833602577448, -0.061912089586257935, 0.14452950656414032, 0.009037695825099945, 0.05558238923549652, -0.09221277385950089, 0.035452574491500854, 0.010119966231286526, -0.09350352734327316, -0.022682681679725647, 0.05638713389635086, -0.0961984172463417, -0.0703551322221756, -0.10488804429769516, 0.23955297470092773, -0.10971306264400482, -0.07677137851715088, -0.14736565947532654, -0.11326737701892853, -0.004818086512386799, 0.24385882914066315, 0.08269254118204117, 0.01239565759897232, 0.03858846426010132, -0.08642589300870895, 0.005450474098324776, 0.049588385969400406, 0.014196625910699368, 0.10089671611785889, -0.11245491355657578, 0.047329071909189224, -0.044042062014341354, 0.016485633328557014, -0.03935205191373825, 0.011918970383703709, -0.12883655726909637, -0.02801603078842163, -0.21548229455947876, 0.0032151390332728624, -0.11145085096359253, -0.02538725547492504, -0.01900305040180683, -0.044338393956422806, -0.025088820606470108, -0.007333352230489254, -0.030041858553886414, 0.00299182441085577, -0.007181383669376373, 0.041302554309368134, -0.09223196655511856, 0.007148549892008305, 0.012328491546213627, -0.053456492722034454, 0.06280636787414551, -0.00893240887671709, -0.0071487450040876865, -0.012709921225905418, -0.19450229406356812, 0.002345168963074684, 0.036399491131305695, 0.001535738236270845, 0.04267258942127228, -0.13975517451763153, -0.01088685356080532, 0.013141660019755363, -0.0025280313566327095, 0.011953342705965042, 0.06233510747551918, -0.04190124571323395, -0.016314128413796425, -0.026496602222323418, -0.013422860763967037, -0.012771910056471825, -0.04317312315106392, 0.06395412981510162, 0.05683617666363716, 0.12095797061920166, -0.060969531536102295, 0.05960730463266373, -0.21843752264976501, -0.015018155798316002, -0.0006696630734950304, -0.11994397640228271, -0.051421426236629486, -0.05582369863986969, 0.043101709336042404, -0.03729502856731415, 0.15593300759792328, -0.029112933203577995, 0.027544153854250908, 0.037049129605293274, 0.024752426892518997, 0.07104434072971344, 0.06151691451668739, 0.1880805939435959, 0.04615733399987221, 0.01910625770688057, -0.07645497471094131, 0.06077345088124275, 0.0266299806535244, -0.021498531103134155, 0.09799824655056, 0.1478779911994934, -0.06694503873586655, 0.1161426529288292, 0.10540741682052612, 0.004582405090332031, 0.03878293186426163, -0.1277027279138565, -0.03530028089880943, 0.029955020174384117, 0.01137976162135601, 0.1505478471517563, 0.13096094131469727, -0.13211630284786224, 0.048033714294433594, -0.044723331928253174, -0.02366826683282852, -0.13887450098991394, -0.06548909842967987, -0.09433107078075409, -0.14302685856819153, -0.032540179789066315, -0.08116292208433151, -0.009383869357407093, 0.04740695282816887, 0.028345637023448944, -0.021242793649435043, 0.14849402010440826, -0.020454468205571175, -0.02160932682454586, 0.02484966814517975, 0.01335762720555067, 0.012902545742690563, -0.00807702075690031, -0.05278446897864342, 0.1175023689866066, -0.011152324266731739, -0.026136761531233788, 0.0020567637402564287, 0.019953811541199684, 0.0522598922252655, -0.012422068975865841, -0.12884683907032013, -0.024220310151576996, 0.02892453223466873, 0.036752764135599136, -0.026921970769762993, 0.04966471716761589, 0.02495107240974903, -0.004570310935378075, 0.07690782099962234, -0.02783874422311783, -0.06630469113588333, -0.07812703400850296, 0.18997856974601746, -0.0809936672449112, 0.08057554066181183, 0.016218699514865875, -0.05157555639743805, 0.01518883928656578, 0.15379329025745392, 0.2914124131202698, -0.016225015744566917, -0.0205474104732275, 0.02512415125966072, 0.009030756540596485, -0.01101128850132227, 0.048433125019073486, 0.010625839233398438, 0.07095625251531601, -0.04026569053530693, 0.040440287441015244, -0.010941771790385246, -0.05646725371479988, -0.017092537134885788, 0.04642526060342789, 0.03509340062737465, -0.023714782670140266, 0.025034595280885696, 0.10989271104335785, 0.01710083894431591, -0.05508830025792122, -0.005053823348134756, -0.1833030730485916, -0.11008157581090927, -0.10379442572593689, 0.11559610068798065, -0.029958879575133324, 0.053228959441185, -0.053630877286195755, -0.04628381505608559, 0.10807733237743378, -0.026901884004473686, -0.08283308148384094, -0.14535000920295715, 0.0430532842874527, -0.10909727215766907, 0.03850005939602852, -0.01384952012449503, 0.10547395050525665, 0.10473208874464035, -0.01858808659017086, -0.0917506143450737, 0.02905973233282566, 0.04949623718857765, 0.020283199846744537, 0.05628354847431183, 0.025756556540727615, -0.002740794327110052, 0.08136462420225143, 0.047132328152656555, -0.2047082781791687, 0.0632270947098732, -0.06763873994350433, -0.09742497652769089, -0.04793442413210869, 0.04886234551668167, -0.014351832680404186, 0.13252124190330505, 0.13000203669071198, -0.06521633267402649, 0.016314536333084106, -0.013725178316235542, 0.034143611788749695, 0.06652597337961197, 0.124567411839962, 0.007404156494885683, -0.188437819480896, -0.0012580753536894917, 0.0007737537962384522, 0.03373990207910538, -0.33845964074134827, -0.06640857458114624, -0.08890920877456665, -0.03835165500640869, -0.007069773506373167, 0.0757669061422348, 0.17087920010089874, 0.062448471784591675, -0.027733778581023216, -0.16481997072696686, 0.011142047122120857, 0.12107190489768982, -0.08029229193925858, -0.1010870561003685 ]
null
null
sample-factory
A(n) **APPO** model trained on the **doom_health_gathering_supreme** environment. This model was trained using Sample-Factory 2.0: https://github.com/alex-petrenko/sample-factory. Documentation for how to use Sample-Factory can be found at https://www.samplefactory.dev/ ## Downloading the model After installing Sample-Factory, download the model with: ``` python -m sample_factory.huggingface.load_from_hub -r vpepe2003/rl_course_vizdoom_health_gathering_supreme ``` ## Using the model To run the model after download, use the `enjoy` script corresponding to this environment: ``` python -m .usr.local.lib.python3.10.dist-packages.colab_kernel_launcher --algo=APPO --env=doom_health_gathering_supreme --train_dir=./train_dir --experiment=rl_course_vizdoom_health_gathering_supreme ``` You can also upload models to the Hugging Face Hub using the same script with the `--push_to_hub` flag. See https://www.samplefactory.dev/10-huggingface/huggingface/ for more details ## Training with this model To continue training with this model, use the `train` script corresponding to this environment: ``` python -m .usr.local.lib.python3.10.dist-packages.colab_kernel_launcher --algo=APPO --env=doom_health_gathering_supreme --train_dir=./train_dir --experiment=rl_course_vizdoom_health_gathering_supreme --restart_behavior=resume --train_for_env_steps=10000000000 ``` Note, you may have to adjust `--train_for_env_steps` to a suitably high number as the experiment will resume at the number of steps it concluded at.
{"library_name": "sample-factory", "tags": ["deep-reinforcement-learning", "reinforcement-learning", "sample-factory"], "model-index": [{"name": "APPO", "results": [{"task": {"type": "reinforcement-learning", "name": "reinforcement-learning"}, "dataset": {"name": "doom_health_gathering_supreme", "type": "doom_health_gathering_supreme"}, "metrics": [{"type": "mean_reward", "value": "9.29 +/- 3.65", "name": "mean_reward", "verified": false}]}]}]}
reinforcement-learning
vpepe2003/rl_course_vizdoom_health_gathering_supreme
[ "sample-factory", "tensorboard", "deep-reinforcement-learning", "reinforcement-learning", "model-index", "region:us" ]
2024-02-13T20:16:57+00:00
[]
[]
TAGS #sample-factory #tensorboard #deep-reinforcement-learning #reinforcement-learning #model-index #region-us
A(n) APPO model trained on the doom_health_gathering_supreme environment. This model was trained using Sample-Factory 2.0: URL Documentation for how to use Sample-Factory can be found at URL ## Downloading the model After installing Sample-Factory, download the model with: ## Using the model To run the model after download, use the 'enjoy' script corresponding to this environment: You can also upload models to the Hugging Face Hub using the same script with the '--push_to_hub' flag. See URL for more details ## Training with this model To continue training with this model, use the 'train' script corresponding to this environment: Note, you may have to adjust '--train_for_env_steps' to a suitably high number as the experiment will resume at the number of steps it concluded at.
[ "## Downloading the model\n\nAfter installing Sample-Factory, download the model with:", "## Using the model\n\nTo run the model after download, use the 'enjoy' script corresponding to this environment:\n\n\n\nYou can also upload models to the Hugging Face Hub using the same script with the '--push_to_hub' flag.\nSee URL for more details", "## Training with this model\n\nTo continue training with this model, use the 'train' script corresponding to this environment:\n\n\nNote, you may have to adjust '--train_for_env_steps' to a suitably high number as the experiment will resume at the number of steps it concluded at." ]
[ "TAGS\n#sample-factory #tensorboard #deep-reinforcement-learning #reinforcement-learning #model-index #region-us \n", "## Downloading the model\n\nAfter installing Sample-Factory, download the model with:", "## Using the model\n\nTo run the model after download, use the 'enjoy' script corresponding to this environment:\n\n\n\nYou can also upload models to the Hugging Face Hub using the same script with the '--push_to_hub' flag.\nSee URL for more details", "## Training with this model\n\nTo continue training with this model, use the 'train' script corresponding to this environment:\n\n\nNote, you may have to adjust '--train_for_env_steps' to a suitably high number as the experiment will resume at the number of steps it concluded at." ]
[ 34, 19, 59, 67 ]
[ "passage: TAGS\n#sample-factory #tensorboard #deep-reinforcement-learning #reinforcement-learning #model-index #region-us \n## Downloading the model\n\nAfter installing Sample-Factory, download the model with:## Using the model\n\nTo run the model after download, use the 'enjoy' script corresponding to this environment:\n\n\n\nYou can also upload models to the Hugging Face Hub using the same script with the '--push_to_hub' flag.\nSee URL for more details## Training with this model\n\nTo continue training with this model, use the 'train' script corresponding to this environment:\n\n\nNote, you may have to adjust '--train_for_env_steps' to a suitably high number as the experiment will resume at the number of steps it concluded at." ]
[ -0.162887305021286, -0.07949446886777878, 0.0013769814977422357, 0.0244897473603487, 0.13643795251846313, 0.08826540410518646, 0.13243556022644043, 0.07938782125711441, 0.19449298083782196, 0.07451266050338745, 0.12160012871026993, 0.06742649525403976, 0.02505551464855671, 0.31084391474723816, 0.08655242621898651, -0.18235880136489868, 0.031082456931471825, -0.06436605006456375, -0.02882574498653412, 0.05590416118502617, 0.050910040736198425, -0.06422623991966248, 0.11641133576631546, -0.05714287608861923, -0.15497641265392303, 0.08288847655057907, 0.008126083761453629, 0.03596968948841095, 0.12199652194976807, -0.007729834411293268, 0.06358569860458374, 0.02508161962032318, 0.09885215014219284, -0.08979995548725128, 0.05817115306854248, 0.037268251180648804, -0.005583701189607382, 0.0697544738650322, -0.02916712686419487, 0.01197513286024332, 0.20552261173725128, 0.051445573568344116, -0.014811687171459198, 0.0707944929599762, -0.04854035750031471, 0.005004523321986198, 0.024828260764479637, 0.08118943125009537, 0.1108563020825386, -0.013300174847245216, -0.015604399144649506, 0.2098497599363327, -0.045419543981552124, 0.030687451362609863, 0.1803472340106964, -0.13901305198669434, -0.00587898213416338, 0.3598267436027527, 0.13591337203979492, 0.07389762997627258, -0.05572221428155899, 0.065569669008255, 0.12957775592803955, -0.013377981260418892, -0.022062024101614952, -0.037468962371349335, 0.01014290377497673, 0.02470328100025654, -0.08271043002605438, -0.03898613899946213, 0.18779566884040833, 0.027798498049378395, -0.0647122785449028, -0.11388745903968811, -0.08383605629205704, -0.01143614575266838, -0.08729266375303268, -0.06047317758202553, 0.061255209147930145, 0.06450130045413971, -0.05541218817234039, -0.16354843974113464, -0.08759765326976776, -0.14808951318264008, 0.09711641818284988, -0.018818290904164314, 0.020023507997393608, 0.039053402841091156, -0.13240769505500793, 0.13932685554027557, -0.12239529192447662, -0.005040881223976612, -0.00391974626109004, -0.10012788325548172, -0.0298643596470356, -0.02757178619503975, -0.06954579800367355, -0.08072661608457565, 0.06621979922056198, 0.1397300660610199, 0.1075919046998024, 0.04457515478134155, -0.016096504405140877, 0.0929836705327034, 0.0659836158156395, 0.015487046912312508, -0.046446919441223145, -0.03190334141254425, 0.06750229746103287, 0.09463070333003998, -0.0025161339435726404, -0.04405781999230385, -0.12502750754356384, 0.004669501446187496, -0.05889439582824707, 0.07438734918832779, -0.01944235898554325, 0.09347380697727203, 0.0012449703644961119, -0.0658751055598259, 0.09675891697406769, -0.056166794151067734, -0.015024078078567982, 0.05717969685792923, -0.09829384088516235, -0.044000294059515, 0.02636338584125042, -0.018662840127944946, 0.02191256918013096, -0.08697114139795303, -0.1281215101480484, -0.0406981036067009, -0.15496762096881866, -0.0733695924282074, 0.020342092961072922, -0.10162562131881714, 0.040819648653268814, -0.08701786398887634, -0.27291807532310486, -0.016108427196741104, 0.05915366858243942, 0.0003154690202791244, 0.03663148358464241, -0.06209208071231842, 0.0267410296946764, -0.030988745391368866, -0.013702943921089172, 0.12538094818592072, -0.04706621542572975, 0.005733184050768614, 0.02853262610733509, 0.09092917293310165, 0.029396481812000275, -0.011824010871350765, -0.09237373620271683, 0.03002769686281681, -0.1866937130689621, 0.0038047281559556723, -0.051012441515922546, 0.14028684794902802, -0.07785230129957199, -0.0034444157499819994, -0.07691079378128052, 0.06912831217050552, 0.052552226930856705, 0.21963854134082794, -0.22059281170368195, -0.09743031859397888, 0.1902308464050293, -0.09678838402032852, -0.1949385702610016, 0.06732125580310822, -0.03079940192401409, 0.20069970190525055, 0.02597416751086712, 0.1891578733921051, 0.00020795770979020745, -0.25584760308265686, 0.035303130745887756, 0.07686726003885269, -0.2078019231557846, -0.11653494834899902, 0.00783967413008213, 0.04216665402054787, -0.050144799053668976, 0.023388857021927834, -0.07392873615026474, 0.1217033788561821, -0.023950038477778435, -0.021695949137210846, -0.009935722686350346, -0.06940963864326477, -0.039610356092453, 0.012346661649644375, 0.06086154654622078, -0.02202412113547325, -0.025860905647277832, -0.05173748731613159, 0.16720648109912872, -0.0795547217130661, 0.011736705899238586, -0.11241740733385086, 0.1497063785791397, 0.007124151568859816, 0.025635361671447754, -0.0980280190706253, -0.014672551304101944, 0.044151511043310165, 0.08621654659509659, 0.011970171704888344, 0.1326037049293518, 0.06774137914180756, 0.01454958226531744, 0.042493220418691635, -0.004039871972054243, -0.0012205307139083743, -0.10230473428964615, -0.05593033879995346, -0.11311958730220795, -0.11286478489637375, -0.09429361671209335, 0.08868816494941711, -0.20066434144973755, 0.05826579034328461, -0.15120604634284973, 0.047645486891269684, 0.038803353905677795, -0.07772190868854523, 0.05121537670493126, -0.08661998063325882, -0.021283775568008423, -0.08784573525190353, 0.0805407464504242, -0.014386715367436409, -0.08415807038545609, 0.006313080433756113, -0.09094364196062088, -0.08295580744743347, 0.09175937622785568, 0.013830476440489292, 0.0026490744203329086, -0.1170414388179779, -0.04695970565080643, 0.001149212708696723, 0.03873389959335327, -0.0591595321893692, 0.08649469166994095, 0.06776818633079529, 0.09646541625261307, -0.09070473909378052, 0.03797374665737152, -0.020416714251041412, -0.06236580014228821, -0.045745182782411575, 0.014070805162191391, 0.1767948418855667, -0.022993814200162888, -0.01734299771487713, -0.005982444155961275, -0.048861317336559296, 0.20095843076705933, -0.018403954803943634, -0.11935548484325409, 0.0030399553943425417, -0.01395543571561575, -0.017944620922207832, 0.11660698801279068, -0.13726668059825897, -0.05182260647416115, 0.030854813754558563, -0.06529976427555084, 0.10216285288333893, -0.08242622762918472, -0.0392029769718647, -0.05685178562998772, -0.043409593403339386, 0.046979792416095734, 0.12330524623394012, -0.07290767133235931, -0.009151018224656582, -0.047789376229047775, -0.03510203957557678, -0.025379952043294907, -0.05724980682134628, -0.11478709429502487, 0.1582695096731186, 0.002751561114564538, -0.09990474581718445, -0.17415542900562286, -0.08029486984014511, -0.03834356367588043, 0.05337152257561684, -0.034037429839372635, -0.04430336132645607, -0.01500723510980606, -0.07299388945102692, 0.1465158462524414, 0.063304103910923, -0.0472191721200943, -0.01852818764746189, 0.08560720086097717, 0.04456184431910515, -0.15394946932792664, 0.007078593596816063, -0.08948076516389847, -0.08794131129980087, 0.03091353550553322, -0.08061819523572922, 0.012820594012737274, 0.11341627687215805, 0.03525753691792488, 0.02826494723558426, 0.01035099383443594, 0.23537762463092804, -0.0369284451007843, -0.01093987375497818, 0.19019025564193726, 0.0682438537478447, 0.020443644374608994, 0.055847786366939545, 0.027420951053500175, -0.15370461344718933, 0.10424364358186722, 0.012530675157904625, -0.044538769870996475, -0.10689681768417358, -0.04666181653738022, -0.03360101953148842, 0.09803235530853271, 0.12185155600309372, 0.03158954530954361, 0.025155838578939438, 0.096546471118927, 0.02187134325504303, -0.0098390718922019, -0.11183010786771774, 0.05996714532375336, -0.1770814210176468, -0.043808963149785995, 0.00898060668259859, -0.028755301609635353, 0.00010461114288773388, 0.0659034252166748, 0.026660064235329628, 0.12833580374717712, 0.0295290257781744, 0.06181740015745163, 0.0663255974650383, 0.10200989991426468, 0.01538698747754097, 0.1999037265777588, -0.06215142831206322, -0.1075027585029602, -0.03758005052804947, -0.04118350148200989, -0.11916319280862808, 0.12439136207103729, 0.1381523460149765, -0.030515994876623154, -0.06625506281852722, 0.07200724631547928, 0.014589293859899044, 0.08729344606399536, 0.08250882476568222, -0.29115065932273865, -0.034177567809820175, 0.031450141221284866, 0.01114452164620161, -0.04308335855603218, 0.010566305369138718, 0.10542299598455429, -0.07616783678531647, -0.09982791543006897, -0.03972722589969635, 0.1055394783616066, 0.08046542853116989, 0.03702867403626442, -0.10841067880392075, 0.20128826797008514, -0.01744360849261284, 0.07004447281360626, -0.07662706822156906, 0.1728198230266571, 0.018701205030083656, 0.05943213775753975, -0.07497778534889221, -0.009592941962182522, 0.1228223443031311, 0.03374773636460304, 0.09092900156974792, -0.0056656887754797935, -0.09995020180940628, -0.13336431980133057, -0.1216202825307846, 0.024986369535326958, -0.000090524394181557, -0.08169890940189362, 0.03341596573591232, -0.016717763617634773, 0.017487963661551476, -0.0027857583481818438, 0.23440547287464142, -0.18267135322093964, 0.012482558377087116, -0.054521817713975906, 0.02707577496767044, -0.04300008341670036, -0.0709642544388771, -0.027162717655301094, 0.060507629066705704, 0.09744840115308762, 0.07921962440013885, 0.030401866883039474, -0.07419665157794952, 0.1431404948234558, 0.06514685600996017, -0.058246973901987076, -0.01524845976382494, 0.01951364241540432, 0.1256532073020935, -0.07438289374113083, -0.10393836349248886, 0.10585980117321014, -0.11736445128917694, 0.008749126456677914, -0.05019083246588707, 0.04299405962228775, 0.02305823378264904, 0.011290842667222023, 0.007447924464941025, -0.04279239848256111, 0.0015383695717900991, -0.06904047727584839, 0.0778660774230957, 0.020559091120958328, -0.0047941361553967, -0.0006717707728967071, -0.16239388287067413, 0.08390985429286957, -0.04138755425810814, 0.052877847105264664, 0.1489589661359787, 0.27864590287208557, -0.02386910282075405, 0.030926240608096123, 0.1617380678653717, -0.01897917501628399, -0.2491649091243744, 0.04654841497540474, 0.014908025041222572, 0.10310175269842148, 0.04640066251158714, -0.19236695766448975, 0.11111847311258316, 0.009474517777562141, -0.02225719392299652, 0.009804603643715382, -0.24880149960517883, -0.13740544021129608, 0.17525193095207214, 0.06902051717042923, 0.15983323752880096, -0.03665107116103172, -0.013587141409516335, -0.061109546571969986, -0.03419603407382965, -0.026354335248470306, -0.12708203494548798, 0.12749767303466797, -0.017607107758522034, 0.047745801508426666, 0.027817612513899803, -0.07676684111356735, 0.12058744579553604, -0.017944786697626114, 0.13344953954219818, -0.017018258571624756, -0.031023232266306877, 0.042466819286346436, -0.09033756703138351, 0.1662607043981552, -0.10233280807733536, 0.057950668036937714, -0.11091876775026321, -0.03109682910144329, -0.015322481282055378, 0.15654151141643524, 0.005544521380215883, -0.0855189636349678, -0.041066281497478485, 0.04975702613592148, -0.05784251168370247, 0.05022609233856201, -0.0021613158751279116, -0.03506873920559883, 0.022246064618229866, 0.08415499329566956, 0.040208954364061356, -0.10403558611869812, -0.011038471013307571, 0.03089289739727974, 0.01896476000547409, 0.09993185102939606, -0.20835483074188232, -0.020152123644948006, 0.019231827929615974, -0.015702085569500923, 0.13085414469242096, 0.04400704801082611, -0.08080117404460907, 0.027568496763706207, 0.13726983964443207, -0.061186157166957855, -0.030986590310931206, -0.04847807064652443, -0.016679393127560616, -0.12794725596904755, -0.01594163477420807, 0.057148490101099014, -0.04251079633831978, 0.02512725070118904, -0.03424951806664467, 0.0004248716577421874, -0.10717252641916275, 0.07036283612251282, 0.06859682500362396, 0.0642281174659729, -0.07167360186576843, 0.09394960850477219, -0.07811970263719559, 0.014289900660514832, 0.03734226152300835, 0.045441556721925735, -0.06931920349597931, -0.06820165365934372, -0.05322124809026718, 0.27575042843818665, -0.024388493970036507, -0.02025510184466839, -0.06021025776863098, 0.11942195147275925, -0.057836465537548065, -0.06673881411552429, 0.08716115355491638, -0.007450808770954609, -0.059019722044467926, 0.022327717393636703, -0.0734894648194313, -0.014457973651587963, 0.04693116992712021, 0.016375891864299774, -0.11610891669988632, 0.1136312261223793, 0.031648989766836166, 0.02891513518989086, -0.09186926484107971, -0.0486464723944664, -0.12123195827007294, 0.0032020595390349627, -0.025323880836367607, -0.06051601842045784, -0.07913094758987427, -0.0425749197602272, 0.049642790108919144, 0.018434861674904823, -0.08444267511367798, -0.0022111251018941402, -0.12617166340351105, 0.006370943505316973, 0.006689207162708044, 0.10316617041826248, -0.06351965665817261, 0.04670397937297821, 0.10049878805875778, -0.07692139595746994, 0.09893755614757538, 0.0846271738409996, -0.00729260453954339, 0.08929292112588882, -0.20261284708976746, -0.02319980226457119, 0.047821637243032455, 0.055264540016651154, 0.03154374286532402, 0.06104309484362602, 0.013487739488482475, -0.05460033565759659, 0.04538526386022568, -0.03539090231060982, 0.0028435050044208765, -0.09104080498218536, 0.09713591635227203, 0.009731475263834, -0.009716489352285862, -0.060456521809101105, -0.01384128537029028, 0.01817488856613636, 0.10404353588819504, 0.09692291915416718, -0.07237115502357483, -0.0035003575030714273, -0.11786255985498428, 0.024597108364105225, 0.02565017342567444, 0.010576808825135231, 0.03638135641813278, -0.11692339926958084, 0.03729743883013725, -0.05475534871220589, 0.19700418412685394, 0.019796879962086678, -0.10531783103942871, -0.008661900646984577, 0.07250577956438065, 0.17378750443458557, -0.006129021290689707, 0.21011123061180115, 0.05919691175222397, 0.09556611627340317, 0.0324610099196434, 0.11373614519834518, 0.11542147397994995, 0.004254546947777271, 0.10733281821012497, 0.0500684529542923, -0.04822303727269173, 0.14306919276714325, 0.032827045768499374, -0.017670227214694023, 0.0304852481931448, 0.04704435542225838, -0.03187015652656555, 0.02075354754924774, -0.06440161913633347, 0.11196915805339813, 0.13514995574951172, -0.08471442013978958, -0.0081911850720644, 0.04797748476266861, -0.0438203290104866, -0.1532401293516159, -0.08671712130308151, -0.024648865684866905, -0.2236001342535019, 0.08533021807670593, -0.06946314871311188, -0.13578248023986816, 0.019155733287334442, 0.013867083936929703, -0.028145823627710342, 0.11776147037744522, -0.07801362872123718, -0.03346126526594162, 0.020983682945370674, -0.039618294686079025, -0.09754771739244461, -0.09402462840080261, -0.07874704152345657, 0.03500581532716751, -0.04535633698105812, 0.025271590799093246, -0.05421067774295807, 0.015182215720415115, 0.10334893316030502, -0.04038224741816521, -0.041323766112327576, -0.0359976626932621, -0.035855069756507874, -0.11793428659439087, 0.025968458503484726, 0.044103916734457016, -0.03597194701433182, -0.05585090070962906, 0.17637495696544647, -0.04257858544588089, -0.01666315644979477, -0.1211012676358223, 0.14332374930381775, -0.04330325871706009, 0.03261799365282059, -0.10366860777139664, -0.08559805154800415, -0.10071583092212677, 0.27439257502555847, 0.2784624397754669, -0.14349330961704254, -0.009759977459907532, 0.02939503826200962, 0.004204166121780872, -0.14250165224075317, 0.14376720786094666, 0.01570971868932247, -0.024460898712277412, -0.027595078572630882, 0.026391539722681046, -0.007621914613991976, -0.0827714279294014, -0.03114704228937626, -0.05752136558294296, -0.006779014132916927, -0.05148708075284958, -0.034257955849170685, 0.06298708915710449, -0.12136059254407883, -0.09091135859489441, -0.05560125410556793, -0.0083417734131217, -0.03344108536839485, -0.07473809272050858, -0.019548200070858, 0.07662302255630493, 0.14781777560710907, -0.05502733215689659, 0.06005467101931572, -0.004367031157016754, -0.04969286173582077, -0.13970479369163513, -0.13660922646522522, 0.05449144169688225, -0.129489928483963, 0.26909253001213074, -0.050524767488241196, -0.05207161232829094, 0.041712693870067596, -0.03221052139997482, -0.05838879942893982, 0.020522039383649826, 0.009778409264981747, -0.05078497156500816, -0.029240628704428673, 0.09255361557006836, -0.033305004239082336, 0.009149706922471523, -0.022496739402413368, -0.22135144472122192, 0.0034119023475795984, -0.05107501149177551, 0.028507398441433907, -0.12569822371006012, 0.06501629203557968, -0.09348012506961823, 0.12403472512960434, 0.07595156878232956, -0.01166640967130661, -0.036088403314352036, -0.04733064025640488, 0.1257045865058899, 0.08392459154129028, -0.02910126931965351, -0.0870935395359993, -0.16758979856967926, -0.004611360374838114, -0.0011314527364447713, -0.08687946200370789, -0.23090760409832, -0.008421163074672222, -0.031696807593107224, 0.0109195401892066, -0.00838692206889391, 0.12826944887638092, 0.14749252796173096, 0.05249129980802536, 0.016358694061636925, -0.12719306349754333, 0.041898638010025024, 0.08496948331594467, -0.15762199461460114, -0.1707899123430252 ]
null
null
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
feature-extraction
furrutiav/bert_qa_extractor_cockatiel_2022_nllf_z_value_it_810
[ "transformers", "safetensors", "bert", "feature-extraction", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
2024-02-13T20:17:36+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #bert #feature-extraction #arxiv-1910.09700 #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #bert #feature-extraction #arxiv-1910.09700 #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ 39, 6, 3, 82, 28, 3, 4, 9, 9, 10, 42, 20, 3, 4, 5, 9, 11, 13, 3, 12, 5, 4, 5, 3, 4, 9, 53, 9, 8, 6, 3, 14, 8, 7, 9, 4 ]
[ "passage: TAGS\n#transformers #safetensors #bert #feature-extraction #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact" ]
[ -0.052746038883924484, 0.20255789160728455, -0.0045078229159116745, 0.0248473659157753, 0.10497838258743286, 0.00675728265196085, 0.06521498411893845, 0.11486967653036118, -0.0023755673319101334, 0.12028469145298004, 0.027631845325231552, 0.08119397610425949, 0.12110675126314163, 0.15393014252185822, 0.005160121712833643, -0.24253977835178375, 0.05344875901937485, -0.09366832673549652, 0.004077504388988018, 0.11452110856771469, 0.1343945860862732, -0.10780399292707443, 0.08976872265338898, -0.00683097867295146, -0.01712046191096306, -0.015751034021377563, -0.07134060561656952, -0.06668227165937424, 0.05541034787893295, 0.07649129629135132, 0.0725555345416069, 0.010986946523189545, 0.07830587029457092, -0.2806258797645569, 0.014425364322960377, 0.08005264401435852, 0.0010765197221189737, 0.06795802712440491, 0.08151742070913315, -0.06789936870336533, 0.1251654475927353, -0.0605485662817955, 0.14059753715991974, 0.07639917731285095, -0.08928128331899643, -0.19590547680854797, -0.06669555604457855, 0.07481247186660767, 0.129872128367424, 0.05026249960064888, -0.02990107797086239, 0.1371748298406601, -0.09688840061426163, 0.00786701962351799, 0.12302009761333466, -0.07360870391130447, -0.05524582043290138, 0.031063849106431007, 0.10805318504571915, 0.09297362715005875, -0.11762315034866333, -0.008467874489724636, 0.029582185670733452, 0.022175652906298637, 0.08627551048994064, 0.015828849747776985, 0.1525639444589615, 0.041341137140989304, -0.14141254127025604, -0.0526716373860836, 0.09056255221366882, 0.03701045364141464, -0.050960201770067215, -0.23367193341255188, -0.026245610788464546, -0.012442239560186863, -0.03079850971698761, -0.04234880208969116, 0.053594592958688736, -0.03630254790186882, 0.07596245408058167, -0.007196845952421427, -0.07732249796390533, -0.031211229041218758, 0.05230424553155899, 0.06785056740045547, 0.018615471199154854, -0.006994647905230522, 0.019442738965153694, 0.11387838423252106, 0.07708574831485748, -0.13029205799102783, -0.07214002311229706, -0.0739525631070137, -0.09558356553316116, -0.04332297295331955, 0.03707554563879967, 0.07106684148311615, 0.04390906170010567, 0.20283061265945435, -0.017690327018499374, 0.046562306582927704, 0.0476159006357193, 0.005842953454703093, 0.07147589325904846, 0.10925443470478058, -0.06689215451478958, -0.14432233572006226, -0.06022803485393524, 0.08875485509634018, -0.009834992699325085, -0.03670760244131088, -0.049119677394628525, 0.04676154628396034, 0.03209913894534111, 0.11318106204271317, 0.08643888682126999, -0.003593706525862217, -0.0628826767206192, -0.042073074728250504, 0.22331053018569946, -0.14625342190265656, 0.043256524950265884, 0.007445589639246464, -0.0429743155837059, -0.0076383077539503574, 0.005870272871106863, 0.014089803211390972, -0.03238216042518616, 0.10351061820983887, -0.0778173878788948, -0.035906463861465454, -0.1116463914513588, -0.06868703663349152, 0.024910317733883858, 0.0025890374090522528, -0.018393149599432945, -0.04424213990569115, -0.11253650486469269, -0.051282741129398346, 0.0724339634180069, -0.07579848170280457, -0.05524555593729019, 0.009976830333471298, -0.04834962263703346, 0.0031978494953364134, 0.00010397454752819613, 0.11258035898208618, -0.03314845636487007, 0.025259260088205338, -0.04850656911730766, 0.06803499162197113, 0.10959596186876297, 0.038730688393116, -0.0804535374045372, 0.07286878675222397, -0.22788093984127045, 0.10223092138767242, -0.09346398711204529, 0.025767935439944267, -0.14578653872013092, -0.04199126362800598, 0.02854149229824543, 0.02887420728802681, -0.010361229069530964, 0.1268649846315384, -0.1982942521572113, -0.035082314163446426, 0.15190726518630981, -0.11336656659841537, -0.09347330778837204, 0.065653957426548, -0.05610617995262146, 0.11296144872903824, 0.04835578054189682, -0.019556574523448944, 0.06953749805688858, -0.1281629204750061, -0.04506009817123413, -0.021473335102200508, -0.008493004366755486, 0.14857245981693268, 0.06750676780939102, -0.05737153813242912, 0.07104712724685669, 0.02051553688943386, -0.037109848111867905, -0.03301886469125748, -0.03470754995942116, -0.09331934154033661, 0.009520708583295345, -0.07244295626878738, 0.03737799823284149, -0.02224314957857132, -0.08870045095682144, -0.030656753107905388, -0.17619828879833221, 0.043274905532598495, 0.08050142228603363, 0.008233942091464996, -0.021131468936800957, -0.09287237375974655, 0.02556683123111725, -0.009385489858686924, -0.021018607541918755, -0.1641797423362732, -0.044834475964307785, 0.04416196420788765, -0.1971662938594818, 0.023802341893315315, -0.03283040598034859, 0.05093098804354668, 0.03247829154133797, -0.04019762575626373, -0.005096070934087038, 0.0028117431793361902, 0.01809627003967762, -0.026984719559550285, -0.200385183095932, -0.031109308823943138, -0.029154371470212936, 0.1362139731645584, -0.22226740419864655, 0.028292208909988403, 0.07483648508787155, 0.13521188497543335, 0.0009690870065242052, -0.04426588490605354, 0.010693409480154514, -0.05366935580968857, -0.053671274334192276, -0.06512755900621414, -0.007102466654032469, -0.03287021815776825, -0.04422381520271301, 0.06460095942020416, -0.19425635039806366, -0.03641216829419136, 0.10608077049255371, 0.10164625942707062, -0.14719000458717346, -0.028969714418053627, -0.04096706584095955, -0.06081128865480423, -0.09094393998384476, -0.0630471333861351, 0.14371246099472046, 0.04861542955040932, 0.048413511365652084, -0.08624191582202911, -0.0630124881863594, 0.00895135197788477, 0.0006565740332007408, -0.03649118170142174, 0.08907787501811981, 0.08782777935266495, -0.10737399011850357, 0.08881597965955734, 0.08605224639177322, 0.06605713814496994, 0.10539878904819489, 0.001256609451957047, -0.10750970244407654, -0.029154706746339798, 0.005644100718200207, 0.01547710970044136, 0.14092515408992767, -0.044270921498537064, 0.04743899777531624, 0.05656488984823227, -0.027443327009677887, 0.01715722121298313, -0.10313762724399567, 0.02984124980866909, 0.046840768307447433, -0.010507673025131226, 0.012429861351847649, -0.03895113617181778, 0.025837475433945656, 0.08796556293964386, 0.03584056720137596, 0.027896199375391006, 0.0029043578542768955, -0.03437814116477966, -0.10392027348279953, 0.17429527640342712, -0.0878753736615181, -0.28357240557670593, -0.1356295943260193, -0.00747122336179018, 0.05167245492339134, -0.022715993225574493, 0.013256389647722244, -0.04903135821223259, -0.11467588692903519, -0.10348290205001831, 0.008818334899842739, 0.0437844917178154, -0.07700283080339432, -0.07256268709897995, 0.046553414314985275, 0.033613573759794235, -0.14174877107143402, 0.022300107404589653, 0.048012908548116684, -0.03855963796377182, -0.015413837507367134, 0.07170835882425308, 0.10258439928293228, 0.17387451231479645, -0.004228805657476187, -0.01945391111075878, 0.023280048742890358, 0.24459126591682434, -0.14296141266822815, 0.10647262632846832, 0.15432609617710114, -0.06630013138055801, 0.1025824174284935, 0.19176462292671204, 0.02610800787806511, -0.07571171224117279, 0.03370760753750801, 0.03715203329920769, -0.053104497492313385, -0.23274335265159607, -0.060641512274742126, 0.0011178229469805956, -0.06850682199001312, 0.09104112535715103, 0.08915619552135468, 0.11183936148881912, 0.0454646460711956, -0.08415863662958145, -0.06847929954528809, 0.019614145159721375, 0.10642454773187637, -0.03275766968727112, 0.007264797575771809, 0.09054313600063324, -0.04184457287192345, -0.005177726969122887, 0.10835286974906921, 0.007426192983984947, 0.1962665617465973, 0.031048519536852837, 0.15333782136440277, 0.07211130857467651, 0.0342402458190918, 0.026680786162614822, 0.025636766105890274, 0.023090654984116554, 0.009547512046992779, -0.01598707027733326, -0.08795502036809921, 0.027014199644327164, 0.13500221073627472, 0.07871367782354355, 0.029795078560709953, 0.020392734557390213, -0.0429922379553318, 0.062152985483407974, 0.15964233875274658, 0.006258485373109579, -0.2136749029159546, -0.03950631618499756, 0.08867984265089035, -0.0793125256896019, -0.1237078458070755, -0.02518491819500923, 0.03823186457157135, -0.1809074580669403, 0.04127289727330208, -0.01795332506299019, 0.11453432589769363, -0.11700457334518433, -0.028958700597286224, 0.039744846522808075, 0.08327627927064896, -0.03253408893942833, 0.07922478020191193, -0.1647184044122696, 0.1165376752614975, 0.012328862212598324, 0.05802180990576744, -0.11617794632911682, 0.09878876805305481, 0.012594180181622505, -0.009003117680549622, 0.16720694303512573, -0.0008162438753060997, -0.07339610159397125, -0.06517832726240158, -0.07867198437452316, -0.022016214206814766, 0.09116258472204208, -0.11647430807352066, 0.08271238952875137, -0.012302344664931297, -0.03819865360856056, 0.002976413816213608, -0.1073245257139206, -0.12343364208936691, -0.191313698887825, 0.05862122401595116, -0.11746024340391159, 0.00024363139527849853, -0.10003595799207687, -0.05551697313785553, -0.04721582680940628, 0.19990667700767517, -0.14306047558784485, -0.09675363451242447, -0.1526252180337906, -0.09468596428632736, 0.1679719239473343, -0.04768168181180954, 0.08716544508934021, -0.00014324963558465242, 0.22273695468902588, 0.00589721417054534, -0.010143720544874668, 0.07824880629777908, -0.08608578145503998, -0.17828822135925293, -0.07740302383899689, 0.12055730819702148, 0.12802201509475708, 0.05279289186000824, -0.012038013897836208, 0.020934196189045906, -0.036648161709308624, -0.11678951978683472, 0.003050430677831173, 0.1217387318611145, 0.05949230119585991, 0.039503831416368484, -0.002558275358751416, -0.10200468450784683, -0.07551230490207672, -0.0352395698428154, 0.02261841483414173, 0.18903005123138428, -0.08441178500652313, 0.15781226754188538, 0.13112787902355194, -0.05333179607987404, -0.21253353357315063, 0.030583804473280907, 0.043237145990133286, 0.004318034742027521, 0.0612679123878479, -0.17720702290534973, 0.08167627453804016, 0.025727098807692528, -0.05116020143032074, 0.15224720537662506, -0.16569727659225464, -0.15514664351940155, 0.0824643224477768, 0.05010354146361351, -0.22108957171440125, -0.12386278063058853, -0.0879128947854042, -0.06589758396148682, -0.1396872103214264, 0.08584427833557129, 0.014041651971638203, -0.0018043812597170472, 0.05013851076364517, 0.033740755170583725, 0.018914686515927315, -0.048698488622903824, 0.21615906059741974, -0.0022440196480602026, 0.03326340764760971, -0.07553089410066605, -0.10180798172950745, 0.06950566172599792, -0.05141735449433327, 0.08518881350755692, -0.03099823370575905, 0.005753061734139919, -0.08320630341768265, -0.057475052773952484, -0.05255331099033356, 0.03318103775382042, -0.08139406144618988, -0.10520965605974197, -0.06759276986122131, 0.09429939836263657, 0.09139011800289154, -0.03298058733344078, -0.04032526910305023, -0.08896728605031967, 0.039150089025497437, 0.20617929100990295, 0.17360219359397888, 0.05333937704563141, -0.10111589729785919, 0.002542630536481738, -0.01915728859603405, 0.040264517068862915, -0.21200114488601685, 0.04798245429992676, 0.04617756977677345, 0.024147402495145798, 0.12109645456075668, -0.0176423080265522, -0.1646004468202591, -0.047221194952726364, 0.0562983863055706, -0.03494611009955406, -0.20504815876483917, -0.01314060389995575, 0.04864202439785004, -0.18736153841018677, -0.06957933306694031, 0.016700902953743935, -0.014444489032030106, -0.027432914823293686, 0.013032985851168633, 0.06286440044641495, 0.025481918826699257, 0.10238313674926758, 0.05989401787519455, 0.1000840812921524, -0.112981878221035, 0.0795830711722374, 0.09043775498867035, -0.08344172686338425, 0.009394102729856968, 0.06964189559221268, -0.05280066654086113, -0.02294989861547947, 0.022772129625082016, 0.06757686287164688, -0.003049787599593401, -0.057536181062459946, -0.02079189568758011, -0.10809285193681717, 0.06586270034313202, 0.1269281655550003, 0.0400845967233181, -0.006831571459770203, 0.04905473813414574, 0.02419281378388405, -0.07880669087171555, 0.11321208626031876, 0.03362756222486496, 0.03722309693694115, -0.05989459529519081, -0.01674187369644642, 0.04316421225667, 0.005734616424888372, -0.02047782577574253, -0.025104478001594543, -0.05658029392361641, -0.013948953710496426, -0.18932224810123444, 0.014544147998094559, -0.07588981091976166, 0.005138450767844915, 0.014814606867730618, -0.040141742676496506, -0.018671197816729546, 0.012856033630669117, -0.08163223415613174, -0.05027473345398903, -0.0038707295898348093, 0.09766460955142975, -0.1400173306465149, 0.008230311796069145, 0.09175591170787811, -0.11852382868528366, 0.06848865002393723, -0.019968708977103233, -0.014717686921358109, 0.0038272906094789505, -0.1270400881767273, 0.04572216048836708, -0.004586559720337391, 0.02062096633017063, 0.04444560408592224, -0.17065683007240295, 0.004877567756921053, -0.0423397533595562, -0.0478336401283741, -0.015323328785598278, -0.08405033499002457, -0.11406292766332626, 0.10921793431043625, 0.002206311793997884, -0.08430022746324539, -0.010287429206073284, 0.04696008190512657, 0.10919637978076935, -0.03898061811923981, 0.124757781624794, 0.0047785635106265545, 0.06639395654201508, -0.18268363177776337, -0.024298490956425667, -0.014514438807964325, 0.007352736312896013, 0.027192458510398865, -0.016180848702788353, 0.04238643869757652, -0.01372526679188013, 0.2601816952228546, -0.021822240203619003, 0.07231466472148895, 0.0637383759021759, 0.042024899274110794, 0.016651110723614693, 0.08318763226270676, 0.06755662709474564, 0.016758481040596962, 0.004258559085428715, 0.02265608124434948, -0.03241465613245964, -0.016654497012495995, -0.15768693387508392, 0.07677853107452393, 0.14623822271823883, 0.08591317385435104, 0.007676990237087011, 0.06586159020662308, -0.10330242663621902, -0.10554943233728409, 0.08015866577625275, -0.03888537734746933, -0.0009790018666535616, -0.058588381856679916, 0.15355949103832245, 0.14971502125263214, -0.17422176897525787, 0.08231138437986374, -0.03791337087750435, -0.04883022606372833, -0.11436772346496582, -0.15839459002017975, -0.06608819216489792, -0.029153592884540558, -0.0041826991364359856, -0.05528274551033974, 0.06748054921627045, 0.10802645981311798, -0.0021057529374957085, -0.00038325722562149167, 0.09545762091875076, -0.026331622153520584, -0.01757199876010418, 0.03465426340699196, 0.04817976430058479, 0.033562518656253815, -0.04831063002347946, 0.020485511049628258, 0.004976877011358738, 0.03976510092616081, 0.05864322930574417, 0.023703020066022873, -0.03892989084124565, 0.014479226432740688, -0.01092575490474701, -0.1049860492348671, 0.022427968680858612, -0.029776830226182938, -0.07360642403364182, 0.13104131817817688, 0.029177764430642128, 0.019099419936537743, -0.03228067234158516, 0.20109383761882782, -0.07107947021722794, -0.06925153732299805, -0.14109766483306885, 0.10889512300491333, -0.03372858464717865, 0.06323269009590149, 0.058447178453207016, -0.1133023053407669, -0.002398417331278324, 0.1314154714345932, 0.133079394698143, -0.033533163368701935, 0.005780258681625128, 0.03008044883608818, 0.00756559893488884, -0.0482633113861084, 0.045497048646211624, 0.031092669814825058, 0.15440985560417175, -0.06949599832296371, 0.07780899107456207, 0.00008295764564536512, -0.08774317800998688, -0.036128852516412735, 0.1405542492866516, 0.006535779219120741, 0.03079606406390667, -0.06559351831674576, 0.10371401906013489, -0.07252706587314606, -0.23936228454113007, 0.045033879578113556, -0.07753164321184158, -0.15683837234973907, -0.013978141359984875, 0.02726292423903942, -0.009009851142764091, 0.02702206000685692, 0.0654432401061058, -0.06469112634658813, 0.161378413438797, 0.03472336754202843, -0.08781957626342773, -0.05673113837838173, 0.07957270741462708, -0.09192227572202682, 0.2958409786224365, 0.013188840821385384, 0.029593972489237785, 0.10327941924333572, -0.019989576190710068, -0.13285429775714874, 0.030561091378331184, 0.10066051781177521, -0.09982595592737198, 0.06684590131044388, 0.18159176409244537, -0.009470577351748943, 0.10021016746759415, 0.07437440752983093, -0.061603669077157974, 0.05807222053408623, -0.0826035663485527, -0.06770919263362885, -0.09389114379882812, 0.05970105528831482, -0.06468918174505234, 0.14543601870536804, 0.1228262409567833, -0.04243761673569679, -0.004415105562657118, -0.02816380001604557, 0.043726447969675064, 0.012194468639791012, 0.12871193885803223, 0.008576037362217903, -0.1618158370256424, 0.026840461418032646, 0.0030557403806596994, 0.10387714207172394, -0.21997274458408356, -0.08367477357387543, 0.04838619381189346, -0.029553698375821114, -0.05334814265370369, 0.10579082369804382, 0.06295353919267654, 0.0504634715616703, -0.04548325017094612, -0.05543007701635361, -0.008723298087716103, 0.14979462325572968, -0.1187625601887703, -0.006005466915667057 ]
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # my_asr_mind_model This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base) on the minds14 dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 500 - training_steps: 1000 - mixed_precision_training: Native AMP ### Framework versions - Transformers 4.37.2 - Pytorch 2.2.0+cu121 - Datasets 2.17.0 - Tokenizers 0.15.2
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["minds14"], "base_model": "facebook/wav2vec2-base", "model-index": [{"name": "my_asr_mind_model", "results": []}]}
automatic-speech-recognition
alekoe/my_asr_mind_model
[ "transformers", "safetensors", "wav2vec2", "automatic-speech-recognition", "generated_from_trainer", "dataset:minds14", "base_model:facebook/wav2vec2-base", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2024-02-13T20:21:01+00:00
[]
[]
TAGS #transformers #safetensors #wav2vec2 #automatic-speech-recognition #generated_from_trainer #dataset-minds14 #base_model-facebook/wav2vec2-base #license-apache-2.0 #endpoints_compatible #region-us
# my_asr_mind_model This model is a fine-tuned version of facebook/wav2vec2-base on the minds14 dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 500 - training_steps: 1000 - mixed_precision_training: Native AMP ### Framework versions - Transformers 4.37.2 - Pytorch 2.2.0+cu121 - Datasets 2.17.0 - Tokenizers 0.15.2
[ "# my_asr_mind_model\n\nThis model is a fine-tuned version of facebook/wav2vec2-base on the minds14 dataset.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 1e-05\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 2\n- total_train_batch_size: 16\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 500\n- training_steps: 1000\n- mixed_precision_training: Native AMP", "### Framework versions\n\n- Transformers 4.37.2\n- Pytorch 2.2.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.2" ]
[ "TAGS\n#transformers #safetensors #wav2vec2 #automatic-speech-recognition #generated_from_trainer #dataset-minds14 #base_model-facebook/wav2vec2-base #license-apache-2.0 #endpoints_compatible #region-us \n", "# my_asr_mind_model\n\nThis model is a fine-tuned version of facebook/wav2vec2-base on the minds14 dataset.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 1e-05\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 2\n- total_train_batch_size: 16\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 500\n- training_steps: 1000\n- mixed_precision_training: Native AMP", "### Framework versions\n\n- Transformers 4.37.2\n- Pytorch 2.2.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.2" ]
[ 73, 35, 6, 12, 8, 3, 140, 33 ]
[ "passage: TAGS\n#transformers #safetensors #wav2vec2 #automatic-speech-recognition #generated_from_trainer #dataset-minds14 #base_model-facebook/wav2vec2-base #license-apache-2.0 #endpoints_compatible #region-us \n# my_asr_mind_model\n\nThis model is a fine-tuned version of facebook/wav2vec2-base on the minds14 dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 1e-05\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 2\n- total_train_batch_size: 16\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 500\n- training_steps: 1000\n- mixed_precision_training: Native AMP### Framework versions\n\n- Transformers 4.37.2\n- Pytorch 2.2.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.2" ]
[ -0.10153123736381531, 0.11222664266824722, -0.002501080045476556, 0.05009397491812706, 0.13725221157073975, 0.017530135810375214, 0.09575158357620239, 0.10975238680839539, -0.05604841187596321, 0.05028744414448738, 0.0665304884314537, 0.004536345135420561, 0.06914141774177551, 0.14030911028385162, -0.026962684467434883, -0.21434994041919708, 0.029891271144151688, -0.014356092549860477, -0.08495652675628662, 0.10051532089710236, 0.11465497314929962, -0.08265680074691772, 0.045189954340457916, 0.02353067323565483, -0.12136883288621902, 0.022036602720618248, -0.01905324123799801, -0.05850406736135483, 0.10991205275058746, 0.026405632495880127, 0.06953521817922592, 0.024338949471712112, 0.10983589291572571, -0.23697234690189362, 0.006634528283029795, 0.08489014953374863, 0.03628633916378021, 0.07016348093748093, 0.06978633999824524, 0.018093982711434364, 0.05953545495867729, -0.13260583579540253, 0.10857577621936798, 0.037685610353946686, -0.09448028355836868, -0.23448413610458374, -0.09340716153383255, 0.11646967381238937, 0.12991993129253387, 0.10220525413751602, -0.01090648677200079, 0.10150425136089325, -0.06404095143079758, 0.05842401832342148, 0.14106978476047516, -0.2523714005947113, -0.06873677670955658, 0.004875440150499344, 0.04581647366285324, 0.04088225215673447, -0.08414885401725769, -0.011721048504114151, 0.031923871487379074, 0.03559071570634842, 0.12557432055473328, -0.005583061370998621, -0.05453794077038765, -0.02362874709069729, -0.12184495478868484, -0.04381207749247551, 0.1291845589876175, 0.08614357560873032, -0.04662878438830376, -0.13157057762145996, -0.033010225743055344, -0.0597367063164711, -0.023129111155867577, -0.047198764979839325, 0.015266564674675465, -0.020765576511621475, -0.028937801718711853, -0.06983261555433273, -0.08233889192342758, -0.05036395043134689, 0.03765666484832764, 0.11346369236707687, 0.009526427835226059, 0.00012625085946638137, -0.025585345923900604, 0.10599157214164734, -0.02088789828121662, -0.1300746649503708, 0.011131125502288342, -0.0032915109768509865, -0.1215873658657074, -0.06267137080430984, -0.03088255412876606, -0.041496392339468, -0.01095854863524437, 0.11741919815540314, -0.005816782359033823, 0.0763767808675766, -0.0059519498609006405, 0.008479354903101921, -0.03681456297636032, 0.14011268317699432, -0.05820007622241974, -0.057326771318912506, -0.0056847818195819855, 0.1476026475429535, -0.000873618118930608, -0.010362539440393448, -0.07146476209163666, -0.03199029713869095, 0.09191660583019257, 0.05111587420105934, -0.03212280571460724, 0.016750294715166092, -0.057214777916669846, -0.03437038138508797, 0.049210742115974426, -0.12318998575210571, 0.04053730145096779, 0.004605062305927277, -0.05513152480125427, -0.03858540207147598, 0.014930102974176407, 0.03743050992488861, -0.012199169024825096, 0.06404399126768112, -0.057060059159994125, -0.027729632332921028, -0.07649188488721848, -0.04453234001994133, 0.01908162049949169, -0.004988935776054859, 0.003502665087580681, -0.0793880969285965, -0.19943703711032867, -0.04854293167591095, 0.06253253668546677, -0.050378866493701935, -0.07327874004840851, -0.007663127966225147, -0.04292025789618492, 0.02061040885746479, -0.03288274258375168, 0.18985719978809357, -0.04587378352880478, 0.05728231370449066, -0.0036328970454633236, 0.017376096919178963, 0.039528537541627884, 0.043411485850811005, -0.061802104115486145, 0.05423201248049736, -0.1233832910656929, 0.0854322612285614, -0.07798567414283752, 0.024119310081005096, -0.13436587154865265, -0.07793088257312775, -0.003913477063179016, -0.02869071252644062, 0.07130568474531174, 0.10623799264431, -0.18364755809307098, -0.043853312730789185, 0.1285434365272522, -0.08256014436483383, -0.0925658643245697, 0.1137641966342926, -0.03442755714058876, 0.055842865258455276, 0.055491216480731964, 0.13472764194011688, 0.11695312708616257, -0.11368466913700104, 0.01491705235093832, -0.017801037058234215, 0.06822159886360168, 0.039654724299907684, 0.07122867554426193, -0.007050159387290478, 0.003990150056779385, 0.0032160519622266293, -0.06476569175720215, 0.04132162779569626, -0.05767862871289253, -0.08506307005882263, -0.042464081197977066, -0.10199049115180969, 0.040715985000133514, 0.02042851597070694, 0.003259865567088127, -0.06750302016735077, -0.11310534179210663, 0.07286692410707474, 0.12310100346803665, -0.05485975369811058, -0.0013522705994546413, -0.07506784051656723, 0.0025352349039167166, -0.01852436363697052, -0.014276514761149883, -0.18647781014442444, -0.07215820997953415, 0.026170510798692703, -0.09104368835687637, 0.01497235894203186, 0.018650829792022705, 0.07519552856683731, 0.04046877473592758, -0.07113919407129288, -0.04710009694099426, -0.10951080918312073, 0.023111576214432716, -0.08054197579622269, -0.17977629601955414, -0.0754292905330658, -0.04397432878613472, 0.24353735148906708, -0.23326627910137177, 0.015250409953296185, 0.0006303659756667912, 0.15372776985168457, 0.023640088737010956, -0.049099262803792953, 0.00043686130084097385, 0.031293563544750214, 0.0333339124917984, -0.09581780433654785, 0.039985474199056625, 0.016380675137043, -0.1090707927942276, -0.027020949870347977, -0.1230449303984642, 0.054549381136894226, 0.07298404723405838, 0.10940168797969818, -0.08456403762102127, -0.05046819895505905, -0.056394897401332855, -0.054612159729003906, -0.06467515975236893, -0.019585056230425835, 0.23594503104686737, 0.039171211421489716, 0.11504001915454865, -0.05712559074163437, -0.050685424357652664, 0.012669730931520462, 0.014235151000320911, -0.020333297550678253, 0.058115698397159576, 0.012877390719950199, -0.1640966236591339, 0.07553445547819138, 0.12844644486904144, -0.03821350634098053, 0.12183625251054764, -0.04818340390920639, -0.07789228856563568, -0.022718753665685654, 0.0031619910150766373, -0.00873805582523346, 0.11540582031011581, -0.1040535718202591, 0.018694337457418442, 0.03466663509607315, 0.0027024790178984404, 0.029762201011180878, -0.16394396126270294, -0.001434710226021707, 0.03960849717259407, -0.030888840556144714, -0.004311616066843271, -0.022262563928961754, 0.034167271107435226, 0.06766228377819061, 0.028112510219216347, -0.036762695759534836, 0.035348281264305115, -0.02335350587964058, -0.09156794100999832, 0.16025181114673615, -0.10502076148986816, -0.1886812150478363, -0.11083764582872391, 0.06153193488717079, -0.0380965918302536, -0.034177668392658234, 0.0041993954218924046, -0.08581259846687317, -0.055541761219501495, -0.08630839735269547, -0.029130497947335243, -0.009992030449211597, -0.017472388222813606, 0.0800408124923706, 0.014736848883330822, 0.10909684002399445, -0.10297653079032898, 0.005875144619494677, 0.013309501111507416, -0.04307492822408676, -0.014369433745741844, 0.06087568774819374, 0.053791385143995285, 0.1272977888584137, -0.002600166480988264, 0.016383159905672073, -0.029117463156580925, 0.24124103784561157, -0.10359331965446472, 0.014119276776909828, 0.16625307500362396, -0.00748947449028492, 0.040966231375932693, 0.13975387811660767, 0.01765589416027069, -0.08682011067867279, 0.03574672341346741, 0.046007074415683746, -0.001272448804229498, -0.21938104927539825, -0.04046985134482384, -0.030780203640460968, -0.10654041171073914, 0.12703511118888855, 0.03557126224040985, 0.0027063044253736734, 0.041546761989593506, -0.022491522133350372, 0.006445479579269886, 0.028430212289094925, 0.072175994515419, 0.08600815385580063, 0.033019352704286575, 0.09753689914941788, -0.006042217370122671, -0.015484082512557507, 0.05179847031831741, 0.026101619005203247, 0.20645597577095032, 0.004277194384485483, 0.0719689205288887, 0.0215410515666008, 0.13158881664276123, -0.01620316505432129, 0.04472941905260086, 0.015611371025443077, -0.0205058716237545, -0.004721882287412882, -0.05090019851922989, -0.04104524478316307, 0.045905858278274536, 0.018214568495750427, 0.03205689415335655, -0.08933545649051666, 0.0017648135544732213, 0.004568613599985838, 0.3302087187767029, 0.06041587144136429, -0.274534672498703, -0.08397726714611053, 0.022699467837810516, -0.06497632712125778, -0.08488459885120392, -0.006995182950049639, 0.12512393295764923, -0.13687212765216827, 0.07542815804481506, -0.059498775750398636, 0.08733445405960083, -0.020827684551477432, 0.009881041012704372, 0.04946853592991829, 0.07816408574581146, 0.010705836117267609, 0.07907108217477798, -0.19119635224342346, 0.22127008438110352, 0.011339209042489529, 0.11212935298681259, -0.06007710099220276, 0.04114939272403717, 0.02325967140495777, 0.06847849488258362, 0.07920171320438385, -0.004493060056120157, -0.059989556670188904, -0.14437136054039001, -0.07458723336458206, 0.04037114232778549, 0.07234660536050797, -0.04151734337210655, 0.07133806496858597, -0.04424293339252472, -0.00023464679543394595, 0.04784953594207764, -0.0010970589937642217, -0.19322186708450317, -0.13943180441856384, 0.019485294818878174, 0.04105411469936371, 0.06389840692281723, -0.1185145303606987, -0.1040235385298729, -0.035065989941358566, 0.1843218356370926, 0.029167059808969498, -0.02939806692302227, -0.1454876959323883, 0.05948520079255104, 0.13337785005569458, -0.03711078688502312, 0.02523079328238964, 0.029217086732387543, 0.1679820716381073, -0.002192473504692316, -0.06028201803565025, 0.060379114001989365, -0.08693566918373108, -0.1824951469898224, -0.04435192048549652, 0.15470774471759796, 0.0527031384408474, 0.050947848707437515, 0.02604834921658039, 0.020323827862739563, 0.018822146579623222, -0.06988554447889328, 0.045375432819128036, 0.06351103633642197, 0.046416815370321274, 0.035242918878793716, -0.026559168472886086, -0.03308308124542236, -0.05187797546386719, -0.037769660353660583, 0.11617088317871094, 0.22642198204994202, -0.06907612085342407, 0.06481075286865234, 0.11190342158079147, -0.05849149078130722, -0.13542021811008453, 0.04659857228398323, 0.1310320496559143, 0.029948774725198746, 0.07155925780534744, -0.17478437721729279, 0.07251101732254028, 0.10961537063121796, -0.03514678403735161, 0.0007362976903095841, -0.2797590494155884, -0.1336805671453476, 0.10128803551197052, 0.08755016326904297, -0.0030716245528310537, -0.10927382856607437, -0.033725980669260025, -0.06169437617063522, -0.13020674884319305, 0.10149601846933365, -0.12732148170471191, 0.08739522844552994, 0.015818197280168533, 0.08111102879047394, 0.02798759564757347, -0.03933155536651611, 0.1734718382358551, -0.0028198184445500374, 0.07363441586494446, -0.02566094882786274, 0.06836962699890137, 0.07396600395441055, -0.07273256033658981, 0.040027420967817307, -0.040616221725940704, 0.05233483389019966, -0.13035304844379425, -0.03354498744010925, -0.09627416729927063, 0.051207318902015686, -0.038147345185279846, -0.055010125041007996, -0.012237487360835075, 0.07567555457353592, 0.05924030393362045, -0.03560132905840874, 0.019557742401957512, -0.01762809418141842, 0.11295408010482788, 0.15385153889656067, 0.12568894028663635, -0.0321807786822319, -0.09905390441417694, -0.004112372640520334, -0.02889730967581272, 0.06902231276035309, -0.08808786422014236, 0.04559462517499924, 0.10060141235589981, 0.021721752360463142, 0.1720108836889267, 0.009531444869935513, -0.08286429941654205, -0.015375405550003052, 0.03284447640180588, -0.09228025376796722, -0.1823308765888214, -0.02422717772424221, 0.062177278101444244, -0.1540842205286026, -0.03661045432090759, 0.12817396223545074, -0.0578535832464695, -0.021841147914528847, -0.010772687382996082, 0.02232644148170948, -0.04361503943800926, 0.17142823338508606, 0.01735515147447586, 0.08549663424491882, -0.07083717733621597, 0.1011447161436081, 0.07754819840192795, -0.14587640762329102, 0.05679056793451309, 0.03795309364795685, -0.08129210025072098, -0.029743749648332596, 0.053725019097328186, 0.11254316568374634, 0.03541523963212967, -0.06098613142967224, -0.057288508862257004, -0.13379408419132233, 0.040172919631004333, 0.07259664684534073, 0.021393099799752235, -0.017705611884593964, -0.025381583720445633, 0.0011647441424429417, -0.10705062001943588, 0.08302953839302063, 0.05572326108813286, 0.05261056497693062, -0.14195145666599274, 0.0926101878285408, 0.011743408627808094, 0.013280624523758888, -0.008004991337656975, 0.008555696345865726, -0.06742741912603378, -0.002692430978640914, -0.16660656034946442, -0.036446355283260345, -0.02946939878165722, 0.006108570843935013, -0.01116505078971386, -0.044258471578359604, -0.033771365880966187, 0.03560604155063629, -0.07487393915653229, -0.05631906911730766, 0.008071254007518291, 0.060898829251527786, -0.12037325650453568, -0.026913661509752274, 0.037837687879800797, -0.10389044135808945, 0.0797487422823906, 0.038201767951250076, 0.02368483692407608, 0.023937147110700607, -0.09357212483882904, -0.02376321516931057, 0.032469842582941055, 0.026677316054701805, 0.051021188497543335, -0.16374604403972626, -0.016688771545886993, -0.01775520294904709, 0.018780292943120003, 0.019970159977674484, 0.06358854472637177, -0.10017484426498413, -0.0464990958571434, -0.011928760446608067, -0.05068054795265198, -0.05539116635918617, 0.030391128733754158, 0.10334468632936478, 0.04967918619513512, 0.17270566523075104, -0.07863882184028625, 0.041298408061265945, -0.2079974263906479, -0.025471195578575134, -0.02992328815162182, -0.010635930113494396, -0.025714704766869545, -0.026865581050515175, 0.07744719088077545, -0.05122878775000572, 0.10260071605443954, -0.037027571350336075, 0.10337099432945251, 0.04673849046230316, -0.10744960606098175, -0.015209484845399857, 0.02958700619637966, 0.16038204729557037, 0.06714365631341934, -0.005376941990107298, 0.09684685617685318, -0.028655990958213806, 0.037827927619218826, 0.1054697260260582, 0.14753395318984985, 0.159526988863945, 0.014621194452047348, 0.05872397497296333, 0.06926853954792023, -0.11489803344011307, -0.1203533262014389, 0.09262308478355408, -0.04317164421081543, 0.12161363661289215, -0.06592338532209396, 0.180277481675148, 0.10753346979618073, -0.18379269540309906, 0.06553950160741806, -0.07402745634317398, -0.0916186198592186, -0.07733095437288284, -0.08076336234807968, -0.06762821972370148, -0.12020258605480194, 0.012727644294500351, -0.09643128514289856, 0.02035077102482319, 0.06058970466256142, 0.0023115556687116623, 0.005791886709630489, 0.12840552628040314, -0.0114994365721941, -0.01067266147583723, 0.09278698265552521, 0.009181794710457325, -0.002767933765426278, -0.07353461533784866, -0.060162484645843506, 0.05542623624205589, 0.020865874364972115, 0.06114686280488968, -0.04082398861646652, -0.0418299175798893, 0.028069818392395973, 0.026325775310397148, -0.08763018250465393, 0.031200595200061798, -0.0002182345779146999, 0.0450550802052021, 0.04661475867033005, 0.04457539692521095, 0.012101695872843266, -0.0394538976252079, 0.264023095369339, -0.08024627715349197, -0.09741180390119553, -0.15305928885936737, 0.18872277438640594, -0.01383020170032978, -0.010966438800096512, 0.061295878142118454, -0.09026878327131271, -0.006733654998242855, 0.12941224873065948, 0.12455592304468155, -0.0724106952548027, -0.0016949119744822383, -0.030564606189727783, -0.02151944860816002, -0.07225373387336731, 0.1161033883690834, 0.11108437180519104, -0.021399838849902153, -0.06145058572292328, 0.04946978762745857, -0.01222253404557705, -0.06483494490385056, -0.04999161511659622, 0.09290066361427307, -0.001775998855009675, 0.025669988244771957, -0.02614617720246315, 0.10410180687904358, 0.03923407569527626, -0.19150540232658386, 0.0037552574649453163, -0.13752864301204681, -0.18926851451396942, -0.030459636822342873, 0.049177948385477066, -0.0077789765782654285, 0.0634671151638031, -0.0016964029055088758, -0.0028017705772072077, 0.12995055317878723, -0.01898675411939621, 0.0105054322630167, -0.1204783245921135, 0.11386164277791977, -0.07318900525569916, 0.21062932908535004, -0.009771021082997322, 0.05616726353764534, 0.10146063566207886, 0.01572267711162567, -0.14717641472816467, 0.020370079204440117, 0.07209694385528564, -0.052392203360795975, 0.04144074767827988, 0.17617693543434143, -0.035645145922899246, 0.1064617782831192, 0.04251779243350029, -0.15239952504634857, -0.013348965905606747, -0.09042961150407791, -0.014819816686213017, -0.06763529032468796, 0.018791118636727333, -0.05000520125031471, 0.16284282505512238, 0.18941082060337067, -0.058727655559778214, -0.00028427279903553426, -0.07502839714288712, 0.021802373230457306, 0.05285383760929108, 0.12831270694732666, -0.0329645611345768, -0.20529350638389587, 0.02353506162762642, 0.021526096388697624, 0.02090175822377205, -0.2374459207057953, -0.10538426041603088, 0.03349979221820831, -0.06293725222349167, -0.024610891938209534, 0.11479642987251282, 0.0516430102288723, 0.025745144113898277, -0.032939933240413666, -0.14114202558994293, -0.045500192791223526, 0.14259016513824463, -0.14907920360565186, -0.029607700183987617 ]
null
null
transformers
# Samantha 1.1 MBX 7B ![image/webp](https://cdn-uploads.huggingface.co/production/uploads/6455cc8d679315e4ef16fbec/bq3IRI6barhEzd1E9LbEe.webp) **Prompt Template:** ``` <|im_start|>system You are a caring and empathetic sentient AI companion named Samantha.<|im_end|> <|im_start|>user Hello, what is your name?<|im_end|> <|im_start|>assistant ``` ### Merge Method This model was merged using the SLERP merge method. ### Models Merged The following models were included in the merge: * [macadeliccc/MBX-7B-v3-DPO](https://huggingface.co/macadeliccc/MBX-7B-v3-DPO) * [cognitivecomputations/samantha-1.1-westlake-7b](https://huggingface.co/cognitivecomputations/samantha-1.1-westlake-7b) ### Configuration The following YAML configuration was used to produce this model: ```yaml slices: - sources: - model: cognitivecomputations/samantha-1.1-westlake-7b layer_range: [0, 32] - model: macadeliccc/MBX-7B-v3-DPO layer_range: [0, 32] merge_method: slerp base_model: macadeliccc/MBX-7B-v3-DPO parameters: t: - filter: self_attn value: [0, 0.5, 0.3, 0.7, 1] - filter: mlp value: [1, 0.5, 0.7, 0.3, 0] - value: 0.5 dtype: bfloat16 ``` ## GGUF TODO ## Code Example ```python from transformers import AutoModelForCausalLM, AutoTokenizer tokenizer = AutoTokenizer.from_pretrained("macadeliccc/samantha-1.1-MBX-7B") model = AutoModelForCausalLM.from_pretrained("macadeliccc/samanth-1.1-MBX-7B") messages = [ {"role": "system", "content": "You are a caring and empathetic sentient AI companion named Samantha."}, {"role": "user", "content": "Hello, what is your name?"} ] gen_input = tokenizer.apply_chat_template(messages, return_tensors="pt") ```
{"license": "apache-2.0", "tags": ["mergekit", "merge"], "base_model": ["macadeliccc/MBX-7B-v3-DPO", "cognitivecomputations/samantha-1.1-westlake-7b"]}
text-generation
macadeliccc/samantha-1.1-MBX-7B
[ "transformers", "safetensors", "mistral", "text-generation", "mergekit", "merge", "base_model:macadeliccc/MBX-7B-v3-DPO", "base_model:cognitivecomputations/samantha-1.1-westlake-7b", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2024-02-13T20:24:52+00:00
[]
[]
TAGS #transformers #safetensors #mistral #text-generation #mergekit #merge #base_model-macadeliccc/MBX-7B-v3-DPO #base_model-cognitivecomputations/samantha-1.1-westlake-7b #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# Samantha 1.1 MBX 7B !image/webp Prompt Template: ### Merge Method This model was merged using the SLERP merge method. ### Models Merged The following models were included in the merge: * macadeliccc/MBX-7B-v3-DPO * cognitivecomputations/samantha-1.1-westlake-7b ### Configuration The following YAML configuration was used to produce this model: ## GGUF TODO ## Code Example
[ "# Samantha 1.1 MBX 7B\n\n!image/webp\n\nPrompt Template:", "### Merge Method\n\nThis model was merged using the SLERP merge method.", "### Models Merged\n\nThe following models were included in the merge:\n* macadeliccc/MBX-7B-v3-DPO\n* cognitivecomputations/samantha-1.1-westlake-7b", "### Configuration\n\nThe following YAML configuration was used to produce this model:", "## GGUF \n\nTODO", "## Code Example" ]
[ "TAGS\n#transformers #safetensors #mistral #text-generation #mergekit #merge #base_model-macadeliccc/MBX-7B-v3-DPO #base_model-cognitivecomputations/samantha-1.1-westlake-7b #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# Samantha 1.1 MBX 7B\n\n!image/webp\n\nPrompt Template:", "### Merge Method\n\nThis model was merged using the SLERP merge method.", "### Models Merged\n\nThe following models were included in the merge:\n* macadeliccc/MBX-7B-v3-DPO\n* cognitivecomputations/samantha-1.1-westlake-7b", "### Configuration\n\nThe following YAML configuration was used to produce this model:", "## GGUF \n\nTODO", "## Code Example" ]
[ 102, 17, 18, 46, 17, 6, 4 ]
[ "passage: TAGS\n#transformers #safetensors #mistral #text-generation #mergekit #merge #base_model-macadeliccc/MBX-7B-v3-DPO #base_model-cognitivecomputations/samantha-1.1-westlake-7b #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Samantha 1.1 MBX 7B\n\n!image/webp\n\nPrompt Template:### Merge Method\n\nThis model was merged using the SLERP merge method.### Models Merged\n\nThe following models were included in the merge:\n* macadeliccc/MBX-7B-v3-DPO\n* cognitivecomputations/samantha-1.1-westlake-7b### Configuration\n\nThe following YAML configuration was used to produce this model:## GGUF \n\nTODO## Code Example" ]
[ -0.12314892560243607, 0.016301700845360756, 0.0011025998974218965, 0.025387154892086983, 0.12642540037631989, 0.051118213683366776, 0.1694164127111435, 0.039125602692365646, 0.08453496545553207, 0.016681548207998276, 0.051030173897743225, 0.11387638002634048, 0.05447908118367195, 0.13575734198093414, -0.04410145804286003, -0.19751523435115814, 0.08827622979879379, -0.0065778084099292755, -0.132705956697464, 0.08400300145149231, 0.08743149787187576, -0.061406541615724564, 0.1247825101017952, 0.020125897601246834, -0.12510240077972412, -0.0063001131638884544, -0.029378259554505348, 0.007455647923052311, 0.09996666014194489, 0.10520423203706741, 0.018235618248581886, 0.060305237770080566, 0.02039526402950287, -0.13525614142417908, 0.036731183528900146, -0.01959589309990406, 0.013663532212376595, 0.05431495979428291, 0.01932177133858204, 0.0071408418007195, 0.07765784114599228, 0.0038235653191804886, 0.029127562418580055, 0.06635479629039764, -0.0940747857093811, -0.08403662592172623, -0.07631821930408478, 0.11146420240402222, 0.13684070110321045, 0.011977951042354107, -0.008216362446546555, 0.06751847267150879, 0.02426084317266941, 0.08242015540599823, 0.03750709444284439, -0.233401358127594, 0.0009076974238269031, 0.18695251643657684, 0.0427534356713295, -0.0885692909359932, 0.05989446863532066, 0.027104131877422333, 0.03607715293765068, -0.02217157743871212, 0.0035923360846936703, -0.07432539761066437, 0.1205960288643837, -0.05598718300461769, -0.13951462507247925, -0.012194598093628883, 0.1950487643480301, 0.04505898803472519, -0.02442324534058571, -0.10598980635404587, -0.12401874363422394, 0.07400680333375931, -0.04560277611017227, -0.024540090933442116, 0.018561474978923798, 0.016499018296599388, 0.07349447160959244, -0.01598490960896015, -0.03176522254943848, -0.03563777357339859, -0.08387880027294159, 0.17548812925815582, 0.023961257189512253, 0.04647567868232727, -0.07652167230844498, 0.09688709676265717, -0.215931236743927, -0.11084969341754913, -0.013812441378831863, -0.07575484365224838, -0.02576562762260437, -0.0070944190956652164, -0.07958278059959412, -0.11387790739536285, 0.07084602117538452, 0.1724124550819397, 0.03839157521724701, 0.030065108090639114, 0.10976513475179672, 0.04370441660284996, 0.02068796008825302, 0.08518275618553162, -0.20905934274196625, -0.11948945373296738, 0.06957339495420456, 0.053831301629543304, 0.10483915358781815, 0.012074941769242287, -0.12239894270896912, 0.022696293890476227, -0.025698594748973846, -0.015576565638184547, 0.007999159395694733, 0.12443293631076813, -0.0690571665763855, -0.06722977012395859, 0.1847495138645172, -0.05329716205596924, -0.011392375454306602, 0.0011263844789937139, -0.00651933066546917, 0.001421632245182991, 0.14253486692905426, 0.05202135443687439, 0.027140580117702484, 0.04667242616415024, -0.0740598514676094, -0.033016134053468704, -0.06224970892071724, -0.08009808510541916, 0.013115073554217815, -0.0085222152993083, 0.029467733576893806, -0.10653671622276306, -0.30806592106819153, -0.015607798472046852, 0.043765246868133545, -0.033682018518447876, 0.0016726991161704063, -0.02899525687098503, 0.012089163064956665, -0.03021392412483692, -0.0006021216395311058, -0.01271318644285202, -0.015207646414637566, -0.01614288054406643, 0.03943701460957527, 0.06975279748439789, -0.14900323748588562, 0.01994522474706173, -0.10347742587327957, 0.12435021996498108, -0.23622311651706696, 0.13674743473529816, 0.014527341350913048, 0.08475211262702942, -0.07144349813461304, -0.003585617756471038, -0.01902393065392971, 0.022979112342000008, 0.07655921578407288, 0.20080965757369995, -0.16557186841964722, -0.07080186158418655, 0.1373576819896698, -0.14540430903434753, -0.1931062489748001, 0.07819715142250061, -0.01747872307896614, 0.135572150349617, 0.06772042065858841, 0.1900688111782074, 0.06736036390066147, 0.007160809822380543, 0.0016564278630539775, -0.051713962107896805, 0.023322073742747307, -0.0305442214012146, 0.06192435324192047, 0.0030771896708756685, -0.1429949849843979, 0.05945321172475815, -0.06649447232484818, 0.13616782426834106, -0.03624522686004639, -0.054380327463150024, -0.05426647141575813, -0.10182126611471176, 0.06638462096452713, 0.00024053377273958176, 0.01866859942674637, -0.06363942474126816, 0.00278597604483366, 0.07675009965896606, 0.10620588064193726, -0.10806042701005936, -0.02955259010195732, -0.04825609549880028, 0.08916239440441132, -0.14913992583751678, 0.07135836035013199, -0.12435106188058853, -0.0345364548265934, -0.024520287290215492, -0.017913179472088814, 0.024912580847740173, -0.013589221052825451, 0.07153826951980591, 0.06324364989995956, -0.028044313192367554, -0.05831288546323776, 0.10144903510808945, 0.05924702063202858, -0.026555461809039116, -0.15142939984798431, -0.07680981606245041, -0.05008969455957413, 0.31321981549263, -0.07739399373531342, 0.08724194020032883, -0.06718370318412781, 0.1649390012025833, -0.024597935378551483, -0.006011561956256628, 0.06472897529602051, 0.012444757856428623, -0.016278771683573723, -0.009122011251747608, 0.07389868050813675, 0.022968487814068794, -0.18556901812553406, 0.1412878930568695, -0.16401611268520355, 0.04877932742238045, 0.09587045758962631, 0.021635232493281364, -0.015291246585547924, -0.025393543764948845, -0.012765523977577686, -0.06008472666144371, 0.0737326517701149, -0.10270175337791443, 0.09534113109111786, 0.007212419528514147, 0.11872106790542603, -0.05190952867269516, -0.006795508321374655, 0.03126602619886398, -0.050303392112255096, -0.024986589327454567, 0.05666843429207802, 0.07308237254619598, -0.2056667059659958, 0.10693264752626419, 0.18084289133548737, 0.04997841268777847, 0.08178775757551193, -0.003908609505742788, 0.010292505845427513, -0.08259211480617523, -0.06081771105527878, -0.016192754730582237, 0.04237145930528641, -0.10980977863073349, 0.022268060594797134, 0.06153108924627304, -0.018015658482909203, 0.048371922224760056, -0.11196599155664444, 0.025266660377383232, 0.06089536100625992, 0.017751839011907578, 0.07199687510728836, 0.0841115415096283, -0.01678777113556862, 0.041544754058122635, -0.015256189741194248, -0.06529678404331207, 0.04183570295572281, 0.0013631588080897927, -0.137632355093956, 0.17998777329921722, -0.08204325288534164, -0.21256515383720398, -0.15879544615745544, -0.06312233954668045, -0.11817122995853424, 0.02073642611503601, 0.033535219728946686, 0.032206740230321884, -0.09841208159923553, -0.11427492648363113, 0.07562846690416336, 0.08772742003202438, -0.0051203519105911255, 0.05167866498231888, -0.04225827008485794, 0.040304362773895264, -0.08156394958496094, -0.026929471641778946, 0.004815564025193453, 0.03715664893388748, 0.03012189455330372, -0.05639555677771568, 0.11546754837036133, 0.17491334676742554, 0.016076548025012016, -0.003807770786806941, -0.0036743436940014362, 0.20774327218532562, -0.024104773998260498, 0.07308147102594376, 0.20844721794128418, -0.10854090750217438, 0.024280333891510963, 0.2315293252468109, 0.02226618118584156, -0.08071961253881454, 0.006421778816729784, -0.09527449309825897, -0.08374718576669693, -0.12351763248443604, -0.14037157595157623, -0.0737181007862091, 0.054371464997529984, 0.02567916549742222, 0.005080224946141243, 0.015569599345326424, 0.09215261787176132, -0.05399477481842041, -0.0019090301357209682, 0.04412371292710304, 0.058672528713941574, 0.20526587963104248, -0.023435724899172783, 0.09676647931337357, -0.06178181245923042, -0.03853970393538475, 0.03917039558291435, 0.04385368898510933, 0.0715809017419815, 0.0697476789355278, 0.10969933867454529, 0.07073896378278732, -0.02462940849363804, 0.028656069189310074, 0.10322190821170807, -0.015502973459661007, -0.00688546197488904, -0.05724511668086052, -0.09455204755067825, -0.05088542401790619, 0.08125632256269455, -0.10305599123239517, 0.0709330141544342, -0.09806039184331894, 0.024559209123253822, 0.0682680755853653, 0.14028193056583405, 0.10234921425580978, -0.30677956342697144, -0.14478714764118195, 0.07055893540382385, 0.03807184845209122, -0.002548281103372574, -0.0250875111669302, 0.015447122044861317, -0.06378244608640671, 0.14549775421619415, -0.021644171327352524, 0.08871456235647202, -0.021417204290628433, 0.011610077694058418, -0.02090255171060562, 0.0630432516336441, 0.03909927234053612, 0.03131697326898575, -0.11645317822694778, 0.1445137858390808, 0.03309517353773117, -0.029998118057847023, -0.026594197377562523, 0.03155358508229256, 0.03239842876791954, 0.20915241539478302, 0.029684578999876976, -0.0006020953878760338, 0.038165755569934845, -0.026994738727808, -0.09582590311765671, 0.016859035938978195, -0.03984411433339119, -0.07838749140501022, 0.07704097777605057, -0.03175792843103409, -0.05338234081864357, 0.002075971569865942, 0.09996151179075241, -0.05346911773085594, -0.1319219321012497, 0.05219922214746475, 0.06728723645210266, 0.051177311688661575, -0.0912940502166748, -0.045302219688892365, -0.16718259453773499, 0.19984251260757446, 0.013972054235637188, -0.08545900881290436, -0.09383577853441238, 0.018305068835616112, 0.0987943485379219, -0.05518363043665886, 0.06433763355016708, -0.028942974284291267, 0.05759665369987488, -0.06149696558713913, -0.17731836438179016, 0.11147862672805786, -0.12929733097553253, -0.1021808534860611, -0.036145295947790146, 0.12316717952489853, -0.05466477572917938, 0.015145174227654934, 0.0013778416905552149, 0.03516802564263344, -0.03998672962188721, -0.0637565329670906, -0.04869070276618004, 0.2591734230518341, 0.018129054456949234, 0.10351148992776871, -0.004963364917784929, -0.1193399652838707, 0.00871006865054369, 0.004395518451929092, 0.12277127802371979, 0.1874760389328003, -0.05614768713712692, 0.0292679313570261, 0.15126007795333862, -0.046704888343811035, -0.24523742496967316, -0.023715844377875328, -0.04119354858994484, 0.04543353244662285, 0.02304212562739849, 0.01288104709237814, 0.08292777836322784, 0.07740019261837006, -0.01824779622256756, 0.04808920994400978, -0.3113402724266052, -0.16196057200431824, 0.07145867496728897, 0.08304382115602493, 0.2153984159231186, -0.1256672739982605, -0.05582576245069504, -0.07415422052145004, -0.22625720500946045, 0.0231876689940691, -0.10345986485481262, 0.062055204063653946, 0.003906381782144308, 0.04172004014253616, 0.02629099227488041, -0.06882454454898834, 0.13903284072875977, -0.01639765314757824, 0.028458939865231514, -0.06618425250053406, 0.02683522365987301, 0.08408600091934204, -0.055953845381736755, 0.1360902488231659, -0.060907747596502304, 0.04600347951054573, 0.013246646150946617, -0.052340682595968246, -0.039580654352903366, 0.011631457135081291, -0.013652320019900799, -0.06032181531190872, -0.052979424595832825, 0.02599661611020565, 0.0007516450714319944, 0.018815547227859497, 0.06823290884494781, -0.0435921847820282, 0.012878471985459328, 0.2007349282503128, 0.09234960377216339, -0.11775162816047668, -0.0049676126800477505, 0.006895100697875023, -0.06341145187616348, 0.04932935908436775, -0.15300405025482178, 0.013734818436205387, 0.09964868426322937, -0.011631051078438759, 0.14123837649822235, 0.02867269515991211, -0.03183698281645775, -0.026527712121605873, 0.06624466180801392, -0.18523095548152924, -0.2802336513996124, -0.03765340521931648, 0.03124433569610119, -0.05310007557272911, 0.1261119395494461, 0.1859615445137024, -0.07339314371347427, -0.014152181334793568, -0.005097191780805588, 0.027484390884637833, -0.08138028532266617, 0.10610504448413849, 0.005537188146263361, 0.038878899067640305, -0.11992697417736053, 0.055984482169151306, 0.0285536777228117, -0.11044900119304657, -0.02314750850200653, 0.02427411451935768, -0.13596484065055847, -0.07976508140563965, -0.1513182371854782, 0.21076853573322296, -0.07877194136381149, -0.07324302196502686, -0.14252324402332306, -0.1048794686794281, -0.0018633807776495814, 0.09548567235469818, 0.0766969695687294, 0.027068905532360077, -0.008184298872947693, -0.07352060079574585, -0.0614839643239975, 0.08451665937900543, 0.012225992046296597, 0.09659776836633682, -0.1319228708744049, 0.010273869149386883, -0.02433060109615326, 0.051486145704984665, -0.04374547675251961, 0.00015363261627499014, -0.10779543966054916, -0.019195416942238808, -0.13918833434581757, -0.03457595407962799, -0.17640730738639832, -0.026607224717736244, 0.01029666792601347, -0.019950736314058304, -0.016552340239286423, 0.04001052305102348, -0.02873566374182701, -0.019248854368925095, -0.020087119191884995, 0.04533066973090172, -0.04170393571257591, -0.04294360429048538, 0.00673443591222167, -0.06269092112779617, 0.07808266580104828, 0.010451547801494598, -0.03329058364033699, -0.03157121688127518, -0.06746020168066025, -0.0342847965657711, 0.08645442128181458, 0.009202176705002785, 0.02884586714208126, -0.12405111640691757, -0.0076826754957437515, 0.03778962790966034, -0.06494925171136856, -0.03430482745170593, 0.07633735239505768, -0.04725354164838791, 0.02106715366244316, 0.0044956267811357975, 0.0026686741039156914, -0.03887123614549637, -0.05831054970622063, 0.034599799662828445, 0.09023350477218628, 0.10521024465560913, -0.0517730750143528, 0.03867422044277191, -0.18097907304763794, -0.014997432939708233, -0.013670498505234718, -0.09414106607437134, -0.0712527260184288, -0.0927886962890625, 0.006838372442871332, -0.01863781549036503, 0.18864530324935913, -0.028590768575668335, -0.03447034955024719, 0.00786418654024601, 0.021774878725409508, 0.1251315027475357, 0.060078635811805725, 0.2045467346906662, 0.007174829486757517, 0.034089040011167526, -0.06930068880319595, 0.11451436579227448, 0.04486660659313202, 0.033514633774757385, 0.03021426685154438, 0.020638788118958473, -0.019410496577620506, 0.11524846404790878, 0.0763482004404068, 0.03880336880683899, -0.021179772913455963, -0.10074973851442337, -0.0249557513743639, 0.062395673245191574, -0.027440210804343224, 0.15852417051792145, 0.12005718797445297, -0.12946845591068268, 0.04080335050821304, -0.003942085895687342, -0.033399321138858795, -0.07545965164899826, -0.08779091387987137, -0.11242223531007767, -0.13664789497852325, -0.039468321949243546, -0.08689592033624649, -0.08814992755651474, 0.016002891585230827, -0.02594870887696743, -0.04898446053266525, 0.15892063081264496, -0.0030273892916738987, 0.00911942683160305, -0.003437325358390808, -0.00824320875108242, -0.02694609947502613, -0.0038676843978464603, -0.049976591020822525, 0.025778012350201607, 0.013498692773282528, -0.0061678229831159115, 0.0180479995906353, 0.018034854903817177, 0.03623870015144348, -0.004368706606328487, -0.09149763733148575, -0.027469856664538383, 0.06185212358832359, 0.056015852838754654, -0.005197176244109869, 0.030528133735060692, -0.028225665912032127, 0.000864816945977509, 0.007692744489759207, -0.046709395945072174, -0.06754148751497269, -0.12057273089885712, 0.19249919056892395, -0.05335231125354767, 0.04322708398103714, 0.04407268390059471, -0.09612971544265747, 0.014999558217823505, 0.17055749893188477, 0.30313342809677124, -0.058850131928920746, 0.0023482474498450756, -0.022464832291007042, 0.0028383927419781685, -0.0039429632015526295, 0.05333350971341133, 0.047950759530067444, 0.14282123744487762, -0.03509651869535446, 0.05336655676364899, -0.012384891510009766, -0.08906113356351852, -0.0172006543725729, 0.010903759859502316, -0.018548941239714622, -0.030147487297654152, 0.04638081416487694, 0.09207695722579956, -0.04405937343835831, -0.02769983559846878, 0.049282900989055634, -0.15230491757392883, -0.08402056992053986, -0.0792621299624443, 0.006898519583046436, -0.010706325061619282, 0.055156316608190536, -0.07445599138736725, 0.001823363360017538, 0.1600620299577713, -0.010838781483471394, -0.10518322885036469, -0.048990700393915176, 0.012115859426558018, -0.003997590392827988, 0.0198940671980381, -0.022519918158650398, 0.023981284350156784, 0.11721879988908768, 0.02378484047949314, -0.1402702033519745, 0.04929065704345703, 0.012216689065098763, 0.007434465456753969, 0.020955752581357956, 0.01653873920440674, -0.03270167484879494, 0.054500360041856766, 0.021452024579048157, -0.2232559770345688, 0.03078695759177208, 0.017836641520261765, -0.06989376246929169, -0.08847394585609436, 0.060492292046546936, -0.04951857775449753, 0.1384337991476059, 0.10181435197591782, -0.059432502835989, -0.003949172794818878, -0.0183668565005064, 0.0722396969795227, 0.0625341385602951, 0.07662408798933029, -0.012899214401841164, -0.18718592822551727, 0.024703525006771088, -0.014292535372078419, 0.041429344564676285, -0.27476298809051514, -0.08324241638183594, -0.11144960671663284, -0.03049188293516636, -0.043534230440855026, 0.10457940399646759, 0.16558197140693665, 0.018819373100996017, -0.032358549535274506, -0.13405700027942657, -0.029478279873728752, 0.10243178904056549, -0.09500917792320251, -0.1315767765045166 ]
null
null
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
feature-extraction
furrutiav/bert_qa_extractor_cockatiel_2022_nllf_z_value_over_subsample_it_726
[ "transformers", "safetensors", "bert", "feature-extraction", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
2024-02-13T20:25:50+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #bert #feature-extraction #arxiv-1910.09700 #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #bert #feature-extraction #arxiv-1910.09700 #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ 39, 6, 3, 82, 28, 3, 4, 9, 9, 10, 42, 20, 3, 4, 5, 9, 11, 13, 3, 12, 5, 4, 5, 3, 4, 9, 53, 9, 8, 6, 3, 14, 8, 7, 9, 4 ]
[ "passage: TAGS\n#transformers #safetensors #bert #feature-extraction #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact" ]
[ -0.052746038883924484, 0.20255789160728455, -0.0045078229159116745, 0.0248473659157753, 0.10497838258743286, 0.00675728265196085, 0.06521498411893845, 0.11486967653036118, -0.0023755673319101334, 0.12028469145298004, 0.027631845325231552, 0.08119397610425949, 0.12110675126314163, 0.15393014252185822, 0.005160121712833643, -0.24253977835178375, 0.05344875901937485, -0.09366832673549652, 0.004077504388988018, 0.11452110856771469, 0.1343945860862732, -0.10780399292707443, 0.08976872265338898, -0.00683097867295146, -0.01712046191096306, -0.015751034021377563, -0.07134060561656952, -0.06668227165937424, 0.05541034787893295, 0.07649129629135132, 0.0725555345416069, 0.010986946523189545, 0.07830587029457092, -0.2806258797645569, 0.014425364322960377, 0.08005264401435852, 0.0010765197221189737, 0.06795802712440491, 0.08151742070913315, -0.06789936870336533, 0.1251654475927353, -0.0605485662817955, 0.14059753715991974, 0.07639917731285095, -0.08928128331899643, -0.19590547680854797, -0.06669555604457855, 0.07481247186660767, 0.129872128367424, 0.05026249960064888, -0.02990107797086239, 0.1371748298406601, -0.09688840061426163, 0.00786701962351799, 0.12302009761333466, -0.07360870391130447, -0.05524582043290138, 0.031063849106431007, 0.10805318504571915, 0.09297362715005875, -0.11762315034866333, -0.008467874489724636, 0.029582185670733452, 0.022175652906298637, 0.08627551048994064, 0.015828849747776985, 0.1525639444589615, 0.041341137140989304, -0.14141254127025604, -0.0526716373860836, 0.09056255221366882, 0.03701045364141464, -0.050960201770067215, -0.23367193341255188, -0.026245610788464546, -0.012442239560186863, -0.03079850971698761, -0.04234880208969116, 0.053594592958688736, -0.03630254790186882, 0.07596245408058167, -0.007196845952421427, -0.07732249796390533, -0.031211229041218758, 0.05230424553155899, 0.06785056740045547, 0.018615471199154854, -0.006994647905230522, 0.019442738965153694, 0.11387838423252106, 0.07708574831485748, -0.13029205799102783, -0.07214002311229706, -0.0739525631070137, -0.09558356553316116, -0.04332297295331955, 0.03707554563879967, 0.07106684148311615, 0.04390906170010567, 0.20283061265945435, -0.017690327018499374, 0.046562306582927704, 0.0476159006357193, 0.005842953454703093, 0.07147589325904846, 0.10925443470478058, -0.06689215451478958, -0.14432233572006226, -0.06022803485393524, 0.08875485509634018, -0.009834992699325085, -0.03670760244131088, -0.049119677394628525, 0.04676154628396034, 0.03209913894534111, 0.11318106204271317, 0.08643888682126999, -0.003593706525862217, -0.0628826767206192, -0.042073074728250504, 0.22331053018569946, -0.14625342190265656, 0.043256524950265884, 0.007445589639246464, -0.0429743155837059, -0.0076383077539503574, 0.005870272871106863, 0.014089803211390972, -0.03238216042518616, 0.10351061820983887, -0.0778173878788948, -0.035906463861465454, -0.1116463914513588, -0.06868703663349152, 0.024910317733883858, 0.0025890374090522528, -0.018393149599432945, -0.04424213990569115, -0.11253650486469269, -0.051282741129398346, 0.0724339634180069, -0.07579848170280457, -0.05524555593729019, 0.009976830333471298, -0.04834962263703346, 0.0031978494953364134, 0.00010397454752819613, 0.11258035898208618, -0.03314845636487007, 0.025259260088205338, -0.04850656911730766, 0.06803499162197113, 0.10959596186876297, 0.038730688393116, -0.0804535374045372, 0.07286878675222397, -0.22788093984127045, 0.10223092138767242, -0.09346398711204529, 0.025767935439944267, -0.14578653872013092, -0.04199126362800598, 0.02854149229824543, 0.02887420728802681, -0.010361229069530964, 0.1268649846315384, -0.1982942521572113, -0.035082314163446426, 0.15190726518630981, -0.11336656659841537, -0.09347330778837204, 0.065653957426548, -0.05610617995262146, 0.11296144872903824, 0.04835578054189682, -0.019556574523448944, 0.06953749805688858, -0.1281629204750061, -0.04506009817123413, -0.021473335102200508, -0.008493004366755486, 0.14857245981693268, 0.06750676780939102, -0.05737153813242912, 0.07104712724685669, 0.02051553688943386, -0.037109848111867905, -0.03301886469125748, -0.03470754995942116, -0.09331934154033661, 0.009520708583295345, -0.07244295626878738, 0.03737799823284149, -0.02224314957857132, -0.08870045095682144, -0.030656753107905388, -0.17619828879833221, 0.043274905532598495, 0.08050142228603363, 0.008233942091464996, -0.021131468936800957, -0.09287237375974655, 0.02556683123111725, -0.009385489858686924, -0.021018607541918755, -0.1641797423362732, -0.044834475964307785, 0.04416196420788765, -0.1971662938594818, 0.023802341893315315, -0.03283040598034859, 0.05093098804354668, 0.03247829154133797, -0.04019762575626373, -0.005096070934087038, 0.0028117431793361902, 0.01809627003967762, -0.026984719559550285, -0.200385183095932, -0.031109308823943138, -0.029154371470212936, 0.1362139731645584, -0.22226740419864655, 0.028292208909988403, 0.07483648508787155, 0.13521188497543335, 0.0009690870065242052, -0.04426588490605354, 0.010693409480154514, -0.05366935580968857, -0.053671274334192276, -0.06512755900621414, -0.007102466654032469, -0.03287021815776825, -0.04422381520271301, 0.06460095942020416, -0.19425635039806366, -0.03641216829419136, 0.10608077049255371, 0.10164625942707062, -0.14719000458717346, -0.028969714418053627, -0.04096706584095955, -0.06081128865480423, -0.09094393998384476, -0.0630471333861351, 0.14371246099472046, 0.04861542955040932, 0.048413511365652084, -0.08624191582202911, -0.0630124881863594, 0.00895135197788477, 0.0006565740332007408, -0.03649118170142174, 0.08907787501811981, 0.08782777935266495, -0.10737399011850357, 0.08881597965955734, 0.08605224639177322, 0.06605713814496994, 0.10539878904819489, 0.001256609451957047, -0.10750970244407654, -0.029154706746339798, 0.005644100718200207, 0.01547710970044136, 0.14092515408992767, -0.044270921498537064, 0.04743899777531624, 0.05656488984823227, -0.027443327009677887, 0.01715722121298313, -0.10313762724399567, 0.02984124980866909, 0.046840768307447433, -0.010507673025131226, 0.012429861351847649, -0.03895113617181778, 0.025837475433945656, 0.08796556293964386, 0.03584056720137596, 0.027896199375391006, 0.0029043578542768955, -0.03437814116477966, -0.10392027348279953, 0.17429527640342712, -0.0878753736615181, -0.28357240557670593, -0.1356295943260193, -0.00747122336179018, 0.05167245492339134, -0.022715993225574493, 0.013256389647722244, -0.04903135821223259, -0.11467588692903519, -0.10348290205001831, 0.008818334899842739, 0.0437844917178154, -0.07700283080339432, -0.07256268709897995, 0.046553414314985275, 0.033613573759794235, -0.14174877107143402, 0.022300107404589653, 0.048012908548116684, -0.03855963796377182, -0.015413837507367134, 0.07170835882425308, 0.10258439928293228, 0.17387451231479645, -0.004228805657476187, -0.01945391111075878, 0.023280048742890358, 0.24459126591682434, -0.14296141266822815, 0.10647262632846832, 0.15432609617710114, -0.06630013138055801, 0.1025824174284935, 0.19176462292671204, 0.02610800787806511, -0.07571171224117279, 0.03370760753750801, 0.03715203329920769, -0.053104497492313385, -0.23274335265159607, -0.060641512274742126, 0.0011178229469805956, -0.06850682199001312, 0.09104112535715103, 0.08915619552135468, 0.11183936148881912, 0.0454646460711956, -0.08415863662958145, -0.06847929954528809, 0.019614145159721375, 0.10642454773187637, -0.03275766968727112, 0.007264797575771809, 0.09054313600063324, -0.04184457287192345, -0.005177726969122887, 0.10835286974906921, 0.007426192983984947, 0.1962665617465973, 0.031048519536852837, 0.15333782136440277, 0.07211130857467651, 0.0342402458190918, 0.026680786162614822, 0.025636766105890274, 0.023090654984116554, 0.009547512046992779, -0.01598707027733326, -0.08795502036809921, 0.027014199644327164, 0.13500221073627472, 0.07871367782354355, 0.029795078560709953, 0.020392734557390213, -0.0429922379553318, 0.062152985483407974, 0.15964233875274658, 0.006258485373109579, -0.2136749029159546, -0.03950631618499756, 0.08867984265089035, -0.0793125256896019, -0.1237078458070755, -0.02518491819500923, 0.03823186457157135, -0.1809074580669403, 0.04127289727330208, -0.01795332506299019, 0.11453432589769363, -0.11700457334518433, -0.028958700597286224, 0.039744846522808075, 0.08327627927064896, -0.03253408893942833, 0.07922478020191193, -0.1647184044122696, 0.1165376752614975, 0.012328862212598324, 0.05802180990576744, -0.11617794632911682, 0.09878876805305481, 0.012594180181622505, -0.009003117680549622, 0.16720694303512573, -0.0008162438753060997, -0.07339610159397125, -0.06517832726240158, -0.07867198437452316, -0.022016214206814766, 0.09116258472204208, -0.11647430807352066, 0.08271238952875137, -0.012302344664931297, -0.03819865360856056, 0.002976413816213608, -0.1073245257139206, -0.12343364208936691, -0.191313698887825, 0.05862122401595116, -0.11746024340391159, 0.00024363139527849853, -0.10003595799207687, -0.05551697313785553, -0.04721582680940628, 0.19990667700767517, -0.14306047558784485, -0.09675363451242447, -0.1526252180337906, -0.09468596428632736, 0.1679719239473343, -0.04768168181180954, 0.08716544508934021, -0.00014324963558465242, 0.22273695468902588, 0.00589721417054534, -0.010143720544874668, 0.07824880629777908, -0.08608578145503998, -0.17828822135925293, -0.07740302383899689, 0.12055730819702148, 0.12802201509475708, 0.05279289186000824, -0.012038013897836208, 0.020934196189045906, -0.036648161709308624, -0.11678951978683472, 0.003050430677831173, 0.1217387318611145, 0.05949230119585991, 0.039503831416368484, -0.002558275358751416, -0.10200468450784683, -0.07551230490207672, -0.0352395698428154, 0.02261841483414173, 0.18903005123138428, -0.08441178500652313, 0.15781226754188538, 0.13112787902355194, -0.05333179607987404, -0.21253353357315063, 0.030583804473280907, 0.043237145990133286, 0.004318034742027521, 0.0612679123878479, -0.17720702290534973, 0.08167627453804016, 0.025727098807692528, -0.05116020143032074, 0.15224720537662506, -0.16569727659225464, -0.15514664351940155, 0.0824643224477768, 0.05010354146361351, -0.22108957171440125, -0.12386278063058853, -0.0879128947854042, -0.06589758396148682, -0.1396872103214264, 0.08584427833557129, 0.014041651971638203, -0.0018043812597170472, 0.05013851076364517, 0.033740755170583725, 0.018914686515927315, -0.048698488622903824, 0.21615906059741974, -0.0022440196480602026, 0.03326340764760971, -0.07553089410066605, -0.10180798172950745, 0.06950566172599792, -0.05141735449433327, 0.08518881350755692, -0.03099823370575905, 0.005753061734139919, -0.08320630341768265, -0.057475052773952484, -0.05255331099033356, 0.03318103775382042, -0.08139406144618988, -0.10520965605974197, -0.06759276986122131, 0.09429939836263657, 0.09139011800289154, -0.03298058733344078, -0.04032526910305023, -0.08896728605031967, 0.039150089025497437, 0.20617929100990295, 0.17360219359397888, 0.05333937704563141, -0.10111589729785919, 0.002542630536481738, -0.01915728859603405, 0.040264517068862915, -0.21200114488601685, 0.04798245429992676, 0.04617756977677345, 0.024147402495145798, 0.12109645456075668, -0.0176423080265522, -0.1646004468202591, -0.047221194952726364, 0.0562983863055706, -0.03494611009955406, -0.20504815876483917, -0.01314060389995575, 0.04864202439785004, -0.18736153841018677, -0.06957933306694031, 0.016700902953743935, -0.014444489032030106, -0.027432914823293686, 0.013032985851168633, 0.06286440044641495, 0.025481918826699257, 0.10238313674926758, 0.05989401787519455, 0.1000840812921524, -0.112981878221035, 0.0795830711722374, 0.09043775498867035, -0.08344172686338425, 0.009394102729856968, 0.06964189559221268, -0.05280066654086113, -0.02294989861547947, 0.022772129625082016, 0.06757686287164688, -0.003049787599593401, -0.057536181062459946, -0.02079189568758011, -0.10809285193681717, 0.06586270034313202, 0.1269281655550003, 0.0400845967233181, -0.006831571459770203, 0.04905473813414574, 0.02419281378388405, -0.07880669087171555, 0.11321208626031876, 0.03362756222486496, 0.03722309693694115, -0.05989459529519081, -0.01674187369644642, 0.04316421225667, 0.005734616424888372, -0.02047782577574253, -0.025104478001594543, -0.05658029392361641, -0.013948953710496426, -0.18932224810123444, 0.014544147998094559, -0.07588981091976166, 0.005138450767844915, 0.014814606867730618, -0.040141742676496506, -0.018671197816729546, 0.012856033630669117, -0.08163223415613174, -0.05027473345398903, -0.0038707295898348093, 0.09766460955142975, -0.1400173306465149, 0.008230311796069145, 0.09175591170787811, -0.11852382868528366, 0.06848865002393723, -0.019968708977103233, -0.014717686921358109, 0.0038272906094789505, -0.1270400881767273, 0.04572216048836708, -0.004586559720337391, 0.02062096633017063, 0.04444560408592224, -0.17065683007240295, 0.004877567756921053, -0.0423397533595562, -0.0478336401283741, -0.015323328785598278, -0.08405033499002457, -0.11406292766332626, 0.10921793431043625, 0.002206311793997884, -0.08430022746324539, -0.010287429206073284, 0.04696008190512657, 0.10919637978076935, -0.03898061811923981, 0.124757781624794, 0.0047785635106265545, 0.06639395654201508, -0.18268363177776337, -0.024298490956425667, -0.014514438807964325, 0.007352736312896013, 0.027192458510398865, -0.016180848702788353, 0.04238643869757652, -0.01372526679188013, 0.2601816952228546, -0.021822240203619003, 0.07231466472148895, 0.0637383759021759, 0.042024899274110794, 0.016651110723614693, 0.08318763226270676, 0.06755662709474564, 0.016758481040596962, 0.004258559085428715, 0.02265608124434948, -0.03241465613245964, -0.016654497012495995, -0.15768693387508392, 0.07677853107452393, 0.14623822271823883, 0.08591317385435104, 0.007676990237087011, 0.06586159020662308, -0.10330242663621902, -0.10554943233728409, 0.08015866577625275, -0.03888537734746933, -0.0009790018666535616, -0.058588381856679916, 0.15355949103832245, 0.14971502125263214, -0.17422176897525787, 0.08231138437986374, -0.03791337087750435, -0.04883022606372833, -0.11436772346496582, -0.15839459002017975, -0.06608819216489792, -0.029153592884540558, -0.0041826991364359856, -0.05528274551033974, 0.06748054921627045, 0.10802645981311798, -0.0021057529374957085, -0.00038325722562149167, 0.09545762091875076, -0.026331622153520584, -0.01757199876010418, 0.03465426340699196, 0.04817976430058479, 0.033562518656253815, -0.04831063002347946, 0.020485511049628258, 0.004976877011358738, 0.03976510092616081, 0.05864322930574417, 0.023703020066022873, -0.03892989084124565, 0.014479226432740688, -0.01092575490474701, -0.1049860492348671, 0.022427968680858612, -0.029776830226182938, -0.07360642403364182, 0.13104131817817688, 0.029177764430642128, 0.019099419936537743, -0.03228067234158516, 0.20109383761882782, -0.07107947021722794, -0.06925153732299805, -0.14109766483306885, 0.10889512300491333, -0.03372858464717865, 0.06323269009590149, 0.058447178453207016, -0.1133023053407669, -0.002398417331278324, 0.1314154714345932, 0.133079394698143, -0.033533163368701935, 0.005780258681625128, 0.03008044883608818, 0.00756559893488884, -0.0482633113861084, 0.045497048646211624, 0.031092669814825058, 0.15440985560417175, -0.06949599832296371, 0.07780899107456207, 0.00008295764564536512, -0.08774317800998688, -0.036128852516412735, 0.1405542492866516, 0.006535779219120741, 0.03079606406390667, -0.06559351831674576, 0.10371401906013489, -0.07252706587314606, -0.23936228454113007, 0.045033879578113556, -0.07753164321184158, -0.15683837234973907, -0.013978141359984875, 0.02726292423903942, -0.009009851142764091, 0.02702206000685692, 0.0654432401061058, -0.06469112634658813, 0.161378413438797, 0.03472336754202843, -0.08781957626342773, -0.05673113837838173, 0.07957270741462708, -0.09192227572202682, 0.2958409786224365, 0.013188840821385384, 0.029593972489237785, 0.10327941924333572, -0.019989576190710068, -0.13285429775714874, 0.030561091378331184, 0.10066051781177521, -0.09982595592737198, 0.06684590131044388, 0.18159176409244537, -0.009470577351748943, 0.10021016746759415, 0.07437440752983093, -0.061603669077157974, 0.05807222053408623, -0.0826035663485527, -0.06770919263362885, -0.09389114379882812, 0.05970105528831482, -0.06468918174505234, 0.14543601870536804, 0.1228262409567833, -0.04243761673569679, -0.004415105562657118, -0.02816380001604557, 0.043726447969675064, 0.012194468639791012, 0.12871193885803223, 0.008576037362217903, -0.1618158370256424, 0.026840461418032646, 0.0030557403806596994, 0.10387714207172394, -0.21997274458408356, -0.08367477357387543, 0.04838619381189346, -0.029553698375821114, -0.05334814265370369, 0.10579082369804382, 0.06295353919267654, 0.0504634715616703, -0.04548325017094612, -0.05543007701635361, -0.008723298087716103, 0.14979462325572968, -0.1187625601887703, -0.006005466915667057 ]
null
null
transformers
# Model Card for Zenith-7B Mistral-7B-v0.1 model fine-tuned on the Ultrafeedback dataset using techinques shown in the paper [Self-Rewarding Language Models](https://arxiv.org/abs/2401.10020). ## Instruction format In order to leverage instruction fine-tuning, your prompt should be surrounded by `[INST]` and `[/INST]` tokens. The very first instruction should begin with a begin of sentence id. The next instructions should not. The assistant generation will be ended by the end-of-sentence token id. E.g. ``` text = "<s>[INST] What is your favourite condiment? [/INST]" "Well, I'm quite partial to a good squeeze of fresh lemon juice. It adds just the right amount of zesty flavour to whatever I'm cooking up in the kitchen!</s> " "[INST] Do you have mayonnaise recipes? [/INST]" ``` This format is available as a [chat template](https://huggingface.co/docs/transformers/main/chat_templating) via the `apply_chat_template()` method: ```python from transformers import AutoModelForCausalLM, AutoTokenizer device = "cuda" # the device to load the model onto model = AutoModelForCausalLM.from_pretrained("Xenon1/Zenith-7B") tokenizer = AutoTokenizer.from_pretrained("Xenon1/Zenith-7B") messages = [ {"role": "user", "content": "What is your favourite condiment?"}, {"role": "assistant", "content": "Well, I'm quite partial to a good squeeze of fresh lemon juice. It adds just the right amount of zesty flavour to whatever I'm cooking up in the kitchen!"}, {"role": "user", "content": "Do you have mayonnaise recipes?"} ] encodeds = tokenizer.apply_chat_template(messages, return_tensors="pt") model_inputs = encodeds.to(device) model.to(device) generated_ids = model.generate(model_inputs, max_new_tokens=1000, do_sample=True) decoded = tokenizer.batch_decode(generated_ids) print(decoded[0]) ``` ## Model Architecture This instruction model is based on Mistral-7B-v0.1, a transformer model with the following architecture choices: - Grouped-Query Attention - Sliding-Window Attention - Byte-fallback BPE tokenizer
{"language": ["en"], "license": "apache-2.0", "tags": ["mistral", "Zenith-7B"], "pipeline_tag": "text-generation"}
text-generation
Xenon1/Zenith-7B
[ "transformers", "safetensors", "mistral", "text-generation", "Zenith-7B", "conversational", "en", "arxiv:2401.10020", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2024-02-13T20:26:02+00:00
[ "2401.10020" ]
[ "en" ]
TAGS #transformers #safetensors #mistral #text-generation #Zenith-7B #conversational #en #arxiv-2401.10020 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# Model Card for Zenith-7B Mistral-7B-v0.1 model fine-tuned on the Ultrafeedback dataset using techinques shown in the paper Self-Rewarding Language Models. ## Instruction format In order to leverage instruction fine-tuning, your prompt should be surrounded by '[INST]' and '[/INST]' tokens. The very first instruction should begin with a begin of sentence id. The next instructions should not. The assistant generation will be ended by the end-of-sentence token id. E.g. This format is available as a chat template via the 'apply_chat_template()' method: ## Model Architecture This instruction model is based on Mistral-7B-v0.1, a transformer model with the following architecture choices: - Grouped-Query Attention - Sliding-Window Attention - Byte-fallback BPE tokenizer
[ "# Model Card for Zenith-7B\n\nMistral-7B-v0.1 model fine-tuned on the Ultrafeedback dataset using techinques shown in the paper Self-Rewarding Language Models.", "## Instruction format\n\nIn order to leverage instruction fine-tuning, your prompt should be surrounded by '[INST]' and '[/INST]' tokens. The very first instruction should begin with a begin of sentence id. The next instructions should not. The assistant generation will be ended by the end-of-sentence token id.\n\nE.g.\n\n\nThis format is available as a chat template via the 'apply_chat_template()' method:", "## Model Architecture\nThis instruction model is based on Mistral-7B-v0.1, a transformer model with the following architecture choices:\n- Grouped-Query Attention\n- Sliding-Window Attention\n- Byte-fallback BPE tokenizer" ]
[ "TAGS\n#transformers #safetensors #mistral #text-generation #Zenith-7B #conversational #en #arxiv-2401.10020 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# Model Card for Zenith-7B\n\nMistral-7B-v0.1 model fine-tuned on the Ultrafeedback dataset using techinques shown in the paper Self-Rewarding Language Models.", "## Instruction format\n\nIn order to leverage instruction fine-tuning, your prompt should be surrounded by '[INST]' and '[/INST]' tokens. The very first instruction should begin with a begin of sentence id. The next instructions should not. The assistant generation will be ended by the end-of-sentence token id.\n\nE.g.\n\n\nThis format is available as a chat template via the 'apply_chat_template()' method:", "## Model Architecture\nThis instruction model is based on Mistral-7B-v0.1, a transformer model with the following architecture choices:\n- Grouped-Query Attention\n- Sliding-Window Attention\n- Byte-fallback BPE tokenizer" ]
[ 76, 44, 105, 56 ]
[ "passage: TAGS\n#transformers #safetensors #mistral #text-generation #Zenith-7B #conversational #en #arxiv-2401.10020 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Model Card for Zenith-7B\n\nMistral-7B-v0.1 model fine-tuned on the Ultrafeedback dataset using techinques shown in the paper Self-Rewarding Language Models.## Instruction format\n\nIn order to leverage instruction fine-tuning, your prompt should be surrounded by '[INST]' and '[/INST]' tokens. The very first instruction should begin with a begin of sentence id. The next instructions should not. The assistant generation will be ended by the end-of-sentence token id.\n\nE.g.\n\n\nThis format is available as a chat template via the 'apply_chat_template()' method:## Model Architecture\nThis instruction model is based on Mistral-7B-v0.1, a transformer model with the following architecture choices:\n- Grouped-Query Attention\n- Sliding-Window Attention\n- Byte-fallback BPE tokenizer" ]
[ -0.057093292474746704, 0.015491068363189697, -0.0064651551656425, 0.006109266076236963, 0.05939408019185066, -0.03828968480229378, 0.12419088184833527, 0.051486462354660034, 0.06320525705814362, 0.06231771782040596, 0.016053294762969017, 0.09384942799806595, 0.08352772891521454, 0.14747217297554016, -0.013681975193321705, -0.203031986951828, 0.0782102644443512, -0.022952819243073463, 0.09499804675579071, 0.06754578649997711, 0.07884412258863449, -0.056120648980140686, 0.05380849167704582, 0.0527210496366024, -0.06691687554121017, 0.00891315657645464, 0.005114189814776182, -0.019265586510300636, 0.09538330882787704, 0.0682094544172287, 0.049560341984033585, 0.020388208329677582, -0.015762388706207275, -0.1311422437429428, 0.01784163899719715, 0.08836466073989868, -0.00509309209883213, 0.04760254919528961, 0.07394348084926605, 0.032350145280361176, 0.11850335448980331, -0.01938650570809841, 0.019505219534039497, 0.0573902390897274, -0.08362274616956711, -0.08168597519397736, -0.058665670454502106, 0.06156935915350914, 0.19203893840312958, 0.05512316897511482, -0.007907462306320667, 0.07647252082824707, 0.04823780059814453, 0.11200877279043198, 0.11114779859781265, -0.23790530860424042, -0.04719870537519455, 0.035804521292448044, 0.017203424125909805, 0.06323094666004181, -0.0748845785856247, -0.007930373772978783, -0.0011339994380250573, 0.030791159719228745, 0.03163857385516167, -0.026431767269968987, -0.01219092682003975, -0.06373053789138794, -0.13487006723880768, -0.04665302112698555, 0.1453249305486679, -0.000245879142312333, -0.04724574834108353, -0.149644136428833, -0.11009569466114044, 0.07594794780015945, -0.025097407400608063, -0.04339836910367012, 0.03515451028943062, 0.06407371908426285, 0.15190856158733368, -0.06244228780269623, -0.09830133616924286, -0.008523160591721535, -0.04702283442020416, -0.01634025014936924, -0.030340906232595444, 0.06129403039813042, -0.10471701622009277, 0.10044357180595398, -0.0730113536119461, -0.11228444427251816, -0.08833467215299606, -0.03714209422469139, -0.04794229567050934, -0.04698120802640915, 0.02128603495657444, -0.0703044980764389, 0.04616573080420494, 0.16525529325008392, -0.016501285135746002, 0.09209199249744415, -0.042554959654808044, 0.044710151851177216, -0.01990615762770176, 0.14634202420711517, 0.0563792921602726, -0.075937420129776, 0.05928561091423035, 0.05934694781899452, 0.11626847833395004, -0.013662654906511307, -0.0796368345618248, -0.07221513241529465, 0.016750499606132507, 0.02885848470032215, 0.04928259178996086, 0.05801578238606453, -0.030511584132909775, -0.018160821869969368, 0.2459801882505417, -0.11331705749034882, 0.02303987555205822, -0.0013276865938678384, -0.00025385391199961305, 0.047320280224084854, 0.06169579178094864, -0.025677086785435677, -0.06818094849586487, -0.04943244159221649, -0.07204774022102356, -0.03441706672310829, -0.09888690710067749, -0.04743599146604538, 0.021920649334788322, -0.050383325666189194, -0.05021991953253746, -0.08290708810091019, -0.23885515332221985, -0.026489002630114555, 0.060478582978248596, -0.03665598854422569, -0.004493835847824812, -0.06282643228769302, -0.06349509954452515, 0.03139873966574669, -0.02617078460752964, 0.02777041122317314, -0.03455132246017456, 0.01467342209070921, -0.012711112387478352, 0.07391180098056793, -0.08565783500671387, 0.02788175828754902, -0.09783189743757248, 0.046134885400533676, -0.30057278275489807, 0.10754553228616714, -0.009561038576066494, 0.05280246585607529, -0.08280045539140701, 0.02839687280356884, 0.029075244441628456, -0.003473077667877078, 0.045725882053375244, 0.1440693885087967, -0.16367830336093903, -0.015736395493149757, 0.17491237819194794, -0.17516133189201355, -0.04726248234510422, 0.10792949795722961, 0.013545101508498192, 0.06064751744270325, 0.066805399954319, 0.16092976927757263, 0.07113935798406601, -0.06963226199150085, -0.02552126720547676, 0.08035176247358322, -0.0899457335472107, -0.006567878182977438, -0.029562076553702354, -0.024952655658125877, 0.04106493666768074, 0.04518390819430351, 0.00040739006362855434, 0.0133997006341815, 0.023358384147286415, 0.002176507841795683, -0.021472832188010216, 0.014006813988089561, -0.025542927905917168, -0.047595202922821045, -0.0492800809442997, -0.033606790006160736, -0.07560748606920242, 0.06355289369821548, 0.08005856722593307, -0.03931082412600517, -0.0030209054239094257, -0.0848606750369072, 0.07563527673482895, -0.05071336776018143, 0.019077733159065247, -0.1198684498667717, -0.06813918054103851, 0.033831991255283356, 0.025415875017642975, -0.03565317019820213, 0.04595737159252167, 0.057476822286844254, 0.032360952347517014, 0.044414322823286057, -0.05811192840337753, 0.09249080717563629, -0.039723798632621765, -0.08158387988805771, -0.11296815425157547, -0.028888968750834465, -0.0548350065946579, 0.0985833927989006, -0.18887168169021606, 0.06325271725654602, 0.0942336916923523, 0.10360240936279297, 0.04127943888306618, -0.04094424843788147, 0.006561742629855871, -0.034428391605615616, -0.01937194913625717, -0.056470319628715515, 0.059296950697898865, 0.08239062130451202, 0.03388439491391182, 0.0998566746711731, -0.18967466056346893, -0.13029612600803375, 0.05525898188352585, 0.03446798771619797, -0.00957874022424221, -0.09971798211336136, -0.03636276349425316, -0.03732680156826973, 0.0030438508838415146, -0.09987180680036545, 0.20131689310073853, 0.028909796848893166, 0.12292575091123581, -0.0647537037730217, -0.07370565831661224, -0.006565980147570372, -0.031069064512848854, -0.03830403462052345, 0.06827197223901749, -0.07775715738534927, -0.11497699469327927, 0.0586257167160511, 0.04236220940947533, -0.007678911555558443, 0.10400519520044327, -0.029486559331417084, -0.028079986572265625, 0.006381209474056959, 0.08086361736059189, -0.005905633792281151, 0.050053246319293976, -0.15923623740673065, -0.008077140897512436, 0.03609154373407364, 0.025708509609103203, 0.03184889256954193, -0.12653760612010956, 0.07956209778785706, -0.009371278807520866, -0.036252547055482864, 0.03446047380566597, 0.05165914446115494, -0.01146815624088049, 0.049118924885988235, -0.01621824875473976, -0.024223530665040016, 0.02223838120698929, -0.05684470385313034, -0.13349288702011108, 0.1258704662322998, -0.07493342459201813, -0.20386597514152527, -0.2057141810655594, -0.0889735072851181, -0.04217434301972389, 0.012566213496029377, 0.0581868477165699, 0.00602686544880271, -0.0817301943898201, -0.12107972800731659, -0.011802105233073235, 0.032484833151102066, -0.05081889405846596, 0.012468519620597363, -0.05290914699435234, 0.03323962539434433, -0.14214976131916046, -0.0213350560516119, -0.0009574848809279501, -0.06803034245967865, 0.04981561377644539, -0.024760456755757332, 0.015146740712225437, 0.13841305673122406, -0.03331870958209038, 0.010420940816402435, -0.022292958572506905, 0.21648572385311127, -0.05165760591626167, 0.1349336951971054, 0.23489010334014893, -0.029200633987784386, 0.101529560983181, 0.16472963988780975, -0.04285694286227226, -0.06780649721622467, 0.020042821764945984, -0.04197072610259056, -0.02850085310637951, -0.17530514299869537, -0.04272506386041641, -0.08234257996082306, -0.020859668031334877, 0.03376986086368561, 0.05842527747154236, 0.0818423330783844, 0.019710294902324677, -0.06750598549842834, 0.04781061410903931, 0.09153418242931366, 0.11570633202791214, 0.14240320026874542, 0.005607243627309799, 0.07347793132066727, -0.06019020080566406, 0.03510339930653572, 0.05100307986140251, 0.028514117002487183, 0.15790201723575592, -0.0483681857585907, 0.1298677772283554, 0.05256379395723343, -0.027343017980456352, 0.05626895651221275, 0.059076130390167236, -0.08663690090179443, -0.007533119060099125, -0.019692154601216316, -0.0926268920302391, -0.03323083743453026, 0.06809432804584503, -0.1434033215045929, 0.04949309676885605, -0.009992512874305248, 0.06159418821334839, 0.051640644669532776, 0.2545381784439087, 0.06583930552005768, -0.19760265946388245, -0.08097969740629196, 0.07997898757457733, -0.04836496338248253, -0.13225169479846954, 0.005465212743729353, 0.1422550231218338, -0.050537269562482834, 0.042989905923604965, -0.006129560060799122, 0.09776598960161209, -0.09497392922639847, 0.014102734625339508, -0.051407232880592346, 0.15005087852478027, -0.011519771069288254, 0.03769150748848915, -0.13697627186775208, 0.07263164222240448, 0.02513798698782921, 0.10058033466339111, -0.06546872854232788, 0.09073968231678009, 0.03086058795452118, 0.107649065554142, 0.0956181213259697, -0.003710759337991476, -0.04469183087348938, -0.0073347375728189945, -0.07327531278133392, -0.001726203365251422, 0.010723361745476723, 0.015588641166687012, 0.03708259388804436, -0.059240348637104034, -0.03332071751356125, 0.011905081570148468, -0.03133554756641388, -0.05168432742357254, -0.15547813475131989, -0.0029446925036609173, 0.10080251842737198, 0.025305094197392464, -0.05247128754854202, 0.008827751502394676, -0.01554980967193842, 0.10785612463951111, -0.10150563716888428, -0.0858987346291542, -0.10428541153669357, -0.05965274199843407, 0.022034117951989174, -0.03728412091732025, 0.03815685585141182, -0.028020871803164482, 0.14705722033977509, -0.005943307187408209, -0.09150782227516174, 0.05950406193733215, -0.11589365452528, -0.023248473182320595, -0.028762100264430046, 0.0778421014547348, 0.03942323103547096, -0.02155373990535736, 0.02115616761147976, 0.010031563229858875, -0.01725446805357933, -0.0689927414059639, 0.02216540090739727, 0.20454375445842743, 0.040516603738069534, -0.001121428795158863, -0.053697872906923294, -0.22282841801643372, -0.019639139994978905, -0.00773716950789094, 0.026081224903464317, 0.21581125259399414, -0.03592286258935928, 0.10645359009504318, 0.13034336268901825, -0.07137520611286163, -0.17605409026145935, 0.04719548672437668, 0.0388454794883728, 0.03310186788439751, -0.06195417419075966, -0.09270133823156357, 0.10574023425579071, 0.013853899203240871, -0.0346771739423275, 0.16302208602428436, -0.2176295667886734, -0.08462273329496384, 0.07307415455579758, 0.05263169854879379, 0.1097598746418953, -0.05940878018736839, -0.06408347189426422, -0.0009403183939866722, -0.09425555914640427, -0.029524872079491615, -0.02287043072283268, 0.03136914595961571, -0.007002911996096373, 0.03789355605840683, 0.022639036178588867, -0.043142616748809814, 0.07968942075967789, -0.03604936599731445, 0.07450035959482193, -0.06139836460351944, -0.0042316485196352005, 0.0699133574962616, -0.08410534262657166, 0.1201040968298912, -0.039778176695108414, 0.03274288401007652, -0.049337562173604965, -0.02143554762005806, -0.06774768233299255, 0.07783512026071548, -0.02220974676311016, -0.04276524856686592, 0.018697479739785194, -0.010212096385657787, 0.05103721469640732, 0.02903994731605053, -0.012125657871365547, -0.09914681315422058, 0.006409991532564163, 0.18580113351345062, 0.11314219236373901, -0.13704830408096313, -0.015517321415245533, -0.033009033650159836, -0.0027763766702264547, 0.049390118569135666, -0.0833093598484993, 0.01954931579530239, 0.054898783564567566, 0.008983714506030083, 0.13610978424549103, 0.03728450834751129, -0.07130703330039978, -0.029460767284035683, 0.0535321868956089, -0.11871468275785446, -0.12908132374286652, -0.041272252798080444, 0.19231079518795013, -0.07092620432376862, 0.06685090810060501, 0.12387146055698395, -0.013184110634028912, -0.0005005583516322076, 0.035309191793203354, 0.032308585941791534, 0.003393382066860795, 0.0007684332667849958, -0.06530078500509262, 0.04924120754003525, -0.06032903119921684, 0.03249030187726021, 0.04560264199972153, -0.08979271352291107, 0.03432152792811394, 0.12681664526462555, -0.14164860546588898, -0.049130551517009735, -0.04943211376667023, 0.11220575869083405, 0.034461069852113724, -0.04765268415212631, -0.006179572083055973, -0.07803806662559509, 0.03260580450296402, 0.1877809762954712, 0.016925044357776642, -0.010985514149069786, -0.02352256141602993, 0.04005054756999016, -0.051248013973236084, 0.10407943278551102, -0.04385315999388695, 0.04655856266617775, -0.07506346702575684, 0.016935618594288826, -0.00010286235192324966, -0.0262187197804451, -0.029797544702887535, -0.059993159025907516, -0.07099464535713196, -0.018889352679252625, -0.10897842049598694, 0.07554121315479279, -0.06753998249769211, -0.002581636654213071, 0.010329986922442913, 0.0075338659808039665, -0.01586153171956539, 0.04243031144142151, -0.026118341833353043, 0.00026700409944169223, -0.0350387766957283, 0.06654716283082962, -0.10048159956932068, -0.017577428370714188, 0.0014406422851607203, -0.10122267156839371, 0.11147136986255646, 0.014391950331628323, -0.05824512243270874, -0.014776620082557201, -0.13204823434352875, -0.008430545218288898, 0.029064474627375603, 0.04824193939566612, 0.03994409367442131, -0.03224768117070198, -0.02429652214050293, 0.016659574583172798, -0.05255131423473358, -0.050815291702747345, 0.014703547582030296, -0.05268873646855354, 0.0541338287293911, 0.016415247693657875, -0.06494712829589844, -0.08891240507364273, 0.03790121525526047, 0.07408854365348816, 0.018543684855103493, 0.10367801040410995, -0.0830528736114502, 0.002875907812267542, -0.13832074403762817, -0.004357850644737482, 0.07036366313695908, -0.03381003811955452, 0.010440784506499767, -0.058159951120615005, 0.02778683416545391, -0.029437268152832985, 0.019973423331975937, 0.028103329241275787, -0.012083964422345161, 0.03181812912225723, -0.03129817172884941, -0.03961612284183502, 0.00811463501304388, 0.12171784788370132, 0.00882643647491932, 0.021498210728168488, 0.04315147176384926, 0.022089583799242973, 0.03497946634888649, 0.05564306303858757, 0.13022540509700775, 0.05980249121785164, -0.009848030284047127, 0.10558832436800003, 0.00573202408850193, -0.014495889656245708, -0.17832079529762268, 0.018221775069832802, -0.05829431489109993, 0.06971258670091629, -0.05813048407435417, 0.18651773035526276, 0.2189510464668274, -0.09946095198392868, 0.03164498880505562, -0.0100817596539855, -0.03111235611140728, -0.08772890269756317, -0.2765122950077057, -0.01120203547179699, -0.09215088188648224, -0.05169617012143135, -0.0915350466966629, 0.04531881958246231, 0.030888965353369713, 0.010504522360861301, 0.024491118267178535, 0.0946301743388176, -0.022084148600697517, -0.028641242533922195, 0.04169376567006111, -0.03329993039369583, 0.040619462728500366, -0.03752715513110161, -0.01166661735624075, 0.07663913071155548, -0.009777788072824478, 0.04701481759548187, 0.059247247874736786, 0.11042548716068268, 0.022856473922729492, -0.005590056534856558, -0.06792616099119186, 0.013344991020858288, 0.0481567345559597, -0.06333483010530472, 0.05547693744301796, 0.09452400356531143, -0.023080172017216682, -0.004409607499837875, 0.14969149231910706, -0.060939036309719086, -0.12617191672325134, -0.08416560292243958, 0.18149779736995697, -0.05169685557484627, 0.02340725064277649, -0.016762299463152885, -0.12196282297372818, -0.00479646073654294, 0.18367329239845276, 0.04109443724155426, -0.009049071930348873, -0.01773078553378582, -0.029364250600337982, 0.0015015644021332264, -0.03247006610035896, 0.08197296410799026, 0.06276582926511765, 0.29901134967803955, -0.029310433194041252, 0.043317846953868866, -0.003245790721848607, -0.021282674744725227, -0.06400999426841736, 0.018861088901758194, -0.04523594677448273, -0.013755371794104576, -0.006410966161638498, 0.03448473662137985, -0.037627339363098145, -0.08067983388900757, -0.06085466220974922, -0.012976648285984993, -0.03951111063361168, -0.014204300940036774, 0.08159049600362778, -0.01673584245145321, 0.05602115020155907, -0.03052504174411297, -0.014900809153914452, 0.21043215692043304, -0.05414178594946861, -0.06398790329694748, -0.04169115796685219, 0.023814471438527107, -0.15262563526630402, 0.11996951699256897, -0.021347902715206146, 0.0612000934779644, 0.09707009047269821, 0.05050359293818474, -0.11089522391557693, 0.02356957644224167, -0.02823713980615139, -0.09396089613437653, 0.006843628361821175, 0.08241618424654007, -0.009255561046302319, 0.06077270582318306, 0.04155062139034271, -0.11006277054548264, 0.02370881848037243, 0.05816633254289627, -0.044171106070280075, -0.051455143839120865, 0.020677560940384865, -0.06280932575464249, 0.12151247262954712, 0.06945908069610596, 0.0007062338409014046, -0.014799739234149456, -0.010586818680167198, 0.0016716490499675274, 0.020196683704853058, 0.0395185723900795, -0.00629964005202055, -0.13526782393455505, 0.03016062080860138, 0.04775846004486084, 0.022118745371699333, -0.2064293771982193, -0.06586792320013046, -0.07576055079698563, -0.02678314968943596, -0.12592148780822754, 0.07409018278121948, 0.10019643604755402, 0.058626580983400345, -0.026594704017043114, -0.03855445981025696, 0.004365107044577599, 0.13450440764427185, -0.06135605648159981, -0.06655029207468033 ]
null
null
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"license": "apache-2.0", "library_name": "transformers"}
text-generation
Radu1999/MisterUkrainianZNO
[ "transformers", "safetensors", "mistral", "text-generation", "conversational", "arxiv:1910.09700", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2024-02-13T20:28:03+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #mistral #text-generation #conversational #arxiv-1910.09700 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #mistral #text-generation #conversational #arxiv-1910.09700 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ 68, 6, 3, 82, 28, 3, 4, 9, 9, 10, 42, 20, 3, 4, 5, 9, 11, 13, 3, 12, 5, 4, 5, 3, 4, 9, 53, 9, 8, 6, 3, 14, 8, 7, 9, 4 ]
[ "passage: TAGS\n#transformers #safetensors #mistral #text-generation #conversational #arxiv-1910.09700 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact" ]
[ -0.0535210520029068, 0.20994897186756134, -0.005268031265586615, 0.014276806265115738, 0.0955512523651123, 0.008638777770102024, 0.06300276517868042, 0.1158338189125061, -0.047883275896310806, 0.1243155300617218, 0.042200133204460144, 0.11444971710443497, 0.11701371520757675, 0.14681175351142883, -0.010448015294969082, -0.2127477526664734, 0.049479469656944275, -0.09782613068819046, -0.010965890251100063, 0.12522436678409576, 0.1528133600950241, -0.1021554172039032, 0.06865602731704712, -0.025421863421797752, -0.027679279446601868, -0.03849923610687256, -0.0566946379840374, -0.04705985262989998, 0.039500366896390915, 0.045059628784656525, 0.0707780197262764, 0.0020420614164322615, 0.08925426751375198, -0.2961680293083191, 0.015841396525502205, 0.061627570539712906, -0.0011538828257471323, 0.07117629796266556, 0.08686534315347672, -0.06445532292127609, 0.11190240830183029, -0.05954765900969505, 0.13944238424301147, 0.08088517934083939, -0.08764896541833878, -0.16849255561828613, -0.08173782378435135, 0.1164049506187439, 0.17698147892951965, 0.05739709362387657, -0.03419014438986778, 0.10745269805192947, -0.07786449790000916, 0.018644774332642555, 0.04381900653243065, -0.10070298612117767, -0.06203228607773781, 0.08966916054487228, 0.10138842463493347, 0.047779373824596405, -0.1254117786884308, -0.02621159702539444, 0.01909841224551201, 0.020525658503174782, 0.08423653244972229, 0.01620851457118988, 0.14951850473880768, 0.03713475912809372, -0.1349543035030365, -0.06700814515352249, 0.10938923060894012, 0.035849303007125854, -0.03309061378240585, -0.23566174507141113, -0.012542245909571648, -0.018523117527365685, -0.03702033683657646, -0.04633718729019165, 0.04508320614695549, 0.00035771893453784287, 0.09011491388082504, -0.008954458869993687, -0.07910548150539398, -0.040139373391866684, 0.07995790988206863, 0.0443887859582901, 0.02618550881743431, -0.020654944702982903, 0.0196200180798769, 0.10571888089179993, 0.08120600134134293, -0.11567401140928268, -0.052357301115989685, -0.059268027544021606, -0.07263094186782837, -0.04522480070590973, 0.034192245453596115, 0.03915010765194893, 0.07917314022779465, 0.2498505860567093, 0.02425568923354149, 0.05030221492052078, 0.034294791519641876, 0.009994457475841045, 0.0506180115044117, 0.09702284634113312, -0.05150073766708374, -0.13315412402153015, -0.025374101474881172, 0.09864256531000137, 0.005786322522908449, -0.027386901900172234, -0.03541889786720276, 0.06029578298330307, 0.04617399722337723, 0.1090288832783699, 0.09366883337497711, 0.020180407911539078, -0.07692110538482666, -0.05078713595867157, 0.19401273131370544, -0.15443174540996552, 0.033637624233961105, 0.034337762743234634, -0.03277822583913803, -0.04201162979006767, 0.008351285941898823, 0.03802844136953354, -0.036022551357746124, 0.08953704684972763, -0.0564630925655365, -0.05063195154070854, -0.11054704338312149, -0.028881831094622612, 0.04553275927901268, 0.013422677293419838, -0.030779702588915825, -0.028852207586169243, -0.09160611778497696, -0.08647685497999191, 0.0933590829372406, -0.06601894646883011, -0.07365784049034119, -0.03445909917354584, -0.07572175562381744, 0.022073041647672653, 0.0197665523737669, 0.08815137296915054, -0.02778664045035839, 0.04800492152571678, -0.05751834064722061, 0.045487694442272186, 0.10566682368516922, 0.037697698920965195, -0.07037637382745743, 0.0706227496266365, -0.20493106544017792, 0.08915809541940689, -0.0830419585108757, 0.04861335828900337, -0.1653890162706375, -0.026274170726537704, 0.039689481258392334, 0.01643560826778412, -0.0033787109423428774, 0.13107116520404816, -0.19559235870838165, -0.017797095701098442, 0.17431269586086273, -0.10360822826623917, -0.08330244570970535, 0.05526921898126602, -0.057440612465143204, 0.11731720715761185, 0.037011608481407166, 0.019470080733299255, 0.058710914105176926, -0.10414515435695648, -0.01242469996213913, -0.053001441061496735, -0.006864918861538172, 0.1263413429260254, 0.0823165625333786, -0.0915842279791832, 0.03891290724277496, 0.018693674355745316, -0.041496336460113525, -0.06606454402208328, -0.03105081617832184, -0.10419636964797974, 0.010852072387933731, -0.07944780588150024, 0.01409090030938387, -0.01250049751251936, -0.09268881380558014, -0.028617506846785545, -0.15793868899345398, -0.017865799367427826, 0.08650479465723038, -0.00420278450474143, -0.024789050221443176, -0.10228614509105682, 0.024746622890233994, 0.016695773229002953, -0.008225910365581512, -0.12663723528385162, -0.03015286847949028, 0.02930605225265026, -0.1463223546743393, 0.025121303275227547, -0.06900617480278015, 0.04800525680184364, 0.013594716787338257, -0.03318023681640625, -0.021950239315629005, 0.013441511429846287, 0.017604129388928413, -0.027337081730365753, -0.22970223426818848, -0.022300025448203087, -0.03795382380485535, 0.16395093500614166, -0.22768230736255646, 0.03959988057613373, 0.057571060955524445, 0.14559322595596313, -0.002035774290561676, -0.05482302978634834, 0.026347527280449867, -0.06407148391008377, -0.026518363505601883, -0.054260484874248505, 0.004980194848030806, -0.016927076503634453, -0.04377713426947594, 0.025113888084888458, -0.16963110864162445, -0.041066963225603104, 0.1009739562869072, 0.05045260116457939, -0.12397581338882446, -0.04221658408641815, -0.030096091330051422, -0.05769895017147064, -0.04544999822974205, -0.059382662177085876, 0.10349436849355698, 0.056902505457401276, 0.039259765297174454, -0.06905653327703476, -0.07481072843074799, -0.00414417777210474, -0.021936526522040367, -0.024703575298190117, 0.09668172895908356, 0.07970423996448517, -0.1298835575580597, 0.09787239134311676, 0.08421709388494492, 0.05604126676917076, 0.08308221399784088, -0.022293850779533386, -0.07451046258211136, -0.026380794122815132, 0.035747330635786057, 0.018736932426691055, 0.12649855017662048, -0.07429172098636627, 0.038807980716228485, 0.045625898987054825, -0.03244946897029877, 0.025344355031847954, -0.08402993530035019, 0.01796797104179859, 0.021251529455184937, -0.021030860021710396, 0.027411481365561485, -0.04135683551430702, 0.013339306227862835, 0.08553782850503922, 0.05025739222764969, 0.020280251279473305, 0.020059505477547646, -0.049937810748815536, -0.11606193333864212, 0.1606271117925644, -0.11436411738395691, -0.2052009552717209, -0.13315674662590027, 0.03165067359805107, 0.03760804608464241, -0.016980579122900963, -0.0016035162843763828, -0.052950531244277954, -0.10256816446781158, -0.09110390394926071, 0.007514104712754488, 0.04145568981766701, -0.09290070086717606, -0.03897719830274582, 0.03872610628604889, 0.040360383689403534, -0.13889461755752563, 0.016270309686660767, 0.04555486887693405, -0.08409781754016876, -0.009337234310805798, 0.06195974722504616, 0.08807603269815445, 0.1949266940355301, 0.014133783988654613, -0.012460295110940933, 0.022886890918016434, 0.22538986802101135, -0.13757912814617157, 0.10110456496477127, 0.1277187615633011, -0.07486125081777573, 0.08228196948766708, 0.21431635320186615, 0.040862374007701874, -0.09166773408651352, 0.025235874578356743, 0.038505490869283676, -0.022305194288492203, -0.25058743357658386, -0.07316450774669647, -0.0049200523644685745, -0.06417204439640045, 0.08241023123264313, 0.08798143267631531, 0.09824612736701965, 0.037020351737737656, -0.08483944833278656, -0.09180966019630432, 0.06513748317956924, 0.10895394533872604, -0.006382278632372618, 0.006520338822156191, 0.09023737907409668, -0.031119242310523987, 0.018800098448991776, 0.08879689127206802, 0.017039285972714424, 0.15405668318271637, 0.04811256006360054, 0.17380255460739136, 0.08999684453010559, 0.08171714097261429, 0.0005674214335158467, 0.019576899707317352, 0.010055757127702236, 0.04351765289902687, -0.0028601118829101324, -0.0789370909333229, -0.016239887103438377, 0.11934800446033478, 0.05345144495368004, 0.01708330772817135, 0.02053074724972248, -0.03853895887732506, 0.07606158405542374, 0.19789741933345795, -0.005303108599036932, -0.19659237563610077, -0.05594524368643761, 0.07944575697183609, -0.09087561815977097, -0.10445090383291245, 0.0012648770352825522, 0.02207600325345993, -0.16718564927577972, 0.0425507128238678, -0.032484039664268494, 0.11075331270694733, -0.10687651485204697, -0.02149091102182865, 0.06949468702077866, 0.06122921034693718, -0.01464743074029684, 0.07286236435174942, -0.20449048280715942, 0.1130913719534874, 0.008275190368294716, 0.07164911180734634, -0.09718446433544159, 0.08861534297466278, 0.0005678947200067341, -0.01952086202800274, 0.1592944860458374, -0.005452786572277546, -0.0763486698269844, -0.06527400016784668, -0.09441093355417252, -0.007634507026523352, 0.09762918204069138, -0.12904053926467896, 0.08223655819892883, -0.033019669353961945, -0.032815903425216675, 0.0006813490763306618, -0.09557481110095978, -0.11922643333673477, -0.17144067585468292, 0.05302504077553749, -0.11941733956336975, 0.03681783378124237, -0.1064034253358841, -0.030312927439808846, -0.03676198795437813, 0.18239666521549225, -0.20175118744373322, -0.0813034176826477, -0.13755548000335693, -0.10123365372419357, 0.1406746506690979, -0.043102510273456573, 0.09645568579435349, -0.008794162422418594, 0.16338226199150085, 0.009652632288634777, -0.01340216863900423, 0.07741600275039673, -0.09172298014163971, -0.2017849236726761, -0.06434639543294907, 0.16040602326393127, 0.1114422157406807, 0.03748288005590439, 0.005051454529166222, 0.0383947417140007, -0.028148308396339417, -0.11146315932273865, 0.025262581184506416, 0.1439998298883438, 0.08248871564865112, 0.0029930840246379375, -0.01808653026819229, -0.14209192991256714, -0.08416560292243958, -0.04521634057164192, 0.02476728893816471, 0.16054345667362213, -0.07303125411272049, 0.1599406898021698, 0.13895317912101746, -0.0657365694642067, -0.2002221941947937, 0.006290470249950886, 0.026827840134501457, -0.010940702632069588, 0.01498403213918209, -0.17745088040828705, 0.08298364281654358, 0.011942053213715553, -0.05820121243596077, 0.08775683492422104, -0.1857975721359253, -0.13893114030361176, 0.08376453816890717, 0.05419633537530899, -0.2044990211725235, -0.13950414955615997, -0.0963040292263031, -0.039252035319805145, -0.1592206358909607, 0.0923963189125061, -0.002233225852251053, 0.0011471696197986603, 0.03607133403420448, 0.012751033529639244, 0.027581125497817993, -0.05559009686112404, 0.184194415807724, 0.002587853232398629, 0.029408378526568413, -0.0853973999619484, -0.10160977393388748, 0.029257556423544884, -0.04654192179441452, 0.0771842896938324, -0.03388750180602074, 0.015159969218075275, -0.1126389130949974, -0.04229956865310669, -0.05704052001237869, 0.015501349233090878, -0.10054408013820648, -0.0924275815486908, -0.04786914214491844, 0.08759854733943939, 0.10720651596784592, -0.020067252218723297, -0.0310191810131073, -0.08270808309316635, 0.0662069097161293, 0.22154001891613007, 0.19471275806427002, 0.0765640065073967, -0.06753472983837128, 0.0009091472602449358, -0.02554071880877018, 0.04510895162820816, -0.1993175894021988, 0.054956164211034775, 0.06033334508538246, 0.018439872190356255, 0.11060556769371033, -0.026344653218984604, -0.14409807324409485, -0.0706387609243393, 0.060687728226184845, -0.05803653597831726, -0.20370253920555115, 0.01122827548533678, 0.05618138238787651, -0.1703799068927765, -0.04229617491364479, 0.03580870106816292, -0.015399953350424767, -0.03552314639091492, 0.011711730621755123, 0.0959220603108406, -0.007616850081831217, 0.09149198979139328, 0.07215461134910583, 0.09309251606464386, -0.09687890857458115, 0.08689413219690323, 0.09960173070430756, -0.061329230666160583, 0.030580563470721245, 0.09173155575990677, -0.0525328703224659, -0.0394037663936615, 0.05599869415163994, 0.10034316778182983, 0.01840079389512539, -0.05585464462637901, 0.0036246594972908497, -0.09255160391330719, 0.060889314860105515, 0.1093989685177803, 0.024813814088702202, 0.012469364330172539, 0.05441085249185562, 0.032194435596466064, -0.08214732259511948, 0.12223849445581436, 0.056204844266176224, 0.014192699454724789, -0.042528290301561356, -0.014953055419027805, 0.010201917961239815, -0.03509276360273361, -0.005019268952310085, -0.008046603761613369, -0.07746861129999161, -0.007181765045970678, -0.1403387039899826, 0.022289810702204704, -0.08175966888666153, 0.013787015341222286, 0.023796403780579567, -0.026725906878709793, 0.007821079343557358, 0.0001906560646602884, -0.07514636218547821, -0.05624905601143837, -0.009548153728246689, 0.10408845543861389, -0.1651892215013504, 0.01900539919734001, 0.08338131755590439, -0.10459206998348236, 0.09141772240400314, -0.008385900408029556, -0.009890591725707054, 0.007888508029282093, -0.16260425746440887, 0.052401088178157806, -0.027303528040647507, 0.004645921755582094, 0.004283140879124403, -0.18384072184562683, -0.00428239069879055, -0.03126665577292442, -0.06947306543588638, -0.0062017133459448814, -0.02631635218858719, -0.11096397042274475, 0.09245428442955017, 0.010684546083211899, -0.08446349948644638, -0.02270403504371643, 0.035123277455568314, 0.090272918343544, -0.04115687683224678, 0.14325648546218872, -0.01693929173052311, 0.06450200825929642, -0.16913360357284546, -0.00916238036006689, -0.008691014721989632, 0.020511409267783165, -0.06107008457183838, -0.007177166640758514, 0.048090312629938126, -0.022019952535629272, 0.18647055327892303, -0.028773868456482887, 0.010638263076543808, 0.060599107295274734, 0.034032776951789856, -0.0006982669583521783, 0.09846311807632446, 0.0685117244720459, 0.010936494916677475, 0.007215898018330336, 0.011858666315674782, -0.04749750345945358, -0.03902202472090721, -0.18028034269809723, 0.0655660554766655, 0.20538455247879028, 0.10305459797382355, -0.02155209518969059, 0.06722000241279602, -0.1133543998003006, -0.09935101121664047, 0.139744371175766, -0.03790305182337761, -0.004522405564785004, -0.07542582601308823, 0.14351864159107208, 0.14540117979049683, -0.19453977048397064, 0.07385516166687012, -0.07807494699954987, -0.04793347790837288, -0.10093287378549576, -0.20358964800834656, -0.06594622880220413, -0.039517082273960114, -0.013079931028187275, -0.0574735552072525, 0.06760978698730469, 0.08985619992017746, -0.004441138822585344, -0.01519135944545269, 0.06695786863565445, -0.034570787101984024, -0.0002581503358669579, 0.03333301469683647, 0.056985706090927124, 0.0183477234095335, -0.06424097716808319, 0.01065297331660986, -0.009701683185994625, 0.04975178465247154, 0.07717985659837723, 0.03756337985396385, -0.024198200553655624, 0.013455628417432308, -0.029356906190514565, -0.10353074967861176, 0.050629157572984695, -0.021355563774704933, -0.053891800343990326, 0.1481945961713791, 0.025117632001638412, 0.006510977167636156, -0.009177884086966515, 0.2235785275697708, -0.06651077419519424, -0.10774586349725723, -0.14945030212402344, 0.07872933149337769, -0.052336301654577255, 0.045451220124959946, 0.052770089358091354, -0.11167564243078232, 0.02688967064023018, 0.15125538408756256, 0.16306868195533752, -0.03116506338119507, 0.013097112998366356, 0.027573687955737114, 0.005935588851571083, -0.025699689984321594, 0.039286959916353226, 0.04900084063410759, 0.13694828748703003, -0.06408719718456268, 0.07125121355056763, 0.0047598169185221195, -0.08259867876768112, -0.020779481157660484, 0.12264706194400787, -0.014121388085186481, 0.0016442675841972232, -0.05498780682682991, 0.12352734804153442, -0.06982193142175674, -0.20818258821964264, 0.03547022119164467, -0.07633711397647858, -0.13274122774600983, -0.027499711140990257, 0.04813192039728165, 0.0004025342350360006, 0.01910916157066822, 0.07624450325965881, -0.06742072850465775, 0.172502800822258, 0.03384549170732498, -0.06627769768238068, -0.050338875502347946, 0.0812113881111145, -0.07910801470279694, 0.3027973771095276, 0.018784107640385628, 0.048531800508499146, 0.1071595773100853, -0.01723645254969597, -0.1239522397518158, 0.02724023349583149, 0.10326878726482391, -0.07115544378757477, 0.0582791231572628, 0.1616062968969345, -0.005169853568077087, 0.12930142879486084, 0.07263926416635513, -0.07595457136631012, 0.0463075116276741, -0.07637964189052582, -0.07506715506315231, -0.10420151054859161, 0.09952887147665024, -0.08918270468711853, 0.14938555657863617, 0.12495137006044388, -0.05796056613326073, 0.009720006957650185, -0.025794677436351776, 0.06097874417901039, -0.008937055245041847, 0.1251431554555893, 0.009622495621442795, -0.18729031085968018, 0.029527664184570312, -0.027558233588933945, 0.10235998779535294, -0.18572887778282166, -0.07891097664833069, 0.03731522336602211, 0.0011901403777301311, -0.0808538869023323, 0.11897921562194824, 0.0759904533624649, 0.029184773564338684, -0.04821867495775223, -0.026540882885456085, -0.010870136320590973, 0.1479586809873581, -0.09371433407068253, -0.004159667529165745 ]
null
null
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [Yashaswat] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [nsi319/legal-led-base-16384] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"language": ["en"], "library_name": "transformers", "tags": ["legal"], "datasets": ["Yashaswat/Indian-Legal-Text-ABS"], "pipeline_tag": "summarization"}
summarization
Yashaswat/indian-legal-led-base
[ "transformers", "safetensors", "led", "text2text-generation", "legal", "summarization", "en", "dataset:Yashaswat/Indian-Legal-Text-ABS", "arxiv:1910.09700", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2024-02-13T20:32:56+00:00
[ "1910.09700" ]
[ "en" ]
TAGS #transformers #safetensors #led #text2text-generation #legal #summarization #en #dataset-Yashaswat/Indian-Legal-Text-ABS #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: [Yashaswat] - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: [nsi319/legal-led-base-16384] ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: [Yashaswat]\n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]: [nsi319/legal-led-base-16384]", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #led #text2text-generation #legal #summarization #en #dataset-Yashaswat/Indian-Legal-Text-ABS #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: [Yashaswat]\n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]: [nsi319/legal-led-base-16384]", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ 75, 6, 3, 101, 28, 3, 4, 9, 9, 10, 42, 20, 3, 4, 5, 9, 11, 13, 3, 12, 5, 4, 5, 3, 4, 9, 53, 9, 8, 6, 3, 14, 8, 7, 9, 4 ]
[ "passage: TAGS\n#transformers #safetensors #led #text2text-generation #legal #summarization #en #dataset-Yashaswat/Indian-Legal-Text-ABS #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: [Yashaswat]\n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]: [nsi319/legal-led-base-16384]### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]" ]
[ -0.07816169410943985, 0.19930583238601685, -0.0038536235224455595, 0.023916011676192284, 0.07982323318719864, -0.018486954271793365, 0.08609069138765335, 0.09424539655447006, 0.03610825911164284, 0.12365159392356873, 0.027805903926491737, 0.10918299108743668, 0.1191505417227745, 0.183548241853714, -0.0023461012169718742, -0.1964581161737442, 0.03224566951394081, -0.11841460317373276, -0.012635042890906334, 0.11678177118301392, 0.13838879764080048, -0.1018420159816742, 0.07334249466657639, -0.02270202897489071, 0.021460991352796555, -0.02897927351295948, -0.06539680063724518, -0.039195314049720764, 0.030222252011299133, 0.06094637140631676, 0.05257820710539818, 0.0015883187297731638, 0.08548521250486374, -0.2747589647769928, 0.014710131101310253, 0.043719690293073654, -0.0021894422825425863, 0.07164842635393143, 0.0854978933930397, -0.05198274180293083, 0.09337551891803741, -0.057720061391592026, 0.10484787076711655, 0.08362456411123276, -0.06437607109546661, -0.12046597898006439, -0.07047078758478165, 0.08484849333763123, 0.17379102110862732, 0.06532783061265945, -0.03410372510552406, 0.11134602129459381, -0.05502853915095329, 0.022337421774864197, 0.030320150777697563, -0.07422756403684616, -0.05612167716026306, 0.04007691144943237, 0.09422701597213745, 0.06386999785900116, -0.10844920575618744, -0.0077390167862176895, 0.016074199229478836, 0.02810557559132576, 0.0853530541062355, -0.0014680937165394425, 0.16989780962467194, 0.027841420844197273, -0.13132882118225098, -0.039890699088573456, 0.10690028220415115, 0.03392115980386734, -0.04363907501101494, -0.26471996307373047, -0.01870076172053814, 0.0037379630375653505, -0.03235271945595741, -0.048846106976270676, 0.04999382048845291, -0.007419032044708729, 0.09439735859632492, -0.024295272305607796, -0.0868057832121849, -0.012516687624156475, 0.06939923018217087, 0.071429044008255, 0.015884002670645714, -0.00706143956631422, -0.008218754082918167, 0.11557573080062866, 0.10969825088977814, -0.14040209352970123, -0.04011378064751625, -0.07083287090063095, -0.06557953357696533, -0.020170684903860092, 0.06315954029560089, 0.07192771136760712, 0.05086755380034447, 0.20604197680950165, 0.022886116057634354, 0.030834995210170746, 0.04600440338253975, 0.0020141045097261667, 0.0739106759428978, 0.11230220645666122, -0.07489484548568726, -0.17059244215488434, -0.014395222999155521, 0.08865044265985489, 0.0020387135446071625, -0.026677805930376053, -0.045874714851379395, 0.05555277690291405, 0.007483704946935177, 0.11864711344242096, 0.13866201043128967, -0.0070789470337331295, -0.08008404076099396, -0.08251510560512543, 0.21507418155670166, -0.15348351001739502, 0.03222739323973656, -0.003960102330893278, -0.03780761733651161, -0.02041054517030716, 0.005293554626405239, 0.013064189814031124, -0.048228081315755844, 0.04983970895409584, -0.0711447075009346, -0.032089464366436005, -0.11608444154262543, -0.0513157919049263, 0.023008430376648903, 0.019527480006217957, -0.025745607912540436, -0.046350546181201935, -0.10770490765571594, -0.07884721457958221, 0.09139568358659744, -0.08002651482820511, -0.047647714614868164, -0.01813628524541855, -0.07328219711780548, 0.016105862334370613, -0.0030763496179133654, 0.03126077726483345, -0.02082088775932789, 0.028931351378560066, -0.044031206518411636, 0.0532233789563179, 0.11142446845769882, 0.02440280094742775, -0.0786568894982338, 0.07534170895814896, -0.18591777980327606, 0.11431027203798294, -0.07357007265090942, 0.024244608357548714, -0.1467597633600235, -0.0030316959600895643, 0.05065272003412247, 0.01940184086561203, 0.009000444784760475, 0.1772754192352295, -0.22512660920619965, -0.002984342398121953, 0.164653018116951, -0.10322453081607819, -0.11545373499393463, 0.04462293162941933, -0.04619685560464859, 0.1605982631444931, 0.06069963797926903, -0.03513257950544357, 0.07116249948740005, -0.12718920409679413, -0.07450622320175171, -0.035323772579431534, 0.002335222205147147, 0.11906953901052475, 0.07243779301643372, -0.05941249057650566, 0.07838340103626251, 0.023595938459038734, -0.039825696498155594, -0.004449716303497553, -0.02516072615981102, -0.09405113756656647, 0.03506113961338997, -0.07077198475599289, 0.002327977679669857, -0.027798451483249664, -0.08075636625289917, 0.003075344255194068, -0.15562836825847626, -0.02194996550679207, 0.07384581863880157, -0.002267769305035472, -0.03732316195964813, -0.11282714456319809, 0.003955818247050047, -0.04329043999314308, -0.009114026091992855, -0.13522113859653473, -0.04956721141934395, 0.030887790024280548, -0.1524381786584854, 0.013178952969610691, -0.09373946487903595, 0.05039055272936821, 0.02251576818525791, -0.02815839648246765, -0.03325499966740608, 0.03590075671672821, 0.010791024193167686, -0.027408601716160774, -0.2187258005142212, -0.026621175929903984, -0.032291918992996216, 0.10464678704738617, -0.19497165083885193, 0.04201202094554901, 0.045362308621406555, 0.1447061151266098, 0.007242704275995493, -0.057342205196619034, 0.02152555249631405, -0.07542848587036133, -0.029300788417458534, -0.060646604746580124, -0.01835472881793976, -0.025636259466409683, -0.060022562742233276, 0.09005375951528549, -0.17179052531719208, -0.03421160578727722, 0.09623648226261139, 0.0690799206495285, -0.11664123088121414, -0.023656917735934258, -0.01723865233361721, -0.07510007172822952, -0.04868929833173752, -0.06538926810026169, 0.09610209614038467, 0.0598822720348835, 0.03902065381407738, -0.06932463496923447, -0.08374036848545074, 0.013391646556556225, -0.021171934902668, -0.012220257893204689, 0.09223516285419464, 0.049613483250141144, -0.13917653262615204, 0.09038390964269638, 0.0975949689745903, 0.0821433886885643, 0.10675999522209167, -0.008523697964847088, -0.0945989266037941, -0.06268499046564102, 0.03445100039243698, 0.013696708716452122, 0.14627575874328613, -0.02636978216469288, 0.04563887417316437, 0.04004688560962677, -0.019037986174225807, 0.035716358572244644, -0.06347332894802094, 0.030896909534931183, 0.02380536124110222, -0.006736292038112879, 0.04060666263103485, -0.03526297211647034, 0.002289525233209133, 0.06982974708080292, 0.04167074337601662, 0.04278387129306793, 0.01767423190176487, -0.05656407028436661, -0.12249751389026642, 0.1671094000339508, -0.10205655544996262, -0.25604376196861267, -0.144037127494812, -0.018681667745113373, 0.04441949352622032, -0.016314193606376648, -0.008651874028146267, -0.040146760642528534, -0.11744602769613266, -0.08363345265388489, 0.038301143795251846, 0.044151462614536285, -0.08169593662023544, -0.07719175517559052, 0.050849512219429016, 0.03220720216631889, -0.1333063691854477, 0.008005962707102299, 0.06743369996547699, -0.04382720962166786, -0.02023477293550968, 0.09684336185455322, 0.08876641094684601, 0.13733358681201935, 0.007955490611493587, -0.029721371829509735, 0.05265346169471741, 0.2288396954536438, -0.12946432828903198, 0.1199878603219986, 0.1581050157546997, -0.0785607397556305, 0.07892929017543793, 0.21717190742492676, 0.046247851103544235, -0.08373657613992691, 0.022680342197418213, 0.014434068463742733, -0.028470883145928383, -0.2430761456489563, -0.08287422358989716, -0.013231498189270496, -0.0641358345746994, 0.07521437853574753, 0.07419732958078384, 0.06460831314325333, 0.041997890919446945, -0.09015166759490967, -0.04153436794877052, 0.03863159939646721, 0.10886886715888977, -0.01511439774185419, -0.009226909838616848, 0.08712020516395569, -0.025118591263890266, 0.005609026178717613, 0.0998314693570137, -0.007758209947496653, 0.18939460813999176, 0.04798439145088196, 0.1772240847349167, 0.09233471751213074, 0.059402648359537125, 0.010611736215651035, 0.030188964679837227, 0.027347147464752197, 0.024422544986009598, 0.004537726286798716, -0.09479741007089615, 0.0028580741491168737, 0.14012977480888367, 0.039104629307985306, 0.02129148319363594, 0.048413991928100586, -0.03367622569203377, 0.05898042768239975, 0.16773322224617004, -0.013335746712982655, -0.18368732929229736, -0.0599539577960968, 0.0990254282951355, -0.09480426460504532, -0.1319483071565628, -0.0011485241120681167, 0.056066159158945084, -0.15368324518203735, -0.0025307810865342617, -0.039158325642347336, 0.10999840497970581, -0.15323849022388458, -0.036281175911426544, 0.044197335839271545, 0.04813271388411522, 0.0053435745649039745, 0.05634436756372452, -0.1455879509449005, 0.09815558791160583, 0.03440406918525696, 0.06807781755924225, -0.09437524527311325, 0.10698506981134415, 0.010316967964172363, -0.0610257126390934, 0.16631661355495453, 0.007936118170619011, -0.02613445743918419, -0.08105123043060303, -0.10774385929107666, -0.01457554567605257, 0.09549213945865631, -0.15006625652313232, 0.09787917882204056, -0.02178269252181053, -0.027381129562854767, -0.01351564098149538, -0.13232837617397308, -0.12932048738002777, -0.19005116820335388, 0.062037233263254166, -0.10963878035545349, 0.047307759523391724, -0.09845558553934097, -0.03740471974015236, -0.007937594316899776, 0.22844181954860687, -0.2373948097229004, -0.09983529895544052, -0.12892985343933105, -0.0921354815363884, 0.17179132997989655, -0.05935894325375557, 0.09235228598117828, 0.009300349280238152, 0.1451130211353302, 0.009218938648700714, -0.01056990958750248, 0.08927585929632187, -0.1064864844083786, -0.1605072021484375, -0.06643147766590118, 0.11631521582603455, 0.16181403398513794, 0.02578667551279068, -0.012703829444944859, 0.030644914135336876, -0.039128389209508896, -0.1315000355243683, -0.008706457912921906, 0.1843031495809555, 0.06693637371063232, 0.017922403290867805, 0.0028041773475706577, -0.14498411118984222, -0.06980415433645248, -0.0386991873383522, -0.010572741739451885, 0.20229841768741608, -0.05983470007777214, 0.15144579112529755, 0.14720241725444794, -0.056993722915649414, -0.18960386514663696, -0.013194440864026546, 0.028033316135406494, 0.027385277673602104, 0.06222089007496834, -0.16298221051692963, 0.0943346694111824, -0.02628314681351185, -0.06420988589525223, 0.16266562044620514, -0.13802824914455414, -0.14387840032577515, 0.092120461165905, 0.039564378559589386, -0.20919781923294067, -0.11182013899087906, -0.11126653105020523, -0.02632099948823452, -0.12467215210199356, 0.07715979963541031, 0.0438261404633522, -0.01621318608522415, 0.03321979567408562, 0.026504436507821083, 0.03376539424061775, -0.05325990170240402, 0.20258133113384247, -0.04138365015387535, 0.014864380471408367, -0.07029478996992111, -0.07536465674638748, 0.058494966477155685, -0.058496590703725815, 0.09970476478338242, -0.01940404623746872, 0.02494777925312519, -0.0965753048658371, -0.052644506096839905, -0.04813311621546745, 0.017768913879990578, -0.07826131582260132, -0.09631706774234772, -0.03981130197644234, 0.1081569641828537, 0.09722891449928284, -0.018042510375380516, -0.014113832265138626, -0.08433747291564941, -0.006132709793746471, 0.20607486367225647, 0.19337548315525055, 0.08759494125843048, -0.042622946202754974, -0.017587797716259956, -0.018405595794320107, 0.04263073578476906, -0.2011106312274933, 0.04833707585930824, 0.044835783541202545, 0.012231329455971718, 0.09557589888572693, -0.0346275232732296, -0.15129363536834717, -0.051452167332172394, 0.07099993526935577, -0.04553789272904396, -0.18220505118370056, -0.03487340733408928, 0.0524827279150486, -0.18937276303768158, -0.05625338852405548, 0.058039966970682144, -0.011379263363778591, -0.027518371120095253, 0.006597194820642471, 0.08458901196718216, -0.0015031154034659266, 0.0788932517170906, 0.05278824642300606, 0.08991073071956635, -0.09819887578487396, 0.05903979763388634, 0.07847093045711517, -0.059890564531087875, 0.018571367487311363, 0.10163839906454086, -0.05085783824324608, -0.03031735122203827, 0.01275660190731287, 0.042947810143232346, 0.02918029949069023, -0.037482138723134995, -0.001585109974257648, -0.04715435206890106, 0.04521951451897621, 0.07935066521167755, 0.01985498145222664, 0.013258591294288635, 0.03465350344777107, 0.028798241168260574, -0.05362387374043465, 0.13254326581954956, 0.06141427531838417, 0.007585014682263136, -0.055612217634916306, -0.07377277314662933, 0.02500779740512371, -0.01574484258890152, -0.015670696273446083, -0.020746629685163498, -0.06970924884080887, -0.015418422408401966, -0.19449737668037415, 0.027401231229305267, -0.11273475736379623, -0.0002845215785782784, 0.01026953849941492, -0.031493477523326874, -0.014564582146704197, 0.006501251365989447, -0.06079204007983208, -0.05901619419455528, -0.018177969381213188, 0.1195918545126915, -0.13861660659313202, 0.002292116405442357, 0.08350345492362976, -0.11354091018438339, 0.07396459579467773, -0.00991649366915226, -0.01003199815750122, 0.001424399670213461, -0.11676022410392761, 0.06548072397708893, -0.030751533806324005, 0.01117587648332119, 0.03186134248971939, -0.1990182250738144, -0.014246387407183647, -0.018621446564793587, -0.06826960295438766, -0.01915310136973858, -0.04121942073106766, -0.1146082878112793, 0.07877242565155029, 0.012498083524405956, -0.04989916831254959, -0.05161235108971596, 0.04567568749189377, 0.09004522860050201, -0.02407827414572239, 0.11498425155878067, -0.0014947857707738876, 0.07703181356191635, -0.17707288265228271, -0.011849084869027138, -0.002419485244899988, 0.046082232147455215, 0.015365873463451862, -0.02942272648215294, 0.03892732039093971, -0.03737659379839897, 0.20021329820156097, -0.032095231115818024, 0.047299157828092575, 0.05322040989995003, 0.009054365567862988, 0.018192708492279053, 0.07973583042621613, 0.05174921080470085, -0.0022043439093977213, 0.015305216424167156, -0.017708197236061096, -0.019788745790719986, -0.06019303947687149, -0.17382502555847168, 0.007549946196377277, 0.1496126651763916, 0.07694035023450851, 0.0065225763246417046, 0.05784570425748825, -0.10605131834745407, -0.11978410184383392, 0.08597605675458908, -0.03960293531417847, -0.028302432969212532, -0.06709535419940948, 0.1439388543367386, 0.14488859474658966, -0.17050662636756897, 0.0754804015159607, -0.024913184344768524, -0.04368341714143753, -0.09859786927700043, -0.242370143532753, -0.046857502311468124, -0.013966700062155724, -0.009387915022671223, -0.06764639914035797, 0.07429509609937668, 0.09228599071502686, 0.001439539366401732, -0.006251701153814793, 0.07852111011743546, -0.008287394419312477, -0.0394451729953289, 0.038348354399204254, 0.05724306404590607, 0.041912540793418884, -0.05214614421129227, 0.031323403120040894, -0.024601485580205917, 0.04464741051197052, 0.0574897937476635, 0.0345153883099556, -0.04462351277470589, 0.008744640275835991, -0.020539667457342148, -0.12448257207870483, 0.025628963485360146, -0.011166383512318134, -0.03917885199189186, 0.1971920281648636, 0.024749260395765305, -0.0034928719978779554, -0.025841381400823593, 0.21154235303401947, -0.09122364223003387, -0.09171883016824722, -0.14029917120933533, 0.05841328203678131, -0.06580410152673721, 0.04205580800771713, 0.03190790116786957, -0.10763497650623322, 0.012300328351557255, 0.14218933880329132, 0.1494099646806717, -0.024202195927500725, 0.010128575377166271, 0.0282515250146389, 0.0017739603063091636, -0.06487628817558289, 0.03613249212503433, 0.025708265602588654, 0.22918182611465454, -0.06695921719074249, 0.08480982482433319, -0.016225671395659447, -0.1034519299864769, -0.006495639216154814, 0.1166657879948616, -0.021111326292157173, 0.028671205043792725, -0.06774675101041794, 0.12927310168743134, -0.07114247977733612, -0.20890556275844574, 0.0584135577082634, -0.051397655159235, -0.12450830638408661, -0.015902694314718246, 0.017047706991434097, -0.019709410145878792, 0.016230972483754158, 0.06393105536699295, -0.036894071847200394, 0.18521425127983093, 0.030565012246370316, -0.06777521967887878, -0.021683910861611366, 0.06131802126765251, -0.12202678620815277, 0.2573108673095703, 0.017337318509817123, 0.06284120678901672, 0.11258864402770996, -0.006835939362645149, -0.14277410507202148, 0.009599477052688599, 0.09251543879508972, -0.09266579151153564, 0.0587080754339695, 0.19926875829696655, -0.0007732061203569174, 0.13975881040096283, 0.0839780643582344, -0.0344182550907135, 0.04127489775419235, -0.08726184070110321, -0.0720178410410881, -0.12734316289424896, 0.08131074905395508, -0.05604798346757889, 0.15387770533561707, 0.10378613322973251, -0.06060708314180374, 0.005130453035235405, -0.019448773935437202, 0.07114747911691666, 0.0073745884001255035, 0.11103936284780502, 0.005517042241990566, -0.1685517281293869, 0.015420843847095966, 0.069615438580513, 0.11372725665569305, -0.18075324594974518, -0.06947632879018784, 0.04605415090918541, -0.022277889773249626, -0.06049257516860962, 0.11935853958129883, 0.060321468859910965, 0.04495978727936745, -0.028974102810025215, -0.05156175419688225, -0.006581826135516167, 0.10672055929899216, -0.10985933244228363, -0.029370471835136414 ]
null
null
peft
# SpecCoder 7bn v1 This model is a fine-tuned version of [DeepSeek Coder 7b Instruct v1.5](https://huggingface.co/deepseek-ai/deepseek-coder-7b-instruct-v1.5) on a synthetic dataset comprising of ~26k solidity smart contracts. It achieves the following results on the evaluation set: - Loss: 0.5607 ## Model description The model was fine-tune through the LoRA framework. ## Intended uses & limitations To generate solidity smart contracts. ## Training and evaluation data 5% hold out validation data. ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-06 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - distributed_type: multi-GPU - num_devices: 8 - total_train_batch_size: 32 - total_eval_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_steps: 10 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | 0.5899 | 1.0 | 780 | 0.5623 | | 0.8224 | 2.0 | 1560 | 0.5606 | | 0.5702 | 3.0 | 2340 | 0.5607 | ### Framework versions - PEFT 0.8.2 - Transformers 4.37.2 - Pytorch 2.2.0+cu121 - Datasets 2.16.1 - Tokenizers 0.15.1
{"language": ["en"], "license": "other", "library_name": "peft", "base_model": "deepseek-ai/deepseek-coder-7b-instruct-v1.5", "pipeline_tag": "text-generation", "model-index": [{"name": "ds-7-lora-finetuned", "results": []}]}
text-generation
asadmasad/ds-7-lora-finetuned
[ "peft", "safetensors", "text-generation", "conversational", "en", "base_model:deepseek-ai/deepseek-coder-7b-instruct-v1.5", "license:other", "endpoints_compatible", "region:us" ]
2024-02-13T20:33:12+00:00
[]
[ "en" ]
TAGS #peft #safetensors #text-generation #conversational #en #base_model-deepseek-ai/deepseek-coder-7b-instruct-v1.5 #license-other #endpoints_compatible #region-us
SpecCoder 7bn v1 ================ This model is a fine-tuned version of DeepSeek Coder 7b Instruct v1.5 on a synthetic dataset comprising of ~26k solidity smart contracts. It achieves the following results on the evaluation set: * Loss: 0.5607 Model description ----------------- The model was fine-tune through the LoRA framework. Intended uses & limitations --------------------------- To generate solidity smart contracts. Training and evaluation data ---------------------------- 5% hold out validation data. Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 5e-06 * train\_batch\_size: 4 * eval\_batch\_size: 4 * seed: 42 * distributed\_type: multi-GPU * num\_devices: 8 * total\_train\_batch\_size: 32 * total\_eval\_batch\_size: 32 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: cosine * lr\_scheduler\_warmup\_steps: 10 * num\_epochs: 3 ### Training results ### Framework versions * PEFT 0.8.2 * Transformers 4.37.2 * Pytorch 2.2.0+cu121 * Datasets 2.16.1 * Tokenizers 0.15.1
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-06\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 4\n* seed: 42\n* distributed\\_type: multi-GPU\n* num\\_devices: 8\n* total\\_train\\_batch\\_size: 32\n* total\\_eval\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_steps: 10\n* num\\_epochs: 3", "### Training results", "### Framework versions\n\n\n* PEFT 0.8.2\n* Transformers 4.37.2\n* Pytorch 2.2.0+cu121\n* Datasets 2.16.1\n* Tokenizers 0.15.1" ]
[ "TAGS\n#peft #safetensors #text-generation #conversational #en #base_model-deepseek-ai/deepseek-coder-7b-instruct-v1.5 #license-other #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-06\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 4\n* seed: 42\n* distributed\\_type: multi-GPU\n* num\\_devices: 8\n* total\\_train\\_batch\\_size: 32\n* total\\_eval\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_steps: 10\n* num\\_epochs: 3", "### Training results", "### Framework versions\n\n\n* PEFT 0.8.2\n* Transformers 4.37.2\n* Pytorch 2.2.0+cu121\n* Datasets 2.16.1\n* Tokenizers 0.15.1" ]
[ 63, 166, 4, 39 ]
[ "passage: TAGS\n#peft #safetensors #text-generation #conversational #en #base_model-deepseek-ai/deepseek-coder-7b-instruct-v1.5 #license-other #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-06\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 4\n* seed: 42\n* distributed\\_type: multi-GPU\n* num\\_devices: 8\n* total\\_train\\_batch\\_size: 32\n* total\\_eval\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_steps: 10\n* num\\_epochs: 3### Training results### Framework versions\n\n\n* PEFT 0.8.2\n* Transformers 4.37.2\n* Pytorch 2.2.0+cu121\n* Datasets 2.16.1\n* Tokenizers 0.15.1" ]
[ -0.08116919547319412, 0.08961661905050278, -0.0025057303719222546, 0.06681334972381592, 0.10440666228532791, 0.044439658522605896, 0.1199614554643631, 0.11797278374433517, -0.08349788188934326, 0.13986293971538544, 0.11136166006326675, 0.07740513235330582, 0.07004021853208542, 0.1860199123620987, -0.02031991444528103, -0.26602408289909363, 0.026732714846730232, -0.04610641673207283, -0.09964632242918015, 0.11396152526140213, 0.06287660449743271, -0.12898540496826172, 0.08243865519762039, -0.04262761026620865, -0.12234269082546234, -0.010082900524139404, -0.06758027523756027, -0.019565634429454803, 0.09417752921581268, 0.0153850382193923, 0.06929251551628113, 0.01736215315759182, 0.1044071838259697, -0.2705119252204895, 0.00545080378651619, 0.08567412197589874, 0.003237623954191804, 0.0587877482175827, 0.07758066058158875, 0.01935901865363121, 0.17020638287067413, -0.12187076359987259, 0.07255320996046066, 0.029513491317629814, -0.10989852249622345, -0.1913844645023346, -0.07468420267105103, 0.06560444831848145, 0.12519827485084534, 0.06254465132951736, -0.0198450218886137, 0.11788766086101532, -0.0901772603392601, 0.07940997183322906, 0.22720247507095337, -0.3042469024658203, -0.0897192433476448, 0.05550668388605118, 0.03425034508109093, 0.10202987492084503, -0.12252714484930038, -0.018235396593809128, 0.049234986305236816, 0.02843816578388214, 0.07482697069644928, 0.01732632704079151, 0.03490301966667175, 0.005275198258459568, -0.15624567866325378, -0.04899024963378906, 0.11144956946372986, 0.08731138706207275, -0.005820523016154766, -0.09175147116184235, -0.055670883506536484, -0.20553192496299744, -0.04704391211271286, -0.018580902367830276, 0.04368797317147255, -0.0554569736123085, -0.06109333783388138, 0.05887782573699951, -0.05589856207370758, -0.08557498455047607, 0.017851606011390686, 0.1405985951423645, 0.07448281347751617, -0.011911041103303432, 0.0174010768532753, 0.09801752120256424, 0.0423501580953598, -0.14840565621852875, -0.02235899493098259, 0.008519512601196766, -0.07160109281539917, -0.0328630767762661, -0.010532701388001442, 0.07334974408149719, 0.06321689486503601, 0.15777678787708282, -0.08313095569610596, 0.09325049817562103, 0.04319256916642189, 0.009150057099759579, -0.05908029153943062, 0.09576067328453064, -0.07064136117696762, -0.06265569478273392, -0.03883271664381027, 0.10130904614925385, 0.03639290854334831, 0.00030493660597130656, -0.05937718227505684, 0.04891612380743027, 0.0838446244597435, 0.04371650889515877, -0.026247477158904076, 0.021984566003084183, -0.04852510243654251, -0.015780750662088394, 0.04477399215102196, -0.09690269082784653, 0.0500747449696064, 0.04978109523653984, -0.05214837193489075, -0.03221862018108368, -0.024013353511691093, 0.009578638710081577, 0.00005277115633361973, 0.1323758363723755, -0.07851184159517288, -0.018437406048178673, -0.08490738272666931, -0.09664641320705414, 0.04797743260860443, -0.05529056861996651, -0.0034134024754166603, -0.08170823007822037, -0.09217077493667603, -0.04521413892507553, 0.0451158843934536, -0.06057743355631828, -0.05728420242667198, -0.07185949385166168, -0.08948151767253876, 0.020373960956931114, -0.0043844436295330524, 0.12681004405021667, -0.06827283650636673, 0.09125200659036636, 0.01112209539860487, 0.07079555094242096, 0.06748660653829575, 0.022727666422724724, -0.058682963252067566, 0.05382626876235008, -0.13466230034828186, 0.030994558706879616, -0.0864211842417717, 0.05011839047074318, -0.1158703863620758, -0.12220854312181473, -0.06247415393590927, -0.00418299762532115, 0.07298851758241653, 0.1317455917596817, -0.13468223810195923, -0.07184649258852005, 0.19059902429580688, -0.08447528630495071, -0.1434766948223114, 0.1231633797287941, -0.003989961929619312, -0.055475618690252304, 0.018207626417279243, 0.17571048438549042, 0.07127740234136581, -0.11130162328481674, -0.040668174624443054, -0.02413514442741871, 0.10360410064458847, 0.020282743498682976, 0.10470397770404816, -0.003302839584648609, 0.025167200714349747, 0.002858895342797041, -0.02337677963078022, 0.028553955256938934, -0.10449479520320892, -0.08509963750839233, -0.030097993090748787, -0.0853288322687149, -0.0074501438066363335, 0.04625386372208595, 0.030341802164912224, -0.11843743920326233, -0.09349936991930008, 0.0034201794769614935, 0.10142146795988083, -0.0734628438949585, 0.003510918701067567, -0.05516849085688591, 0.043947745114564896, -0.012749982066452503, 0.006134824827313423, -0.13051170110702515, -0.0771649032831192, 0.04832172766327858, -0.05576692521572113, 0.0031459538731724024, -0.022621892392635345, 0.07570984959602356, 0.09117276966571808, -0.04581068083643913, -0.0456162765622139, -0.0514412522315979, -0.0006350161856971681, -0.08184964954853058, -0.24982166290283203, -0.05950789898633957, -0.022890808060765266, 0.13035057485103607, -0.21684883534908295, 0.019982967525720596, -0.004853461403399706, 0.1348758190870285, 0.005081417504698038, -0.05533439293503761, -0.02110895700752735, 0.07515067607164383, -0.02336582913994789, -0.07182558625936508, 0.02361452765762806, -0.009899403899908066, -0.07479939609766006, -0.060209259390830994, -0.15474539995193481, 0.1671774983406067, 0.10109145939350128, 0.015941495075821877, -0.09185013920068741, -0.0180872343480587, -0.07770694047212601, -0.05663603916764259, -0.032278019934892654, 0.02859381213784218, 0.10424364358186722, -0.008880765177309513, 0.08872001618146896, -0.0861799567937851, -0.054098691791296005, 0.04489127919077873, 0.012508506886661053, -0.00968153402209282, 0.16498561203479767, 0.11755120754241943, -0.09078742563724518, 0.1286376565694809, 0.10732432454824448, -0.038747549057006836, 0.11681240797042847, -0.06202588975429535, -0.08136585354804993, -0.037194591015577316, 0.058430664241313934, 0.016283588483929634, 0.12731145322322845, -0.08753932267427444, 0.018659524619579315, 0.0038962934631854296, 0.027308976277709007, 0.026561617851257324, -0.17699696123600006, -0.022269533947110176, 0.02297079563140869, -0.062267743051052094, -0.023134959861636162, -0.021820534020662308, -0.005851496011018753, 0.10469014942646027, 0.018986059352755547, -0.01455327495932579, -0.030943164601922035, -0.009610624052584171, -0.09723164141178131, 0.21195226907730103, -0.09960868954658508, -0.1092161163687706, -0.09481000155210495, 0.03611483797430992, -0.03646795451641083, -0.021434973925352097, 0.04511595144867897, -0.11548273265361786, -0.0333574041724205, -0.0982750877737999, -0.02856050804257393, -0.043685272336006165, 0.03120519407093525, 0.04061508551239967, 0.02366333082318306, 0.04783087968826294, -0.09319224953651428, 0.009804606437683105, -0.02091749757528305, -0.05081680044531822, 0.018288705497980118, 0.04426836967468262, 0.10425615310668945, 0.1289433091878891, 0.03049124777317047, 0.03195849806070328, -0.020255912095308304, 0.19433988630771637, -0.0770636647939682, -0.023307083174586296, 0.07349295914173126, 0.023195896297693253, 0.039305541664361954, 0.1414748579263687, 0.05534931644797325, -0.09894907474517822, 0.016415735706686974, 0.046759989112615585, -0.022408831864595413, -0.205655038356781, -0.04798310250043869, -0.0477316789329052, 0.02160676382482052, 0.1096186712384224, 0.05221936106681824, -0.052887048572301865, 0.01948981173336506, -0.031282175332307816, 0.015268057584762573, -0.006794163025915623, 0.06659489125013351, 0.04444524645805359, 0.04516274482011795, 0.092131108045578, -0.041488196700811386, -0.03511250391602516, 0.03786558657884598, -0.02274453453719616, 0.19493208825588226, 0.0039578573778271675, 0.10062869638204575, 0.045907456427812576, 0.1737101674079895, -0.015513312071561813, 0.05470728129148483, 0.022896915674209595, -0.03096778877079487, 0.01892795041203499, -0.06668201833963394, -0.019618578255176544, 0.05136111006140709, 0.023929499089717865, 0.04298042505979538, -0.09481539577245712, 0.04429198428988457, 0.06253769248723984, 0.258461058139801, 0.058882951736450195, -0.29660746455192566, -0.08742683380842209, 0.017597604542970657, -0.03664083778858185, -0.020499885082244873, 0.037183381617069244, 0.15064989030361176, -0.06718041747808456, 0.0678659975528717, -0.05476028844714165, 0.0714234784245491, -0.029339605942368507, 0.021090582013130188, 0.08590030670166016, 0.11974357068538666, 0.002786818193271756, 0.06725836545228958, -0.25245264172554016, 0.2707517445087433, -0.002609516493976116, 0.08545290678739548, -0.05155066028237343, 0.012531102634966373, 0.001783235464245081, -0.006644141394644976, 0.08240722864866257, -0.001402994617819786, -0.1099291443824768, -0.17008262872695923, -0.10743388533592224, 0.0567476749420166, 0.15215390920639038, -0.07103873789310455, 0.11882734298706055, -0.008423985913395882, -0.02384473942220211, 0.03924616053700447, -0.07083743810653687, -0.11385631561279297, -0.09139660000801086, 0.007371220272034407, -0.03425471857190132, -0.004319718107581139, -0.0765787735581398, -0.09540924429893494, -0.12455986440181732, 0.16399461030960083, -0.11649708449840546, -0.007129426579922438, -0.1135060265660286, 0.07594317197799683, 0.13291624188423157, -0.06862083822488785, 0.03249603509902954, 0.010562309995293617, 0.08353906869888306, 0.01824544556438923, -0.05012689530849457, 0.11724374443292618, -0.09690078347921371, -0.21414092183113098, -0.04751820117235184, 0.13227790594100952, 0.0359160490334034, 0.05676880106329918, -0.03182290121912956, 0.02482701651751995, -0.011102409102022648, -0.11384046077728271, 0.06569714844226837, 0.0663968026638031, 0.059740275144577026, 0.04738123342394829, -0.06736883521080017, 0.056140851229429245, -0.036075204610824585, -0.04082293435931206, 0.10559671372175217, 0.33127784729003906, -0.087220199406147, 0.03078503906726837, 0.03324110433459282, -0.04962733015418053, -0.16783343255519867, -0.0033581957686692476, 0.08413244038820267, 0.027233947068452835, 0.03368727117776871, -0.1869574338197708, 0.07469066977500916, 0.11436004936695099, -0.019924232736229897, 0.09562017768621445, -0.33974072337150574, -0.12493431568145752, 0.054095521569252014, 0.1097240075469017, 0.003149496391415596, -0.18806703388690948, -0.04439815133810043, -0.002798600820824504, -0.11923448741436005, 0.07155106216669083, -0.05258183181285858, 0.11177277565002441, -0.018180783838033676, -0.010202793404459953, 0.0094267837703228, -0.05894067883491516, 0.14415982365608215, -0.020653871819376945, 0.07596288621425629, -0.02612939663231373, 0.021413959562778473, -0.017528077587485313, -0.06336526572704315, 0.013862616382539272, -0.1026887372136116, 0.04805869236588478, -0.07692230492830276, -0.011987397447228432, -0.07088573276996613, 0.015460550785064697, -0.05857030674815178, -0.03128288313746452, -0.04841221496462822, 0.048894379287958145, 0.07524491846561432, -0.009196033701300621, 0.12948814034461975, 0.007815499790012836, 0.17122186720371246, 0.08799669146537781, 0.07893205434083939, 0.004296842962503433, -0.0443173348903656, -0.0094153992831707, -0.015750402584671974, 0.03803302347660065, -0.13083219528198242, 0.004251535981893539, 0.14811134338378906, 0.041275762021541595, 0.10922978818416595, 0.06454334408044815, -0.05798310041427612, -0.012051267549395561, 0.0760246068239212, -0.13528338074684143, -0.17674142122268677, 0.008204285986721516, -0.014757132157683372, -0.14324265718460083, 0.0380941703915596, 0.07254987210035324, -0.0659923106431961, -0.0003700870438478887, -0.009083051234483719, 0.06540210545063019, -0.03284139186143875, 0.18789294362068176, 0.028817487880587578, 0.08164418488740921, -0.09311337769031525, 0.08954893797636032, 0.029032699763774872, -0.12248459458351135, 0.04994863644242287, 0.10039298236370087, -0.0652943104505539, -0.012397680431604385, 0.10083611309528351, 0.0911022275686264, 0.058722976595163345, -0.039665866643190384, -0.12537772953510284, -0.13560830056667328, 0.08397036790847778, 0.10648085922002792, 0.05006923899054527, 0.03652513027191162, 0.0015454803360626101, 0.05551727116107941, -0.11887624114751816, 0.1028917133808136, 0.043006688356399536, 0.08139839768409729, -0.1505391001701355, 0.11705764383077621, -0.004390579182654619, -0.00028660643147304654, -0.007212274707853794, 0.04149031266570091, -0.15642865002155304, -0.00807337835431099, -0.06570790708065033, -0.03757113963365555, -0.07875575125217438, -0.0018463230226188898, 0.015844913199543953, -0.029754461720585823, -0.07062274217605591, 0.01559047494083643, -0.08978661894798279, -0.05180021747946739, -0.023744624108076096, 0.07998468726873398, -0.12539111077785492, 0.007729270029813051, 0.028638985008001328, -0.11517247557640076, 0.07054968178272247, 0.04067464545369148, 0.04456363618373871, 0.025762783363461494, -0.13038194179534912, 0.027573632076382637, 0.04477858543395996, 0.0011917161755263805, 0.03174671158194542, -0.12039490789175034, -0.007189166732132435, -0.025370024144649506, 0.01085666287690401, 0.018256502225995064, 0.048855338245630264, -0.11928289383649826, 0.02677309699356556, -0.06259255856275558, -0.06648104637861252, -0.038604985922575, 0.016674144193530083, 0.07115665823221207, -0.022939864546060562, 0.13624894618988037, -0.08955854177474976, 0.03060542792081833, -0.23612940311431885, -0.025838108733296394, 0.000636901066172868, -0.055487923324108124, -0.07866781204938889, -0.021602405235171318, 0.10091603547334671, -0.0427401028573513, 0.1259942501783371, -0.04477383568882942, 0.01475464180111885, 0.035327088087797165, -0.0829094871878624, 0.018599962815642357, 0.0540584959089756, 0.20084117352962494, 0.03647582605481148, -0.03622031584382057, 0.03734765946865082, 0.0008353304001502693, 0.07257858663797379, 0.054931554943323135, 0.20656585693359375, 0.17106543481349945, -0.01360391266644001, 0.08363381773233414, 0.07926836609840393, -0.14210906624794006, -0.10478587448596954, 0.12879931926727295, -0.06425484269857407, 0.1191500648856163, -0.028806816786527634, 0.14086827635765076, 0.10289608687162399, -0.20108532905578613, 0.0028585398104041815, -0.05571160092949867, -0.07791538536548615, -0.11287353932857513, -0.01811886765062809, -0.09790903329849243, -0.18292607367038727, 0.028961479663848877, -0.1278804987668991, 0.03522436320781708, 0.08061081171035767, 0.049541618674993515, 0.030636481940746307, 0.15597619116306305, 0.057319268584251404, 0.054689228534698486, 0.052344877272844315, 0.028148846700787544, -0.02312067523598671, -0.01425123866647482, -0.10179010033607483, 0.022225655615329742, -0.07545062154531479, 0.07387889176607132, -0.051885802298784256, -0.08218702673912048, 0.0829269140958786, 0.011305814608931541, -0.08621250092983246, 0.02862706407904625, -0.019634488970041275, 0.036834053695201874, 0.07514385879039764, 0.033416613936424255, -0.022319065406918526, -0.02997124381363392, 0.18711277842521667, -0.08683710545301437, -0.04853413626551628, -0.1138145849108696, 0.24969492852687836, 0.017662614583969116, 0.005625060759484768, 0.031855687499046326, -0.07173047959804535, -0.027034197002649307, 0.14521615207195282, 0.1709156334400177, -0.04293767735362053, -0.016439272090792656, 0.0379488468170166, -0.0030996992718428373, -0.026460738852620125, 0.10004007816314697, 0.11658500134944916, 0.08034861087799072, -0.06641628593206406, -0.03812886029481888, -0.03707262873649597, -0.04435378313064575, -0.056964144110679626, 0.02585584484040737, 0.05056993290781975, -0.017979025840759277, -0.02224830724298954, 0.09162844717502594, -0.06770863384008408, -0.08349522948265076, 0.09145055711269379, -0.20610740780830383, -0.17707638442516327, -0.020876677706837654, 0.0938676968216896, 0.0027875700034201145, 0.05301930010318756, -0.01309208758175373, -0.04541606083512306, 0.08960239589214325, -0.010870004072785378, -0.055225927382707596, -0.10439514368772507, 0.03303535282611847, -0.05604441463947296, 0.1922978013753891, -0.032089218497276306, 0.05581330135464668, 0.11956910789012909, 0.026363976299762726, -0.08759874105453491, 0.0430867001414299, 0.08001555502414703, -0.1292235106229782, 0.012691550888121128, 0.12321045994758606, -0.054058242589235306, 0.10452087223529816, 0.06360263377428055, -0.096180260181427, -0.00429017748683691, -0.06283566355705261, -0.03416689857840538, -0.060467157512903214, -0.0008267327211797237, -0.03855368122458458, 0.1605164110660553, 0.2148650884628296, -0.05457605421543121, -0.0032359035685658455, -0.026425501331686974, 0.051291629672050476, 0.032996006309986115, 0.14934062957763672, -0.015550891868770123, -0.2904152274131775, 0.033034924417734146, -0.007790134754031897, 0.03237990662455559, -0.2066020518541336, -0.08651521801948547, 0.03687984123826027, -0.03405413776636124, -0.0930708721280098, 0.10265741497278214, 0.09763026237487793, 0.04090205952525139, -0.06869292259216309, -0.15827588737010956, -0.033742740750312805, 0.17884403467178345, -0.14728021621704102, -0.09104027599096298 ]
null
null
null
GGUF Quants with iMatrix for : https://huggingface.co/NousResearch/Yarn-Llama-2-70b-32k iMatrix (Wiki-c512-ch1k) courtesy of Artefact2. Quant : - IQ1_S ("v3") for full offload on 16GB VRAM, and a good partial offload on 12GB VRAM Other quants : https://huggingface.co/Artefact2/Yarn-Llama-2-70b-32k-GGUF
{}
null
Nexesenex/NousResearch_Yarn-Llama-2-70b-32k-iMat.GGUF
[ "gguf", "region:us" ]
2024-02-13T20:33:17+00:00
[]
[]
TAGS #gguf #region-us
GGUF Quants with iMatrix for : URL iMatrix (Wiki-c512-ch1k) courtesy of Artefact2. Quant : - IQ1_S ("v3") for full offload on 16GB VRAM, and a good partial offload on 12GB VRAM Other quants : URL
[]
[ "TAGS\n#gguf #region-us \n" ]
[ 9 ]
[ "passage: TAGS\n#gguf #region-us \n" ]
[ 0.030724648386240005, 0.026499787345528603, -0.010017825290560722, -0.05703527107834816, 0.08247160166501999, 0.07200847566127777, 0.01814177818596363, 0.020192064344882965, 0.2235025018453598, 0.017216520383954048, 0.1496623009443283, -0.031233953312039375, 0.006174509879201651, 0.05538657680153847, 0.039407629519701004, -0.19438467919826508, 0.058440499007701874, -0.02356063388288021, -0.020945189520716667, 0.01803453452885151, -0.05310691148042679, -0.04108472168445587, 0.022135348990559578, -0.07881014049053192, -0.15867982804775238, 0.0678698718547821, 0.017852067947387695, 0.0007025183876976371, 0.0820731669664383, 0.05882885307073593, 0.09657382220029831, -0.024203501641750336, -0.15220364928245544, -0.18796531856060028, 0.0366438589990139, -0.02974788099527359, -0.10282598435878754, 0.022019000723958015, 0.029453158378601074, -0.06967076659202576, 0.02238346077501774, 0.1427535116672516, -0.10206039994955063, 0.051592033356428146, -0.27165159583091736, -0.1715938150882721, -0.06585682183504105, -0.025845954194664955, -0.007345964200794697, 0.01241085771471262, -0.0010092189768329263, 0.047266922891139984, -0.20188692212104797, -0.005631127394735813, 0.09329266101121902, -0.25229454040527344, 0.02776304818689823, 0.21345718204975128, -0.010520953685045242, 0.09873088449239731, -0.05590669438242912, 0.14438565075397491, 0.03173782303929329, -0.019559340551495552, -0.1924813836812973, -0.070224329829216, -0.07177317887544632, 0.162109375, -0.0823177620768547, -0.11764442175626755, 0.24176421761512756, 0.009283576160669327, -0.026472626253962517, 0.15598991513252258, -0.029037300497293472, -0.009749599732458591, 0.04555726423859596, 0.01668328419327736, -0.010545015335083008, 0.1551385223865509, 0.17108163237571716, -0.08598228543996811, -0.10847756266593933, -0.030579885467886925, -0.2373785674571991, 0.2470305860042572, -0.01911027915775776, 0.12945520877838135, -0.20086053013801575, 0.018443629145622253, -0.3247532844543457, -0.0012029389617964625, -0.010316703468561172, -0.028618358075618744, -0.006935348734259605, 0.009301352314651012, -0.050316113978624344, 0.0739501491189003, 0.14580395817756653, 0.1393439620733261, -0.11465669423341751, 0.060509420931339264, -0.052172139286994934, 0.14876529574394226, 0.05827285721898079, 0.061183393001556396, 0.04079163819551468, 0.07037676870822906, -0.008353544399142265, -0.21633195877075195, -0.029873060062527657, -0.07057386636734009, -0.08445251733064651, -0.0130265261977911, -0.13896764814853668, 0.11386743932962418, -0.022273007780313492, -0.07913482189178467, -0.06810981780290604, 0.07626928389072418, 0.017650218680500984, -0.008536403998732567, -0.035703565925359726, -0.012481719255447388, 0.022218508645892143, -0.014872739091515541, -0.1519843488931656, 0.02295425534248352, 0.10455024242401123, 0.07257117331027985, -0.1489023119211197, -0.011344035156071186, -0.017298875376582146, 0.06959983706474304, 0.03884255141019821, -0.10402916371822357, 0.04283881187438965, -0.10747409611940384, -0.08414466679096222, 0.022628657519817352, -0.005062851123511791, -0.0418001152575016, 0.13524691760540009, 0.03997812792658806, 0.040150050073862076, -0.016940169036388397, -0.04259050637483597, -0.048133596777915955, -0.07602019608020782, 0.07334327697753906, 0.05418020859360695, 0.027240034192800522, -0.1915341019630432, 0.01154522504657507, -0.048245880752801895, 0.09175369143486023, -0.11856856942176819, 0.014575321227312088, -0.08105122298002243, 0.1604209989309311, 0.0349995456635952, 0.09055875241756439, -0.19562625885009766, 0.02605881541967392, -0.06191767752170563, 0.1854621320962906, -0.04451294615864754, -0.11786319315433502, 0.2698904871940613, -0.09105797111988068, -0.040079716593027115, 0.056803084909915924, 0.06560484319925308, -0.06272535026073456, 0.068723164498806, 0.4434472322463989, -0.06556011736392975, -0.07118581980466843, 0.05080527812242508, 0.17805561423301697, -0.1262815296649933, -0.09372174739837646, 0.09990617632865906, -0.1480535864830017, -0.211008220911026, 0.030864350497722626, 0.028955968096852303, 0.1494358479976654, -0.06205282360315323, -0.012456154450774193, 0.058214303106069565, -0.013022401370108128, 0.046677324920892715, 0.03563477098941803, 0.11109840869903564, -0.06493768095970154, 0.06851828098297119, -0.16232267022132874, 0.016065504401922226, 0.1209988072514534, -0.015012580901384354, -0.04126624017953873, 0.14286154508590698, -0.03809087723493576, 0.07199656218290329, -0.07730832695960999, -0.1804673671722412, 0.027612121775746346, 0.05621999502182007, 0.028122514486312866, 0.09176547825336456, 0.09526687115430832, -0.039257392287254333, 0.0013902259524911642, 0.0329861082136631, 0.061223939061164856, -0.007701692637056112, 0.015235940925776958, -0.015374142676591873, 0.12888981401920319, -0.07010363042354584, -0.04155188798904419, -0.09715848416090012, -0.00889967754483223, 0.2288777232170105, -0.01933911070227623, 0.02257734164595604, -0.06854789704084396, 0.033186767250299454, -0.0012386917369440198, 0.09506335854530334, -0.017756229266524315, 0.06063338369131088, -0.022011179476976395, -0.06201287358999252, 0.11652727425098419, -0.043086208403110504, 0.24556174874305725, 0.10792262107133865, -0.07513239979743958, -0.01741042546927929, -0.0871582105755806, -0.007020947523415089, 0.022898653522133827, 0.08814648538827896, -0.04863424599170685, 0.06471672654151917, -0.037898752838373184, -0.0013588295551016927, 0.018808960914611816, -0.008487841114401817, -0.030526969581842422, -0.04284367710351944, -0.08270563185214996, 0.09057542681694031, 0.0691855251789093, -0.13670015335083008, 0.17748047411441803, 0.2472171038389206, 0.1500423550605774, 0.2487964630126953, -0.06485911458730698, -0.014139159582555294, -0.02016172744333744, 0.03673918917775154, -0.020436765626072884, 0.13109654188156128, -0.18929845094680786, -0.032152432948350906, 0.02558354288339615, 0.029807843267917633, 0.10872193425893784, -0.1365325003862381, -0.1145850270986557, -0.0379912331700325, -0.047677598893642426, -0.08257206529378891, 0.07034620642662048, -0.12104500830173492, 0.03338077291846275, 0.07256745547056198, 0.0073080710135400295, 0.12201625853776932, 0.015417544171214104, -0.055278971791267395, 0.0998256728053093, -0.14543165266513824, -0.2384990155696869, -0.04642500355839729, -0.10990478098392487, 0.001206184271723032, 0.05318264663219452, 0.016633260995149612, -0.21265560388565063, -0.01741623878479004, 0.11141498386859894, 0.06650645285844803, -0.18111048638820648, 0.024138791486620903, 0.029385030269622803, -0.004455238115042448, -0.10212790220975876, -0.012687300331890583, -0.05387670546770096, -0.11039627343416214, -0.0691843032836914, 0.08163908869028091, -0.06936442852020264, 0.11164893209934235, 0.1582336574792862, 0.11141853034496307, 0.11249161511659622, -0.011774544604122639, 0.1976311057806015, -0.14119699597358704, -0.14489109814167023, 0.06405922025442123, -0.014498869888484478, 0.03640124574303627, 0.08232609927654266, 0.04930112138390541, -0.14269955456256866, -0.04848511889576912, -0.007545206230133772, -0.1497725397348404, -0.1323675513267517, -0.05164776369929314, -0.10658133774995804, 0.12379065901041031, -0.06248227879405022, 0.10150982439517975, 0.11162466555833817, 0.017522823065519333, 0.11151766777038574, -0.06246228888630867, -0.054680291563272476, -0.04807431995868683, 0.06297076493501663, -0.05410824716091156, -0.04205694422125816, -0.06721562892198563, -0.008002115413546562, 0.1349310278892517, 0.10885956883430481, 0.07581131905317307, 0.2265089601278305, 0.02780294418334961, 0.05355561524629593, 0.040789585560560226, 0.16015571355819702, 0.015284501947462559, -0.0046128155663609505, -0.08788388222455978, -0.014365277253091335, -0.0019687749445438385, -0.031080376356840134, -0.006052241660654545, 0.1340780407190323, -0.2559821307659149, 0.03235609456896782, -0.2989844083786011, 0.11946471780538559, -0.1565471589565277, 0.07426489144563675, 0.05220162868499756, 0.030080994591116905, 0.08841689676046371, 0.035069406032562256, -0.02871096506714821, 0.09149409085512161, 0.11694692075252533, -0.12628670036792755, 0.01540512777864933, 0.04918349161744118, 0.052707213908433914, -0.0142430504783988, 0.0931062400341034, -0.11024625599384308, -0.0737583339214325, -0.0024255106691271067, 0.07025767862796783, -0.2099330574274063, 0.23986183106899261, 0.03523903712630272, -0.10871971398591995, -0.021638909354805946, -0.0547538623213768, 0.03316742554306984, 0.08983159810304642, 0.1342458724975586, 0.11251148581504822, -0.11371640861034393, -0.12470904737710953, 0.029020745307207108, 0.03679748624563217, 0.1757190227508545, -0.09047917276620865, -0.14164063334465027, 0.001811441034078598, 0.05263577029109001, -0.053646381944417953, 0.07645093649625778, -0.05327983945608139, -0.0941789522767067, 0.03495060279965401, 0.04520740360021591, 0.00641082925722003, -0.019971303641796112, 0.08110581338405609, -0.02520396187901497, 0.085345059633255, -0.04878882318735123, 0.00847524031996727, -0.10202991217374802, -0.03634759038686752, 0.04376819357275963, -0.0722225159406662, 0.01614394783973694, -0.09818518906831741, -0.15651735663414001, -0.08556577563285828, -0.15303048491477966, 0.12497064471244812, -0.052672382444143295, 0.10244213044643402, -0.047614291310310364, 0.147609144449234, -0.013274060562252998, 0.030878636986017227, -0.05167607590556145, 0.028036773204803467, 0.011671020649373531, -0.14858771860599518, 0.20959575474262238, -0.1476162225008011, -0.023819662630558014, 0.16589532792568207, 0.05426561459898949, 0.1161220371723175, 0.04555299133062363, -0.0879630371928215, 0.23518426716327667, 0.2702784240245819, -0.0007818902959115803, 0.17838320136070251, 0.2352202981710434, -0.026693791151046753, -0.2436053603887558, -0.07260585576295853, -0.2063993662595749, -0.039628319442272186, 0.0004186074365861714, -0.282958060503006, 0.06042884290218353, 0.17210599780082703, -0.07570867985486984, 0.4319494664669037, -0.22352926433086395, 0.03153151646256447, 0.13982820510864258, -0.04242865741252899, 0.6181237101554871, -0.1820172369480133, -0.16550765931606293, 0.052592549473047256, -0.1248052790760994, 0.11609237641096115, -0.005267696920782328, 0.10048385709524155, -0.00011838242062367499, -0.02595684304833412, 0.03428659215569496, -0.0409976989030838, 0.23620888590812683, 0.018790103495121002, 0.045043930411338806, -0.09004033356904984, -0.1538960188627243, 0.10746775567531586, 0.02556895837187767, -0.10341835021972656, 0.03920651972293854, -0.06092366203665733, -0.10915451496839523, 0.011575369164347649, -0.08317004889249802, 0.03433287888765335, 0.09550272673368454, -0.050003789365291595, -0.0652989074587822, 0.024777809157967567, -0.16975140571594238, 0.028226720169186592, 0.1660151481628418, -0.08661750704050064, 0.17001861333847046, -0.04084239527583122, -0.0947834923863411, -0.15362800657749176, -0.020637191832065582, -0.07918675988912582, -0.01597081869840622, 0.10419487953186035, -0.11003783345222473, 0.006433290895074606, 0.09035904705524445, 0.002910176757723093, 0.07882846146821976, 0.09883374720811844, -0.08716033399105072, 0.05550702288746834, 0.1730797290802002, -0.21496161818504333, -0.1694899946451187, -0.04902869462966919, -0.1887752115726471, 0.2065081000328064, 0.03903897479176521, 0.04895683750510216, 0.16432031989097595, 0.015995748341083527, -0.010867753997445107, -0.020683420822024345, -0.11664224416017532, 0.00450828718021512, 0.04868127405643463, -0.005741522181779146, -0.11094820499420166, 0.13042977452278137, 0.05625306814908981, -0.010265284217894077, -0.04014173522591591, 0.1808832287788391, -0.06324239075183868, -0.06105973571538925, -0.29144585132598877, 0.07338178157806396, -0.10203809291124344, -0.033191971480846405, 0.08307401835918427, -0.024927617982029915, -0.0012370682088658214, 0.14441034197807312, 0.009444275870919228, 0.1295502781867981, 0.031338974833488464, 0.03218937665224075, 0.14084547758102417, -0.13805074989795685, -0.14429166913032532, -0.029582731425762177, -0.08434601873159409, -0.12847381830215454, -0.016780147328972816, 0.1751313954591751, -0.08363176882266998, -0.12467111647129059, -0.2756369411945343, 0.049299292266368866, -0.0641724020242691, -0.1138453483581543, -0.03101496584713459, -0.06544762849807739, 0.052310146391391754, -0.040101904422044754, 0.014005003497004509, -0.023109296336770058, -0.14451682567596436, 0.0458921417593956, 0.06695213168859482, 0.03172319754958153, -0.02931683138012886, 0.0015236766776069999, 0.15014788508415222, 0.026510147377848625, 0.16621503233909607, 0.22043149173259735, 0.061838917434215546, 0.20056213438510895, -0.2713247239589691, -0.10004157572984695, 0.10868333280086517, -0.07527677714824677, 0.021882841363549232, 0.13841275870800018, -0.01911449432373047, -0.0495067797601223, -0.03201347589492798, 0.08917038887739182, -0.017281996086239815, -0.08984966576099396, -0.04857974499464035, -0.003589637577533722, -0.18503929674625397, -0.0007536212215200067, -0.15319249033927917, 0.1420021951198578, 0.04460230842232704, -0.062356118112802505, 0.07465137541294098, 0.05997058004140854, 0.03977793827652931, 0.006764960940927267, 0.018739836290478706, -0.14650356769561768, 0.01704270951449871, -0.025170978158712387, -0.006106532644480467, 0.03402095288038254, 0.34655115008354187, -0.0466112419962883, -0.07675225287675858, -0.019784720614552498, 0.1001124382019043, 0.13863220810890198, -0.009452453814446926, 0.13600659370422363, 0.13898764550685883, -0.07470680773258209, -0.12456237524747849, 0.10025309771299362, -0.04034053534269333, -0.15969179570674896, 0.12802298367023468, -0.0435095950961113, -0.016280202195048332, 0.04011611267924309, -0.03383811563253403, -0.08241409808397293, 0.04869242012500763, -0.08193223923444748, -0.03468599542975426, -0.03921830281615257, -0.019609715789556503, -0.02835456281900406, 0.179523304104805, -0.03646359592676163, 0.07318142801523209, -0.02748848870396614, 0.010194642469286919, -0.10395175963640213, -0.1028568297624588, 0.05173351243138313, -0.12340104579925537, 0.07964924722909927, -0.03694985434412956, 0.030445387586951256, 0.22815105319023132, 0.02754553034901619, 0.015633730217814445, 0.13255921006202698, -0.00819331593811512, -0.0877854973077774, 0.03996758162975311, -0.044342756271362305, 0.021794743835926056, -0.030855976045131683, -0.07628626376390457, -0.0880078375339508, -0.10075201094150543, -0.049825526773929596, 0.03320961445569992, -0.030442843213677406, -0.05212388187646866, -0.14976045489311218, -0.02720625326037407, -0.07237301766872406, 0.11920249462127686, -0.09342960268259048, 0.08832328021526337, -0.012045936658978462, 0.0026839354541152716, 0.037163145840168, 0.1505078673362732, 0.010094218887388706, 0.10494716465473175, 0.006677085533738136, 0.09218452870845795, -0.06759306788444519, 0.14643312990665436, -0.12665413320064545, -0.02135086990892887, -0.03415476530790329, 0.2331210970878601, 0.20847657322883606, -0.11358945816755295, 0.009311644360423088, 0.03202449902892113, 0.04839635267853737, 0.185939759016037, 0.12599588930606842, 0.01761433109641075, 0.33329761028289795, -0.059357043355703354, -0.02227349951863289, 0.05721667781472206, -0.00022221643303055316, -0.06214975565671921, 0.0716261938214302, 0.08921460807323456, 0.013963594101369381, -0.1257423460483551, 0.11072274297475815, -0.21343208849430084, 0.15216094255447388, 0.07192383706569672, -0.18375952541828156, -0.009178245440125465, -0.05186039209365845, 0.008210902102291584, -0.027973614633083344, 0.13407447934150696, -0.07003656774759293, -0.1739543378353119, -0.19977876543998718, 0.060681428760290146, -0.35512542724609375, -0.20812080800533295, 0.06384200602769852, 0.1383514702320099, 0.10808566957712173, -0.06061858683824539, -0.013316533528268337, 0.006446295417845249, 0.01029437780380249, -0.019556531682610512, 0.028526417911052704, -0.008326482027769089, -0.05453765019774437, -0.25444141030311584, -0.006056090816855431, 0.0625600665807724, -0.15240277349948883, 0.05618175491690636, -0.017780732363462448, -0.008800189942121506, 0.13029517233371735, -0.021711476147174835, 0.03442413732409477, 0.00029493181500583887, -0.16273388266563416, 0.031801287084817886, 0.035038504749536514, 0.03614772483706474, -0.010639974847435951, -0.04227915778756142, -0.002239778870716691, 0.07848605513572693, -0.054354216903448105, -0.1438787877559662, 0.11021588742733002, -0.026462025940418243, 0.21526864171028137, -0.06517954170703888, -0.033111389726400375, 0.023098714649677277, -0.07031320035457611, 0.2018292248249054, -0.03690796345472336, 0.05650625377893448, 0.1586160659790039, 0.018734993413090706, 0.019857894629240036, -0.30062609910964966, 0.08813683688640594, -0.024517416954040527, 0.006894893944263458, -0.05270370468497276 ]
null
null
peft
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed] ### Framework versions - PEFT 0.8.2
{"library_name": "peft", "base_model": "mistralai/Mistral-7B-Instruct-v0.2"}
null
nes07/mistral-7b-do-e2e
[ "peft", "tensorboard", "safetensors", "arxiv:1910.09700", "base_model:mistralai/Mistral-7B-Instruct-v0.2", "region:us" ]
2024-02-13T20:37:37+00:00
[ "1910.09700" ]
[]
TAGS #peft #tensorboard #safetensors #arxiv-1910.09700 #base_model-mistralai/Mistral-7B-Instruct-v0.2 #region-us
# Model Card for Model ID ## Model Details ### Model Description - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact ### Framework versions - PEFT 0.8.2
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact", "### Framework versions\n\n- PEFT 0.8.2" ]
[ "TAGS\n#peft #tensorboard #safetensors #arxiv-1910.09700 #base_model-mistralai/Mistral-7B-Instruct-v0.2 #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact", "### Framework versions\n\n- PEFT 0.8.2" ]
[ 46, 6, 3, 54, 28, 3, 4, 9, 9, 10, 42, 20, 3, 4, 5, 9, 11, 13, 3, 12, 5, 4, 5, 3, 4, 9, 53, 9, 8, 6, 3, 14, 8, 7, 9, 4, 11 ]
[ "passage: TAGS\n#peft #tensorboard #safetensors #arxiv-1910.09700 #base_model-mistralai/Mistral-7B-Instruct-v0.2 #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact### Framework versions\n\n- PEFT 0.8.2" ]
[ -0.11005459725856781, 0.20163951814174652, -0.0035966800060123205, 0.026700038462877274, 0.07522141933441162, 0.01652722992002964, 0.07247468829154968, 0.12883290648460388, 0.02812971919775009, 0.13037152588367462, 0.058048464357852936, 0.11483261734247208, 0.11362606287002563, 0.20114298164844513, -0.005471509415656328, -0.18043509125709534, 0.021887680515646935, -0.06085912883281708, 0.020198648795485497, 0.1274448037147522, 0.13497017323970795, -0.09239913523197174, 0.07295332103967667, -0.02170969918370247, -0.012601199559867382, -0.032460540533065796, -0.0648520216345787, -0.011631972156465054, 0.04732809215784073, 0.034687288105487823, 0.05745432525873184, -0.00008365680696442723, 0.08657443523406982, -0.26058104634284973, 0.013964368030428886, 0.0542697049677372, -0.01270007248967886, 0.08525513857603073, 0.10747287422418594, -0.04662379249930382, 0.11477144807577133, -0.04431706666946411, 0.134995698928833, 0.08009204268455505, -0.10757938027381897, -0.21277353167533875, -0.06918945908546448, 0.07900398969650269, 0.1814880073070526, 0.06827759742736816, -0.03866032138466835, 0.12086976319551468, -0.06236695870757103, 0.025052931159734726, 0.08825783431529999, -0.11256146430969238, -0.06688081473112106, 0.082643061876297, 0.13116984069347382, 0.08507762104272842, -0.1210196316242218, -0.03631440922617912, 0.026673709973692894, 0.04620568826794624, 0.07228352874517441, 0.008282438851892948, 0.17713725566864014, 0.0362890250980854, -0.14388036727905273, -0.05260264500975609, 0.09574218094348907, 0.0037930181715637445, -0.037303030490875244, -0.2194669246673584, -0.014829638414084911, -0.09240177273750305, -0.03722438961267471, -0.055330999195575714, 0.03412413224577904, 0.01384571474045515, 0.1076611578464508, -0.05132735148072243, -0.07961571961641312, -0.012441459111869335, 0.11537463217973709, 0.06377547234296799, 0.011325653642416, -0.020242629572749138, -0.0030128145590424538, 0.12158112227916718, 0.068356454372406, -0.12469957023859024, -0.055848702788352966, -0.060070816427469254, -0.03400525450706482, -0.020758500322699547, 0.05025280639529228, 0.010998870246112347, 0.035184595733881, 0.2646316885948181, -0.017974352464079857, 0.06756756454706192, 0.043896857649087906, 0.0156417116522789, 0.026632798835635185, 0.10410027205944061, -0.037555061280727386, -0.19288389384746552, -0.013176835142076015, 0.10054730623960495, 0.006715704221278429, -0.029446851462125778, -0.05178588628768921, 0.020186113193631172, 0.0419967882335186, 0.11880776286125183, 0.10641331225633621, -0.02229263447225094, -0.060212694108486176, -0.06173035129904747, 0.207918182015419, -0.15331804752349854, 0.05768301710486412, 0.02754295989871025, -0.005362172145396471, -0.0623653419315815, 0.01926332525908947, 0.0016047041863203049, -0.04339689016342163, 0.10307925194501877, -0.061971765011548996, -0.03986029326915741, -0.11947621405124664, -0.048466697335243225, 0.03591301664710045, -0.026588937267661095, -0.05572141706943512, -0.027149029076099396, -0.08369230479001999, -0.10554058104753494, 0.10127554088830948, -0.05512791499495506, -0.05090269818902016, -0.027961917221546173, -0.0668550357222557, 0.028812354430556297, 0.027274467051029205, 0.061151739209890366, -0.028421077877283096, 0.043536871671676636, -0.03260212019085884, 0.06901808828115463, 0.08473734557628632, 0.03871752694249153, -0.07206472009420395, 0.06833932548761368, -0.17687472701072693, 0.08178167045116425, -0.060581643134355545, 0.025721583515405655, -0.16274085640907288, 0.004659113474190235, 0.005039842799305916, 0.02710006572306156, 0.05084904283285141, 0.15026958286762238, -0.1957567036151886, -0.03217093646526337, 0.17966242134571075, -0.10135950893163681, -0.1094222217798233, 0.03621042147278786, -0.04602672904729843, 0.16922710835933685, 0.04013962298631668, 0.012035398744046688, 0.0975765511393547, -0.15438184142112732, -0.013515327125787735, -0.03538142517209053, 0.012560931034386158, 0.05384664982557297, 0.06582455337047577, -0.08259595930576324, 0.008543440140783787, 0.008608016185462475, -0.04720229655504227, -0.018620548769831657, -0.036632340401411057, -0.09199238568544388, 0.0057491157203912735, -0.07812433689832687, 0.0057044909335672855, 0.008361929096281528, -0.08591339737176895, -0.01097843050956726, -0.14100462198257446, -0.009545885026454926, 0.07069354504346848, 0.00639401376247406, -0.005693669430911541, -0.07847297191619873, 0.03638138622045517, -0.05526026710867882, -0.013197780586779118, -0.1535767763853073, -0.003397177206352353, 0.02166792005300522, -0.1436336785554886, 0.012558945454657078, -0.14282190799713135, 0.07540597766637802, 0.012133589014410973, -0.05982504040002823, -0.0339570976793766, 0.0255888719111681, -0.009153253398835659, -0.07313985377550125, -0.22400832176208496, -0.030789807438850403, -0.05429824814200401, 0.11528600752353668, -0.21535146236419678, 0.04933593422174454, 0.00785841140896082, 0.11688604205846786, 0.013987415470182896, -0.06909734755754471, 0.027225371450185776, -0.05747755989432335, -0.02227303385734558, -0.0725458562374115, -0.00412127235904336, 0.0021293172612786293, -0.02146938629448414, 0.02878858707845211, -0.15052732825279236, -0.07253512740135193, 0.09104260802268982, 0.08480095118284225, -0.14407381415367126, 0.007187587674707174, -0.036201320588588715, -0.06233687326312065, -0.0744178518652916, -0.06917433440685272, 0.06551656872034073, 0.050435952842235565, 0.05165539309382439, -0.08584674447774887, -0.07369772344827652, 0.00037993781734257936, -0.012960086576640606, -0.028873829171061516, 0.12414303421974182, 0.07643704116344452, -0.09972136467695236, 0.09953218698501587, 0.07417118549346924, 0.03155597671866417, 0.09870006144046783, -0.005474020726978779, -0.10043563693761826, -0.035906244069337845, 0.05830102413892746, 0.020956698805093765, 0.15284894406795502, -0.060526516288518906, 0.03974595293402672, 0.04266728088259697, -0.04023314267396927, 0.04180866852402687, -0.09666601568460464, 0.014699176885187626, 0.009571230970323086, -0.014223182573914528, 0.0313471183180809, -0.02805047668516636, 0.010896158404648304, 0.08903627842664719, 0.0674998089671135, 0.029385944828391075, 0.01622944511473179, -0.03919723257422447, -0.13904616236686707, 0.17320752143859863, -0.08991904556751251, -0.21730844676494598, -0.15506738424301147, 0.026881014928221703, 0.05315892770886421, -0.013546639122068882, 0.03454039990901947, -0.040787190198898315, -0.088419109582901, -0.08748151361942291, 0.027524907141923904, 0.04211213439702988, -0.06141750514507294, -0.0724899172782898, 0.03481484204530716, 0.023364350199699402, -0.13330024480819702, 0.027110811322927475, 0.04855768755078316, 0.0030759028159081936, -0.004230567719787359, 0.0325961709022522, 0.083511583507061, 0.20576579868793488, 0.00020747755479533225, 0.0035126712173223495, 0.054180242121219635, 0.2754363417625427, -0.1523919552564621, 0.1261279135942459, 0.11582386493682861, -0.05628293752670288, 0.0858890563249588, 0.1984606832265854, 0.03459503874182701, -0.08104845136404037, 0.014322231523692608, 0.043156519532203674, -0.039604440331459045, -0.2633996605873108, -0.04501243308186531, -0.025872379541397095, -0.0695464089512825, 0.08896328508853912, 0.0838984027504921, 0.10345571488142014, 0.0326971709728241, -0.07095418125391006, -0.0797138512134552, 0.05792110785841942, 0.11802562326192856, -0.05132799968123436, 0.020302319899201393, 0.08306527882814407, -0.047197598963975906, 0.006914446130394936, 0.08762696385383606, -0.006281302776187658, 0.14358140528202057, 0.05257464945316315, 0.12444662302732468, 0.06914252787828445, 0.06877049058675766, 0.0034519657492637634, 0.053128208965063095, -0.007263101637363434, 0.03292576223611832, 0.016154471784830093, -0.09057274460792542, 0.027154648676514626, 0.11054202914237976, 0.010620896704494953, 0.0261711236089468, 0.025273732841014862, -0.0790678933262825, 0.034961458295583725, 0.2158462554216385, 0.02692532353103161, -0.20542433857917786, -0.07897011190652847, 0.056988924741744995, -0.06825610250234604, -0.1519065797328949, -0.010114242322742939, 0.014493370428681374, -0.15202461183071136, 0.013836684636771679, -0.050429344177246094, 0.11177665740251541, -0.06939925998449326, -0.04759884998202324, 0.09826842695474625, 0.05632491037249565, -0.047903914004564285, 0.034415435045957565, -0.18346436321735382, 0.10871147364377975, 0.03304004669189453, 0.0745575949549675, -0.0848151296377182, 0.08513015508651733, 0.0007639020914211869, -0.017988769337534904, 0.1550130546092987, -0.00043773101060651243, -0.07281176745891571, -0.08181023597717285, -0.072193942964077, -0.024777889251708984, 0.0885690301656723, -0.13120169937610626, 0.07264670729637146, -0.019088126718997955, -0.036649446934461594, 0.0019251303747296333, -0.11250454187393188, -0.10406520217657089, -0.1667160987854004, 0.05802864581346512, -0.08013319224119186, 0.003496306948363781, -0.0743221789598465, -0.05195728689432144, 0.031194375827908516, 0.16645489633083344, -0.19271133840084076, -0.115898497402668, -0.14820581674575806, -0.10923478752374649, 0.156250461935997, -0.05000615492463112, 0.08219405263662338, -0.011544501408934593, 0.15233094990253448, -0.01687156781554222, -0.031223447993397713, 0.08919575065374374, -0.08587724715471268, -0.18791624903678894, -0.05242117866873741, 0.1851341873407364, 0.1381605863571167, 0.02496352046728134, -0.020828155800700188, 0.03142014518380165, -0.04748864471912384, -0.10241223871707916, 0.026746954768896103, 0.13647699356079102, 0.061669182032346725, -0.00950334407389164, -0.03350286930799484, -0.1159382164478302, -0.05637725815176964, -0.034676361829042435, -0.014220849610865116, 0.20315253734588623, -0.07312743365764618, 0.1658446341753006, 0.1355094462633133, -0.06540928781032562, -0.20865488052368164, 0.04042050242424011, 0.027259349822998047, 0.015933310613036156, 0.022266117855906487, -0.19179661571979523, 0.07247679680585861, -0.01754053495824337, -0.0737711563706398, 0.17311693727970123, -0.20707862079143524, -0.13338039815425873, 0.09248258918523788, 0.019948231056332588, -0.20403160154819489, -0.14508700370788574, -0.11111684888601303, -0.011416752822697163, -0.13068582117557526, 0.06379712373018265, 0.022076088935136795, 0.012893707491457462, 0.011839169077575207, 0.013473032042384148, 0.04255645349621773, -0.04992612078785896, 0.1917145699262619, -0.02846366912126541, 0.008735325187444687, -0.05922757089138031, -0.10196419060230255, 0.011574484407901764, -0.06782879680395126, 0.11283008754253387, -0.020221877843141556, 0.02301609143614769, -0.1528090387582779, -0.04578140005469322, -0.0714537724852562, 0.015519832260906696, -0.09531879425048828, -0.08455274999141693, -0.049439553171396255, 0.0783919021487236, 0.10328805446624756, -0.02736905962228775, 0.03864118829369545, -0.08202621340751648, 0.09035898000001907, 0.21119530498981476, 0.16400893032550812, 0.04649794101715088, -0.05868382751941681, 0.02372674085199833, -0.034152138978242874, 0.04237493500113487, -0.2166866809129715, 0.037099987268447876, 0.0608285553753376, 0.036562465131282806, 0.08432819694280624, -0.00042154433322139084, -0.16199977695941925, -0.0797998458147049, 0.07719049602746964, -0.05837990343570709, -0.1647234708070755, -0.021720368415117264, 0.02884635515511036, -0.19608011841773987, -0.04525052756071091, 0.04151042923331261, -0.01695541851222515, -0.038237977772951126, 0.02557426318526268, 0.08537513762712479, -0.014354127459228039, 0.10072643309831619, 0.07968520373106003, 0.08924145996570587, -0.09613505005836487, 0.06764718145132065, 0.08637084811925888, -0.03203056380152702, 0.014335314743220806, 0.1503715217113495, -0.0452423095703125, -0.030858734622597694, 0.08546127378940582, 0.11083735525608063, -0.00005333216176950373, -0.03985688462853432, 0.013670684769749641, -0.055868346244096756, 0.07566402107477188, 0.13868381083011627, 0.017355289310216904, -0.010717766359448433, 0.06568524986505508, 0.031095102429389954, -0.09764852374792099, 0.1229134052991867, 0.06805631518363953, 0.026218384504318237, -0.018219830468297005, -0.024859806522727013, -0.017393220216035843, -0.010152953676879406, -0.01540584396570921, -0.006094908341765404, -0.0959668681025505, -0.001990789780393243, -0.11897372454404831, 0.023793812841176987, -0.08940079808235168, 0.0015908803325146437, 0.01214626058936119, -0.0400712788105011, -0.00025378880673088133, -0.006765867117792368, -0.07619301974773407, -0.059000495821237564, -0.03216192498803139, 0.07252002507448196, -0.14047078788280487, 0.02537561021745205, 0.06839318573474884, -0.10962726175785065, 0.059035275131464005, -0.006260691676288843, 0.011807315982878208, -0.002309588948264718, -0.14891605079174042, 0.05358622595667839, -0.026976801455020905, -0.01981113664805889, 0.012001123279333115, -0.1692555993795395, -0.004585957154631615, -0.0511835440993309, -0.0762346088886261, 0.010102638974785805, -0.011340390890836716, -0.12630750238895416, 0.12615086138248444, -0.0063450695015490055, -0.060919396579265594, -0.01892554573714733, 0.05821729823946953, 0.07702402025461197, -0.015600720420479774, 0.0892941877245903, -0.029351623728871346, 0.08385108411312103, -0.1820463389158249, -0.008691578172147274, -0.010815318673849106, 0.033747829496860504, -0.020286059007048607, -0.030831478536128998, 0.0531829334795475, -0.011255898512899876, 0.14676298201084137, -0.006423518061637878, 0.06015009805560112, 0.048124827444553375, 0.006920591462403536, 0.02919268049299717, 0.0696592628955841, 0.04966346547007561, -0.02800079435110092, -0.016267187893390656, 0.037642017006874084, 0.000896480109076947, -0.04671601951122284, -0.12709124386310577, 0.06075132265686989, 0.18692833185195923, 0.08073030412197113, 0.03540117293596268, -0.0002233115810668096, -0.12857462465763092, -0.0877828449010849, 0.09351637214422226, -0.013290531933307648, -0.027559228241443634, -0.06855801492929459, 0.21997161209583282, 0.13481467962265015, -0.19359658658504486, 0.08406678587198257, -0.04253533110022545, -0.03307926282286644, -0.12915627658367157, -0.1629187911748886, -0.053160302340984344, -0.028866155073046684, -0.03541787341237068, -0.06275656819343567, 0.0596289336681366, 0.04133724048733711, 0.006389061454683542, -0.0005353929009288549, 0.09978675842285156, 0.009770453907549381, -0.033808641135692596, 0.05144312605261803, 0.06978347152471542, 0.04913105070590973, -0.08674725145101547, 0.010803880169987679, 0.0005428498261608183, 0.006077948026359081, 0.06205815076828003, 0.02849133498966694, -0.04819479584693909, 0.026295537129044533, -0.011428934521973133, -0.11974438279867172, 0.04715810716152191, -0.003973750863224268, -0.017049450427293777, 0.1537330150604248, 0.031823933124542236, 0.001920440699905157, -0.016450049355626106, 0.2280317097902298, -0.07207660377025604, -0.07965082675218582, -0.12652109563350677, 0.07480741292238235, -0.04907325282692909, 0.026031091809272766, 0.01771274022758007, -0.12830325961112976, 0.011899122968316078, 0.1622019112110138, 0.13475745916366577, 0.0024472996592521667, 0.008293789811432362, 0.04171401262283325, 0.009172188118100166, -0.011490153148770332, 0.015049811452627182, 0.03651893138885498, 0.21362726390361786, -0.07320795208215714, 0.07639545202255249, -0.011993059888482094, -0.07167233526706696, -0.024480629712343216, 0.12952014803886414, -0.012061450630426407, -0.007921954616904259, -0.059930287301540375, 0.13771551847457886, -0.06249047815799713, -0.21576860547065735, 0.0640716478228569, -0.09252475202083588, -0.13166746497154236, -0.038371820002794266, 0.011393394321203232, -0.031109139323234558, 0.015293916687369347, 0.07246529310941696, -0.051936712116003036, 0.16368438303470612, 0.028651461005210876, -0.06270486116409302, -0.10049844533205032, 0.05132393166422844, -0.1389930099248886, 0.2862507700920105, 0.022998498752713203, 0.02687516249716282, 0.10852425545454025, -0.01905406080186367, -0.14447781443595886, 0.00836025271564722, 0.1040535718202591, -0.061888206750154495, 0.048319220542907715, 0.16283240914344788, -0.008894095197319984, 0.12235795706510544, 0.051420800387859344, -0.059867892414331436, 0.031131945550441742, -0.07197175174951553, -0.05715079978108406, -0.12105903029441833, 0.06780505180358887, -0.07448333501815796, 0.1483479142189026, 0.12549027800559998, -0.06655856966972351, -0.004834878258407116, -0.01689787395298481, 0.07297565788030624, 0.015770042315125465, 0.12326548248529434, 0.013568950816988945, -0.1786518096923828, 0.04732469469308853, 0.006347086280584335, 0.11393843591213226, -0.21668830513954163, -0.06368950009346008, 0.046959761530160904, -0.027408968657255173, -0.0887913927435875, 0.12153337150812149, 0.04261757805943489, 0.022057300433516502, -0.029619643464684486, -0.09196826815605164, 0.01616792380809784, 0.15653520822525024, -0.10153105109930038, -0.015052146278321743 ]
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # IKI-Category-multilabel This model is a fine-tuned version of [sentence-transformers/all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.4712 - Precision-micro: 0.7365 - Precision-samples: 0.7665 - Precision-weighted: 0.7428 - Recall-micro: 0.7899 - Recall-samples: 0.7987 - Recall-weighted: 0.7899 - F1-micro: 0.7622 - F1-samples: 0.7537 - F1-weighted: 0.7614 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5.86e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 200 - num_epochs: 20 ### Training results | Training Loss | Epoch | Step | Validation Loss | Precision-micro | Precision-samples | Precision-weighted | Recall-micro | Recall-samples | Recall-weighted | F1-micro | F1-samples | F1-weighted | |:-------------:|:-----:|:----:|:---------------:|:---------------:|:-----------------:|:------------------:|:------------:|:--------------:|:---------------:|:--------:|:----------:|:-----------:| | 0.913 | 1.0 | 94 | 0.8956 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.8069 | 2.0 | 188 | 0.7584 | 0.3594 | 0.2160 | 0.3414 | 0.2778 | 0.2514 | 0.2778 | 0.3134 | 0.2076 | 0.2539 | | 0.6672 | 3.0 | 282 | 0.6557 | 0.3630 | 0.4096 | 0.4762 | 0.6014 | 0.6049 | 0.6014 | 0.4527 | 0.4453 | 0.4592 | | 0.5371 | 4.0 | 376 | 0.5725 | 0.5128 | 0.5412 | 0.5838 | 0.6304 | 0.6561 | 0.6304 | 0.5655 | 0.5553 | 0.5815 | | 0.4334 | 5.0 | 470 | 0.5177 | 0.5315 | 0.5918 | 0.5666 | 0.6932 | 0.7160 | 0.6932 | 0.6017 | 0.6084 | 0.6085 | | 0.346 | 6.0 | 564 | 0.4782 | 0.5792 | 0.6473 | 0.6128 | 0.7246 | 0.7595 | 0.7246 | 0.6438 | 0.6570 | 0.6558 | | 0.2812 | 7.0 | 658 | 0.4750 | 0.6263 | 0.6777 | 0.6563 | 0.7246 | 0.7588 | 0.7246 | 0.6719 | 0.6808 | 0.6760 | | 0.2331 | 8.0 | 752 | 0.4544 | 0.6660 | 0.7216 | 0.6746 | 0.7609 | 0.7931 | 0.7609 | 0.7103 | 0.7211 | 0.7089 | | 0.1916 | 9.0 | 846 | 0.4534 | 0.6036 | 0.6657 | 0.6199 | 0.8164 | 0.8300 | 0.8164 | 0.6940 | 0.7022 | 0.6989 | | 0.1599 | 10.0 | 940 | 0.4369 | 0.6913 | 0.7376 | 0.7083 | 0.7899 | 0.8113 | 0.7899 | 0.7373 | 0.7372 | 0.7402 | | 0.1331 | 11.0 | 1034 | 0.4501 | 0.7035 | 0.7433 | 0.7181 | 0.7850 | 0.8035 | 0.7850 | 0.7420 | 0.7397 | 0.7428 | | 0.114 | 12.0 | 1128 | 0.4314 | 0.6954 | 0.7484 | 0.7066 | 0.7995 | 0.8137 | 0.7995 | 0.7438 | 0.7460 | 0.7456 | | 0.099 | 13.0 | 1222 | 0.4752 | 0.7175 | 0.7435 | 0.7255 | 0.7729 | 0.78 | 0.7729 | 0.7442 | 0.7304 | 0.7426 | | 0.0862 | 14.0 | 1316 | 0.4682 | 0.6936 | 0.7346 | 0.7063 | 0.7874 | 0.7943 | 0.7874 | 0.7376 | 0.7319 | 0.7380 | | 0.0756 | 15.0 | 1410 | 0.4735 | 0.7095 | 0.7435 | 0.7191 | 0.7729 | 0.7841 | 0.7729 | 0.7399 | 0.7327 | 0.7383 | | 0.0683 | 16.0 | 1504 | 0.4656 | 0.7246 | 0.7567 | 0.7351 | 0.7754 | 0.7892 | 0.7754 | 0.7491 | 0.7422 | 0.7493 | | 0.0624 | 17.0 | 1598 | 0.4800 | 0.7332 | 0.7665 | 0.7389 | 0.7899 | 0.7984 | 0.7899 | 0.7605 | 0.7535 | 0.7586 | | 0.0597 | 18.0 | 1692 | 0.4578 | 0.7426 | 0.7686 | 0.7477 | 0.7874 | 0.7977 | 0.7874 | 0.7644 | 0.7539 | 0.7635 | | 0.0568 | 19.0 | 1786 | 0.4671 | 0.7371 | 0.7693 | 0.7440 | 0.7923 | 0.8018 | 0.7923 | 0.7637 | 0.7564 | 0.7628 | | 0.0557 | 20.0 | 1880 | 0.4712 | 0.7365 | 0.7665 | 0.7428 | 0.7899 | 0.7987 | 0.7899 | 0.7622 | 0.7537 | 0.7614 | ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.0+cu121 - Datasets 2.17.0 - Tokenizers 0.15.1
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "base_model": "sentence-transformers/all-mpnet-base-v2", "model-index": [{"name": "IKI-Category-multilabel", "results": []}]}
text-classification
ppsingh/IKI-Category-multilabel
[ "transformers", "tensorboard", "safetensors", "mpnet", "text-classification", "generated_from_trainer", "base_model:sentence-transformers/all-mpnet-base-v2", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2024-02-13T20:40:52+00:00
[]
[]
TAGS #transformers #tensorboard #safetensors #mpnet #text-classification #generated_from_trainer #base_model-sentence-transformers/all-mpnet-base-v2 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
IKI-Category-multilabel ======================= This model is a fine-tuned version of sentence-transformers/all-mpnet-base-v2 on the None dataset. It achieves the following results on the evaluation set: * Loss: 0.4712 * Precision-micro: 0.7365 * Precision-samples: 0.7665 * Precision-weighted: 0.7428 * Recall-micro: 0.7899 * Recall-samples: 0.7987 * Recall-weighted: 0.7899 * F1-micro: 0.7622 * F1-samples: 0.7537 * F1-weighted: 0.7614 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 5.86e-05 * train\_batch\_size: 8 * eval\_batch\_size: 8 * seed: 42 * gradient\_accumulation\_steps: 2 * total\_train\_batch\_size: 16 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * lr\_scheduler\_warmup\_steps: 200 * num\_epochs: 20 ### Training results ### Framework versions * Transformers 4.35.2 * Pytorch 2.1.0+cu121 * Datasets 2.17.0 * Tokenizers 0.15.1
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5.86e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 16\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 200\n* num\\_epochs: 20", "### Training results", "### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1" ]
[ "TAGS\n#transformers #tensorboard #safetensors #mpnet #text-classification #generated_from_trainer #base_model-sentence-transformers/all-mpnet-base-v2 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5.86e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 16\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 200\n* num\\_epochs: 20", "### Training results", "### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1" ]
[ 77, 145, 4, 33 ]
[ "passage: TAGS\n#transformers #tensorboard #safetensors #mpnet #text-classification #generated_from_trainer #base_model-sentence-transformers/all-mpnet-base-v2 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5.86e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 16\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 200\n* num\\_epochs: 20### Training results### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1" ]
[ -0.11495015025138855, 0.15706494450569153, -0.002509945072233677, 0.0645434558391571, 0.12168026715517044, 0.008515747264027596, 0.11904439330101013, 0.13272637128829956, -0.11751610785722733, 0.09737222641706467, 0.132240429520607, 0.09354675561189651, 0.04658868908882141, 0.14282426238059998, -0.021769538521766663, -0.28309527039527893, 0.014103545807301998, 0.00022772251395508647, -0.1708340048789978, 0.12917040288448334, 0.09462205320596695, -0.10197700560092926, 0.06529652327299118, 0.009217599406838417, -0.13557438552379608, -0.01565863937139511, -0.04129604995250702, -0.04939073324203491, 0.10721245408058167, 0.037127718329429626, 0.09358333051204681, 0.03063798137009144, 0.07496551424264908, -0.2333490550518036, 0.008545559830963612, 0.09071210771799088, 0.010170087218284607, 0.09881152957677841, 0.06979893893003464, -0.001191137358546257, 0.1518978327512741, -0.08986864984035492, 0.06664309650659561, 0.037611205130815506, -0.08061858266592026, -0.24004922807216644, -0.09150931984186172, 0.08658915013074875, 0.12893950939178467, 0.08950481563806534, -0.022027403116226196, 0.11848321557044983, -0.07878438383340836, 0.09463370591402054, 0.2430058866739273, -0.28791987895965576, -0.07952830940485, 0.05669751763343811, 0.06712563335895538, 0.058201052248477936, -0.1115824282169342, -0.024608660489320755, 0.04227657988667488, 0.01671653985977173, 0.11019432544708252, 0.01468093041330576, 0.059597574174404144, 0.0011880429228767753, -0.17243656516075134, -0.040561698377132416, 0.1456722766160965, 0.09083404392004013, -0.013842551968991756, -0.08983905613422394, -0.05748786777257919, -0.22840864956378937, -0.03216234967112541, -0.0064851404167711735, 0.0381464958190918, -0.043446317315101624, -0.08980453759431839, 0.014420835301280022, -0.0799824595451355, -0.07711321860551834, 0.01542198471724987, 0.12470876425504684, 0.05955030024051666, -0.023556213825941086, 0.011369789019227028, 0.12561805546283722, 0.04162825271487236, -0.1425594836473465, 0.00553198391571641, 0.016367103904485703, -0.06198234111070633, -0.024275874719023705, -0.02644512616097927, -0.020085202530026436, 0.02580924890935421, 0.1474171280860901, -0.036619506776332855, 0.059279993176460266, 0.045424822717905045, 0.046964533627033234, -0.08961806446313858, 0.13928097486495972, -0.08514425903558731, -0.06440470367670059, -0.03012247197329998, 0.10661838203668594, 0.024069391191005707, -0.015562481246888638, -0.09078172594308853, 0.052108217030763626, 0.12208434194326401, 0.040609464049339294, -0.010081861168146133, 0.050087809562683105, -0.04389740526676178, -0.028981365263462067, 0.05539415776729584, -0.0923318937420845, 0.03399941697716713, 0.02201511338353157, -0.08327463269233704, -0.030825713649392128, 0.022355295717716217, 0.019303422421216965, -0.014493155293166637, 0.13098743557929993, -0.10086569935083389, -0.01987246423959732, -0.08556843549013138, -0.10052845627069473, 0.03751356899738312, -0.0494314469397068, -0.011769264936447144, -0.08963542431592941, -0.10841434448957443, -0.0431659072637558, 0.06526770442724228, -0.03473949804902077, -0.08543595671653748, -0.05771429091691971, -0.11536601185798645, 0.043661389499902725, -0.013229123316705227, 0.14054365456104279, -0.060021739453077316, 0.10141556710004807, 0.00716404290869832, 0.04789688065648079, 0.05393003672361374, 0.04583263024687767, -0.054456111043691635, 0.055489715188741684, -0.17295221984386444, 0.050080012530088425, -0.10371314734220505, 0.06532712280750275, -0.12538745999336243, -0.11028748005628586, -0.007612817920744419, -0.007253963965922594, 0.06746834516525269, 0.11577354371547699, -0.15373361110687256, -0.09350291639566422, 0.18233469128608704, -0.08314727991819382, -0.12437102943658829, 0.11186143755912781, -0.0153714744374156, -0.03196541965007782, 0.0372331477701664, 0.14144447445869446, 0.10304651409387589, -0.0893731415271759, -0.01790754310786724, -0.0191528107970953, 0.10074343532323837, -0.01580585539340973, 0.10719111561775208, -0.016913071274757385, 0.047842562198638916, 0.0007075046887621284, -0.08275537192821503, 0.05381697788834572, -0.09597072750329971, -0.085270456969738, -0.021571576595306396, -0.08692105859518051, 0.0274602510035038, 0.057633016258478165, 0.05666746944189072, -0.08826755732297897, -0.1446482241153717, 0.03023679368197918, 0.12651361525058746, -0.08586503565311432, 0.026510467752814293, -0.06838783621788025, 0.06578269600868225, -0.03078792430460453, 0.0020530305337160826, -0.14945165812969208, -0.05528520047664642, 0.023571982979774475, -0.06072456017136574, 0.006338444538414478, -0.007071300875395536, 0.0608663484454155, 0.07308399677276611, -0.05912706255912781, -0.057016462087631226, -0.051601048558950424, -0.0010948117123916745, -0.09263328462839127, -0.26017385721206665, -0.06427637487649918, -0.020612196996808052, 0.17658469080924988, -0.2623749077320099, 0.04231833294034004, 0.018522242084145546, 0.14481939375400543, 0.040225811302661896, -0.046386461704969406, -0.027415109798312187, 0.04806177318096161, -0.04981246590614319, -0.07883691787719727, 0.03507322445511818, 0.007213471923023462, -0.1177629828453064, -0.01845894753932953, -0.12329071015119553, 0.14120890200138092, 0.12252028286457062, -0.00035128885065205395, -0.08682829886674881, -0.04171226546168327, -0.07917536795139313, -0.05450472608208656, -0.007916072383522987, 0.004072021227329969, 0.09154374152421951, 0.027625076472759247, 0.13483163714408875, -0.08323518931865692, -0.06190848723053932, 0.035410501062870026, -0.009088373742997646, -0.008501109667122364, 0.14169622957706451, 0.10159289836883545, -0.08243223279714584, 0.1424224078655243, 0.11427032202482224, -0.08114683628082275, 0.12446867674589157, -0.05754101276397705, -0.08538367599248886, -0.024538343772292137, 0.04392727091908455, 0.03072761371731758, 0.11360307037830353, -0.13334614038467407, -0.01061067171394825, 0.01101268082857132, 0.03133922815322876, 0.03238382190465927, -0.19153903424739838, -0.014638862572610378, 0.03262857347726822, -0.052492231130599976, 0.012584523297846317, -0.0334470197558403, -0.02430449053645134, 0.09580165147781372, 0.01625489816069603, -0.06991160660982132, 0.0045647164806723595, -0.011007078923285007, -0.08797833323478699, 0.21341753005981445, -0.10313290357589722, -0.1522108018398285, -0.13488714396953583, -0.0047721038572490215, -0.060043878853321075, -0.004109757021069527, 0.040563829243183136, -0.09759150445461273, -0.026192134246230125, -0.09284985065460205, 0.0162594523280859, -0.016921816393733025, 0.034152351319789886, 0.01070825569331646, 0.030581045895814896, 0.07719841599464417, -0.11081623286008835, 0.030655423179268837, -0.030804790556430817, -0.04185117408633232, 0.018314464017748833, 0.03295225277543068, 0.09024247527122498, 0.1866745948791504, 0.03408517688512802, 0.02019079215824604, -0.022250497713685036, 0.16756226122379303, -0.10571518540382385, -0.007263400126248598, 0.08921144902706146, 0.0050116912461817265, 0.053985122591257095, 0.13085004687309265, 0.048421964049339294, -0.07840169221162796, 0.0345010831952095, 0.05964001640677452, -0.018066609278321266, -0.2319001406431198, -0.0447189025580883, -0.044127993285655975, 0.017540493980050087, 0.11205533146858215, 0.056990671902894974, 0.020983872935175896, 0.042781006544828415, -0.018809860572218895, 0.00952677708119154, 0.01064478699117899, 0.07975292205810547, 0.043094292283058167, 0.0329192578792572, 0.13028667867183685, -0.03153768181800842, -0.033924300223588943, 0.02948812022805214, -0.0018098570872098207, 0.22480104863643646, -0.022981666028499603, 0.1477539837360382, 0.06223350390791893, 0.17889928817749023, 0.0030751246958971024, 0.055065639317035675, 0.00908602587878704, -0.03056846186518669, 0.010958252474665642, -0.059866245836019516, -0.015318159945309162, 0.052876751869916916, 0.019556894898414612, 0.04645673558115959, -0.1095510944724083, 0.0357556976377964, 0.04343718662858009, 0.3154299557209015, 0.06749365478754044, -0.31668907403945923, -0.09191422164440155, 0.0057648057118058205, -0.06860662251710892, -0.028985396027565002, 0.029576953500509262, 0.13151000440120697, -0.09340795874595642, 0.055130474269390106, -0.07250112295150757, 0.0908479243516922, -0.04970764368772507, -0.004233356099575758, 0.0864926353096962, 0.09256561845541, -0.005019691772758961, 0.07702657580375671, -0.2403205782175064, 0.2779272496700287, -0.012654968537390232, 0.051428232342004776, -0.04546063393354416, 0.03022860549390316, 0.028683632612228394, 0.053109634667634964, 0.08134873956441879, -0.002903042593970895, -0.03749805688858032, -0.1405748575925827, -0.11071822047233582, 0.007342171389609575, 0.11859609186649323, -0.09675420820713043, 0.10809925198554993, -0.023858224973082542, -0.024773741140961647, 0.06043868511915207, -0.03156052529811859, -0.06012344732880592, -0.10160037130117416, 0.01988668367266655, -0.011845995672047138, 0.025991519913077354, -0.10474962741136551, -0.12012472748756409, -0.10192190110683441, 0.20910309255123138, -0.13006611168384552, -0.02701021172106266, -0.11490002274513245, 0.07894277572631836, 0.12471570819616318, -0.08116471767425537, 0.04445227235555649, 0.0004329448565840721, 0.13746190071105957, 0.022942325100302696, -0.027776822447776794, 0.12394504249095917, -0.08636961877346039, -0.2294761687517166, -0.058796338737010956, 0.1611378788948059, 0.037635546177625656, 0.05356091260910034, -0.04000482335686684, 0.012197081930935383, -0.015546822920441628, -0.08699639141559601, 0.02904418669641018, 0.024559231474995613, 0.05320994183421135, 0.034381791949272156, -0.03365379944443703, 0.01864440180361271, -0.058986105024814606, -0.05227132514119148, 0.13562913239002228, 0.3082290291786194, -0.09332104027271271, 0.012920376844704151, 0.062288299202919006, -0.0400930717587471, -0.1722509115934372, 0.02836497686803341, 0.11264856904745102, 0.005593244452029467, 0.0050029451958835125, -0.1731136590242386, 0.09202396124601364, 0.1108202114701271, -0.037314582616090775, 0.12046368420124054, -0.31805408000946045, -0.13484860956668854, 0.09917096793651581, 0.10058514028787613, -0.02243122085928917, -0.17980913817882538, -0.06116833910346031, -0.01293210033327341, -0.1012125164270401, 0.10761237144470215, -0.0665854811668396, 0.109072744846344, 0.0038218609988689423, 0.015406684949994087, 0.0030022496357560158, -0.0659705176949501, 0.1378728151321411, 0.006649738643318415, 0.06945214420557022, -0.027944765985012054, 0.012255633249878883, 0.02620348148047924, -0.07300817966461182, 0.010996676050126553, -0.09960821270942688, 0.035524412989616394, -0.09693235158920288, -0.01474188081920147, -0.08928472548723221, 0.03545166179537773, -0.05847649648785591, -0.04443071410059929, -0.04887080937623978, 0.056252021342515945, 0.06593014299869537, -0.0024258012417703867, 0.17476166784763336, -0.001577008981257677, 0.15749238431453705, 0.1418292373418808, 0.05247460678219795, -0.037998419255018234, -0.09368427097797394, -0.012098354287445545, -0.009440340101718903, 0.06497718393802643, -0.15000994503498077, 0.024274984374642372, 0.13492770493030548, 0.034710843116045, 0.13916127383708954, 0.05839744955301285, -0.0792304053902626, -0.020162072032690048, 0.07542381435632706, -0.11487537622451782, -0.09983422607183456, -0.015145310200750828, 0.01206812635064125, -0.14085730910301208, 0.06697064638137817, 0.10440845787525177, -0.07011587917804718, 0.0023456080816686153, 0.017309732735157013, 0.032680340111255646, -0.038325726985931396, 0.1977643370628357, 0.06250955909490585, 0.08930030465126038, -0.09541147947311401, 0.09615004807710648, 0.056657347828149796, -0.1333044171333313, 0.021639322862029076, 0.09424905478954315, -0.08138900995254517, -0.0211055688560009, 0.028411900624632835, 0.07824873179197311, -0.00848389882594347, -0.05138573795557022, -0.12288044393062592, -0.11780889332294464, 0.08076319098472595, 0.09365417808294296, 0.06783229112625122, 0.03482035547494888, -0.021616913378238678, 0.04651251435279846, -0.1234237551689148, 0.11391232162714005, 0.07104265689849854, 0.09187242388725281, -0.17347092926502228, 0.1361209899187088, 0.021199055016040802, 0.009777242317795753, -0.0071490067057311535, 0.027053581550717354, -0.11136291176080704, -0.001422106521204114, -0.07745175808668137, -0.039137598127126694, -0.06912552565336227, -0.007610756438225508, 0.001693588332273066, -0.047013502568006516, -0.060695625841617584, 0.01703178696334362, -0.10501152276992798, -0.048304811120033264, -0.005465096328407526, 0.0455065593123436, -0.1234847903251648, -0.008482549339532852, 0.031053174287080765, -0.11555575579404831, 0.08753133565187454, 0.02604280412197113, 0.045176900923252106, 0.026355888694524765, -0.08126819133758545, 0.042201898992061615, 0.04927336424589157, -0.008193264715373516, 0.037236958742141724, -0.14360389113426208, -0.009406295605003834, -0.03158969804644585, 0.028276175260543823, 0.00009063445759238675, 0.029098989441990852, -0.1393224000930786, -0.022269541397690773, -0.0482456348836422, -0.046899620443582535, -0.05818725377321243, 0.041912250220775604, 0.06363987177610397, 0.015537910163402557, 0.19296684861183167, -0.06155501306056976, 0.028731755912303925, -0.21546031534671783, 0.0021253558807075024, -0.003333902917802334, -0.07549455016851425, -0.05083492398262024, -0.004862564150243998, 0.0658179298043251, -0.07489307969808578, 0.10658984631299973, -0.04774032533168793, 0.01801275461912155, 0.042604733258485794, -0.07663747668266296, 0.03307412564754486, 0.05904454365372658, 0.17437884211540222, 0.04012222960591316, -0.01607516221702099, 0.04587416723370552, 0.013676189817488194, 0.07989230751991272, 0.041569631546735764, 0.1711719036102295, 0.13445347547531128, -0.00917122047394514, 0.10618335008621216, 0.055614173412323, -0.10766513645648956, -0.1669999063014984, 0.11364265531301498, -0.05847044661641121, 0.13435986638069153, -0.01569943130016327, 0.19152629375457764, 0.12985406816005707, -0.19153591990470886, 0.019130341708660126, -0.03451969847083092, -0.08735649287700653, -0.09678121656179428, -0.0811552032828331, -0.07670769095420837, -0.19494271278381348, 0.004567521158605814, -0.11356007307767868, 0.00560830207541585, 0.06839879602193832, 0.035452306270599365, 0.01566343754529953, 0.14151670038700104, 0.033804867416620255, 0.0023318787571042776, 0.07484264671802521, 0.027614573016762733, -0.03332800418138504, -0.04115547984838486, -0.08775793015956879, 0.020137954503297806, -0.02448315918445587, 0.05706046521663666, -0.05389612913131714, -0.07574353367090225, 0.08622577041387558, 0.01935357227921486, -0.09501256048679352, 0.02103707194328308, -0.0016862208722159266, 0.05343851074576378, 0.07210848480463028, 0.00170545291621238, -0.01065285224467516, -0.029350044205784798, 0.2420814484357834, -0.10602010041475296, -0.0505882203578949, -0.1340397298336029, 0.23055793344974518, 0.012458688579499722, -0.005686833523213863, 0.014272934757173061, -0.08714623749256134, -0.019581317901611328, 0.1718081831932068, 0.19822506606578827, -0.03130514919757843, -0.02024914138019085, 0.03953511640429497, -0.01694582588970661, -0.05163036659359932, 0.07303464412689209, 0.14056557416915894, 0.07986512780189514, -0.06557317078113556, -0.03377566114068031, -0.039515066891908646, -0.037152472883462906, -0.024880798533558846, 0.06642864644527435, 0.04097985848784447, -0.005099382717162371, -0.02813085913658142, 0.08208343386650085, -0.05931435152888298, -0.14158155024051666, 0.04460303112864494, -0.18238800764083862, -0.18639764189720154, -0.03059692308306694, 0.08768457174301147, -0.0039022380951792, 0.059032414108514786, 0.013513526879251003, -0.03532771021127701, 0.07634293287992477, -0.010011368431150913, -0.04788912087678909, -0.12092726677656174, 0.07226423174142838, -0.07226664572954178, 0.21242879331111908, -0.04629220813512802, 0.017604151740670204, 0.13013628125190735, 0.03367399051785469, -0.08375052362680435, 0.02845386229455471, 0.08116339147090912, -0.10192488878965378, 0.03381403163075447, 0.13366246223449707, -0.032994747161865234, 0.1331944316625595, 0.05337746813893318, -0.1235184371471405, -0.007322425022721291, -0.10526405274868011, -0.044266797602176666, -0.07694225758314133, -0.0004878207400906831, -0.05039086937904358, 0.1380009651184082, 0.21343426406383514, -0.061595574021339417, -0.006194178946316242, -0.06551839411258698, 0.011871482245624065, 0.03211534395813942, 0.10021837800741196, -0.0018176025478169322, -0.2821711301803589, 0.03457779809832573, 0.006267037242650986, 0.0231146402657032, -0.25454995036125183, -0.08244301378726959, 0.027642911300063133, -0.06092638149857521, -0.0800480991601944, 0.09410346299409866, 0.07418159395456314, 0.04873276874423027, -0.06818170100450516, -0.06637730449438095, -0.06498032808303833, 0.16817814111709595, -0.17592929303646088, -0.08272360265254974 ]
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # Whisper-tiny-hy-erebuni-13 This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on the erebuni dataset. It achieves the following results on the evaluation set: - Loss: 0.1063 - Wer: 21.5645 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3.75e-05 - train_batch_size: 4 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 500 - training_steps: 2000 ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:-----:|:----:|:---------------:|:-------:| | 0.6736 | 0.61 | 100 | 0.6259 | 94.9886 | | 0.2628 | 1.22 | 200 | 0.3047 | 63.6854 | | 0.1711 | 1.83 | 300 | 0.1949 | 48.7081 | | 0.0776 | 2.44 | 400 | 0.1534 | 40.5877 | | 0.074 | 3.05 | 500 | 0.1348 | 36.2436 | | 0.0465 | 3.66 | 600 | 0.1244 | 32.4248 | | 0.0298 | 4.27 | 700 | 0.1164 | 29.7416 | | 0.0372 | 4.88 | 800 | 0.1110 | 29.3441 | | 0.0166 | 5.49 | 900 | 0.1136 | 29.2589 | | 0.017 | 6.1 | 1000 | 0.1049 | 25.9796 | | 0.0122 | 6.71 | 1100 | 0.1046 | 24.1482 | | 0.0053 | 7.32 | 1200 | 0.1054 | 23.5662 | | 0.0104 | 7.93 | 1300 | 0.1060 | 22.9131 | | 0.0027 | 8.54 | 1400 | 0.1055 | 22.2317 | | 0.0013 | 9.15 | 1500 | 0.1049 | 22.4020 | | 0.0024 | 9.76 | 1600 | 0.1051 | 21.3089 | | 0.0003 | 10.37 | 1700 | 0.1055 | 21.6354 | | 0.0005 | 10.98 | 1800 | 0.1053 | 21.5503 | | 0.0006 | 11.59 | 1900 | 0.1062 | 21.6212 | | 0.0003 | 12.2 | 2000 | 0.1063 | 21.5645 | ### Framework versions - Transformers 4.35.0 - Pytorch 2.0.0 - Datasets 2.17.1.dev0 - Tokenizers 0.14.1
{"language": ["hy"], "license": "apache-2.0", "tags": ["generated_from_trainer"], "metrics": ["wer"], "base_model": "openai/whisper-tiny", "model-index": [{"name": "Whisper-tiny-hy-erebuni-13", "results": []}]}
automatic-speech-recognition
lukarape/whisper-tiny-hy-erebuni-13
[ "transformers", "tensorboard", "safetensors", "whisper", "automatic-speech-recognition", "generated_from_trainer", "hy", "base_model:openai/whisper-tiny", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2024-02-13T20:51:26+00:00
[]
[ "hy" ]
TAGS #transformers #tensorboard #safetensors #whisper #automatic-speech-recognition #generated_from_trainer #hy #base_model-openai/whisper-tiny #license-apache-2.0 #endpoints_compatible #region-us
Whisper-tiny-hy-erebuni-13 ========================== This model is a fine-tuned version of openai/whisper-tiny on the erebuni dataset. It achieves the following results on the evaluation set: * Loss: 0.1063 * Wer: 21.5645 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 3.75e-05 * train\_batch\_size: 4 * eval\_batch\_size: 8 * seed: 42 * gradient\_accumulation\_steps: 4 * total\_train\_batch\_size: 16 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * lr\_scheduler\_warmup\_steps: 500 * training\_steps: 2000 ### Training results ### Framework versions * Transformers 4.35.0 * Pytorch 2.0.0 * Datasets 2.17.1.dev0 * Tokenizers 0.14.1
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3.75e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 16\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* training\\_steps: 2000", "### Training results", "### Framework versions\n\n\n* Transformers 4.35.0\n* Pytorch 2.0.0\n* Datasets 2.17.1.dev0\n* Tokenizers 0.14.1" ]
[ "TAGS\n#transformers #tensorboard #safetensors #whisper #automatic-speech-recognition #generated_from_trainer #hy #base_model-openai/whisper-tiny #license-apache-2.0 #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3.75e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 16\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* training\\_steps: 2000", "### Training results", "### Framework versions\n\n\n* Transformers 4.35.0\n* Pytorch 2.0.0\n* Datasets 2.17.1.dev0\n* Tokenizers 0.14.1" ]
[ 70, 144, 4, 33 ]
[ "passage: TAGS\n#transformers #tensorboard #safetensors #whisper #automatic-speech-recognition #generated_from_trainer #hy #base_model-openai/whisper-tiny #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3.75e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 16\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* training\\_steps: 2000### Training results### Framework versions\n\n\n* Transformers 4.35.0\n* Pytorch 2.0.0\n* Datasets 2.17.1.dev0\n* Tokenizers 0.14.1" ]
[ -0.14764222502708435, 0.11343313753604889, -0.0009530447423458099, 0.06082560867071152, 0.11634533107280731, -0.011388487182557583, 0.1032494306564331, 0.151529923081398, -0.11551154404878616, 0.06780864298343658, 0.10391558706760406, 0.08246880024671555, 0.04303893819451332, 0.15560169517993927, -0.041334595531225204, -0.2812199890613556, 0.009196208789944649, -0.0021633158903568983, -0.16118894517421722, 0.1130540519952774, 0.0994323343038559, -0.1284460574388504, 0.046730682253837585, 0.03233664110302925, -0.1423385590314865, 0.007979624904692173, -0.010264873504638672, -0.07426848262548447, 0.11325894296169281, 0.020526878535747528, 0.07232945412397385, 0.04556770250201225, 0.08387945592403412, -0.20042897760868073, 0.011471916921436787, 0.06891850382089615, 0.04548097401857376, 0.0858597680926323, 0.047059398144483566, -0.00984172523021698, 0.09748997539281845, -0.06058383733034134, 0.07468593865633011, 0.06631772220134735, -0.09068252146244049, -0.35690560936927795, -0.10134795308113098, 0.06917618215084076, 0.12760555744171143, 0.07447679340839386, -0.02457614243030548, 0.11719150841236115, -0.051867879927158356, 0.08019038289785385, 0.2567170560359955, -0.2628341317176819, -0.07329542934894562, -0.005378689616918564, 0.04362005740404129, 0.028080711141228676, -0.10724260658025742, 0.003505548695102334, 0.06564738601446152, 0.02520241215825081, 0.10149062424898148, 0.010571207851171494, 0.0008352053700946271, 0.015684228390455246, -0.13431598246097565, -0.03802720457315445, 0.15324001014232635, 0.06946440041065216, -0.05488191545009613, -0.09471751749515533, -0.031379591673612595, -0.18936264514923096, -0.04389304667711258, -0.007193692494183779, 0.04638278856873512, -0.0740479975938797, -0.13933812081813812, 0.007032882887870073, -0.08508690446615219, -0.10752131044864655, 0.030983522534370422, 0.19291940331459045, 0.05099044740200043, -0.0032293926924467087, -0.014717030338943005, 0.11825753003358841, 0.04679163545370102, -0.1649293452501297, -0.030943509191274643, 0.034996889531612396, -0.0627320185303688, -0.013462034054100513, -0.052045296877622604, -0.02065224014222622, 0.028361039236187935, 0.1622561514377594, -0.07996553182601929, 0.06126090884208679, 0.031060000881552696, 0.02231471799314022, -0.10755430161952972, 0.19242613017559052, -0.07293469458818436, -0.04106643423438072, -0.013864320702850819, 0.11841793358325958, 0.030319498851895332, -0.011663935147225857, -0.0804206132888794, 0.05214381217956543, 0.09080782532691956, 0.04917481541633606, -0.029665136709809303, 0.033135317265987396, -0.036626458168029785, -0.018884487450122833, 0.01599373109638691, -0.1076233759522438, 0.0017474791966378689, 0.008273215964436531, -0.10224904119968414, -0.06619324535131454, 0.008781763724982738, 0.02478759177029133, 0.011393599212169647, 0.11015056073665619, -0.08859638124704361, -0.019320709630846977, -0.08092189580202103, -0.0862676277756691, 0.008005674928426743, -0.02318919450044632, 0.0013809215743094683, -0.07235943526029587, -0.13693128526210785, -0.03163828328251839, 0.049529798328876495, -0.03440176695585251, -0.06520991772413254, -0.05085282400250435, -0.10457908362150192, 0.027372127398848534, -0.016160964965820312, 0.12032022327184677, -0.04930731654167175, 0.10635808110237122, 0.07252871990203857, 0.06102736294269562, 0.0359242744743824, 0.03888678178191185, -0.06868500262498856, 0.04568261653184891, -0.18077343702316284, 0.06099015846848488, -0.09579628705978394, 0.10539872199296951, -0.1157759577035904, -0.11804847419261932, 0.028760366141796112, 0.0015168949030339718, 0.08818907290697098, 0.12493500858545303, -0.15650038421154022, -0.11434581875801086, 0.1833774894475937, -0.09118850529193878, -0.15829600393772125, 0.12338462471961975, -0.016492187976837158, 0.01521229650825262, 0.05220190808176994, 0.17496640980243683, 0.07511360198259354, -0.10007082670927048, -0.011766305193305016, -0.032381702214479446, 0.10678652673959732, 0.0016981434309855103, 0.0915232002735138, -0.041074272245168686, 0.029689887538552284, 0.02314871735870838, -0.06171424314379692, 0.029187550768256187, -0.10056696087121964, -0.09553729742765427, -0.03188392519950867, -0.11417827010154724, 0.03152066469192505, 0.06032910197973251, 0.07356560975313187, -0.11739731580018997, -0.1274656057357788, 0.047877367585897446, 0.13820748031139374, -0.0813739001750946, 0.023831037804484367, -0.10358607769012451, 0.0761364996433258, -0.028330782428383827, -0.01622573658823967, -0.14441567659378052, 0.003177795559167862, 0.02429630048573017, -0.05725228786468506, 0.008793122135102749, -0.037242431193590164, 0.08672793954610825, 0.060455549508333206, -0.07535190135240555, -0.06917613744735718, -0.0644778162240982, -0.005034739151597023, -0.06834151595830917, -0.25022459030151367, -0.07531041651964188, -0.02265019342303276, 0.18020716309547424, -0.20594021677970886, 0.03997855260968208, 0.04203054681420326, 0.1418783813714981, 0.04817643389105797, -0.03617740795016289, -0.006367512978613377, 0.0497761145234108, -0.019083764404058456, -0.06381421536207199, 0.027235770598053932, 0.0100393146276474, -0.13227403163909912, -0.01396765187382698, -0.121584951877594, 0.1459565907716751, 0.11346416920423508, 0.0062463125213980675, -0.07090316712856293, -0.010924779810011387, -0.08898546546697617, -0.043642595410346985, -0.009423426352441311, -0.015270596370100975, 0.12823662161827087, 0.012472163885831833, 0.1421356350183487, -0.1018151268362999, -0.08625325560569763, 0.032652366906404495, -0.0175032876431942, 0.0005456738290376961, 0.11187451332807541, 0.0018187176901847124, -0.05162286013364792, 0.11179880797863007, 0.08953934907913208, -0.08539236336946487, 0.16981269419193268, -0.08126075565814972, -0.07870384305715561, -0.01698640175163746, 0.025667371228337288, 0.03761839494109154, 0.15192408859729767, -0.1140032559633255, -0.023464296013116837, 0.007262288127094507, 0.002262069610878825, 0.029623517766594887, -0.20737324655056, -0.01482884306460619, 0.03140972554683685, -0.05557843670248985, -0.02883962169289589, -0.002063965192064643, -0.01768568344414234, 0.08688589930534363, 0.019572723656892776, -0.039924394339323044, 0.004059127997606993, -0.01825900748372078, -0.0694536417722702, 0.20965473353862762, -0.09219600260257721, -0.13600414991378784, -0.13019230961799622, -0.00265663955360651, -0.045033231377601624, -0.0069792005233466625, 0.04811164736747742, -0.13751141726970673, -0.01722075045108795, -0.06636804342269897, 0.031411781907081604, -0.019214151427149773, 0.043715499341487885, 0.04193549230694771, 0.03157490864396095, 0.11110856384038925, -0.12729677557945251, 0.02476738765835762, -0.04510020464658737, -0.045510247349739075, -0.004985918756574392, 0.043403323739767075, 0.0952269658446312, 0.16674497723579407, 0.022059720009565353, 0.042971398681402206, -0.03869595751166344, 0.17725549638271332, -0.11385685950517654, -0.04299343749880791, 0.13131822645664215, 0.007681495975703001, 0.04208805039525032, 0.13684441149234772, 0.058073341846466064, -0.0974460244178772, 0.01825215481221676, 0.043871860951185226, -0.02302910014986992, -0.22272081673145294, -0.02404441125690937, -0.03959375619888306, 0.01811598241329193, 0.0795883983373642, 0.039240218698978424, 0.060999296605587006, 0.025425372645258904, -0.024428891018033028, -0.0074694231152534485, -0.016361217945814133, 0.07395951449871063, 0.05187239497900009, 0.03757978975772858, 0.12667502462863922, -0.029916835948824883, -0.027595436200499535, 0.020874837413430214, -0.013211960904300213, 0.21450568735599518, -0.036990173161029816, 0.140633225440979, 0.07093774527311325, 0.1470974236726761, -0.0004938762867823243, 0.05393005535006523, -0.00011163926683366299, -0.04885429888963699, 0.028489435091614723, -0.06564317643642426, -0.010755754075944424, 0.056499093770980835, 0.004751322325319052, 0.08980075269937515, -0.13644874095916748, 0.013701969757676125, 0.057113904505968094, 0.3389938771724701, 0.08519627898931503, -0.3048360347747803, -0.13285136222839355, 0.010647941380739212, -0.07350289821624756, -0.017075803130865097, 0.029086720198392868, 0.12665235996246338, -0.08064188808202744, 0.07235029339790344, -0.07407855242490768, 0.08046778291463852, -0.016648510470986366, 0.017441963776946068, 0.0804625153541565, 0.10679644346237183, -0.014865979552268982, 0.05017198249697685, -0.2412482053041458, 0.3092750012874603, -0.0009251281735487282, 0.10493945330381393, -0.03262881562113762, 0.029459519311785698, 0.04321314021945, 0.024154985323548317, 0.07839643210172653, -0.017266638576984406, -0.07496464997529984, -0.16777029633522034, -0.053617581725120544, 0.027025286108255386, 0.12891675531864166, -0.06818235665559769, 0.11767113953828812, -0.02477959543466568, -0.023297108709812164, 0.05649986490607262, -0.0675264298915863, -0.07805041968822479, -0.06962527334690094, 0.02416037581861019, 0.0007960928487591445, 0.036466412246227264, -0.1294449120759964, -0.1105126291513443, -0.06576281040906906, 0.1491805911064148, -0.09566017240285873, -0.046211421489715576, -0.12386247515678406, 0.09011654555797577, 0.14795511960983276, -0.06929775327444077, 0.05564172565937042, 0.013226301409304142, 0.1214945912361145, 0.025502024218440056, -0.022479472681879997, 0.09781626611948013, -0.08821769803762436, -0.2494608461856842, -0.04205358773469925, 0.17637552320957184, 0.02599637769162655, 0.0550980269908905, -0.026218561455607414, 0.023732351139187813, -0.013325012288987637, -0.07998688519001007, 0.06590116769075394, -0.0046715643256902695, 0.026231953874230385, 0.016643373295664787, -0.007061588112264872, 0.025083083659410477, -0.06891702115535736, -0.053018175065517426, 0.10960913449525833, 0.32521694898605347, -0.09005432575941086, 0.029388515278697014, 0.07597948610782623, -0.024769818410277367, -0.1640872061252594, 0.0429539754986763, 0.10858089476823807, 0.02457553520798683, -0.007690321188420057, -0.18468470871448517, 0.08511731773614883, 0.07686501741409302, -0.054289355874061584, 0.09784981608390808, -0.28787025809288025, -0.15483024716377258, 0.10316920280456543, 0.10421685129404068, 0.010716275312006474, -0.1575702726840973, -0.0649528056383133, -0.02128920704126358, -0.08171307295560837, 0.042134080082178116, -0.10221700370311737, 0.10625704377889633, -0.003224659711122513, 0.039229173213243484, 0.01869155652821064, -0.05541154369711876, 0.14156082272529602, -0.030406739562749863, 0.08057498186826706, -0.006763208657503128, 0.017831427976489067, 0.06335782259702682, -0.06485012918710709, 0.0056063453666865826, -0.0967552587389946, 0.05266927182674408, -0.06641470640897751, -0.017811119556427002, -0.10729850083589554, 0.042212266474962234, -0.04569217190146446, -0.060106903314590454, -0.0275777205824852, 0.049792978912591934, 0.035912856459617615, -0.01119089126586914, 0.14446941018104553, -0.03655492886900902, 0.20466703176498413, 0.11682364344596863, 0.10036167502403259, -0.056244123727083206, -0.03832876682281494, 0.020908717066049576, -0.02246101386845112, 0.06288932263851166, -0.1802908033132553, 0.03394217789173126, 0.1222657710313797, 0.06900586932897568, 0.12361130118370056, 0.065000019967556, -0.08045592904090881, 0.018439842388033867, 0.08977193385362625, -0.10550574958324432, -0.13264153897762299, -0.034141238778829575, 0.013819241896271706, -0.1565231829881668, 0.10621335357427597, 0.09636730700731277, -0.06606794148683548, -0.009799351915717125, 0.004649038426578045, 0.008185897953808308, -0.04720287024974823, 0.23999516665935516, 0.06562243402004242, 0.10109589248895645, -0.10957597941160202, 0.0953555479645729, 0.009666898287832737, -0.11652203649282455, 0.02661229856312275, 0.08337653428316116, -0.061749331653118134, -0.007878179661929607, -0.004526561126112938, 0.07361244410276413, 0.00848401803523302, -0.05754528194665909, -0.1786838173866272, -0.1285599023103714, 0.06069666147232056, 0.15386773645877838, 0.062126275151968, 0.03090844489634037, -0.01695329323410988, 0.05556594207882881, -0.1241837739944458, 0.12755301594734192, 0.06815877556800842, 0.07165022939443588, -0.1657179892063141, 0.1777147799730301, 0.012683806009590626, 0.03201380744576454, -0.009638198651373386, 0.006146694999188185, -0.10377509891986847, 0.021640295162796974, -0.1379040777683258, -0.05773203447461128, -0.04641687497496605, -0.009131009690463543, 0.0062295133247971535, -0.05037945881485939, -0.07822202891111374, 0.047595854848623276, -0.13250583410263062, -0.037511181086301804, 0.015012536197900772, 0.03251666948199272, -0.10434776544570923, -0.004457274917513132, 0.03652726858854294, -0.11402901262044907, 0.09739663451910019, 0.04212778061628342, 0.023006442934274673, 0.05773194879293442, -0.06414278596639633, -0.00012085324851796031, 0.03398827463388443, -0.023872841149568558, 0.046983752399683, -0.12391430139541626, -0.027657950296998024, -0.03249695524573326, 0.040202461183071136, -0.004872813820838928, 0.03949475288391113, -0.12089909613132477, -0.021279651671648026, -0.005627516657114029, -0.030270816758275032, -0.062675341963768, 0.03569202497601509, 0.06873027235269547, 0.030104150995612144, 0.14684242010116577, -0.0817749947309494, 0.013609593734145164, -0.22838887572288513, -0.004083778243511915, -0.02391102723777294, -0.10362095385789871, -0.11247288435697556, -0.000984568614512682, 0.08279064297676086, -0.07565329968929291, 0.10187846422195435, -0.07107780873775482, 0.07081757485866547, 0.05073285847902298, -0.08985445648431778, 0.04343092441558838, 0.04861113801598549, 0.22514912486076355, 0.025086553767323494, -0.015493854880332947, 0.0675307884812355, 0.010893522761762142, 0.0576891265809536, 0.07898027449846268, 0.1658833920955658, 0.14421474933624268, 0.024866757914423943, 0.09884825348854065, 0.06760092824697495, -0.09530039876699448, -0.17377035319805145, 0.07470714300870895, -0.03456338122487068, 0.12171146273612976, -0.019139358773827553, 0.16358785331249237, 0.15947574377059937, -0.16928818821907043, 0.043540164828300476, -0.04117520526051521, -0.07275336235761642, -0.11239718645811081, -0.02585756592452526, -0.07229162752628326, -0.2116660475730896, 0.01730830781161785, -0.11989330500364304, 0.027894483879208565, 0.05540420487523079, 0.027907783165574074, 0.018209345638751984, 0.16774491965770721, 0.030649786815047264, 0.025912361219525337, 0.09454267472028732, 0.00013965148536954075, -0.03573226556181908, -0.01796652004122734, -0.09811349213123322, 0.0355539545416832, -0.03412163257598877, 0.034511156380176544, -0.07754247635602951, -0.1371993124485016, 0.05844475328922272, 0.020832274109125137, -0.11077779531478882, 0.034333597868680954, 0.009263342246413231, 0.08502092212438583, 0.02786848694086075, 0.017970947548747063, -0.00247671059332788, -0.018956100568175316, 0.2714459002017975, -0.1342736780643463, -0.06734083592891693, -0.15465295314788818, 0.29657629132270813, 0.022353481501340866, -0.023373272269964218, 0.026380164548754692, -0.10366974025964737, -0.028067082166671753, 0.17812016606330872, 0.176982581615448, -0.004501702729612589, -0.016479767858982086, -0.004010239150375128, -0.018911395221948624, -0.09564413130283356, 0.07071560621261597, 0.10357576608657837, 0.06930352002382278, -0.05780990421772003, -0.04042666777968407, -0.016070788726210594, -0.07019414007663727, -0.007441381923854351, 0.10101678967475891, 0.019925376400351524, -0.017038622871041298, -0.031019892543554306, 0.08577966690063477, -0.049790333956480026, -0.1497511863708496, 0.053394805639982224, -0.20059308409690857, -0.17961527407169342, -0.03433359041810036, 0.06299279630184174, 0.02370414510369301, 0.06572093814611435, 0.011710929684340954, -0.021617576479911804, 0.08496279269456863, -0.0037058903835713863, -0.0407065823674202, -0.14833500981330872, 0.11868934333324432, -0.11435382813215256, 0.213250070810318, -0.0614868625998497, 0.01507959421724081, 0.13167835772037506, 0.0350036546587944, -0.08334415405988693, 0.03587142378091812, 0.07944292575120926, -0.15838836133480072, 0.018812458962202072, 0.2045632302761078, -0.029725907370448112, 0.1335413157939911, 0.02486705593764782, -0.14632771909236908, 0.0014721043407917023, -0.08051003515720367, -0.061496634036302567, -0.07839073985815048, -0.007693722378462553, -0.0431969128549099, 0.12162661552429199, 0.22174210846424103, -0.07651467621326447, 0.0069428314454853535, -0.058776725083589554, 0.029404062777757645, 0.06969176977872849, 0.0686408057808876, -0.025682803243398666, -0.30227401852607727, 0.0320303812623024, 0.03297380730509758, -0.030502870678901672, -0.2592831254005432, -0.07102712988853455, 0.04648700729012489, -0.05690392851829529, -0.04127245023846626, 0.09478116035461426, 0.1146719753742218, 0.05292554199695587, -0.04642018303275108, -0.05323956906795502, -0.06234227120876312, 0.1759393811225891, -0.1843591332435608, -0.06657944619655609 ]
null
null
null
<p align="center"> <img style="width: 20%;" src="llasmol.png"> </p> <h1 align="center"> LlaSMol </h1> <h3 align="center"> LlaSMol-Mistral-7B </h3> **Paper**: [LlaSMol: Advancing Large Language Models for Chemistry with a Large-Scale, Comprehensive, High-Quality Instruction Tuning Dataset](https://arxiv.org/abs/2402.09391) **Page**: [https://osu-nlp-group.github.io/LlaSMol](https://osu-nlp-group.github.io/LlaSMol) **Code**: [https://github.com/OSU-NLP-Group/LlaSMol](https://github.com/OSU-NLP-Group/LlaSMol) **Models**: [https://huggingface.co/osunlp/LlaSMol](https://huggingface.co/osunlp/LlaSMol-Mistral-7B) LlaSMol-Mistral-7B is an LLM for chemistry. It is based on [Mistral](https://huggingface.co/mistralai/Mistral-7B-v0.1) and tuned on our [SMolInstruct](https://huggingface.co/datasets/osunlp/SMolInstruct) dataset with LoRA. This repo contains the weight of the low-rank adapter. ## ⚔️ Usage For instructions to run the model, please refer to our [repository](https://github.com/OSU-NLP-Group/LlaSMol). ## 🚨 Limitations While the model is carefully trained, we do not guarantee its effectiveness. The model may output incorrect or inaccurate information. Please use it at your own risk. Additionally, the model is built as a mature product but solely for research purpose. It may generate harmful or biased information. We emphatically urge all users to adhere to the highest ethical standards when using the model, including maintaining fairness, transparency, and responsibility in their research. Any usage of the dataset that may lead to harm or pose a detriment to society is strictly **forbidden**. ## 📚 Citation If our paper or related resources prove valuable to your research, we kindly ask for citation. Please feel free to contact us with any inquiries. ``` @article{yu2024llasmol, title={LlaSMol: Advancing Large Language Models for Chemistry with a Large-Scale, Comprehensive, High-Quality Instruction Tuning Dataset}, author={Botao Yu and Frazier N. Baker and Ziqi Chen and Xia Ning and Huan Sun}, journal={arXiv preprint arXiv:2402.09391}, year={2024} } ```
{"language": ["en"], "license": "cc-by-4.0", "tags": ["instruction tuning", "chemistry", "molecule", "small molecule"]}
null
osunlp/LlaSMol-Mistral-7B
[ "instruction tuning", "chemistry", "molecule", "small molecule", "en", "arxiv:2402.09391", "license:cc-by-4.0", "region:us" ]
2024-02-13T20:51:34+00:00
[ "2402.09391" ]
[ "en" ]
TAGS #instruction tuning #chemistry #molecule #small molecule #en #arxiv-2402.09391 #license-cc-by-4.0 #region-us
<p align="center"> <img style="width: 20%;" src="URL"> </p> <h1 align="center"> LlaSMol </h1> <h3 align="center"> LlaSMol-Mistral-7B </h3> Paper: LlaSMol: Advancing Large Language Models for Chemistry with a Large-Scale, Comprehensive, High-Quality Instruction Tuning Dataset Page: URL Code: URL Models: URL LlaSMol-Mistral-7B is an LLM for chemistry. It is based on Mistral and tuned on our SMolInstruct dataset with LoRA. This repo contains the weight of the low-rank adapter. ## ️ Usage For instructions to run the model, please refer to our repository. ## Limitations While the model is carefully trained, we do not guarantee its effectiveness. The model may output incorrect or inaccurate information. Please use it at your own risk. Additionally, the model is built as a mature product but solely for research purpose. It may generate harmful or biased information. We emphatically urge all users to adhere to the highest ethical standards when using the model, including maintaining fairness, transparency, and responsibility in their research. Any usage of the dataset that may lead to harm or pose a detriment to society is strictly forbidden. ## Citation If our paper or related resources prove valuable to your research, we kindly ask for citation. Please feel free to contact us with any inquiries.
[ "## ️ Usage\n\nFor instructions to run the model, please refer to our repository.", "## Limitations\n\nWhile the model is carefully trained, we do not guarantee its effectiveness. The model may output incorrect or inaccurate information. Please use it at your own risk.\n\nAdditionally, the model is built as a mature product but solely for research purpose. It may generate harmful or biased information. We emphatically urge all users to adhere to the highest ethical standards when using the model, including maintaining fairness, transparency, and responsibility in their research. Any usage of the dataset that may lead to harm or pose a detriment to society is strictly forbidden.", "## Citation\nIf our paper or related resources prove valuable to your research, we kindly ask for citation. Please feel free to contact us with any inquiries." ]
[ "TAGS\n#instruction tuning #chemistry #molecule #small molecule #en #arxiv-2402.09391 #license-cc-by-4.0 #region-us \n", "## ️ Usage\n\nFor instructions to run the model, please refer to our repository.", "## Limitations\n\nWhile the model is carefully trained, we do not guarantee its effectiveness. The model may output incorrect or inaccurate information. Please use it at your own risk.\n\nAdditionally, the model is built as a mature product but solely for research purpose. It may generate harmful or biased information. We emphatically urge all users to adhere to the highest ethical standards when using the model, including maintaining fairness, transparency, and responsibility in their research. Any usage of the dataset that may lead to harm or pose a detriment to society is strictly forbidden.", "## Citation\nIf our paper or related resources prove valuable to your research, we kindly ask for citation. Please feel free to contact us with any inquiries." ]
[ 42, 20, 133, 35 ]
[ "passage: TAGS\n#instruction tuning #chemistry #molecule #small molecule #en #arxiv-2402.09391 #license-cc-by-4.0 #region-us \n## ️ Usage\n\nFor instructions to run the model, please refer to our repository.## Limitations\n\nWhile the model is carefully trained, we do not guarantee its effectiveness. The model may output incorrect or inaccurate information. Please use it at your own risk.\n\nAdditionally, the model is built as a mature product but solely for research purpose. It may generate harmful or biased information. We emphatically urge all users to adhere to the highest ethical standards when using the model, including maintaining fairness, transparency, and responsibility in their research. Any usage of the dataset that may lead to harm or pose a detriment to society is strictly forbidden.## Citation\nIf our paper or related resources prove valuable to your research, we kindly ask for citation. Please feel free to contact us with any inquiries." ]
[ -0.010147995315492153, 0.03891729563474655, -0.0034893369302153587, 0.02233525738120079, -0.04863188788294792, 0.005278392694890499, 0.1924600601196289, 0.05362652242183685, 0.1704735904932022, -0.0038594133220613003, 0.0744829848408699, -0.0005666232318617404, -0.09512129426002502, -0.01600741408765316, -0.0626668632030487, -0.11423155665397644, -0.050726063549518585, -0.008004202507436275, 0.07083533704280853, 0.07617372274398804, 0.07367971539497375, -0.09633862972259521, 0.15167254209518433, 0.020362338051199913, 0.018138999119400978, -0.04323769360780716, 0.1114995926618576, 0.002267467090860009, 0.1170068085193634, 0.04275898262858391, 0.044144582003355026, 0.11873235553503036, -0.0019987693522125483, -0.013595618307590485, 0.06132778897881508, -0.04439128562808037, -0.008739948272705078, 0.16561996936798096, 0.033300478011369705, 0.07207270711660385, 0.26161640882492065, -0.02422637678682804, -0.01771652325987816, 0.0976613312959671, -0.04588175565004349, -0.0010031101992353797, -0.0037476937286555767, -0.007192966062575579, 0.0016934761079028249, -0.015541632659733295, 0.004078051075339317, 0.24381320178508759, -0.06015845388174057, 0.009492039680480957, 0.14192558825016022, -0.08903845399618149, -0.003238838165998459, 0.21553851664066315, 0.06291434168815613, 0.1797269880771637, -0.007673555053770542, 0.09531523287296295, 0.01622040569782257, -0.017029281705617905, 0.07468974590301514, -0.12263684719800949, 0.057218484580516815, -0.05694214254617691, -0.07441070675849915, 0.049402978271245956, 0.2873885929584503, -0.0357944555580616, -0.132645383477211, -0.0013417575974017382, -0.05514741688966751, 0.1373872309923172, -0.012517793104052544, -0.1048588678240776, 0.07629402726888657, -0.016970811411738396, 0.10456208884716034, -0.027889076620340347, -0.08230983465909958, -0.08685817569494247, 0.01170096080750227, -0.04613765329122543, -0.05803641304373741, 0.060953110456466675, -0.09514710307121277, 0.0333331823348999, -0.2432549148797989, -0.0044866581447422504, -0.06018918380141258, -0.1768350601196289, 0.15155132114887238, -0.08270534127950668, -0.015251635573804379, 0.11245804280042648, 0.15525659918785095, 0.036492761224508286, -0.055543817579746246, -0.021159019321203232, 0.05882219225168228, 0.05391382798552513, 0.09570140391588211, -0.1460096538066864, -0.024086326360702515, 0.08592627197504044, 0.16271750628948212, 0.010520084761083126, 0.027798041701316833, 0.06157510727643967, -0.006903266068547964, -0.06963002681732178, -0.08148864656686783, 0.0569867342710495, 0.020926864817738533, -0.018141742795705795, 0.0061043729074299335, -0.002594852354377508, 0.20147311687469482, 0.05664007365703583, -0.0392177440226078, 0.07779742032289505, 0.001263220445252955, 0.03270198404788971, 0.0026084808632731438, -0.02883097529411316, 0.05726241320371628, 0.010345254093408585, -0.17788879573345184, -0.028708441182971, -0.12239876389503479, -0.034204691648483276, 0.031798046082258224, 0.06717019528150558, 0.08769775182008743, -0.10839097946882248, -0.028657011687755585, -0.016107529401779175, 0.011263452470302582, -0.0490538589656353, 0.04477836564183235, 0.12915034592151642, 0.03495251387357712, -0.004625262692570686, -0.025663748383522034, -0.02923351712524891, -0.06755850464105606, -0.0033599266316741705, -0.02394854463636875, 0.08056800067424774, -0.12273123860359192, -0.02367801032960415, -0.13275116682052612, 0.11510676145553589, -0.058045994490385056, -0.049149855971336365, -0.015360000543296337, -0.0203426294028759, -0.05065850913524628, -0.1179286316037178, -0.11898642033338547, -0.0640067309141159, 0.019034232944250107, 0.12319842725992203, -0.10090191662311554, -0.004154907539486885, -0.052734486758708954, -0.06699074059724808, -0.18369098007678986, 0.12844711542129517, -0.061845723539590836, 0.1539270430803299, 0.06271164864301682, 0.02595793828368187, 0.050825297832489014, -0.10427693277597427, -0.07594865560531616, 0.05745461583137512, -0.03238978609442711, -0.09028678387403488, 0.08239620178937912, 0.05516323074698448, -0.10783656686544418, -0.06004461273550987, -0.04677601158618927, -0.04054240882396698, -0.045628417283296585, -0.06510967016220093, -0.009542698971927166, -0.01961502432823181, -0.009107442572712898, -0.01427599135786295, -0.006490111351013184, -0.033824458718299866, 0.04703003168106079, 0.026258820667862892, 0.09428777545690536, 0.011806318536400795, -0.04831061139702797, 0.03863998502492905, 0.14236319065093994, -0.06382928788661957, -0.04459863528609276, -0.06462511420249939, 0.0912909284234047, 0.02535894699394703, 0.007096529006958008, 0.03762932866811752, 0.04819010943174362, -0.0017594435485079885, 0.062143027782440186, 0.018031202256679535, 0.08637095987796783, 0.02170552685856819, 0.015296011231839657, -0.1001410260796547, -0.1213383674621582, 0.06067967787384987, 0.012764157727360725, -0.00916876271367073, -0.26181405782699585, -0.017992688342928886, 0.09386942535638809, -0.1475866585969925, 0.01367010548710823, 0.08426953852176666, 0.06758160144090652, -0.08411133289337158, -0.03193962946534157, 0.02097880095243454, 0.08724460750818253, -0.07981351017951965, -0.05123044550418854, -0.10320428758859634, 0.015524957329034805, 0.09656009078025818, 0.048983413726091385, -0.11891276389360428, 0.014560079202055931, -0.2127445787191391, -0.04517270624637604, 0.0759672075510025, -0.0386183001101017, 0.0874982476234436, -0.022484883666038513, 0.011525177396833897, -0.05124640464782715, -0.16037625074386597, 0.047458190470933914, 0.00006392371142283082, -0.1576024293899536, 0.024065881967544556, 0.026560241356492043, 0.14980186522006989, -0.1545170098543167, 0.03526674211025238, 0.10386502742767334, -0.04448404535651207, 0.012523685581982136, 0.11197841167449951, -0.11305814236402512, 0.012219487689435482, -0.10808657109737396, -0.009620960801839828, 0.12246735394001007, 0.07542155683040619, 0.0877726674079895, 0.09744071960449219, -0.015332259237766266, -0.015412409789860249, -0.03530547022819519, -0.0029684831388294697, -0.04628363624215126, 0.029532255604863167, -0.19580279290676117, -0.01781466044485569, -0.05400719493627548, 0.17514875531196594, -0.012051272206008434, 0.021345827728509903, -0.05470888316631317, -0.005018818192183971, -0.11412175744771957, 0.21069872379302979, 0.033842138946056366, -0.046739716082811356, -0.12897136807441711, 0.09139206260442734, 0.0017543832072988153, 0.11509380489587784, 0.026837294921278954, 0.023723337799310684, -0.05437403544783592, -0.03146848455071449, -0.047836918383836746, 0.0675756186246872, -0.0885481908917427, -0.11336144059896469, 0.08856775611639023, 0.03936422988772392, -0.05103261023759842, -0.004655436612665653, -0.022888800129294395, -0.012986134737730026, 0.09508155286312103, -0.06413095444440842, 0.03960694000124931, 0.04237684607505798, -0.08421581983566284, -0.06067279726266861, -0.012722708284854889, 0.2345912754535675, -0.038609541952610016, 0.03803102299571037, 0.22540590167045593, 0.025775812566280365, 0.020280374214053154, 0.014102384448051453, 0.08811414241790771, -0.05479412153363228, 0.031863633543252945, -0.047878239303827286, -0.07109193503856659, -0.16562561690807343, -0.028171902522444725, -0.006085201632231474, -0.04900621250271797, -0.03827627748250961, 0.008853618055582047, 0.10456246882677078, 0.18615147471427917, -0.00749265355989337, -0.03993295133113861, -0.017202291637659073, 0.08623112738132477, 0.13255687057971954, -0.007551573682576418, 0.051240336149930954, -0.04741966351866722, 0.011926116421818733, 0.0994427353143692, -0.005548566114157438, 0.19805265963077545, 0.057416267693042755, 0.2004462033510208, 0.13090810179710388, 0.004441143479198217, 0.10735663026571274, 0.13845789432525635, -0.07618831098079681, 0.0059236702509224415, -0.06452497094869614, -0.12884639203548431, -0.10587645322084427, 0.07472329586744308, -0.19041140377521515, 0.05091920867562294, 0.006185390055179596, 0.07243156433105469, 0.07132084667682648, 0.02095855213701725, -0.0018010172061622143, -0.23171940445899963, -0.12943744659423828, 0.07447609305381775, 0.13262172043323517, -0.0573556013405323, 0.02390313148498535, 0.0654473751783371, 0.003730679862201214, -0.11985117942094803, 0.0022098440676927567, 0.05946596339344978, 0.02345820888876915, 0.09295911341905594, -0.11662457883358002, -0.018742991611361504, -0.09226979315280914, 0.12952229380607605, -0.2048969566822052, 0.16139139235019684, -0.00934003759175539, -0.02279592491686344, -0.07002218067646027, -0.06467805802822113, 0.039362937211990356, 0.22970160841941833, 0.14127010107040405, 0.05337926745414734, 0.0071487766690552235, -0.09212630242109299, -0.1953035295009613, 0.03292771056294441, -0.09439891576766968, 0.14138121902942657, 0.0645221397280693, -0.006021674256771803, -0.004144366830587387, -0.015149355866014957, -0.013845246285200119, -0.22133991122245789, -0.0020893567707389593, 0.04338068142533302, -0.041830115020275116, -0.11038418859243393, -0.013821424916386604, -0.014663861133158207, 0.20210221409797668, -0.003199261613190174, -0.1336703598499298, -0.14330103993415833, -0.07182876020669937, -0.11567211896181107, 0.14981429278850555, -0.0793229341506958, -0.017333628609776497, -0.017744896933436394, 0.027305297553539276, -0.008587910793721676, -0.09978403151035309, 0.04798712581396103, -0.1205044761300087, -0.1011573076248169, 0.012626315467059612, 0.024470165371894836, 0.08682569116353989, 0.11335697770118713, 0.08304635435342789, 0.046565912663936615, -0.1600525975227356, -0.12289418280124664, -0.033118925988674164, -0.033395349979400635, 0.27055349946022034, -0.07708929479122162, -0.07576169818639755, 0.12830030918121338, -0.027048423886299133, -0.02133222483098507, -0.019224831834435463, 0.18091775476932526, -0.024600274860858917, 0.050595030188560486, 0.2089191973209381, -0.13131490349769592, -0.048109021037817, 0.05898751690983772, -0.09214985370635986, -0.007703322451561689, 0.029797915369272232, -0.10744036734104156, 0.07674459367990494, 0.10157126933336258, -0.03942393884062767, 0.09499366581439972, -0.07720966637134552, -0.08239339292049408, 0.14799924194812775, 0.07629519701004028, 0.1864781230688095, -0.10891154408454895, -0.01375109888613224, -0.07689053565263748, -0.09430824220180511, 0.11553594470024109, -0.05358409881591797, 0.029815154150128365, -0.09193011373281479, 0.019026022404432297, 0.04293026775121689, -0.05760861560702324, 0.1862756460905075, -0.06590897589921951, 0.09566208720207214, -0.03032000921666622, -0.2205844223499298, -0.02813623659312725, 0.028907092288136482, 0.11651423573493958, 0.12398253381252289, 0.050802480429410934, -0.014427593909204006, -0.0582229420542717, -0.06941459327936172, 0.12718337774276733, -0.03222649544477463, -0.1610081046819687, -0.15217861533164978, 0.10678884387016296, -0.10702148079872131, 0.0014801844954490662, -0.0017333209980279207, -0.14512749016284943, -0.008323422633111477, 0.14389154314994812, 0.37668150663375854, -0.11808907240629196, 0.045027535408735275, 0.18875543773174286, -0.0032081815879791975, 0.03485151380300522, -0.12313187122344971, 0.001942516304552555, 0.11235152930021286, -0.028620127588510513, -0.002375895855948329, -0.004452676512300968, -0.00023532810155302286, 0.003890427527949214, 0.11655841767787933, -0.15880167484283447, -0.011036252602934837, -0.047332763671875, 0.03478304296731949, -0.11191794276237488, -0.006030527409166098, 0.08322294056415558, -0.14427532255649567, -0.03327203914523125, -0.011875421740114689, 0.07308521866798401, -0.01754363439977169, 0.08283516019582748, 0.10670950263738632, 0.016695767641067505, -0.008553286083042622, -0.02827446535229683, 0.0410190187394619, 0.0710025429725647, -0.06844179332256317, -0.09475135803222656, -0.06926579028367996, -0.04357677325606346, -0.08490916341543198, -0.07863181084394455, -0.17803311347961426, -0.06711924821138382, -0.05240025371313095, -0.0346379280090332, -0.10892240703105927, 0.0782618522644043, 0.10202439874410629, 0.015865540131926537, -0.041528526693582535, -0.025830959901213646, -0.0960547998547554, 0.020462842658162117, -0.12933091819286346, 0.010410255752503872, -0.05684426426887512, 0.06773373484611511, -0.08317169547080994, 0.02654392644762993, -0.10456503927707672, 0.04758141562342644, -0.1116822361946106, -0.07250920683145523, -0.20740054547786713, 0.06335107237100601, -0.13686349987983704, -0.030329355970025063, -0.023610590025782585, 0.015461244620382786, -0.05322812497615814, -0.0026145377196371555, -0.04261522740125656, 0.12610653042793274, -0.006432587746530771, 0.14750656485557556, -0.10765548795461655, 0.030644165351986885, 0.1218382939696312, 0.0740942656993866, 0.07971542328596115, 0.035664647817611694, -0.013557007536292076, -0.04552680253982544, -0.09289287030696869, 0.11962508410215378, 0.03504331409931183, 0.03718455508351326, 0.008620078675448895, -0.26866042613983154, 0.07386986166238785, 0.041024014353752136, 0.0022782410960644484, -0.024760669097304344, -0.013793433085083961, -0.05596389248967171, 0.04406160116195679, 0.04009971395134926, 0.0058711813762784, -0.01891271583735943, -0.04902120307087898, 0.0305307786911726, 0.01745300367474556, 0.03512609750032425, 0.03533129394054413, -0.06666863709688187, -0.10539433360099792, 0.028301013633608818, -0.050154201686382294, -0.045168787240982056, -0.11245119571685791, -0.08158561587333679, -0.010643914341926575, 0.015707215294241905, 0.26161524653434753, 0.06488511711359024, -0.03582547977566719, 0.01058835256844759, 0.06830090284347534, 0.13403397798538208, -0.022815629839897156, 0.11222582310438156, -0.016558878123760223, -0.02997562289237976, -0.0034058778546750546, 0.09368473291397095, -0.03746398538351059, -0.11579428613185883, 0.2658683955669403, 0.12263999879360199, -0.03981834277510643, -0.06154409423470497, 0.012195850722491741, -0.024694902822375298, 0.08138369768857956, -0.14766226708889008, 0.15579599142074585, -0.13044153153896332, -0.09280035644769669, 0.005035609472543001, 0.26374658942222595, -0.15089648962020874, 0.04823199287056923, -0.1202322319149971, -0.06673403829336166, -0.1919931322336197, -0.03050491027534008, -0.06296318769454956, -0.018974658101797104, -0.07701478898525238, -0.17627809941768646, -0.11311616748571396, 0.07017556577920914, -0.020227795466780663, -0.0957765057682991, 0.06938641518354416, -0.12297992408275604, 0.011791512370109558, -0.018058758229017258, 0.011699707247316837, -0.03822818398475647, -0.13272036612033844, -0.013138937763869762, 0.04847608506679535, -0.11400621384382248, -0.00786788109689951, -0.009843883104622364, 0.09059437364339828, -0.0708904042840004, -0.024302711710333824, -0.04730644449591637, -0.11128722876310349, 0.04953114688396454, 0.029008211567997932, 0.06656621396541595, 0.00824629794806242, -0.06032693013548851, 0.002048358554020524, 0.14949297904968262, -0.0029588702600449324, 0.06306008249521255, 0.02409263327717781, 0.32755202054977417, -0.0915667861700058, 0.025320731103420258, 0.037143949419260025, -0.0071157431229949, 0.10544741153717041, 0.18937230110168457, 0.2216527760028839, -0.13030578196048737, -0.042416345328092575, -0.08609573543071747, 0.046778757125139236, -0.011238360777497292, 0.1364511102437973, -0.03696286305785179, 0.15372096002101898, -0.11534848809242249, 0.10787124186754227, -0.0418066680431366, 0.1319100260734558, -0.09843740612268448, -0.00867479294538498, 0.10690341889858246, -0.007473514880985022, -0.042689427733421326, 0.030675694346427917, -0.0010594639461487532, -0.0192082691937685, -0.08193381875753403, -0.007195354904979467, 0.004742256831377745, 0.017195312306284904, -0.2911408841609955, -0.06691049039363861, 0.020858673378825188, -0.11044564843177795, -0.015308835543692112, -0.017397241666913033, 0.05861696973443031, -0.10937003791332245, 0.04189234599471092, 0.21471789479255676, 0.1615353375673294, 0.14520278573036194, 0.04585409164428711, 0.15430966019630432, 0.08404563367366791, -0.1237230971455574, -0.1418205052614212, 0.05222391337156296, -0.007808746304363012, -0.0895625650882721, 0.047769755125045776, -0.047759056091308594, 0.036448389291763306, -0.019845979288220406, 0.03418222442269325, 0.02209167368710041, 0.03119371086359024, 0.09468730539083481, 0.07209151238203049, -0.02908368967473507, 0.09201916307210922, -0.11797983944416046, 0.10872234404087067, 0.017702292650938034, -0.09918254613876343, -0.012370775453746319, -0.040832243859767914, 0.17377647757530212, 0.08733730763196945, -0.11025907844305038, 0.013582393527030945, -0.03254763409495354, 0.0482349693775177, 0.054594993591308594, -0.03808062896132469, -0.23477959632873535, 0.033871203660964966, -0.008031042292714119, 0.051398128271102905, -0.09203995019197464, -0.0335267037153244, -0.01662748120725155, 0.021905094385147095, -0.015624157153069973, -0.15324100852012634, -0.06816142797470093, 0.03172832727432251, -0.07505868375301361, -0.0686003565788269 ]
null
null
diffusers
<!-- This model card has been generated automatically according to the information the training script had access to. You should probably proofread and complete it, then remove this comment. --> # DreamBooth - awatterson/dogbooth This is a dreambooth model derived from stabilityai/stable-diffusion-2-1. The weights were trained on a photo of [v]dog using [DreamBooth](https://dreambooth.github.io/). You can find some example images in the following. DreamBooth for the text encoder was enabled: False.
{"license": "creativeml-openrail-m", "library_name": "diffusers", "tags": ["text-to-image", "dreambooth", "stable-diffusion", "stable-diffusion-diffusers", "text-to-image", "dreambooth", "stable-diffusion", "stable-diffusion-diffusers", "text-to-image", "dreambooth", "stable-diffusion", "stable-diffusion-diffusers", "text-to-image", "dreambooth", "stable-diffusion", "stable-diffusion-diffusers", "text-to-image", "dreambooth", "stable-diffusion", "stable-diffusion-diffusers", "text-to-image", "dreambooth", "stable-diffusion", "stable-diffusion-diffusers", "text-to-image", "dreambooth", "stable-diffusion", "stable-diffusion-diffusers", "text-to-image", "dreambooth", "stable-diffusion", "stable-diffusion-diffusers", "text-to-image", "dreambooth", "stable-diffusion", "stable-diffusion-diffusers"], "inference": true, "base_model": "stabilityai/stable-diffusion-2-1", "instance_prompt": "a photo of [v]dog"}
text-to-image
awatterson/dogbooth
[ "diffusers", "tensorboard", "safetensors", "text-to-image", "dreambooth", "stable-diffusion", "stable-diffusion-diffusers", "base_model:stabilityai/stable-diffusion-2-1", "license:creativeml-openrail-m", "endpoints_compatible", "diffusers:StableDiffusionPipeline", "region:us" ]
2024-02-13T20:57:24+00:00
[]
[]
TAGS #diffusers #tensorboard #safetensors #text-to-image #dreambooth #stable-diffusion #stable-diffusion-diffusers #base_model-stabilityai/stable-diffusion-2-1 #license-creativeml-openrail-m #endpoints_compatible #diffusers-StableDiffusionPipeline #region-us
# DreamBooth - awatterson/dogbooth This is a dreambooth model derived from stabilityai/stable-diffusion-2-1. The weights were trained on a photo of [v]dog using DreamBooth. You can find some example images in the following. DreamBooth for the text encoder was enabled: False.
[ "# DreamBooth - awatterson/dogbooth\n\nThis is a dreambooth model derived from stabilityai/stable-diffusion-2-1. The weights were trained on a photo of [v]dog using DreamBooth.\nYou can find some example images in the following. \n\n\n\nDreamBooth for the text encoder was enabled: False." ]
[ "TAGS\n#diffusers #tensorboard #safetensors #text-to-image #dreambooth #stable-diffusion #stable-diffusion-diffusers #base_model-stabilityai/stable-diffusion-2-1 #license-creativeml-openrail-m #endpoints_compatible #diffusers-StableDiffusionPipeline #region-us \n", "# DreamBooth - awatterson/dogbooth\n\nThis is a dreambooth model derived from stabilityai/stable-diffusion-2-1. The weights were trained on a photo of [v]dog using DreamBooth.\nYou can find some example images in the following. \n\n\n\nDreamBooth for the text encoder was enabled: False." ]
[ 97, 80 ]
[ "passage: TAGS\n#diffusers #tensorboard #safetensors #text-to-image #dreambooth #stable-diffusion #stable-diffusion-diffusers #base_model-stabilityai/stable-diffusion-2-1 #license-creativeml-openrail-m #endpoints_compatible #diffusers-StableDiffusionPipeline #region-us \n# DreamBooth - awatterson/dogbooth\n\nThis is a dreambooth model derived from stabilityai/stable-diffusion-2-1. The weights were trained on a photo of [v]dog using DreamBooth.\nYou can find some example images in the following. \n\n\n\nDreamBooth for the text encoder was enabled: False." ]
[ -0.061487723141908646, 0.13258428871631622, -0.003917930647730827, 0.02640218660235405, 0.11215130239725113, -0.008417452685534954, 0.15123024582862854, 0.03553721308708191, 0.0017114831134676933, 0.054416924715042114, 0.1157161146402359, -0.0034992226865142584, -0.0025027880910784006, 0.09191115945577621, 0.05528142303228378, -0.1742471158504486, 0.03444173187017441, -0.010138211771845818, -0.020818594843149185, 0.023959076032042503, 0.03449968248605728, -0.09361111372709274, 0.0942903384566307, -0.007053041364997625, -0.17105498909950256, 0.03942415863275528, -0.03830242529511452, 0.009943372569978237, 0.07939954102039337, 0.010377597063779831, 0.10748548060655594, 0.04798062518239021, 0.0225591529160738, -0.15134969353675842, 0.025829417631030083, 0.04711999371647835, -0.030925381928682327, 0.03996087610721588, 0.026055175811052322, -0.052660636603832245, 0.0076773385517299175, -0.05154308304190636, 0.05914812535047531, 0.03683357685804367, -0.02942194603383541, 0.02471768669784069, 0.04752589762210846, 0.13601581752300262, 0.08556817471981049, 0.0814414918422699, -0.03617561236023903, 0.03414981812238693, 0.04637124016880989, 0.10030538588762283, 0.16866186261177063, -0.21289321780204773, -0.050454385578632355, 0.2890041172504425, -0.03332337364554405, -0.004610857926309109, -0.030808912590146065, 0.06032281368970871, 0.04626943916082382, 0.015248671174049377, 0.020552685484290123, -0.04887856915593147, 0.007998380810022354, -0.13074038922786713, -0.10202249884605408, 0.025916362181305885, 0.031588293612003326, -0.0055976202711462975, -0.06771886348724365, -0.1687200665473938, -0.0759054645895958, 0.14506453275680542, -0.02606136165559292, 0.00026337680174037814, -0.016780460253357887, 0.01918904297053814, -0.013394136913120747, -0.03417269513010979, -0.07738807797431946, -0.07056616246700287, -0.0002797639463096857, 0.06086321175098419, 0.0014587425393983722, 0.018332989886403084, -0.02465370111167431, 0.15063399076461792, -0.07109130173921585, -0.13535748422145844, 0.0645117238163948, -0.05116814002394676, -0.02292844094336033, 0.07551147043704987, -0.03199369087815285, -0.2397885024547577, 0.10557835549116135, -0.05737321451306343, 0.11142077296972275, -0.007856015115976334, -0.007823396474123001, 0.05614214763045311, -0.0001989134616451338, 0.029657503589987755, -0.03891090303659439, -0.023141881451010704, -0.005401157774031162, 0.016818605363368988, 0.04126104712486267, -0.024544425308704376, -0.11442214995622635, 0.018410516902804375, -0.03017965890467167, 0.03257317468523979, 0.010762694291770458, 0.025531737133860588, -0.07959378510713577, -0.025115661323070526, 0.06281162798404694, -0.03965839371085167, -0.010407637804746628, -0.022271832451224327, 0.023794058710336685, -0.03770133852958679, 0.14930729568004608, 0.020191865041851997, -0.04340806230902672, 0.07385647296905518, -0.07440739125013351, 0.011675415560603142, -0.014856440015137196, -0.09856981784105301, -0.015865720808506012, -0.18041671812534332, 0.038534246385097504, -0.15267109870910645, -0.11994780600070953, -0.001208397326990962, 0.015715382993221283, -0.010969993658363819, -0.014076903462409973, -0.09497778117656708, -0.10525388270616531, -0.043922215700149536, 0.06217992305755615, 0.01681816391646862, 0.018646802753210068, 0.026152268052101135, -0.05392823740839958, 0.08556753396987915, -0.04822938144207001, -0.041957441717386246, -0.11165957897901535, 0.0044923508539795876, -0.09071329236030579, 0.13358119130134583, -0.0466756708920002, 0.12327715754508972, -0.0411541722714901, 0.0013933104928582907, 0.0194181427359581, 0.015021318569779396, 0.02177780494093895, 0.16708841919898987, -0.21318496763706207, -0.040353383868932724, 0.2087327092885971, -0.2032119184732437, -0.11929015815258026, 0.08528020232915878, -0.0048673865385353565, 0.12470757216215134, 0.08411242812871933, 0.12565144896507263, 0.09946340322494507, -0.2769841253757477, -0.024673085659742355, -0.05148645117878914, -0.08451187610626221, 0.04704098030924797, -0.0325796976685524, 0.047578707337379456, -0.020285625010728836, 0.022417176514863968, -0.0697312206029892, 0.10062340646982193, -0.03401856869459152, -0.02388160675764084, -0.04472634568810463, -0.08799321204423904, 0.040742889046669006, 0.0192253440618515, 0.03608867898583412, -0.014730342663824558, -0.03753861412405968, 0.059876833111047745, -0.004746389575302601, -0.04045775532722473, -0.010044927708804607, -0.04384413734078407, 0.010484776459634304, -0.028932929039001465, -0.0051655955612659454, -0.1009986400604248, -0.04636165872216225, 0.03210984170436859, 0.17456293106079102, 0.008677845820784569, 0.058178987354040146, 0.07679549604654312, 0.08301302045583725, -0.003301994176581502, -0.0816020742058754, 0.05697775259613991, 0.02757343463599682, -0.056844305247068405, -0.1341991275548935, 0.09613215178251266, -0.08652692288160324, -0.0014429293805733323, -0.18030299246311188, 0.08936057239770889, 0.07172419130802155, 0.25516611337661743, 0.09206375479698181, -0.05176892504096031, 0.07263346761465073, 0.028240026906132698, -0.029093941673636436, -0.08827336132526398, 0.011649723164737225, 0.02450578100979328, -0.13793331384658813, 0.13024066388607025, -0.12367037683725357, 0.14136053621768951, 0.11608937382698059, 0.0972093716263771, -0.052112873643636703, 0.02004760131239891, -0.021649932488799095, -0.003449835814535618, -0.08359222114086151, -0.011514466255903244, 0.14750833809375763, 0.019669558852910995, 0.15318159759044647, -0.018907589837908745, 0.04025718942284584, 0.06477618217468262, -0.05236639454960823, -0.0566772036254406, 0.09920956939458847, -0.16500632464885712, 0.010944894514977932, 0.03528164327144623, 0.025075638666749, 0.013867496512830257, 0.2088630646467209, -0.03502172604203224, 0.006269495934247971, -0.04019763320684433, 0.018724054098129272, 0.02830909751355648, 0.16199885308742523, -0.05072062090039253, -0.013982434757053852, -0.02364111691713333, -0.03420738875865936, 0.0069371736608445644, -0.15481853485107422, -0.0010920027270913124, 0.041825201362371445, -0.03264408931136131, 0.17633645236492157, 0.0198240764439106, -0.09402137249708176, 0.02508784644305706, -0.10483229905366898, -0.03787900134921074, 0.020076816901564598, -0.019489651545882225, -0.08943919092416763, 0.13601383566856384, -0.10123305767774582, -0.27362746000289917, -0.12736020982265472, 0.021272454410791397, -0.001683627488091588, 0.009404584765434265, 0.05359863117337227, -0.07859715819358826, -0.03178298473358154, -0.10544256120920181, 0.060349248349666595, 0.01547287032008171, 0.07766824960708618, 0.07516248524188995, 0.024389024823904037, 0.043909985572099686, -0.013880758546292782, 0.0048573315143585205, -0.05398465320467949, 0.031151525676250458, 0.05306341499090195, 0.014508347027003765, 0.09628375619649887, 0.11062537878751755, -0.008823881857097149, -0.016393465921282768, -0.00759811419993639, 0.25016486644744873, 0.007230840157717466, 0.05383430793881416, 0.12021937221288681, -0.003474707482382655, 0.07570957392454147, 0.176797553896904, 0.02791161648929119, -0.050506237894296646, 0.09180623292922974, -0.00028988055419176817, -0.10580950975418091, -0.04253498464822769, -0.0987946093082428, 0.022053822875022888, 0.03118979185819626, 0.08022614568471909, 0.0649581179022789, 0.05393851175904274, 0.12409260869026184, 0.06357071548700333, -0.02658044919371605, 0.01995674893260002, 0.09329839795827866, 0.059863727539777756, -0.08769824355840683, 0.030245894566178322, -0.06691054254770279, -0.060818541795015335, 0.03617320954799652, -0.03518707677721977, 0.07657467573881149, -0.07172571122646332, -0.1031995341181755, 0.05706822872161865, 0.024664223194122314, 0.07529246062040329, 0.0326046347618103, -0.06065591424703598, -0.0563826747238636, -0.00981572363525629, -0.10851209610700607, 0.04058929905295372, 0.12428674846887589, -0.03085605800151825, 0.027916813269257545, -0.02966662123799324, 0.14838923513889313, 0.0250532403588295, 0.03917815163731575, 0.12613564729690552, -0.21220016479492188, -0.0708397850394249, -0.03074897825717926, 0.0017342019127681851, -0.035929352045059204, -0.014443212188780308, 0.3254593014717102, -0.01963895373046398, 0.0433158315718174, -0.044550780206918716, 0.032586801797151566, 0.032815154641866684, 0.010290954262018204, -0.13177266716957092, 0.04784658923745155, -0.03646400570869446, -0.005404424853622913, -0.26355206966400146, 0.04458528012037277, -0.0033125528134405613, 0.07907790690660477, -0.00219073798507452, 0.03772573173046112, -0.0025149709545075893, 0.11868467181921005, 0.14049002528190613, 0.0027667409740388393, 0.04305756837129593, -0.03403463587164879, -0.13774600625038147, -0.016686834394931793, 0.018528908491134644, -0.03415823355317116, 0.03317326307296753, 0.10887960344552994, -0.010887657292187214, -0.002798578003421426, 0.016292616724967957, -0.17656287550926208, -0.07571518421173096, -0.045449934899806976, 0.11987597495317459, 0.06785077601671219, -0.0804765373468399, -0.07201842963695526, 0.07069368660449982, 0.08661691844463348, -0.23669147491455078, -0.11796187609434128, -0.07894158363342285, -0.009891631081700325, 0.0046412209048867226, -0.029550908133387566, 0.07046202570199966, -0.02845987305045128, 0.1320488303899765, -0.13515537977218628, -0.09790026396512985, 0.016922030597925186, -0.13945086300373077, -0.15729379653930664, -0.14047501981258392, 0.04731407389044762, 0.05058496072888374, -0.015272840857505798, 0.009022961370646954, -0.006724075879901648, -0.0014468296431005, -0.03590218722820282, 0.05259303003549576, 0.17965875566005707, -0.09620440751314163, 0.05438490957021713, 0.011088826693594456, -0.13497035205364227, -0.08010445535182953, 0.01604338362812996, 0.07692499458789825, 0.19849249720573425, -0.062067288905382156, 0.12401428073644638, 0.11666348576545715, -0.1091826930642128, -0.2308800369501114, -0.09039865434169769, 0.06577128171920776, 0.03366568312048912, 0.013173932209610939, -0.17966710031032562, 0.18388931453227997, -0.024647077545523643, -0.011215327307581902, 0.06330892443656921, -0.33868804574012756, -0.13316795229911804, 0.0464695505797863, 0.1825132817029953, 0.22115960717201233, -0.1006254032254219, -0.04875680431723595, 0.05923634022474289, -0.10508923977613449, 0.18038323521614075, -0.003888555569574237, 0.031722523272037506, 0.00780878821387887, 0.005505140870809555, 0.031925421208143234, -0.027676614001393318, 0.08008013665676117, 0.006844367831945419, 0.019115818664431572, -0.05420437082648277, 0.020791249349713326, 0.12779788672924042, -0.04542055353522301, 0.02091442421078682, -0.10537276417016983, 0.04932163655757904, -0.10159894078969955, -0.0009242816595360637, 0.00878006312996149, -0.00639199698343873, -0.03682517260313034, -0.14224353432655334, -0.04115433990955353, 0.029941264539957047, 0.08192399889230728, -0.004146529361605644, -0.09255820512771606, -0.021290631964802742, -0.039074987173080444, 0.20789490640163422, -0.03898725286126137, -0.04857451468706131, -0.11288531869649887, -0.02897154912352562, -0.06229810044169426, 0.12232720106840134, -0.09358446300029755, -0.018214626237750053, 0.1235627830028534, 0.06216316670179367, 0.08056686073541641, 0.024174047634005547, -0.11032366752624512, 0.027796432375907898, 0.0869022086262703, -0.14280274510383606, -0.08114486932754517, -0.013881489634513855, 0.02330404706299305, 0.05640358105301857, -0.04870661348104477, 0.16825903952121735, -0.10595126450061798, 0.033003564924001694, -0.00624000234529376, 0.025687916204333305, 0.00499731907621026, 0.08010206371545792, -0.016351470723748207, 0.05859055742621422, -0.04184646159410477, 0.07092860341072083, -0.026065779849886894, -0.10774708539247513, 0.05804082378745079, 0.010164815001189709, -0.10720829665660858, 0.010617614723742008, -0.050855010747909546, 0.21151529252529144, -0.08821490406990051, -0.053983304649591446, -0.10900650173425674, -0.11568087339401245, 0.015604401007294655, 0.12502293288707733, 0.0460657998919487, 0.054072149097919464, -0.02343655377626419, -0.04398190602660179, -0.052943602204322815, 0.08165168017148972, 0.06058875843882561, 0.05455304682254791, -0.24973061680793762, 0.010991089046001434, 0.038351595401763916, -0.026696886867284775, -0.06515327095985413, -0.030626434832811356, -0.08416157215833664, -0.01358033437281847, 0.023427562788128853, 0.15568594634532928, -0.05999782681465149, -0.044259462505578995, 0.0012310664169490337, -0.0006163702928461134, 0.03213253989815712, 0.042760998010635376, 0.00724730035290122, 0.012037948705255985, 0.003370030550286174, -0.008289461024105549, -0.045654069632291794, -0.07655703276395798, -0.027838390320539474, -0.05792110413312912, 0.058269601315259933, -0.051640499383211136, -0.10808588564395905, -0.011432732455432415, -0.2668137848377228, 0.04541283845901489, 0.1791999340057373, -0.038785528391599655, -0.04232368990778923, 0.024249928072094917, -0.04165200889110565, -0.029263650998473167, -0.00437549501657486, 0.009603560902178288, 0.11365827172994614, -0.08651460707187653, -0.04057018458843231, 0.03723937273025513, 0.07713504135608673, -0.07097519934177399, 0.042368773370981216, 0.11140233278274536, 0.08428308367729187, 0.13347522914409637, -0.19197876751422882, 0.09087952226400375, -0.08791191875934601, -0.01759895496070385, 0.021922409534454346, -0.03769078478217125, 0.027414629235863686, -0.021519195288419724, -0.02562946453690529, -0.0013258522376418114, 0.10848637670278549, 0.05271292105317116, -0.15455232560634613, 0.011986344121396542, -0.0281316377222538, 0.004676262382417917, 0.03756445273756981, 0.25715938210487366, -0.019950231537222862, 0.010950434021651745, -0.1214122548699379, 0.04940490424633026, 0.14828243851661682, 0.1616290658712387, 0.04613904282450676, 0.03694906085729599, 0.06434980779886246, 0.10301999747753143, 0.07078799605369568, 0.10748797655105591, -0.07025470584630966, 0.08963383734226227, -0.1296372413635254, 0.1548108607530594, -0.0742935910820961, -0.019228998571634293, 0.07931602001190186, -0.023474985733628273, -0.03760983794927597, 0.10586234927177429, -0.022708237171173096, -0.007764630485326052, -0.06997951120138168, -0.02137964777648449, -0.13200773298740387, 0.003748430637642741, -0.07210519909858704, -0.06018896400928497, 0.032892413437366486, 0.014388962648808956, 0.027286477386951447, 0.1405154913663864, 0.05712669715285301, 0.004809935577213764, 0.1019260436296463, -0.02107219025492668, -0.0741705521941185, 0.05068780109286308, 0.015481643378734589, 0.06810500472784042, 0.10402896255254745, -0.03443341329693794, 0.0638127475976944, 0.06401742994785309, 0.016968239098787308, 0.027907563373446465, -0.05016041174530983, 0.010430368594825268, 0.0064874920062720776, -0.042156562209129333, 0.13841810822486877, 0.12838968634605408, -0.05113581195473671, -0.061429012566804886, 0.1460026651620865, -0.029917048290371895, -0.10159068554639816, -0.1336270123720169, 0.13094955682754517, -0.10981430113315582, 0.07592546939849854, -0.027772966772317886, -0.11562629789113998, -0.05039845407009125, 0.1193222627043724, 0.08996721357107162, -0.0016583942342549562, 0.018474459648132324, -0.0720442533493042, -0.007062193937599659, -0.06178976595401764, 0.06674215197563171, 0.0027086706832051277, 0.26609134674072266, -0.04117506742477417, 0.026158491149544716, -0.08214361220598221, -0.15320613980293274, -0.10671105235815048, -0.1747562289237976, 0.0303203072398901, -0.00296419276855886, -0.07311160117387772, 0.05694422125816345, -0.18729813396930695, -0.12428377568721771, 0.2193423956632614, -0.12802714109420776, -0.0063324496150016785, -0.06450286507606506, 0.1067982017993927, 0.03326466679573059, 0.05638796091079712, -0.059130240231752396, 0.05009376257658005, 0.10145528614521027, -0.054347485303878784, -0.07046888768672943, 0.047771938145160675, -0.10428522527217865, -0.28307920694351196, 0.19192926585674286, -0.029060209169983864, 0.020526928827166557, 0.03755174204707146, -0.012315863743424416, -0.11228228360414505, 0.023497631773352623, -0.06913761049509048, -0.003479823237285018, -0.08233754336833954, 0.15201765298843384, 0.016115374863147736, 0.056246958673000336, 0.023168373852968216, -0.12405716627836227, -0.01952211558818817, 0.02606482058763504, -0.03836448863148689, -0.09479398280382156, 0.04302215203642845, -0.03290941193699837, 0.06710419803857803, 0.05597975477576256, -0.040804021060466766, 0.05101136490702629, 0.0229609627276659, 0.005147357936948538, -0.01730109378695488, -0.01013228204101324, 0.0669274851679802, -0.09122266620397568, 0.018828265368938446, 0.04483867436647415, -0.0201016366481781, -0.2870188355445862, -0.11351446062326431, -0.1259414553642273, 0.016642563045024872, -0.01858353242278099, 0.07663609087467194, 0.20719976723194122, 0.04150402918457985, 0.009955822490155697, -0.14151644706726074, 0.007274589966982603, 0.07649216800928116, 0.05186416208744049, -0.1085299476981163 ]
null
null
null
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # tinyllama-colorist-lora This model is a fine-tuned version of [PY007/TinyLlama-1.1B-Chat-v0.3](https://huggingface.co/PY007/TinyLlama-1.1B-Chat-v0.3) on the None dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - training_steps: 200 - mixed_precision_training: Native AMP ### Training results ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.0+cu121 - Datasets 2.17.0 - Tokenizers 0.15.1
{"license": "apache-2.0", "tags": ["trl", "sft", "generated_from_trainer"], "base_model": "PY007/TinyLlama-1.1B-Chat-v0.3", "model-index": [{"name": "tinyllama-colorist-lora", "results": []}]}
null
ibibek/tinyllama-colorist-lora
[ "tensorboard", "safetensors", "trl", "sft", "generated_from_trainer", "base_model:PY007/TinyLlama-1.1B-Chat-v0.3", "license:apache-2.0", "region:us" ]
2024-02-13T20:59:17+00:00
[]
[]
TAGS #tensorboard #safetensors #trl #sft #generated_from_trainer #base_model-PY007/TinyLlama-1.1B-Chat-v0.3 #license-apache-2.0 #region-us
# tinyllama-colorist-lora This model is a fine-tuned version of PY007/TinyLlama-1.1B-Chat-v0.3 on the None dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - training_steps: 200 - mixed_precision_training: Native AMP ### Training results ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.0+cu121 - Datasets 2.17.0 - Tokenizers 0.15.1
[ "# tinyllama-colorist-lora\n\nThis model is a fine-tuned version of PY007/TinyLlama-1.1B-Chat-v0.3 on the None dataset.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0002\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 32\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- training_steps: 200\n- mixed_precision_training: Native AMP", "### Training results", "### Framework versions\n\n- Transformers 4.35.2\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1" ]
[ "TAGS\n#tensorboard #safetensors #trl #sft #generated_from_trainer #base_model-PY007/TinyLlama-1.1B-Chat-v0.3 #license-apache-2.0 #region-us \n", "# tinyllama-colorist-lora\n\nThis model is a fine-tuned version of PY007/TinyLlama-1.1B-Chat-v0.3 on the None dataset.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0002\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 32\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- training_steps: 200\n- mixed_precision_training: Native AMP", "### Training results", "### Framework versions\n\n- Transformers 4.35.2\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1" ]
[ 57, 43, 6, 12, 8, 3, 125, 4, 33 ]
[ "passage: TAGS\n#tensorboard #safetensors #trl #sft #generated_from_trainer #base_model-PY007/TinyLlama-1.1B-Chat-v0.3 #license-apache-2.0 #region-us \n# tinyllama-colorist-lora\n\nThis model is a fine-tuned version of PY007/TinyLlama-1.1B-Chat-v0.3 on the None dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0002\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 32\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- training_steps: 200\n- mixed_precision_training: Native AMP### Training results### Framework versions\n\n- Transformers 4.35.2\n- Pytorch 2.1.0+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.1" ]
[ -0.09162449091672897, 0.08750587701797485, -0.004118462558835745, 0.0819387435913086, 0.12069046497344971, 0.012158194556832314, 0.1329593062400818, 0.1383032649755478, -0.08035019040107727, 0.06513556838035583, 0.01911940984427929, 0.06151379644870758, 0.05883807688951492, 0.1231413260102272, -0.04085353761911392, -0.20482751727104187, 0.004525299649685621, -0.017435211688280106, -0.0670468658208847, 0.1159663051366806, 0.11129280179738998, -0.10965857654809952, 0.04126819595694542, 0.023315340280532837, -0.13125616312026978, 0.0053040580824017525, -0.006525339093059301, -0.05317167565226555, 0.1114061176776886, -0.014861625619232655, 0.14343367516994476, 0.034638501703739166, 0.13241466879844666, -0.22082650661468506, 0.002078643999993801, 0.09879884868860245, 0.03982066363096237, 0.085658960044384, 0.07481855154037476, -0.018986936658620834, 0.06943511217832565, -0.1299302875995636, 0.11004873365163803, 0.006434911862015724, -0.10533525049686432, -0.20543065667152405, -0.11743098497390747, 0.04341084510087967, 0.1178910955786705, 0.09072043001651764, 0.006787795573472977, 0.14857827126979828, -0.09013151377439499, 0.06870315223932266, 0.25397300720214844, -0.24351945519447327, -0.05346228554844856, 0.06893841177225113, 0.03679715842008591, 0.06136064976453781, -0.08933361619710922, -0.027530740946531296, 0.030997926369309425, 0.015841661021113396, 0.08926887065172195, 0.0023941509425640106, -0.10962608456611633, -0.021296430379152298, -0.11241985112428665, -0.03020091913640499, 0.09799840301275253, 0.02771582454442978, -0.03453672677278519, -0.09337113797664642, -0.05978311598300934, -0.11993129551410675, -0.032827962189912796, -0.03933316841721535, 0.046136509627103806, -0.04248278588056564, -0.04474435746669769, -0.02733820490539074, -0.09342112392187119, -0.04502831771969795, 0.018046576529741287, 0.09731192141771317, 0.039625223726034164, 0.025089291855692863, -0.05007581785321236, 0.12293584644794464, 0.03239770606160164, -0.14437457919120789, 0.017173152416944504, 0.014600183814764023, -0.07013683021068573, -0.05111829191446304, -0.04475845396518707, -0.047447312623262405, -0.010102253407239914, 0.1323441118001938, -0.0638323724269867, 0.06791505217552185, -0.035678911954164505, 0.012827974744141102, -0.06159471347928047, 0.14098946750164032, -0.013861862011253834, -0.012659544125199318, 0.007576463278383017, 0.11793842911720276, 0.04231766611337662, -0.017847295850515366, -0.07077155262231827, -0.020286060869693756, 0.08552059531211853, 0.06322392076253891, -0.03886157646775246, 0.004434671718627214, -0.07460404932498932, -0.029814211651682854, 0.030563801527023315, -0.13574357330799103, 0.04746241867542267, 0.009592419490218163, -0.07050042599439621, -0.05347586050629616, 0.01761539839208126, 0.03546769171953201, -0.01816801354289055, 0.10900310426950455, -0.054431598633527756, 0.027292706072330475, -0.102786585688591, -0.05825759097933769, 0.016993548721075058, -0.044579681009054184, -0.027450362220406532, -0.04706934839487076, -0.1829054206609726, -0.04656876251101494, 0.0633508712053299, -0.0646529346704483, -0.033158741891384125, -0.01510742399841547, -0.06760407984256744, 0.019505120813846588, -0.013339055702090263, 0.137724369764328, -0.04810516908764839, 0.08278154581785202, -0.013548742979764938, 0.0394568145275116, 0.059023793786764145, 0.02702077478170395, -0.09700161218643188, 0.02749691717326641, -0.18878109753131866, 0.04500468820333481, -0.06070861965417862, 0.0127726374194026, -0.10292822867631912, -0.0891399011015892, -0.004683217499405146, -0.03505757823586464, 0.06249492987990379, 0.08798467367887497, -0.2173827886581421, -0.030289655551314354, 0.18478120863437653, -0.12349507957696915, -0.06163649260997772, 0.08690348267555237, -0.05781257525086403, 0.010684124194085598, 0.05738888680934906, 0.17891362309455872, 0.08488533645868301, -0.1546350121498108, 0.00045951767242513597, -0.0260507483035326, 0.07962919026613235, 0.04125701263546944, 0.053316112607717514, -0.01784568652510643, 0.0679916962981224, 0.0006994868163019419, -0.05396103858947754, -0.008969300426542759, -0.0698666200041771, -0.08696243166923523, -0.058807797729969025, -0.06197591871023178, 0.05579834431409836, 0.05062256380915642, 0.017778495326638222, -0.08661393821239471, -0.11342477798461914, 0.060440197587013245, 0.1283896118402481, -0.061900604516267776, 0.031553804874420166, -0.09017300605773926, 0.038308195769786835, 0.010812033899128437, -0.05570309981703758, -0.19396023452281952, -0.09214857220649719, 0.03415660560131073, -0.07961861789226532, -0.012414222583174706, 0.02092229574918747, 0.05647516995668411, 0.06814360618591309, -0.07244398444890976, -0.011360347270965576, -0.10991179198026657, -0.006258354987949133, -0.08447382599115372, -0.21034090220928192, -0.03559061139822006, -0.02701548859477043, 0.19931940734386444, -0.20835447311401367, 0.01575763151049614, 0.020809734240174294, 0.16429755091667175, 0.05200183391571045, -0.06092160940170288, -0.01607060618698597, 0.05255220830440521, 0.005236564204096794, -0.092415951192379, 0.04702245071530342, 0.006344550754874945, -0.055603671818971634, -0.02230808325111866, -0.15429779887199402, 0.0020004890393465757, 0.10388894379138947, 0.04759014770388603, -0.0826667994260788, -0.007232437375932932, -0.07068470865488052, -0.028251493349671364, -0.08296800404787064, 0.015975283458828926, 0.17482562363147736, 0.029392184689641, 0.12008926272392273, -0.08045881241559982, -0.07757985591888428, -0.007413593586534262, -0.01811741478741169, 0.02000187523663044, 0.07505906373262405, 0.05784350633621216, -0.06815037876367569, 0.07854513823986053, 0.10775525867938995, -0.041352614760398865, 0.10190331935882568, -0.06548290699720383, -0.07859382778406143, -0.03384529426693916, 0.003072492079809308, 0.003991129342466593, 0.14934702217578888, -0.04197050258517265, 0.03678782656788826, 0.02415608800947666, 0.015607480891048908, 0.021988969296216965, -0.2218107283115387, -0.02006620168685913, 0.02292116917669773, -0.051036201417446136, 0.02417997643351555, -0.03151478245854378, 0.05926493555307388, 0.09970476478338242, -0.007567971013486385, -0.03783716261386871, 0.004264220129698515, -0.03246275335550308, -0.08421238511800766, 0.1685454249382019, -0.10514070838689804, -0.17314596474170685, -0.06863582134246826, 0.03617622330784798, -0.0005627732607536018, -0.028695112094283104, 0.007337029557675123, -0.09982434660196304, -0.04808909818530083, -0.10771902650594711, -0.04014979302883148, 0.011849983595311642, -0.015667717903852463, 0.08722563832998276, 0.038399837911129, 0.09002920985221863, -0.12274300307035446, -0.00006605221278732643, -0.03782540187239647, -0.07544149458408356, 0.03192266449332237, 0.07478544861078262, 0.06390579044818878, 0.1552356332540512, -0.03415791317820549, 0.012852436862885952, -0.030777812004089355, 0.18676719069480896, -0.09756030142307281, -0.0028374618850648403, 0.15414807200431824, -0.005370937287807465, 0.0663892924785614, 0.1478428691625595, 0.041805583983659744, -0.08911151438951492, 0.02336662821471691, 0.0725867971777916, -0.023464029654860497, -0.2505546808242798, -0.05586618185043335, -0.014101173728704453, -0.06893324851989746, 0.08449657261371613, 0.032820846885442734, 0.009415303356945515, 0.02614162303507328, -0.01849633827805519, -0.00716719264164567, 0.027414238080382347, 0.06769555062055588, 0.08054926246404648, 0.04510246217250824, 0.11275459080934525, -0.01889921724796295, -0.031108956784009933, 0.055471014231443405, 0.011042389087378979, 0.2510300874710083, -0.026568006724119186, 0.0778365284204483, 0.04938920959830284, 0.13873425126075745, -0.003787140129134059, 0.01367399375885725, 0.005778483115136623, -0.025472553446888924, 0.00003775732693611644, -0.06582410633563995, -0.008633730001747608, 0.010056920349597931, -0.06324446201324463, 0.025653965771198273, -0.0651407465338707, 0.021108582615852356, 0.03248384967446327, 0.23477183282375336, 0.06933291256427765, -0.25700512528419495, -0.040001023560762405, 0.02231919765472412, -0.019702114164829254, -0.03485646843910217, -0.0073094842955470085, 0.11321648955345154, -0.12259776890277863, 0.07363726943731308, -0.08557506650686264, 0.07098963856697083, -0.050749339163303375, -0.005742188543081284, 0.06200789660215378, 0.07835392653942108, 0.004847847856581211, 0.06231443211436272, -0.2068425714969635, 0.2252006083726883, 0.005288388580083847, 0.09957730770111084, -0.05547692999243736, 0.033710919320583344, 0.0042778924107551575, 0.040473829954862595, 0.07953830808401108, -0.0029564551077783108, -0.08230549842119217, -0.1670791208744049, -0.09782266616821289, 0.027658723294734955, 0.11133243143558502, -0.03531920909881592, 0.08586673438549042, -0.04287131875753403, 0.0028028630185872316, 0.04844669997692108, -0.07076362520456314, -0.16248439252376556, -0.1180947795510292, 0.03156876564025879, -0.0012355954386293888, -0.052543576806783676, -0.1069202795624733, -0.10643879324197769, -0.03484366461634636, 0.14968256652355194, 0.03809447959065437, -0.0530732199549675, -0.1249295175075531, 0.07171401381492615, 0.1308606117963791, -0.03944137319922447, 0.017090648412704468, 0.0437726154923439, 0.1395547091960907, 0.01812553033232689, -0.06621403247117996, 0.04012289643287659, -0.06642689555883408, -0.20875856280326843, -0.08240842819213867, 0.15796485543251038, 0.05346154049038887, 0.0581844188272953, 0.009881607256829739, 0.03859053924679756, 0.02471655048429966, -0.07690680772066116, 0.025249328464269638, 0.06070076301693916, 0.09449104219675064, 0.020006248727440834, -0.06056353077292442, -0.0036122435703873634, -0.05774950236082077, -0.045307353138923645, 0.09574829787015915, 0.2306365966796875, -0.08732964098453522, 0.09909065067768097, 0.0189984068274498, -0.07418487221002579, -0.15301823616027832, 0.05420893430709839, 0.11803076416254044, 0.010718880221247673, 0.08990085124969482, -0.1589907854795456, 0.09923052042722702, 0.11634009331464767, -0.04283110424876213, 0.041052259504795074, -0.36374786496162415, -0.14297421276569366, 0.051452990621328354, 0.12056594341993332, 0.00792484637349844, -0.12926754355430603, -0.03087025322020054, -0.03104151226580143, -0.11665996164083481, 0.1305975764989853, -0.10506557673215866, 0.08375701308250427, -0.006177055649459362, 0.06359530240297318, 0.03577960655093193, -0.04461890086531639, 0.15335887670516968, 0.0005671714316122234, 0.09297412633895874, -0.050306808203458786, 0.01874854974448681, 0.10135670006275177, -0.05437758192420006, -0.0037732296623289585, -0.014230961911380291, 0.06050602346658707, -0.09951908886432648, -0.0044397106394171715, -0.07963193953037262, 0.0305187925696373, -0.062473949044942856, -0.05413365364074707, -0.0590849444270134, 0.06671874225139618, 0.06875656545162201, -0.03594868630170822, 0.0617951825261116, 0.02423267439007759, 0.13632887601852417, 0.09047836065292358, 0.07283748686313629, -0.000884118489921093, -0.09438671916723251, 0.0032265211921185255, -0.006231765728443861, 0.035420455038547516, -0.10524065792560577, 0.047369178384542465, 0.11388718336820602, 0.034653376787900925, 0.12933947145938873, 0.03040016070008278, -0.06278692185878754, 0.002183087170124054, 0.05808061361312866, -0.10784967243671417, -0.1014249175786972, 0.0230344757437706, 0.029540175572037697, -0.1113450825214386, -0.013671645894646645, 0.11307749897241592, -0.035150062292814255, -0.025896070525050163, -0.0008961631101556122, 0.02989424206316471, 0.0017726192018017173, 0.17122770845890045, 0.03495972976088524, 0.0713859274983406, -0.07948053628206253, 0.1063433438539505, 0.08980117738246918, -0.07200665026903152, 0.015620669350028038, 0.05898304283618927, -0.08861421793699265, -0.0040306877344846725, 0.07680825889110565, 0.14072422683238983, 0.01972772181034088, -0.04989730194211006, -0.12363311648368835, -0.0992799699306488, 0.03550223633646965, 0.11381011456251144, 0.05307646840810776, 0.0015773061895743012, -0.004198549780994654, 0.014857216738164425, -0.12755213677883148, 0.10198400169610977, 0.04558110982179642, 0.0624140202999115, -0.1275596171617508, 0.1296408772468567, 0.03304019570350647, 0.0037943068891763687, -0.006511481013149023, 0.034557782113552094, -0.06253128498792648, -0.004497711546719074, -0.09485011547803879, 0.01954055204987526, -0.0017260228050872684, -0.009152354672551155, -0.01276395469903946, -0.039391838014125824, -0.028407782316207886, 0.03680647909641266, -0.08227013051509857, -0.0469474233686924, -0.006683017127215862, 0.02951040491461754, -0.14019884169101715, -0.02409670688211918, 0.053029656410217285, -0.09508533775806427, 0.07353446632623672, 0.03490806370973587, 0.03886416181921959, 0.028201540932059288, -0.14211933314800262, -0.020339466631412506, 0.03699521720409393, 0.033805765211582184, 0.03593442961573601, -0.12626251578330994, -0.01840078830718994, -0.022795598953962326, 0.028574395924806595, 0.02611559070646763, 0.05896206200122833, -0.14154808223247528, -0.006003543268889189, -0.03699095919728279, -0.08698408305644989, -0.03785170242190361, 0.03269128501415253, 0.08151765912771225, 0.03781936317682266, 0.15534517168998718, -0.0846165269613266, 0.06606686860322952, -0.2318459004163742, -0.0345190167427063, 0.00023438599600922316, -0.0016732183285057545, -0.079831562936306, -0.011572098359465599, 0.07516396045684814, -0.05125917121767998, 0.0926864892244339, 0.030225902795791626, 0.090375617146492, 0.04317938908934593, -0.060552507638931274, -0.016230642795562744, 0.024733388796448708, 0.1468799114227295, 0.045782145112752914, -0.00949756521731615, 0.1240607500076294, -0.04430850222706795, 0.048046234995126724, 0.029450146481394768, 0.20235542953014374, 0.1799953281879425, 0.016031168401241302, 0.02061782404780388, 0.03600176423788071, -0.11565270274877548, -0.15648144483566284, 0.11654067784547806, -0.05971170589327812, 0.0876464769244194, -0.050340231508016586, 0.21525244414806366, 0.09410134702920914, -0.19920901954174042, 0.06359249353408813, -0.05749917030334473, -0.09022660553455353, -0.12681059539318085, -0.046908944845199585, -0.08875114470720291, -0.12735000252723694, 0.02465081214904785, -0.1161559596657753, 0.05496307834982872, 0.14203976094722748, 0.002878685714676976, 0.021332962438464165, 0.12195809930562973, -0.009151951409876347, 0.008085930719971657, 0.0353681705892086, 0.04482196643948555, 0.015784060582518578, -0.07060612738132477, -0.056008391082286835, 0.0251421257853508, 0.017717277631163597, 0.06194131448864937, -0.040322788059711456, -0.0018282943638041615, 0.01708269491791725, 0.006476643495261669, -0.07632207870483398, 0.042232926934957504, -0.009867897257208824, 0.08036453276872635, 0.035183701664209366, 0.05317758768796921, 0.04775254800915718, -0.046729717403650284, 0.3105505108833313, -0.05535507947206497, -0.09730909764766693, -0.11199149489402771, 0.16550973057746887, 0.015771199017763138, -0.024274388328194618, 0.06943155080080032, -0.11604689061641693, -0.018406976014375687, 0.1360449641942978, 0.13413213193416595, -0.0612444169819355, -0.0026802930515259504, -0.023514170199632645, -0.017294520512223244, -0.057694002985954285, 0.10843975841999054, 0.10198233276605606, 0.04442499205470085, -0.07512842863798141, -0.021929841488599777, -0.014862711541354656, -0.01801726594567299, -0.08139453828334808, 0.068351149559021, -0.01435098610818386, 0.015240194275975227, -0.03943350166082382, 0.08121774345636368, 0.0622708797454834, -0.1818641871213913, 0.08878183364868164, -0.22962111234664917, -0.19108404219150543, 0.011315497569739819, 0.1271478831768036, -0.043258845806121826, 0.05351608991622925, -0.014956626109778881, -0.016414880752563477, 0.0966622605919838, -0.015041404403746128, -0.02588311955332756, -0.0995328277349472, 0.07560376822948456, -0.12082383781671524, 0.2508528232574463, -0.011978982016444206, 0.09411975741386414, 0.09121854603290558, 0.024783173575997353, -0.10806573927402496, 0.03965306282043457, 0.07054422795772552, -0.07656551152467728, 0.005862072110176086, 0.1635105311870575, -0.04347117617726326, 0.08696530014276505, 0.0788937658071518, -0.13396140933036804, 0.001547955209389329, -0.026939716190099716, -0.033812958747148514, -0.06535080075263977, -0.008497478440403938, -0.07387606799602509, 0.13236433267593384, 0.1992357075214386, -0.03789524361491203, 0.03031182661652565, -0.05200549215078354, 0.008507125079631805, 0.04111889749765396, 0.08872215449810028, -0.020371438935399055, -0.20980173349380493, 0.04376857355237007, 0.07287480682134628, 0.016937416046857834, -0.1972523182630539, -0.10241857171058655, 0.060918550938367844, -0.08582572638988495, -0.09248218685388565, 0.1051350012421608, 0.04160461947321892, 0.034534648060798645, -0.026831144466996193, -0.10274911671876907, -0.03802400827407837, 0.14672207832336426, -0.15779142081737518, -0.04538678377866745 ]
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # chat_500STEPS_1e7rate_SFT This model is a fine-tuned version of [meta-llama/Llama-2-7b-chat-hf](https://huggingface.co/meta-llama/Llama-2-7b-chat-hf) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.4297 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-07 - train_batch_size: 4 - eval_batch_size: 1 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 8 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_steps: 100 - training_steps: 500 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | 1.6169 | 0.1 | 50 | 1.6126 | | 1.5653 | 0.2 | 100 | 1.5784 | | 1.524 | 0.29 | 150 | 1.5257 | | 1.4813 | 0.39 | 200 | 1.4845 | | 1.4608 | 0.49 | 250 | 1.4560 | | 1.4351 | 0.59 | 300 | 1.4397 | | 1.4317 | 0.68 | 350 | 1.4319 | | 1.4269 | 0.78 | 400 | 1.4300 | | 1.4167 | 0.88 | 450 | 1.4297 | | 1.4284 | 0.98 | 500 | 1.4297 | ### Framework versions - Transformers 4.37.2 - Pytorch 2.0.0+cu117 - Datasets 2.17.0 - Tokenizers 0.15.2
{"tags": ["trl", "sft", "generated_from_trainer"], "base_model": "meta-llama/Llama-2-7b-chat-hf", "model-index": [{"name": "chat_500STEPS_1e7rate_SFT", "results": []}]}
text-generation
tsavage68/chat_500STEPS_1e7rate_SFT
[ "transformers", "safetensors", "llama", "text-generation", "trl", "sft", "generated_from_trainer", "conversational", "base_model:meta-llama/Llama-2-7b-chat-hf", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2024-02-13T21:02:32+00:00
[]
[]
TAGS #transformers #safetensors #llama #text-generation #trl #sft #generated_from_trainer #conversational #base_model-meta-llama/Llama-2-7b-chat-hf #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
chat\_500STEPS\_1e7rate\_SFT ============================ This model is a fine-tuned version of meta-llama/Llama-2-7b-chat-hf on an unknown dataset. It achieves the following results on the evaluation set: * Loss: 1.4297 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 1e-07 * train\_batch\_size: 4 * eval\_batch\_size: 1 * seed: 42 * gradient\_accumulation\_steps: 2 * total\_train\_batch\_size: 8 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: cosine * lr\_scheduler\_warmup\_steps: 100 * training\_steps: 500 ### Training results ### Framework versions * Transformers 4.37.2 * Pytorch 2.0.0+cu117 * Datasets 2.17.0 * Tokenizers 0.15.2
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-07\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 1\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 8\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_steps: 100\n* training\\_steps: 500", "### Training results", "### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.0.0+cu117\n* Datasets 2.17.0\n* Tokenizers 0.15.2" ]
[ "TAGS\n#transformers #safetensors #llama #text-generation #trl #sft #generated_from_trainer #conversational #base_model-meta-llama/Llama-2-7b-chat-hf #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-07\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 1\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 8\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_steps: 100\n* training\\_steps: 500", "### Training results", "### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.0.0+cu117\n* Datasets 2.17.0\n* Tokenizers 0.15.2" ]
[ 84, 145, 4, 33 ]
[ "passage: TAGS\n#transformers #safetensors #llama #text-generation #trl #sft #generated_from_trainer #conversational #base_model-meta-llama/Llama-2-7b-chat-hf #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-07\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 1\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 8\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_steps: 100\n* training\\_steps: 500### Training results### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.0.0+cu117\n* Datasets 2.17.0\n* Tokenizers 0.15.2" ]
[ -0.1264524757862091, 0.10048962384462357, -0.002988111926242709, 0.07021201401948929, 0.11775663495063782, 0.015688039362430573, 0.10355205088853836, 0.14844270050525665, -0.06577921658754349, 0.0907679870724678, 0.13912944495677948, 0.11874600499868393, 0.06253913789987564, 0.17900799214839935, -0.01958063431084156, -0.32588130235671997, -0.0033641981426626444, -0.020962556824088097, -0.15907752513885498, 0.13289013504981995, 0.08159270137548447, -0.11876463890075684, 0.054352302104234695, -0.033767323940992355, -0.09925711154937744, -0.034664493054151535, -0.036511603742837906, -0.04554397985339165, 0.13619154691696167, 0.0005286114756017923, 0.08217086642980576, 0.04410487785935402, 0.0977410152554512, -0.2490074634552002, 0.01332472637295723, 0.05861414223909378, 0.03058757446706295, 0.08727744221687317, 0.06425372511148453, -0.045484960079193115, 0.10879740864038467, -0.11148316413164139, 0.07761994749307632, 0.04077523574233055, -0.1150345653295517, -0.22232685983181, -0.08138809353113174, 0.037898555397987366, 0.15914009511470795, 0.07819047570228577, -0.02387550100684166, 0.06179753318428993, -0.08809037506580353, 0.08778182417154312, 0.26622962951660156, -0.26082199811935425, -0.0803927332162857, 0.04525091126561165, 0.06480256468057632, 0.07270687818527222, -0.13786669075489044, -0.00040268359589390457, 0.02666735090315342, 0.002402259735390544, 0.1457911729812622, 0.0011989828199148178, 0.08160808682441711, 0.019526291638612747, -0.14256912469863892, -0.0404927134513855, 0.09752488881349564, 0.07638733088970184, -0.02191729098558426, -0.117055244743824, -0.05150604620575905, -0.21924132108688354, -0.0469721220433712, 0.0048906016163527966, 0.024466773495078087, -0.05435978248715401, -0.0976991206407547, 0.0018299893708899617, -0.06536123901605606, -0.11036951094865799, 0.07136641442775726, 0.15899519622325897, 0.04423387721180916, -0.04154982417821884, 0.032085537910461426, 0.14945510029792786, 0.0841037780046463, -0.1592804342508316, -0.00438303267583251, 0.021879728883504868, -0.09395690262317657, -0.013251989148557186, -0.017977595329284668, 0.019099758937954903, 0.02160456031560898, 0.15283530950546265, -0.02273513376712799, 0.07240522652864456, 0.07283692806959152, 0.02470562793314457, -0.09540550410747528, 0.13851411640644073, -0.06190080940723419, -0.10252466797828674, -0.04786761477589607, 0.14619328081607819, 0.012581497430801392, -0.022934289649128914, -0.08331320434808731, 0.013802241533994675, 0.09810896217823029, 0.07666955143213272, -0.023147884756326675, 0.03883035108447075, -0.06901142746210098, -0.015698198229074478, 0.015411962755024433, -0.10639147460460663, 0.03053024224936962, 0.015266641043126583, -0.07017149776220322, -0.06110352277755737, -0.0017788425320759416, 0.007768351584672928, 0.0008055801154114306, 0.12106979638338089, -0.07360301911830902, -0.024884812533855438, -0.09259165078401566, -0.087873175740242, 0.0010621006367728114, -0.09489613026380539, -0.011355007067322731, -0.061767976731061935, -0.1408682018518448, -0.05417333543300629, 0.05835656821727753, -0.06295600533485413, -0.06870770454406738, -0.0916588306427002, -0.09978596121072769, 0.0329466350376606, -0.004799443297088146, 0.16095085442066193, -0.05204198136925697, 0.12203902751207352, -0.005801612511277199, 0.08953054994344711, 0.09444393217563629, 0.045202791690826416, -0.040215104818344116, 0.06523449718952179, -0.18047310411930084, 0.08360772579908371, -0.07990517467260361, 0.07331262528896332, -0.12418321520090103, -0.09523487836122513, -0.03442656993865967, -0.002882768167182803, 0.08645583689212799, 0.16967836022377014, -0.17538177967071533, -0.08429605513811111, 0.21079425513744354, -0.052949268370866776, -0.11479993909597397, 0.11311250180006027, -0.033573996275663376, 0.016947466880083084, 0.024285687133669853, 0.16144296526908875, 0.10068191587924957, -0.0690755769610405, 0.030635442584753036, -0.017938785254955292, 0.09590212255716324, 0.03716844320297241, 0.09047484397888184, -0.03809264302253723, 0.025487834587693214, -0.0010347223142161965, -0.059627920389175415, 0.045530930161476135, -0.09400080889463425, -0.08962943404912949, -0.004561227280646563, -0.08528867363929749, 0.07209974527359009, 0.04131527617573738, 0.03208664432168007, -0.09148811548948288, -0.11744840443134308, -0.026324521750211716, 0.10299796611070633, -0.08725423365831375, 0.017189880833029747, -0.03521981090307236, 0.03881141170859337, 0.005847202148288488, 0.008301934227347374, -0.1343899667263031, -0.037844039499759674, 0.03304402902722359, 0.013501378707587719, -0.01351541094481945, -0.022279951721429825, 0.08750966191291809, 0.06644240766763687, -0.07242284715175629, -0.08773952722549438, -0.05159381777048111, -0.00632308516651392, -0.10476066917181015, -0.2494734525680542, -0.06775138527154922, -0.03429614007472992, 0.20658056437969208, -0.23600825667381287, 0.04520190879702568, 0.009844743646681309, 0.12013832479715347, 0.029989946633577347, -0.04090619832277298, 0.012551939114928246, 0.044744838029146194, -0.025977151468396187, -0.0923406332731247, 0.041665203869342804, -0.013624212704598904, -0.14510773122310638, -0.00803033635020256, -0.13407449424266815, 0.11160066723823547, 0.09204533696174622, 0.015666387975215912, -0.12217529118061066, -0.09550535678863525, -0.06570889800786972, -0.04340104013681412, -0.02818954922258854, -0.0045694625005126, 0.13766928017139435, 0.03622426837682724, 0.11827421933412552, -0.07852553576231003, -0.0693659707903862, 0.02518627978861332, -0.0022009217645972967, 0.017270293086767197, 0.16328707337379456, 0.04823872447013855, -0.05786338821053505, 0.1190037950873375, 0.12248537689447403, -0.04014257714152336, 0.1569419950246811, -0.04814097657799721, -0.08410318195819855, -0.03186142444610596, 0.060219209641218185, 0.029809657484292984, 0.13182848691940308, -0.11535637825727463, -0.017171241343021393, 0.00870989728718996, 0.01621703803539276, 0.001986654242500663, -0.19592247903347015, -0.042907074093818665, 0.0514935702085495, -0.06659891456365585, 0.018219204619526863, -0.028966419398784637, -0.02055971324443817, 0.10257281363010406, 0.031415484845638275, -0.04975751414895058, -0.0028324301820248365, -0.02102285996079445, -0.08405378460884094, 0.23163138329982758, -0.09193634986877441, -0.13656127452850342, -0.09989508986473083, 0.03022884950041771, -0.008991995826363564, 0.008560371585190296, 0.023953324183821678, -0.1093045249581337, 0.0028526363894343376, -0.08143916726112366, 0.013067766092717648, -0.03568268567323685, 0.04014617204666138, -0.023486921563744545, 0.015437688678503036, 0.02980407327413559, -0.08130250126123428, 0.01873890496790409, -0.020356956869363785, -0.051731497049331665, 0.04704837501049042, 0.014759676530957222, 0.10359611362218857, 0.16517649590969086, 0.025393519550561905, 0.02915089763700962, -0.04454990103840828, 0.13666631281375885, -0.12888942658901215, 0.02475675940513611, 0.08689645677804947, 0.029865792021155357, 0.05739213153719902, 0.14888566732406616, 0.0378432422876358, -0.08345414698123932, 0.039487481117248535, 0.037368979305028915, -0.031428322196006775, -0.2056024819612503, -0.003562220837920904, -0.04943549633026123, 0.024994686245918274, 0.12262163311243057, 0.03955169767141342, 0.016170751303434372, 0.06119342893362045, -0.030053509399294853, -0.03358611837029457, 0.025928640738129616, 0.07067859917879105, -0.014014005661010742, 0.01958400383591652, 0.11741376668214798, -0.007257288321852684, -0.04772888496518135, 0.009173142723739147, 0.005429737735539675, 0.2312549650669098, -0.026668427512049675, 0.15935878455638885, 0.0357842817902565, 0.16306796669960022, -0.011897843331098557, 0.07918944954872131, 0.02851380966603756, -0.03975044563412666, -0.0012702513486146927, -0.05782580003142357, -0.03866851329803467, 0.0634814202785492, 0.042984794825315475, 0.05509291589260101, -0.11673591285943985, 0.028759846463799477, 0.04518486186861992, 0.3148028552532196, 0.07796385884284973, -0.2909800708293915, -0.07102268189191818, 0.02138400636613369, -0.048594336956739426, -0.03326205164194107, 0.02290809154510498, 0.14611534774303436, -0.12264592200517654, 0.052535053342580795, -0.08516771346330643, 0.07315164059400558, -0.06685903668403625, 0.0012527472572401166, 0.04085933044552803, 0.08228491246700287, -0.030561544001102448, 0.06427614390850067, -0.2779248058795929, 0.2983187437057495, -0.010820789262652397, 0.05888362228870392, -0.05639566108584404, 0.01879662089049816, 0.032014962285757065, 0.012261486612260342, 0.11737395823001862, -0.004056049510836601, -0.03580031171441078, -0.16951672732830048, -0.09978143125772476, 0.008337351493537426, 0.14744800329208374, -0.14391013979911804, 0.12442655861377716, -0.026789192110300064, -0.03741287812590599, 0.04399452358484268, -0.06704474985599518, -0.05688508227467537, -0.10099557787179947, 0.01082129031419754, -0.040668778121471405, 0.0703277438879013, -0.10596735775470734, -0.09491842985153198, -0.032644122838974, 0.15889543294906616, -0.11424092203378677, -0.03972864896059036, -0.15457560122013092, 0.06312797218561172, 0.13416950404644012, -0.07652433961629868, 0.054030559957027435, 0.015948275104165077, 0.10757214576005936, 0.005255698226392269, 0.004905437584966421, 0.12387686967849731, -0.08199344575405121, -0.23920415341854095, -0.06808719038963318, 0.1897614598274231, 0.04164566472172737, 0.06521455198526382, -0.02864019013941288, 0.017731890082359314, -0.010058529675006866, -0.08974400907754898, 0.0714719295501709, 0.031209731474518776, 0.04548890143632889, 0.047279153019189835, -0.05270976573228836, 0.07926690578460693, -0.06605184823274612, -0.05061493068933487, 0.14235615730285645, 0.3032793402671814, -0.10443402081727982, 0.050649479031562805, 0.05706313997507095, -0.03478868305683136, -0.1849895566701889, 0.007564503699541092, 0.10338275879621506, 0.041545622050762177, 0.007477108389139175, -0.20140528678894043, 0.027251064777374268, 0.08279865235090256, -0.024214407429099083, 0.09866368770599365, -0.3355318605899811, -0.12655878067016602, 0.0719076544046402, 0.11885429918766022, -0.006775163114070892, -0.16878642141819, -0.06298518925905228, -0.010961975902318954, -0.04757856950163841, 0.05001397803425789, -0.047301437705755234, 0.1267683058977127, -0.01756935939192772, 0.007401060312986374, 0.028621621429920197, -0.06225210800766945, 0.13478349149227142, -0.000814578088466078, 0.0745803490281105, -0.024625269696116447, 0.0006604759837500751, -0.0027950156945735216, -0.08207064121961594, 0.011472354643046856, -0.12392161041498184, 0.036843281239271164, -0.10267370939254761, -0.01863902620971203, -0.09524042904376984, 0.03220100328326225, -0.06034621223807335, -0.07382342219352722, -0.024321993812918663, 0.04226195812225342, 0.07935436815023422, 0.00009836581739364192, 0.11056114733219147, -0.03946496918797493, 0.16454409062862396, 0.09811536222696304, 0.11039624363183975, -0.009211239404976368, -0.07188668102025986, -0.007993271574378014, -0.006067971233278513, 0.04486655071377754, -0.1503353863954544, 0.008519568480551243, 0.13120704889297485, 0.06339335441589355, 0.13903272151947021, 0.06905659288167953, -0.05875542014837265, -0.01902279257774353, 0.07281529158353806, -0.10870549827814102, -0.13084547221660614, -0.020098386332392693, -0.003826491767540574, -0.1492009460926056, 0.0441342331469059, 0.09207116067409515, -0.05061947554349899, -0.005007689818739891, -0.0012279435759410262, 0.03051992692053318, -0.012759054079651833, 0.20591148734092712, 0.06234275922179222, 0.10542066395282745, -0.08737149834632874, 0.0749061182141304, 0.025646883994340897, -0.1080685406923294, 0.02677171304821968, 0.1196613758802414, -0.08999498188495636, -0.020318251103162766, 0.05730030685663223, 0.07230596244335175, 0.017678629606962204, -0.003955411724746227, -0.12843097746372223, -0.13819266855716705, 0.06377339363098145, 0.11117932200431824, 0.03811309114098549, 0.024431105703115463, -0.016472836956381798, 0.04830242320895195, -0.11737983673810959, 0.11705221235752106, 0.06270910054445267, 0.08284737169742584, -0.13104687631130219, 0.15387222170829773, -0.002920422935858369, 0.012669677846133709, -0.008299976587295532, 0.01733088120818138, -0.1206686943769455, 0.011233607307076454, -0.04353136196732521, -0.061579663306474686, -0.05935918912291527, -0.0241389200091362, -0.013574127107858658, -0.04478197917342186, -0.010168888606131077, -0.00511541124433279, -0.10724836587905884, -0.05742975324392319, -0.026248184964060783, 0.03668752312660217, -0.09966683387756348, -0.030564432963728905, 0.03289069980382919, -0.12863819301128387, 0.09844791144132614, 0.03259865194559097, 0.0342743806540966, 0.006146268453449011, -0.09766450524330139, 0.043547481298446655, 0.0238934438675642, -0.03021586500108242, 0.028945403173565865, -0.13365228474140167, -0.025647029280662537, -0.07564486563205719, 0.02007298544049263, 0.011713833548128605, 0.006755760405212641, -0.13852016627788544, 0.0296914204955101, -0.04601193591952324, -0.0626351609826088, -0.06955745816230774, 0.06071219593286514, 0.04603477194905281, -0.0034155480097979307, 0.13418233394622803, -0.07023580372333527, 0.06813608855009079, -0.2254142463207245, -0.0181319247931242, -0.008244994096457958, -0.07495411485433578, -0.06743098050355911, -0.0359279066324234, 0.09449242800474167, -0.058408867567777634, 0.06275948137044907, -0.042844176292419434, 0.03780468553304672, 0.020418912172317505, -0.10212749242782593, 0.09767741709947586, 0.05600544810295105, 0.18136772513389587, 0.05337219312787056, -0.046808335930109024, 0.057261183857917786, 0.0349358394742012, 0.06772661954164505, 0.06655082106590271, 0.17100869119167328, 0.1301729381084442, 0.024328580126166344, 0.0868804082274437, 0.033969346433877945, -0.12707965075969696, -0.14041730761528015, 0.0987575352191925, -0.04226620867848396, 0.09338034689426422, -0.026749256998300552, 0.21085087954998016, 0.13020774722099304, -0.20863084495067596, 0.03404485806822777, -0.018246470019221306, -0.10111624747514725, -0.09209571778774261, -0.06137613579630852, -0.07232701778411865, -0.1725062131881714, -0.002089977264404297, -0.10337737202644348, 0.023101167753338814, 0.0827316865324974, 0.029430728405714035, 0.04137645661830902, 0.17400121688842773, 0.09133601188659668, 0.016047894954681396, 0.10471291840076447, 0.04506847634911537, 0.009628157131373882, -0.03881602734327316, -0.10856318473815918, 0.00883608590811491, -0.0617293119430542, 0.03249097615480423, -0.06788093596696854, -0.09996422380208969, 0.061488159000873566, 0.036025162786245346, -0.10352812707424164, 0.0254055242985487, -0.009503601118922234, 0.061414577066898346, 0.07690690457820892, 0.023267241194844246, -0.02781156823039055, -0.03636005520820618, 0.26609864830970764, -0.11236199736595154, -0.04101184755563736, -0.10371168702840805, 0.2522847354412079, 0.0184276532381773, 0.0010536195477470756, 0.012955655343830585, -0.08194886147975922, 0.009461174719035625, 0.1728598177433014, 0.17730505764484406, -0.04938556253910065, -0.005211350508034229, 0.014905614778399467, -0.014430050738155842, -0.017868679016828537, 0.07703423500061035, 0.1132737249135971, 0.04366019368171692, -0.07364598661661148, -0.005886469502002001, -0.011370088905096054, -0.07675722241401672, -0.05918729305267334, 0.08391674607992172, 0.04597990959882736, 0.007423634175211191, -0.034738689661026, 0.10481105744838715, -0.03827366977930069, -0.1373264342546463, 0.05774400755763054, -0.20503725111484528, -0.16852061450481415, -0.05631701275706291, 0.03979531675577164, 0.01602347008883953, 0.07517888396978378, 0.013826285488903522, -0.021031934767961502, 0.0865391194820404, 0.00958753377199173, -0.04548646882176399, -0.09958740323781967, 0.06718317419290543, -0.1002369374036789, 0.1925046741962433, -0.057273104786872864, -0.018113693222403526, 0.13021136820316315, 0.02177605777978897, -0.07569243758916855, 0.03933093696832657, 0.09936203807592392, -0.09079761803150177, 0.04527369514107704, 0.17175261676311493, -0.030083468183875084, 0.09995332360267639, 0.04724164307117462, -0.1263817697763443, 0.021756067872047424, -0.08516212552785873, -0.05817404389381409, -0.0723225399851799, 0.006589214783161879, -0.018889078870415688, 0.14656318724155426, 0.23006758093833923, -0.06971150636672974, 0.00930313766002655, -0.051045969128608704, -0.0016813738038763404, 0.048383355140686035, 0.10070318728685379, -0.02446082979440689, -0.25355494022369385, 0.006911217235028744, 0.04700903221964836, 0.005789392627775669, -0.2559746205806732, -0.09587877243757248, 0.025092020630836487, -0.04813125729560852, -0.07913971692323685, 0.09855103492736816, 0.04632273688912392, 0.05664074048399925, -0.038245901465415955, -0.09856355935335159, -0.048861101269721985, 0.19238485395908356, -0.17858664691448212, -0.05635223537683487 ]
null
null
null
--- license: creativeml-openrail-m base_model: runwayml/stable-diffusion-v1-5 tags: - stable-diffusion - stable-diffusion-diffusers - text-to-image - diffusers - lora inference: true --- # LoRA text2image fine-tuning - jlbaker361/spider-lora-500-e100-runway-minimal These are LoRA adaption weights for runwayml/stable-diffusion-v1-5. The weights were fine-tuned on the jlbaker361/spider-500-cropped dataset. Training epochs = 100 num_train_timesteps = 50 url: https://wandb.ai/jlbaker361/text2image-fine-tune/runs/b0qdx1y4 You can find some example images in the following. ![img_0](./image_0.png) ![img_1](./image_1.png) ![img_2](./image_2.png) ![img_3](./image_3.png) ![img_4](./image_4.png) ![img_5](./image_5.png) ![img_6](./image_6.png) ![img_7](./image_7.png) ![img_8](./image_8.png) ![img_9](./image_9.png) ![img_10](./image_10.png) ![img_11](./image_11.png) ![img_12](./image_12.png) ![img_13](./image_13.png) ![img_14](./image_14.png) ![img_15](./image_15.png) ![img_16](./image_16.png) ![img_17](./image_17.png) ![img_18](./image_18.png) ![img_19](./image_19.png)
{}
null
jlbaker361/spider-lora-500-e100-runway-minimal
[ "safetensors", "region:us" ]
2024-02-13T21:04:54+00:00
[]
[]
TAGS #safetensors #region-us
--- license: creativeml-openrail-m base_model: runwayml/stable-diffusion-v1-5 tags: - stable-diffusion - stable-diffusion-diffusers - text-to-image - diffusers - lora inference: true --- # LoRA text2image fine-tuning - jlbaker361/spider-lora-500-e100-runway-minimal These are LoRA adaption weights for runwayml/stable-diffusion-v1-5. The weights were fine-tuned on the jlbaker361/spider-500-cropped dataset. Training epochs = 100 num_train_timesteps = 50 url: URL You can find some example images in the following. !img_0 !img_1 !img_2 !img_3 !img_4 !img_5 !img_6 !img_7 !img_8 !img_9 !img_10 !img_11 !img_12 !img_13 !img_14 !img_15 !img_16 !img_17 !img_18 !img_19
[ "# LoRA text2image fine-tuning - jlbaker361/spider-lora-500-e100-runway-minimal\n These are LoRA adaption weights for runwayml/stable-diffusion-v1-5. The weights were fine-tuned on the jlbaker361/spider-500-cropped dataset. \n\n Training epochs = 100 \n\n num_train_timesteps = 50 \n\n url: URL\n You can find some example images in the following. \n\n !img_0\n!img_1\n!img_2\n!img_3\n!img_4\n!img_5\n!img_6\n!img_7\n!img_8\n!img_9\n!img_10\n!img_11\n!img_12\n!img_13\n!img_14\n!img_15\n!img_16\n!img_17\n!img_18\n!img_19" ]
[ "TAGS\n#safetensors #region-us \n", "# LoRA text2image fine-tuning - jlbaker361/spider-lora-500-e100-runway-minimal\n These are LoRA adaption weights for runwayml/stable-diffusion-v1-5. The weights were fine-tuned on the jlbaker361/spider-500-cropped dataset. \n\n Training epochs = 100 \n\n num_train_timesteps = 50 \n\n url: URL\n You can find some example images in the following. \n\n !img_0\n!img_1\n!img_2\n!img_3\n!img_4\n!img_5\n!img_6\n!img_7\n!img_8\n!img_9\n!img_10\n!img_11\n!img_12\n!img_13\n!img_14\n!img_15\n!img_16\n!img_17\n!img_18\n!img_19" ]
[ 11, 210 ]
[ "passage: TAGS\n#safetensors #region-us \n# LoRA text2image fine-tuning - jlbaker361/spider-lora-500-e100-runway-minimal\n These are LoRA adaption weights for runwayml/stable-diffusion-v1-5. The weights were fine-tuned on the jlbaker361/spider-500-cropped dataset. \n\n Training epochs = 100 \n\n num_train_timesteps = 50 \n\n url: URL\n You can find some example images in the following. \n\n !img_0\n!img_1\n!img_2\n!img_3\n!img_4\n!img_5\n!img_6\n!img_7\n!img_8\n!img_9\n!img_10\n!img_11\n!img_12\n!img_13\n!img_14\n!img_15\n!img_16\n!img_17\n!img_18\n!img_19" ]
[ -0.10659971088171005, -0.014300315640866756, -0.0011486784787848592, 0.1351010650396347, 0.09994176030158997, 0.012189105153083801, 0.12726351618766785, 0.1499784141778946, 0.018835516646504402, 0.05709904804825783, 0.07195918262004852, 0.10149788111448288, 0.003963027149438858, 0.1466466784477234, -0.02414216659963131, -0.16831672191619873, -0.005098814610391855, -0.07081548124551773, -0.14185267686843872, 0.07799544930458069, 0.10585561394691467, -0.04240447282791138, 0.10350188612937927, 0.012840522453188896, -0.03617606684565544, 0.048095837235450745, 0.03167916461825371, -0.08857898414134979, 0.06696231663227081, 0.007685667369514704, 0.01669858582317829, 0.010378452949225903, 0.09662409126758575, -0.17029646039009094, -0.001431277603842318, 0.07541914284229279, 0.040319718420505524, 0.03708993270993233, -0.015130824409425259, 0.024530163034796715, 0.1678975522518158, -0.07900472730398178, -0.030786124989390373, -0.00047873318544588983, -0.10787785053253174, -0.14900024235248566, -0.14642639458179474, -0.01418953575193882, 0.12629421055316925, 0.08954750001430511, -0.0017858062637969851, 0.1418030709028244, -0.10066504776477814, 0.05915429815649986, 0.4026571214199066, -0.2707531452178955, -0.014066982083022594, 0.1355581134557724, -0.01732202060520649, 0.03823300823569298, -0.08400581032037735, 0.021155787631869316, 0.10553895682096481, -0.047969359904527664, 0.05900648236274719, -0.008433475159108639, 0.03669307753443718, 0.030968206003308296, -0.14904147386550903, 0.04128174111247063, 0.19919133186340332, 0.032401151955127716, -0.0913834199309349, -0.10874742269515991, 0.014326155185699463, -0.03964956849813461, -0.04870004579424858, -0.032026857137680054, 0.02612403593957424, -0.02868816629052162, -0.00512561434879899, 0.0221833735704422, -0.024671057239174843, -0.08056662976741791, -0.004037628881633282, 0.2145853191614151, 0.03160426393151283, 0.009342356584966183, 0.023120269179344177, 0.10210714489221573, -0.10665223002433777, -0.08218812197446823, -0.041457194834947586, -0.013623198494315147, 0.0372236929833889, -0.04125964641571045, 0.06361685693264008, -0.04913029447197914, 0.01964656263589859, 0.13074891269207, 0.052367135882377625, 0.007783533073961735, 0.027926720678806305, 0.05783550441265106, -0.09914504736661911, 0.04375632107257843, -0.12545737624168396, -0.12988092005252838, 0.06359883397817612, 0.1334940642118454, 0.09697969257831573, 0.009178978390991688, -0.0388558991253376, -0.03599147871136665, -0.018709274008870125, 0.06641121953725815, -0.05528029799461365, 0.037533000111579895, -0.09645599871873856, 0.022904785349965096, 0.13451585173606873, -0.03775790333747864, 0.01645202748477459, -0.046669527888298035, -0.06752598285675049, -0.027196258306503296, 0.14724600315093994, 0.04891087859869003, 0.030453549697995186, 0.04368021711707115, -0.061262499541044235, 0.06462245434522629, -0.022318312898278236, -0.13219815492630005, -0.039327315986156464, 0.024979664012789726, -0.03852848336100578, -0.08378893882036209, -0.044793859124183655, -0.034145329147577286, 0.0074925497174263, -0.026005782186985016, 0.008032494224607944, -0.01576913893222809, -0.08047773689031601, -0.04123934358358383, 0.05282733961939812, 0.02415439672768116, -0.07400219142436981, 0.0553874596953392, 0.043110042810440063, 0.15650813281536102, 0.040364257991313934, 0.008661799132823944, -0.12609563767910004, 0.04533637315034866, -0.21590939164161682, 0.014745889231562614, -0.11481857299804688, 0.02242639660835266, -0.09463474154472351, -0.0431656651198864, -0.1214049905538559, 0.014256434515118599, 0.09381990879774094, 0.1917283982038498, -0.23590198159217834, -0.02060452289879322, 0.10759936273097992, -0.09891684353351593, -0.11292850971221924, 0.058467231690883636, -0.012713995762169361, 0.06613385677337646, 0.04225081950426102, 0.11502489447593689, 0.04214392229914665, -0.13727250695228577, 0.02248665876686573, -0.012851347215473652, -0.042057767510414124, -0.04912898689508438, 0.0994841679930687, -0.047324907034635544, -0.08851369470357895, 0.005075766704976559, -0.05481081083416939, 0.08788079023361206, -0.028133083134889603, -0.01958266831934452, -0.020659111440181732, -0.06774663925170898, 0.02627681940793991, 0.029232563450932503, -0.026389773935079575, -0.10554729402065277, -0.04276859387755394, -0.03941554203629494, 0.1299406737089157, -0.015548383817076683, -0.01281297393143177, 0.0231834314763546, 0.04717329517006874, -0.13245521485805511, -0.04961371049284935, -0.06607894599437714, -0.005470348056405783, 0.0003176366735715419, 0.11802010238170624, 0.08697044849395752, 0.0022525938693434, 0.13001582026481628, 0.06162590533494949, -0.044499948620796204, 0.0024864906445145607, 0.0842859223484993, 0.02121138758957386, -0.15131475031375885, -0.16833561658859253, -0.04952366277575493, -0.07699607312679291, 0.14996124804019928, -0.282424658536911, 0.024538163095712662, -0.08465935289859772, 0.09404810518026352, 0.10283554345369339, -0.0503845177590847, 0.11806254088878632, -0.023229418322443962, -0.06515355408191681, -0.1293916553258896, 0.00263034226372838, -0.06476762145757675, -0.08393339067697525, 0.06089482083916664, -0.09881017357110977, 0.07486370205879211, 0.1171601191163063, 0.04737864062190056, -0.06751244515180588, -0.061071500182151794, -0.0037571187131106853, -0.0021303787361830473, -0.05041784048080444, 0.06395672261714935, -0.02905346266925335, 0.022698894143104553, 0.10414743423461914, -0.036846987903118134, -0.04008355364203453, -0.0670621246099472, -0.05494659021496773, -0.024553731083869934, 0.06483776122331619, 0.10786974430084229, -0.04759277403354645, 0.034422617405653, 0.1404893845319748, -0.04564535617828369, 0.09310289472341537, 0.03771873936057091, -0.06469134986400604, 0.007900076918303967, 0.07550498843193054, 0.07846584171056747, 0.07568471878767014, 0.07328027486801147, 0.03317359834909439, 0.03677996248006821, -0.041450776159763336, 0.035353899002075195, -0.16326804459095, -0.07158296555280685, 0.08302456885576248, -0.0696733221411705, 0.005919179879128933, 0.03755398094654083, -0.02312474511563778, 0.1360674947500229, -0.050820786505937576, -0.054482266306877136, -0.052017539739608765, 0.034980323165655136, -0.09295287728309631, 0.1826794445514679, 0.0023722860496491194, -0.03391708433628082, -0.08726714551448822, 0.029164426028728485, -0.062222424894571304, -0.012900150381028652, 0.028162680566310883, -0.12832830846309662, -0.04605164751410484, -0.1439182609319687, 0.04566306620836258, 0.007663956843316555, 0.035476889461278915, -0.006546171847730875, -0.020984044298529625, 0.06709958612918854, -0.1680993139743805, -0.006041841581463814, -0.018637128174304962, 0.08346293121576309, -0.009746822528541088, 0.007664022967219353, 0.1310986876487732, 0.05207673832774162, -0.023576753214001656, 0.02167925052344799, -0.013964243233203888, 0.22408363223075867, -0.017951346933841705, 0.057829346507787704, 0.15064503252506256, 0.07141193002462387, 0.04534253105521202, 0.07121732831001282, 0.014815880917012691, -0.09572076052427292, 0.014844708144664764, 0.059606604278087616, -0.08760818839073181, -0.13103480637073517, -0.02254457026720047, -0.048666369169950485, -0.03364536166191101, 0.06493375450372696, 0.0234037097543478, 0.004538377281278372, 0.0986315980553627, -0.011874371208250523, 0.1104913279414177, 0.02908935956656933, 0.05490109696984291, -0.10904619097709656, 0.0582113079726696, 0.11154747754335403, -0.05478798225522041, -0.07991189509630203, 0.08713051676750183, -0.048335231840610504, 0.1341797262430191, -0.06933458894491196, 0.014061303809285164, 0.04367603361606598, 0.052573516964912415, -0.004838911816477776, 0.17370471358299255, -0.0486331470310688, -0.06450830399990082, -0.030516931787133217, -0.10958181321620941, 0.049360666424036026, 0.05430389940738678, 0.029205620288848877, 0.03934222832322121, -0.03521740436553955, 0.09448155015707016, 0.059787455946207047, 0.03923526033759117, 0.12900368869304657, -0.2810105085372925, 0.03514721617102623, 0.03691403195261955, 0.05548299849033356, 0.0008713550050742924, 0.0368991382420063, 0.09056448191404343, -0.027632050216197968, 0.09963195025920868, -0.08092517405748367, 0.0485205315053463, -0.04268929362297058, -0.06174696609377861, 0.04106789827346802, 0.1852554976940155, -0.059337932616472244, 0.026456046849489212, -0.17629513144493103, 0.06872954964637756, -0.007347417995333672, 0.015302618034183979, -0.06956419348716736, -0.012969869188964367, 0.1284455507993698, 0.024555908516049385, 0.10848253220319748, 0.016468388959765434, -0.06510009616613388, -0.20321978628635406, -0.10218776017427444, -0.02316289208829403, 0.037142813205718994, -0.059223975986242294, 0.13755284249782562, 0.002024776302278042, -0.03674909844994545, 0.034443199634552, -0.027686621993780136, -0.25940319895744324, -0.08653953671455383, -0.012755668722093105, 0.08469022810459137, -0.035263173282146454, -0.0882854014635086, -0.060461994260549545, -0.12749852240085602, 0.1842864751815796, 0.12604795396327972, -0.035426586866378784, -0.10948369652032852, 0.12771159410476685, 0.13579365611076355, -0.02182474546134472, -0.04505825787782669, -0.0023504733107984066, 0.03291264548897743, -0.043619025498628616, -0.05521070584654808, 0.09203749895095825, -0.07838942855596542, -0.10595974326133728, -0.04412069916725159, 0.10883036255836487, -0.061174072325229645, 0.029454339295625687, -0.00492457952350378, 0.008361238986253738, 0.06787735223770142, -0.11548926681280136, 0.05437609180808067, -0.04311065003275871, -0.04488569498062134, 0.12777812778949738, -0.058928389102220535, -0.006514566019177437, -0.10106287151575089, -0.032912544906139374, 0.14393793046474457, 0.24028825759887695, -0.08868066221475601, -0.026384958997368813, -0.08185023069381714, -0.016480954363942146, -0.15408839285373688, 0.0606377013027668, -0.07717665284872055, -0.00857972539961338, 0.12746229767799377, -0.036475323140621185, 0.12316016107797623, 0.13687145709991455, 0.013203481212258339, 0.26649725437164307, -0.3148244619369507, -0.11415290087461472, 0.01842159777879715, 0.1701274812221527, 0.055240143090486526, -0.15253832936286926, -0.04935050383210182, -0.03577704355120659, -0.009787964634597301, -0.012787159532308578, -0.0719091147184372, 0.10250934213399887, -0.043975830078125, -0.09642593562602997, 0.037747547030448914, -0.0486907884478569, 0.15014860033988953, -0.02354937605559826, 0.14514108002185822, -0.05049765855073929, -0.11814984679222107, -0.06454909592866898, -0.0673455148935318, 0.17826125025749207, -0.148894265294075, 0.018432695418596268, -0.1606680303812027, -0.0002850173623301089, -0.005499205086380243, 0.03077918291091919, -0.00005536536264116876, -0.03795153275132179, -0.12330705672502518, 0.025759778916835785, -0.005500852130353451, 0.0038232901133596897, 0.13673093914985657, 0.035685282200574875, 0.0920870453119278, 0.001442519249394536, 0.008016761392354965, 0.05199749395251274, -0.023833582177758217, -0.00002758456685114652, 0.030930764973163605, 0.0388796441257, -0.18530088663101196, 0.015216252766549587, 0.05758613720536232, 0.07334808260202408, 0.03487824276089668, 0.05186600983142853, -0.07370150834321976, 0.00642716558650136, 0.12834522128105164, -0.04785173758864403, 0.0647016242146492, -0.015260501764714718, -0.03565382957458496, 0.008201541379094124, 0.05625609681010246, 0.07946892082691193, -0.09065893292427063, 0.019781343638896942, -0.046197064220905304, 0.003978682216256857, -0.06818981468677521, 0.146143838763237, 0.13497580587863922, 0.018697526305913925, -0.06594117730855942, 0.06857606768608093, -0.014646044000983238, 0.0210120752453804, -0.025555338710546494, 0.07686734199523926, -0.035783715546131134, -0.06099909171462059, 0.20740436017513275, 0.21889205276966095, 0.03428514301776886, -0.002152408007532358, -0.07634373009204865, -0.09729433804750443, 0.0345727875828743, -0.0340522825717926, 0.06635583937168121, -0.0709652230143547, -0.017203306779265404, -0.023404119536280632, -0.12496216595172882, 0.06765604764223099, 0.07439474016427994, 0.09509065747261047, -0.21782732009887695, 0.0346955768764019, -0.048451606184244156, -0.06317774206399918, -0.0029621319845318794, 0.03510817140340805, -0.08780549466609955, -0.02358977682888508, -0.007983054965734482, -0.017729703336954117, -0.12467213720083237, 0.012652833014726639, -0.02216995507478714, -0.010341228917241096, -0.02183634601533413, 0.03652208670973778, -0.09127822518348694, -0.0811804011464119, -0.0546245202422142, 0.043630775064229965, -0.0644133985042572, -0.05456414818763733, -0.000319372775265947, -0.09874090552330017, 0.021900730207562447, -0.01067408174276352, 0.036409758031368256, -0.06999965012073517, -0.06788258999586105, -0.032176900655031204, 0.13023918867111206, -0.054967254400253296, 0.04064951464533806, -0.11667454987764359, 0.06091314181685448, -0.06912224739789963, 0.07052173465490341, -0.013899925164878368, 0.021552152931690216, -0.08240611851215363, -0.09486116468906403, -0.07042801380157471, -0.04863720014691353, 0.00957003515213728, 0.02462548203766346, 0.21809521317481995, -0.035381514579057693, 0.2089933454990387, -0.03498825058341026, 0.0019215508364140987, -0.22935038805007935, -0.0013727633049711585, -0.023757152259349823, -0.08709226548671722, -0.031191708520054817, 0.027362771332263947, 0.05021686479449272, -0.011564218439161777, 0.05852280929684639, 0.059524886310100555, -0.05229678377509117, -0.023784657940268517, 0.01998542621731758, 0.09803519397974014, -0.027873948216438293, 0.13618163764476776, 0.021832449361681938, -0.010685381479561329, 0.004081394989043474, 0.06057769060134888, 0.17592822015285492, 0.037850480526685715, 0.11064339429140091, 0.14301735162734985, -0.021760666742920876, 0.1561841368675232, -0.04590819403529167, -0.1410038024187088, -0.07777687162160873, 0.07259459048509598, 0.040113888680934906, 0.03549950569868088, 0.009134665131568909, 0.009768147021532059, 0.17457188665866852, -0.15106172859668732, 0.012791688553988934, 0.005242341198027134, -0.00029322595219127834, -0.08047423511743546, -0.08379258215427399, -0.07062308490276337, -0.16042077541351318, 0.0197741761803627, -0.04616791009902954, 0.01410157885402441, 0.1362690031528473, 0.044116370379924774, 0.10144258290529251, 0.16088490188121796, -0.0895862951874733, -0.0513915978372097, 0.03054085373878479, -0.04262199252843857, -0.09445034712553024, 0.06512275338172913, 0.0003260039375163615, 0.05529855936765671, -0.16573528945446014, 0.01003174390643835, -0.02661116234958172, -0.032941341400146484, 0.07976174354553223, -0.004878525622189045, -0.1050296276807785, -0.03102242387831211, -0.02070043794810772, 0.08790735900402069, 0.19849324226379395, 0.07039467990398407, 0.010257082991302013, -0.020711436867713928, 0.1127399429678917, -0.07979834079742432, -0.07502218335866928, -0.13847577571868896, 0.1266561895608902, -0.02913420833647251, 0.029285404831171036, -0.11767866462469101, -0.10317832231521606, 0.048312366008758545, 0.1820848435163498, 0.12975722551345825, -0.0804448127746582, -0.009531483054161072, -0.04987376928329468, -0.03579350933432579, -0.058022402226924896, 0.03608570992946625, 0.10909166187047958, 0.005957404151558876, -0.05285811424255371, -0.010011422447860241, -0.09625618904829025, -0.07597197592258453, -0.07872296124696732, 0.17308497428894043, 0.05394090339541435, 0.01656648889183998, -0.05435745790600777, 0.1003175601363182, 0.12173143774271011, -0.06558145582675934, 0.08802685886621475, -0.12550731003284454, -0.11150024086236954, -0.03506157547235489, -0.03215411677956581, 0.040531158447265625, -0.011374065652489662, -0.034305933862924576, -0.02353251539170742, -0.10822424292564392, 0.006189923733472824, -0.1443088799715042, -0.08853130787611008, -0.02600240707397461, 0.08618749678134918, 0.1869642585515976, 0.015568377450108528, 0.02812975086271763, 0.09019003063440323, -0.00956389307975769, -0.1565401554107666, 0.11297670006752014, -0.014087689109146595, 0.027098994702100754, -0.010107535868883133, 0.1138828843832016, -0.048021864145994186, 0.07629827409982681, 0.005807400681078434, -0.1282539814710617, 0.05438147112727165, 0.10666490346193314, -0.09959474205970764, -0.1163857951760292, -0.025815332308411598, -0.069755919277668, 0.1284453123807907, 0.1265622079372406, -0.0027617947198450565, 0.042861077934503555, -0.0168903861194849, 0.05768800526857376, 0.04870566353201866, 0.04133900627493858, -0.024255109950900078, -0.08095657080411911, -0.00210593338124454, -0.006351040210574865, 0.005835304968059063, -0.18099090456962585, -0.049676015973091125, -0.041634172201156616, -0.09770216047763824, -0.03213297575712204, 0.0738736167550087, 0.24886147677898407, 0.024761775508522987, 0.00043652256135828793, -0.2638370990753174, 0.03862505033612251, 0.1489696204662323, -0.15375280380249023, -0.03781549632549286 ]
null
null
null
--- license: creativeml-openrail-m base_model: stabilityai/stable-diffusion-2 tags: - stable-diffusion - stable-diffusion-diffusers - text-to-image - diffusers - lora inference: true --- # LoRA text2image fine-tuning - jlbaker361/spider-lora-500-e100-stable-minimal These are LoRA adaption weights for stabilityai/stable-diffusion-2. The weights were fine-tuned on the jlbaker361/spider-500 dataset. Training epochs = 100 num_train_timesteps = 50 You can find some example images in the following. ![img_0](./image_0.png) ![img_1](./image_1.png) ![img_2](./image_2.png) ![img_3](./image_3.png) ![img_4](./image_4.png) ![img_5](./image_5.png) ![img_6](./image_6.png) ![img_7](./image_7.png) ![img_8](./image_8.png) ![img_9](./image_9.png) ![img_10](./image_10.png) ![img_11](./image_11.png)
{}
null
jlbaker361/spider-lora-500-e100-stable-minimal
[ "safetensors", "region:us" ]
2024-02-13T21:04:54+00:00
[]
[]
TAGS #safetensors #region-us
--- license: creativeml-openrail-m base_model: stabilityai/stable-diffusion-2 tags: - stable-diffusion - stable-diffusion-diffusers - text-to-image - diffusers - lora inference: true --- # LoRA text2image fine-tuning - jlbaker361/spider-lora-500-e100-stable-minimal These are LoRA adaption weights for stabilityai/stable-diffusion-2. The weights were fine-tuned on the jlbaker361/spider-500 dataset. Training epochs = 100 num_train_timesteps = 50 You can find some example images in the following. !img_0 !img_1 !img_2 !img_3 !img_4 !img_5 !img_6 !img_7 !img_8 !img_9 !img_10 !img_11
[ "# LoRA text2image fine-tuning - jlbaker361/spider-lora-500-e100-stable-minimal\n These are LoRA adaption weights for stabilityai/stable-diffusion-2. The weights were fine-tuned on the jlbaker361/spider-500 dataset. \n\n Training epochs = 100 \n\n num_train_timesteps = 50 \n\n You can find some example images in the following. \n\n !img_0\n!img_1\n!img_2\n!img_3\n!img_4\n!img_5\n!img_6\n!img_7\n!img_8\n!img_9\n!img_10\n!img_11" ]
[ "TAGS\n#safetensors #region-us \n", "# LoRA text2image fine-tuning - jlbaker361/spider-lora-500-e100-stable-minimal\n These are LoRA adaption weights for stabilityai/stable-diffusion-2. The weights were fine-tuned on the jlbaker361/spider-500 dataset. \n\n Training epochs = 100 \n\n num_train_timesteps = 50 \n\n You can find some example images in the following. \n\n !img_0\n!img_1\n!img_2\n!img_3\n!img_4\n!img_5\n!img_6\n!img_7\n!img_8\n!img_9\n!img_10\n!img_11" ]
[ 11, 160 ]
[ "passage: TAGS\n#safetensors #region-us \n# LoRA text2image fine-tuning - jlbaker361/spider-lora-500-e100-stable-minimal\n These are LoRA adaption weights for stabilityai/stable-diffusion-2. The weights were fine-tuned on the jlbaker361/spider-500 dataset. \n\n Training epochs = 100 \n\n num_train_timesteps = 50 \n\n You can find some example images in the following. \n\n !img_0\n!img_1\n!img_2\n!img_3\n!img_4\n!img_5\n!img_6\n!img_7\n!img_8\n!img_9\n!img_10\n!img_11" ]
[ -0.09726246446371078, -0.07385904341936111, -0.0013792960671707988, 0.061733365058898926, 0.11907105892896652, -0.000849090632982552, 0.18258029222488403, 0.12198125571012497, 0.12885545194149017, 0.07870177924633026, 0.060273244976997375, 0.07838229835033417, -0.015639912337064743, 0.16255612671375275, -0.05010133981704712, -0.20899902284145355, 0.03640762344002724, -0.047443896532058716, -0.13504455983638763, 0.046972934156656265, 0.11840589344501495, -0.0845518708229065, 0.09066691249608994, -0.04510724917054176, -0.03340849652886391, 0.03640330955386162, 0.03812042623758316, -0.0712621659040451, 0.09440683573484421, -0.006843179930001497, 0.032509755343198776, 0.06639993190765381, 0.069826140999794, -0.2127724587917328, 0.012603296898305416, 0.03061925433576107, 0.043126534670591354, 0.02911810390651226, -0.043609876185655594, 0.010790761560201645, 0.10868018120527267, -0.15403932332992554, -0.06576647609472275, -0.0329681932926178, -0.08393532782793045, -0.09104663878679276, -0.09650228172540665, -0.07556680589914322, 0.09646917879581451, 0.025151513516902924, -0.00037782557774335146, 0.1506749987602234, -0.0868178978562355, 0.023857399821281433, 0.3409694731235504, -0.30369746685028076, 0.003306586993858218, 0.10875874012708664, 0.011403135024011135, 0.1393936425447464, -0.07476624101400375, 0.0706208273768425, 0.10653502494096756, -0.061686739325523376, 0.050716083496809006, -0.03432483226060867, 0.01721253991127014, 0.004474493209272623, -0.12659546732902527, 0.07170441746711731, 0.245199054479599, 0.0008045291760936379, -0.10177344083786011, -0.11288967728614807, -0.010642490349709988, 0.08320527523756027, -0.05839572474360466, -0.03832380101084709, 0.027944428846240044, -0.03718714043498039, 0.03862939402461052, 0.0026464902330189943, -0.04727690666913986, -0.10152619332075119, 0.014005566947162151, 0.2557583451271057, 0.040905945003032684, 0.007829170674085617, 0.030970316380262375, 0.09201518446207047, -0.1825133115053177, -0.0849839299917221, -0.0033400740940123796, -0.002652056748047471, 0.0028153052553534508, -0.01804325357079506, -0.01607205718755722, -0.12042947113513947, 0.05412700027227402, 0.08754775673151016, 0.02531602792441845, 0.014599824324250221, 0.027412883937358856, 0.05134939029812813, -0.06206314638257027, 0.0372251495718956, -0.12100923806428909, -0.09413862228393555, 0.09556151181459427, 0.11737435311079025, 0.09585338830947876, 0.0017853041645139456, -0.042945459485054016, -0.09622969478368759, 0.054370585829019547, 0.029769351705908775, -0.09882981330156326, 0.029452817514538765, -0.07647080719470978, 0.01896851882338524, 0.014195204712450504, -0.03221786022186279, -0.01581837795674801, -0.05682515725493431, -0.0469115674495697, 0.061539098620414734, 0.08683370053768158, 0.020908910781145096, 0.03332780674099922, 0.030121950432658195, -0.0735599473118782, 0.023488875478506088, -0.035647325217723846, -0.12574829161167145, -0.03954490274190903, 0.004949381109327078, -0.0021542594768106937, -0.08631739765405655, -0.08512144535779953, -0.004782828036695719, -0.02437267079949379, -0.005455082282423973, 0.05014098063111305, -0.014781545847654343, -0.04679100215435028, -0.04484977573156357, 0.033853668719530106, -0.037255555391311646, -0.06495028734207153, 0.06925087422132492, 0.07871439307928085, 0.12312819063663483, -0.012026372365653515, -0.028477361425757408, -0.12185542285442352, 0.05274530127644539, -0.1777716875076294, 0.008007736876606941, -0.06840293854475021, -0.007370735052973032, -0.05466422066092491, -0.02773493155837059, -0.13020175695419312, 0.04307705909013748, 0.07643140107393265, 0.2080465406179428, -0.2615400552749634, -0.05602492019534111, 0.15392757952213287, -0.16192986071109772, -0.09400411695241928, 0.08289197832345963, 0.012008176185190678, 0.0478762723505497, 0.08763531595468521, 0.09962368756532669, 0.0412323921918869, -0.12608332931995392, -0.03271154314279556, -0.07648357003927231, -0.009130971506237984, -0.06722813844680786, 0.0848463699221611, 0.04864831268787384, -0.06320667266845703, 0.05737709626555443, -0.12394934147596359, 0.07443753629922867, -0.07692120969295502, 0.026247017085552216, -0.029777009040117264, -0.08685200661420822, 0.015715302899479866, 0.013016091659665108, 0.03316356614232063, -0.06707882136106491, -0.016379881650209427, 0.06583811342716217, 0.11424810439348221, -0.0549849309027195, 0.013996915891766548, -0.0010643335990607738, 0.15082751214504242, -0.12457045167684555, -0.02881852351129055, -0.06796521693468094, -0.02954445406794548, 0.047503966838121414, 0.19110563397407532, 0.09280239790678024, 0.03735868260264397, 0.1478741317987442, 0.06726787984371185, -0.08377333730459213, -0.009091551415622234, 0.06343355774879456, -0.011982147581875324, -0.12941880524158478, -0.16038601100444794, -0.03247837349772453, -0.08246832340955734, 0.20763123035430908, -0.21947422623634338, 0.007464115973562002, -0.08765413612127304, 0.11416955292224884, 0.09081587940454483, -0.02149159088730812, 0.09137938171625137, 0.0062998696230351925, -0.06911872327327728, -0.0688825473189354, -0.0015540673630312085, -0.07626919448375702, -0.09866773337125778, 0.09023044258356094, -0.1308300793170929, 0.14748741686344147, 0.11138056963682175, 0.07609512656927109, 0.0071300845593214035, -0.1437346488237381, 0.00721377320587635, 0.025096209719777107, -0.0665164366364479, 0.042593978345394135, -0.03090895153582096, 0.023204607889056206, 0.09425784647464752, -0.026648452505469322, 0.03583158925175667, -0.04938879981637001, -0.09205971658229828, -0.02168102189898491, 0.021725401282310486, 0.020676443353295326, 0.01915564574301243, 0.011766931042075157, 0.10099402070045471, -0.07534996420145035, 0.05954572558403015, 0.013681144453585148, -0.09710390120744705, -0.001820469624362886, 0.07834676653146744, 0.08193126320838928, 0.09498827159404755, 0.08906060457229614, 0.012881116941571236, 0.011863618157804012, -0.05072157829999924, 0.04394347965717316, -0.11595802009105682, -0.04827020317316055, 0.03810557723045349, -0.06349111348390579, 0.03757111728191376, 0.037661537528038025, -0.010918284766376019, 0.16463477909564972, -0.06885461509227753, -0.05106176808476448, -0.07284080237150192, 0.020275700837373734, -0.0726582407951355, 0.17552265524864197, -0.02319512702524662, -0.047002192586660385, -0.07800797373056412, 0.07075154781341553, 0.013278579339385033, 0.021477244794368744, -0.017586447298526764, -0.1173023208975792, -0.06381469964981079, -0.10335178673267365, 0.07841823250055313, 0.09511827677488327, 0.0874895229935646, 0.00233399192802608, -0.022165220230817795, 0.04524125158786774, -0.08624296635389328, -0.025055984035134315, -0.07164984196424484, 0.08101078867912292, 0.044646091759204865, -0.004621605388820171, 0.11003574728965759, 0.08326127380132675, -0.0162829477339983, 0.017105501145124435, 0.000983374542556703, 0.10758133232593536, 0.01683037355542183, 0.05217849835753441, 0.19388552010059357, 0.05530586093664169, 0.018071496859192848, 0.03757273033261299, -0.01299951784312725, -0.10879196971654892, 0.054401054978370667, 0.05141366273164749, -0.1032293289899826, -0.08737605810165405, -0.05406976863741875, -0.06245436519384384, -0.07638978958129883, 0.007861040532588959, 0.017513340339064598, -0.01563032902777195, 0.09948588907718658, 0.017050011083483696, 0.03840631619095802, 0.053329676389694214, 0.06428852677345276, -0.09015841037034988, 0.0016643244307488203, 0.08988428115844727, -0.031973112374544144, -0.09207289665937424, 0.09054867923259735, -0.057534802705049515, 0.12631341814994812, -0.09754903614521027, 0.021645385771989822, 0.016439486294984818, 0.03628671169281006, 0.019481172785162926, 0.12742547690868378, -0.0484960563480854, -0.05012043938040733, -0.05531344935297966, -0.10715183615684509, -0.006166001781821251, 0.08677111566066742, 0.02845945954322815, 0.009760444052517414, -0.08974834531545639, 0.1252775639295578, 0.06734490394592285, 0.05203728750348091, 0.17488080263137817, -0.29986122250556946, -0.0005384896649047732, 0.06581932306289673, 0.060128651559352875, 0.003093227744102478, 0.012154730036854744, 0.11924746632575989, 0.01482479739934206, 0.041324179619550705, -0.09300678968429565, 0.036898452788591385, -0.03140357881784439, -0.04109074920415878, -0.08421523869037628, 0.20819728076457977, -0.0593620203435421, -0.019994385540485382, -0.17653656005859375, 0.08130688220262527, 0.004907066002488136, -0.0043220240622758865, -0.06026426702737808, -0.04147828742861748, 0.07321944832801819, 0.014730877242982388, 0.07227253168821335, 0.03182661160826683, -0.0539356954395771, -0.15179140865802765, -0.12903153896331787, 0.015053517185151577, 0.06709937751293182, -0.02054099552333355, 0.13789154589176178, -0.021036019548773766, -0.018303807824850082, 0.045881304889917374, -0.04166833683848381, -0.1871783286333084, -0.02764410339295864, 0.00324074225500226, 0.10046226531267166, -0.04131180793046951, -0.10667970031499863, -0.05972025915980339, -0.0769868716597557, 0.2007218897342682, 0.12796847522258759, -0.01569874957203865, -0.09527773410081863, 0.13928818702697754, 0.16052037477493286, -0.044477030634880066, -0.024113956838846207, 0.0015877113910391927, 0.04167643561959267, -0.04356209933757782, -0.0805179551243782, 0.06470506638288498, -0.04963279888033867, -0.09111316502094269, -0.04980399087071419, 0.12604078650474548, -0.025969356298446655, 0.007659358903765678, 0.016042552888393402, 0.005328289233148098, 0.0963435247540474, -0.06836742162704468, 0.026270119473338127, 0.03365094214677811, -0.027506263926625252, 0.1562495082616806, -0.0855497345328331, 0.028864195570349693, -0.08419272303581238, 0.002660658210515976, 0.11229731887578964, 0.2902054786682129, -0.035077668726444244, -0.022521665319800377, 0.0316111221909523, -0.00939929485321045, -0.14636766910552979, 0.01794142834842205, -0.08858472853899002, 0.04204554855823517, 0.07910654693841934, -0.039026178419589996, 0.11847427487373352, 0.09945467114448547, 0.01563735119998455, 0.19374851882457733, -0.3130386471748352, -0.12546171247959137, 0.05395204946398735, 0.18019497394561768, 0.18033228814601898, -0.15750400722026825, -0.05700312554836273, -0.04773785173892975, 0.028179803863167763, -0.003777657635509968, -0.08257493376731873, 0.0878419578075409, -0.06615342944860458, -0.09397493302822113, 0.06828898191452026, -0.05911751464009285, 0.15899907052516937, -0.04633767902851105, 0.08638675510883331, -0.04502005875110626, -0.08320137858390808, -0.013314907439053059, -0.04924333840608597, 0.10807513445615768, -0.16434405744075775, 0.007215377874672413, -0.09311331808567047, -0.021966615691781044, 0.008973428048193455, 0.0439041405916214, 0.042634692043066025, -0.04526342824101448, -0.11173152178525925, 0.06538562476634979, -0.03208136931061745, 0.0032804713118821383, 0.14810407161712646, 0.012652621604502201, 0.05677574872970581, 0.03368082270026207, 0.027049971744418144, 0.040486060082912445, 0.09251885861158371, 0.02132706716656685, 0.029537726193666458, 0.024156169965863228, -0.07646182179450989, 0.02807001583278179, 0.09843768179416656, 0.07481582462787628, 0.07917144149541855, 0.03709542751312256, -0.06912116706371307, 0.02876763977110386, 0.1411551982164383, -0.08080048859119415, 0.016193026676774025, 0.006066340021789074, -0.011467736214399338, 0.02136155590415001, 0.05916152894496918, 0.1375601440668106, -0.06201321259140968, -0.007782215252518654, -0.057328470051288605, 0.026023417711257935, -0.05622288957238197, 0.16849766671657562, 0.09487630426883698, 0.015440881252288818, -0.05001407116651535, 0.08421718329191208, -0.044198740273714066, 0.037951212376356125, -0.02652411349117756, 0.028648899868130684, -0.06428591907024384, -0.030795643106102943, 0.1982712298631668, 0.10563290119171143, -0.04686407372355461, 0.002274326980113983, -0.1699596494436264, -0.11869173496961594, -0.007816179655492306, 0.04293706268072128, 0.07409443706274033, -0.030407898128032684, -0.007139011286199093, -0.021299293264746666, -0.12162086367607117, 0.03295259177684784, 0.10013706982135773, 0.09708999842405319, -0.23087547719478607, 0.034426115453243256, -0.06897225230932236, -0.06543044745922089, -0.042592357844114304, -0.005828352645039558, -0.10123305022716522, -0.00814145803451538, -0.07610978186130524, 0.016552986577153206, -0.13435149192810059, -0.05918256565928459, -0.03613435477018356, -0.06519759446382523, -0.025179363787174225, 0.01635373756289482, -0.062199078500270844, -0.027648478746414185, -0.043038588017225266, 0.016563203185796738, -0.08113367855548859, -0.10029877722263336, 0.018292156979441643, -0.09562885761260986, 0.0523083321750164, 0.07238584011793137, 0.037845950573682785, 0.02762749046087265, -0.12021157145500183, -0.0019896484445780516, 0.19456009566783905, -0.028842095285654068, 0.01621234230697155, -0.13139860332012177, 0.08111534267663956, -0.03352301940321922, 0.06653571873903275, -0.012643334455788136, 0.07176148891448975, -0.03364759311079979, -0.07250533252954483, -0.11682122200727463, -0.02013184130191803, -0.007501177955418825, 0.013450490310788155, 0.19676461815834045, 0.04730742424726486, 0.13294482231140137, -0.07242463529109955, -0.021345563232898712, -0.18211579322814941, 0.017858967185020447, -0.03522900864481926, -0.07634661346673965, -0.01660050079226494, 0.010081890039145947, 0.032086245715618134, 0.003707939526066184, 0.08372154086828232, -0.027251839637756348, -0.05619261786341667, -0.027907872572541237, 0.0009969115490093827, 0.12024078518152237, 0.024408487603068352, 0.20588961243629456, 0.03629589453339577, 0.010144468396902084, -0.028577439486980438, 0.04661635681986809, 0.15409459173679352, 0.0064087300561368465, 0.1570141315460205, 0.16249974071979523, -0.02508515678346157, 0.13997502624988556, -0.022560657933354378, -0.07959398627281189, -0.049156609922647476, 0.0378882996737957, -0.05834931135177612, -0.022478384897112846, 0.006188919302076101, 0.07941579818725586, 0.1771772801876068, -0.14280979335308075, 0.0011361786164343357, 0.007299761287868023, -0.02270716428756714, -0.0733361691236496, -0.08717815577983856, -0.09376009553670883, -0.1513872593641281, 0.020236752927303314, -0.08703000843524933, -0.03459978103637695, 0.052167560905218124, 0.014067655429244041, 0.08197516947984695, 0.1428821086883545, -0.020425306633114815, -0.04395882040262222, 0.041674621403217316, -0.012424607761204243, -0.10199059545993805, 0.1412990540266037, -0.07618645578622818, 0.03771268203854561, -0.0963805690407753, -0.0242126677185297, -0.014853292144834995, 0.0055253333412110806, 0.0450216680765152, 0.006662092171609402, -0.08868195116519928, -0.045647475868463516, -0.016831250861287117, 0.0680973157286644, 0.13328683376312256, 0.07294265180826187, -0.07273674011230469, -0.025320643559098244, 0.18506267666816711, -0.053634658455848694, -0.031316064298152924, -0.12353593111038208, 0.19815106689929962, -0.005499729420989752, 0.019152389839291573, -0.0706639513373375, -0.08991330862045288, 0.048991866409778595, 0.1300819367170334, 0.11491351574659348, -0.113695427775383, 0.010792113840579987, -0.08947959542274475, -0.019341427832841873, -0.05435502901673317, 0.07054823637008667, 0.06944590061903, 0.06502029299736023, -0.03871491551399231, 0.006655991543084383, -0.09090723842382431, -0.058892544358968735, -0.06467233598232269, 0.10444743931293488, 0.015606571920216084, 0.016445504501461983, -0.07447320222854614, 0.0794050320982933, 0.07441455125808716, -0.12752613425254822, 0.07479275017976761, -0.15143203735351562, -0.08734284341335297, -0.06353884190320969, -0.10303881019353867, 0.019743170589208603, -0.00327677046880126, -0.09252479672431946, -0.03719087690114975, -0.140019953250885, 0.03920832276344299, -0.08918766677379608, -0.07681109011173248, -0.010724188759922981, 0.08176443725824356, 0.1245444193482399, 0.0017089119646698236, 0.016498876735568047, 0.06944336742162704, -0.009537591598927975, -0.11457855999469757, 0.11376211047172546, -0.039126139134168625, -0.02983890287578106, -0.03338993713259697, 0.15724720060825348, -0.06577520072460175, 0.11616946756839752, 0.024527471512556076, -0.10961561650037766, 0.056711889803409576, 0.08456423133611679, -0.11769416928291321, -0.1135142371058464, 0.000881704967468977, -0.08046223968267441, 0.11058619618415833, 0.07156547904014587, -0.014776268973946571, 0.04190511256456375, -0.03217892348766327, 0.09498082101345062, 0.07400067150592804, 0.05004730075597763, -0.014543939381837845, -0.09170961380004883, -0.014774181880056858, 0.02562526986002922, -0.0017121204873546958, -0.21112531423568726, -0.0718015655875206, -0.10386394709348679, -0.06431866437196732, -0.031218288466334343, 0.09724786877632141, 0.22523751854896545, 0.022283554077148438, 0.005267293192446232, -0.25205281376838684, 0.027624478563666344, 0.13772715628147125, -0.1729198396205902, -0.05656914785504341 ]
null
null
peft
# Model Card for Model ID 4 bit general purpose text-to-SQL model. Takes 5677MiB of GPU memory. ## Model Details ### Model Description Provide the CREATE statement of the target table(s) in the context of your prompt and ask a question to your database. The model outputs a query to answer the question. Data used for fine tuning: https://huggingface.co/datasets/b-mc2/sql-create-context ## Uses This model can be coupled with a chat model like llama2-chat to convert the output into a text answer. ### Direct Use ```python from peft import AutoPeftModelForCausalLM from transformers import AutoTokenizer, AutoModelForCausalLM import torch # load model base_model = "GTimothee/sql-code-llama-4bits" model = AutoModelForCausalLM.from_pretrained( base_model, load_in_4bit=True, torch_dtype=torch.float16, device_map="auto", ) model.eval() # load tokenizer tokenizer = AutoTokenizer.from_pretrained("codellama/CodeLlama-7b-hf") eval_prompt = """You are a powerful text-to-SQL model. Your job is to answer questions about a database. You are given a question and context regarding one or more tables. You must output the SQL query that answers the question. ### Input: Which Class has a Frequency MHz larger than 91.5, and a City of license of hyannis, nebraska? ### Context: CREATE TABLE table_name_12 (class VARCHAR, frequency_mhz VARCHAR, city_of_license VARCHAR) ### Response: """ model_input = tokenizer(eval_prompt, return_tensors="pt").to("cuda") with torch.no_grad(): print(tokenizer.decode(model.generate(**model_input, max_new_tokens=100)[0], skip_special_tokens=True)) ``` Outputs: ```python ### Response: SELECT class FROM table_name_12 WHERE frequency_mhz > 91.5 AND city_of_license = "hyannis, nebraska" ``` ## Bias, Risks, and Limitations - potential security issues if there is a malicious use. If you execute blindly the SQL queries that are being generated by end users you could lose data, leak information etc. - may be mistaken depending on the way the prompt has been written. ### Recommendations - Make sure that you check the generated SQL before applying it if the model is used by end users directly. - The model works well when used on simple tables and simple queries. If possible, try to break a complex query into multiple simple queries.
{"library_name": "peft", "base_model": "codellama/CodeLlama-7b-hf"}
null
GTimothee/sql-code-llama-4bits
[ "peft", "safetensors", "base_model:codellama/CodeLlama-7b-hf", "region:us" ]
2024-02-13T21:04:59+00:00
[]
[]
TAGS #peft #safetensors #base_model-codellama/CodeLlama-7b-hf #region-us
# Model Card for Model ID 4 bit general purpose text-to-SQL model. Takes 5677MiB of GPU memory. ## Model Details ### Model Description Provide the CREATE statement of the target table(s) in the context of your prompt and ask a question to your database. The model outputs a query to answer the question. Data used for fine tuning: URL ## Uses This model can be coupled with a chat model like llama2-chat to convert the output into a text answer. ### Direct Use Outputs: ## Bias, Risks, and Limitations - potential security issues if there is a malicious use. If you execute blindly the SQL queries that are being generated by end users you could lose data, leak information etc. - may be mistaken depending on the way the prompt has been written. ### Recommendations - Make sure that you check the generated SQL before applying it if the model is used by end users directly. - The model works well when used on simple tables and simple queries. If possible, try to break a complex query into multiple simple queries.
[ "# Model Card for Model ID\n\n4 bit general purpose text-to-SQL model.\n\nTakes 5677MiB of GPU memory.", "## Model Details", "### Model Description\n\nProvide the CREATE statement of the target table(s) in the context of your prompt and ask a question to your database. The model outputs a query to answer the question.\n\nData used for fine tuning: URL", "## Uses\n\nThis model can be coupled with a chat model like llama2-chat to convert the output into a text answer.", "### Direct Use\n\n\n\nOutputs:", "## Bias, Risks, and Limitations\n\n- potential security issues if there is a malicious use. If you execute blindly the SQL queries that are being generated by end users you could lose data, leak information etc. \n- may be mistaken depending on the way the prompt has been written.", "### Recommendations\n\n- Make sure that you check the generated SQL before applying it if the model is used by end users directly.\n- The model works well when used on simple tables and simple queries. If possible, try to break a complex query into multiple simple queries." ]
[ "TAGS\n#peft #safetensors #base_model-codellama/CodeLlama-7b-hf #region-us \n", "# Model Card for Model ID\n\n4 bit general purpose text-to-SQL model.\n\nTakes 5677MiB of GPU memory.", "## Model Details", "### Model Description\n\nProvide the CREATE statement of the target table(s) in the context of your prompt and ask a question to your database. The model outputs a query to answer the question.\n\nData used for fine tuning: URL", "## Uses\n\nThis model can be coupled with a chat model like llama2-chat to convert the output into a text answer.", "### Direct Use\n\n\n\nOutputs:", "## Bias, Risks, and Limitations\n\n- potential security issues if there is a malicious use. If you execute blindly the SQL queries that are being generated by end users you could lose data, leak information etc. \n- may be mistaken depending on the way the prompt has been written.", "### Recommendations\n\n- Make sure that you check the generated SQL before applying it if the model is used by end users directly.\n- The model works well when used on simple tables and simple queries. If possible, try to break a complex query into multiple simple queries." ]
[ 31, 27, 3, 50, 26, 8, 67, 62 ]
[ "passage: TAGS\n#peft #safetensors #base_model-codellama/CodeLlama-7b-hf #region-us \n# Model Card for Model ID\n\n4 bit general purpose text-to-SQL model.\n\nTakes 5677MiB of GPU memory.## Model Details### Model Description\n\nProvide the CREATE statement of the target table(s) in the context of your prompt and ask a question to your database. The model outputs a query to answer the question.\n\nData used for fine tuning: URL## Uses\n\nThis model can be coupled with a chat model like llama2-chat to convert the output into a text answer.### Direct Use\n\n\n\nOutputs:## Bias, Risks, and Limitations\n\n- potential security issues if there is a malicious use. If you execute blindly the SQL queries that are being generated by end users you could lose data, leak information etc. \n- may be mistaken depending on the way the prompt has been written.### Recommendations\n\n- Make sure that you check the generated SQL before applying it if the model is used by end users directly.\n- The model works well when used on simple tables and simple queries. If possible, try to break a complex query into multiple simple queries." ]
[ -0.12832164764404297, 0.0321430042386055, -0.0013963081873953342, 0.07408720999956131, 0.11126609891653061, -0.023403169587254524, 0.015315604396164417, 0.08537214994430542, 0.20983745157718658, 0.10610173642635345, 0.07719088345766068, 0.0440671369433403, 0.061643052846193314, 0.14587488770484924, -0.0024133045226335526, -0.048649296164512634, 0.10060225427150726, -0.07029850780963898, 0.1607235074043274, 0.10088478773832321, 0.05137025564908981, -0.0894998237490654, 0.10648861527442932, -0.010296796448528767, -0.0016308808699250221, -0.014794097281992435, 0.030522916465997696, 0.059685513377189636, 0.09198397397994995, 0.07907707989215851, -0.009452289901673794, 0.01947575993835926, -0.002040023449808359, -0.1626499891281128, 0.05331214889883995, 0.07194722443819046, -0.03715480864048004, -0.005027486942708492, 0.0845533087849617, 0.01990821212530136, 0.062263596802949905, -0.04414970055222511, 0.013917690142989159, 0.0761096403002739, -0.08110904693603516, -0.002796913031488657, -0.14665326476097107, -0.03317871317267418, 0.1731487214565277, 0.04786721616983414, -0.023990441113710403, 0.2261936217546463, -0.07310033589601517, 0.08549667149782181, 0.17353251576423645, -0.18155105412006378, 0.00968648586422205, 0.14793357253074646, 0.015148601494729519, 0.032437533140182495, -0.02463226392865181, 0.019193964079022408, 0.061401333659887314, 0.0020954220090061426, -0.016510941088199615, -0.03565312549471855, 0.03183640539646149, -0.02876937761902809, -0.11011549085378647, -0.10799955576658249, 0.09294562041759491, 0.01271485723555088, -0.07599367201328278, -0.13126397132873535, -0.09590138494968414, 0.0020992206409573555, 0.010370882228016853, -0.004271120298653841, -0.006779505405575037, 0.018717924132943153, 0.13212111592292786, -0.04464518278837204, -0.07173411548137665, -0.1552872657775879, 0.018762636929750443, -0.07321993261575699, 0.04606472700834274, 0.030916107818484306, -0.024963997304439545, 0.1288338005542755, -0.09374900162220001, -0.031133247539401054, -0.13363736867904663, -0.07656390219926834, -0.142117440700531, -0.01258908212184906, -0.053657595068216324, -0.002039206214249134, 0.12386465817689896, 0.10813301801681519, 0.025777747854590416, 0.10966357588768005, 0.0219796821475029, -0.002207206329330802, 0.0904843732714653, -0.03135034069418907, -0.07044879347085953, 0.04341141879558563, 0.10400548577308655, 0.02948159910738468, 0.08113491535186768, -0.026998313143849373, -0.011429932899773121, -0.015808861702680588, 0.0824301466345787, 0.1300714612007141, 0.026067744940519333, 0.04871254041790962, -0.05965767428278923, 0.02074812725186348, 0.14259475469589233, -0.09079708158969879, -0.028168359771370888, 0.05460279434919357, -0.030847031623125076, -0.14430230855941772, 0.026872016489505768, -0.011451751925051212, -0.10601276159286499, -0.04034234583377838, -0.0665844976902008, -0.00046774069778621197, -0.08508796244859695, -0.059571731835603714, 0.03592044487595558, -0.04235688969492912, -0.023775113746523857, -0.10891979932785034, -0.32087159156799316, -0.07941092550754547, -0.025712955743074417, -0.06739164888858795, -0.002612931886687875, 0.054047536104917526, -0.044287100434303284, -0.05504881218075752, -0.029202017933130264, 0.03123173676431179, -0.08421166241168976, 0.030710509046912193, 0.005719232838600874, 0.015194042585790157, -0.003895177273079753, -0.04485570266842842, -0.12508681416511536, 0.07679769396781921, -0.08884262293577194, 0.14377442002296448, -0.021219970658421516, 0.01269652508199215, -0.0770796537399292, 0.01753992959856987, 0.004888400435447693, 0.028549738228321075, 0.04076482355594635, 0.23454877734184265, -0.24451322853565216, 0.10618623346090317, 0.16602906584739685, -0.15542016923427582, -0.15331542491912842, 0.1801375150680542, -0.08872271329164505, 0.01634635217487812, 0.1215510442852974, 0.061999719589948654, 0.08313433825969696, -0.14909061789512634, -0.004978675860911608, 0.041606929153203964, -0.09703361988067627, 0.006035539787262678, 0.05958324670791626, -0.03298741579055786, -0.1437494158744812, -0.007563467603176832, -0.06202464550733566, -0.06998679786920547, 0.005629387218505144, -0.04702631011605263, -0.05063946917653084, -0.04437655955553055, -0.04387647286057472, -0.014202537946403027, -0.06388276070356369, -0.022519584745168686, -0.005270478315651417, -0.11759036034345627, 0.11433805525302887, -0.0017138301627710462, -0.030642395839095116, -0.10741402208805084, 0.15511180460453033, -0.05813232809305191, 0.008073735050857067, -0.15595023334026337, -0.144955113530159, 0.030380671843886375, -0.1148139089345932, -0.03588277846574783, 0.03568379208445549, -0.0011027716100215912, 0.0563613586127758, 0.01626696065068245, -0.012079943902790546, 0.1333806812763214, 0.015629513189196587, -0.09120246022939682, -0.06294827163219452, -0.01022292859852314, -0.0602911114692688, 0.1150781437754631, -0.20549993216991425, 0.017877643927931786, 0.0939355343580246, 0.027097420766949654, -0.017900550737977028, -0.02721387892961502, 0.0861862525343895, -0.03828474134206772, 0.020709097385406494, -0.04866554215550423, 0.04402991756796837, 0.05484291538596153, -0.05459613353013992, 0.0025763739831745625, -0.1979343444108963, -0.045513056218624115, 0.07018936425447464, 0.06059759855270386, -0.08207324147224426, -0.0820625051856041, 0.03387259691953659, -0.01899489387869835, -0.0762215107679367, -0.055196478962898254, 0.015530193224549294, 0.032445237040519714, 0.06508038938045502, -0.14282244443893433, -0.03729873523116112, 0.06201460212469101, -0.07341311126947403, 0.05176784470677376, 0.06336537003517151, 0.003304277779534459, -0.16746582090854645, 0.032466817647218704, -0.10183300077915192, 0.008122557774186134, 0.03246608376502991, 0.019839871674776077, -0.03876495733857155, -0.006552967242896557, 0.1332044154405594, -0.024016186594963074, 0.11499911546707153, -0.01011960580945015, 0.026346782222390175, 0.04009522125124931, 0.003149597439914942, 0.051929887384176254, -0.11167561262845993, 0.03504752367734909, -0.024186929687857628, -0.030417686328291893, -0.06504958868026733, 0.0415901280939579, -0.010244670324027538, 0.0757504552602768, -0.0007723739254288375, 0.06057105213403702, 0.054457925260066986, -0.08106136322021484, -0.17733453214168549, 0.25306758284568787, -0.017478089779615402, -0.208979532122612, -0.05091085657477379, 0.020882438868284225, -0.03755931556224823, 0.02404147759079933, 0.06858986616134644, -0.06551481783390045, -0.042500562965869904, -0.14849404990673065, -0.12325166165828705, 0.08255808800458908, -0.08899629861116409, -0.13125097751617432, -0.04408170282840729, 0.03632600978016853, -0.08024246245622635, -0.025805100798606873, -0.012675805017352104, -0.03965463861823082, 0.05213722586631775, -0.023725563660264015, 0.11478087306022644, 0.0805210992693901, 0.033873025327920914, 0.00832410529255867, -0.013136452063918114, 0.23407945036888123, -0.011583419516682625, -0.01718209497630596, 0.21709923446178436, -0.006755054462701082, 0.0672617256641388, 0.14294672012329102, -0.011062704026699066, -0.11927803605794907, 0.04888063669204712, 0.04815712198615074, -0.10288631916046143, -0.12129107117652893, -0.08685687929391861, -0.044976282864809036, -0.04138042777776718, 0.041555255651474, 0.05292494595050812, -0.025858646258711815, 0.10230561345815659, -0.09434272348880768, -0.11297322809696198, -0.006632915232330561, 0.08921968191862106, 0.004054003395140171, 0.012243770994246006, 0.012591753154993057, -0.06935610622167587, 0.06856372207403183, 0.09174313396215439, 0.03825325518846512, 0.13212107121944427, -0.0893852487206459, 0.08541136980056763, 0.08033044636249542, 0.08192030340433121, 0.011895024217665195, 0.077202208340168, -0.026989175006747246, -0.013161285780370235, -0.015266434289515018, -0.05135449767112732, -0.14192727208137512, 0.0546722337603569, -0.07282623648643494, 0.007800220511853695, -0.09309221804141998, 0.06896957010030746, 0.08280742168426514, 0.09613019227981567, 0.02951202169060707, -0.24663549661636353, -0.045264728367328644, 0.012112385593354702, 0.010200648568570614, -0.09422502666711807, 0.044757865369319916, 0.020860683172941208, -0.17918618023395538, 0.0123052429407835, -0.03237966448068619, 0.08705206215381622, -0.062366437166929245, 0.021993709728121758, -0.0733318105340004, 0.06664128601551056, -0.04202111065387726, 0.09284171462059021, -0.2370568960905075, 0.0565846711397171, 0.00350364507175982, 0.08579839766025543, -0.07285203039646149, -0.023789338767528534, 0.046686675399541855, -0.009776690043509007, 0.18044394254684448, 0.011697000823915005, -0.05658663436770439, -0.08902480453252792, -0.020022451877593994, 0.05890875682234764, -0.05365694686770439, 0.027633443474769592, 0.12056468427181244, 0.0031946455128490925, 0.025253072381019592, -0.05151568353176117, 0.04683660715818405, -0.12380141764879227, -0.11541728675365448, -0.0018711347365751863, -0.0913536325097084, 0.03499998897314072, -0.059560392051935196, 0.01616751030087471, 0.03367938473820686, 0.03362515568733215, -0.13406312465667725, -0.13697773218154907, -0.09198928624391556, -0.013918169774115086, 0.03277325630187988, -0.06172061339020729, 0.030789395794272423, -0.06034088134765625, 0.1009812131524086, -0.004691990092396736, -0.027636202052235603, 0.048974670469760895, -0.1530604064464569, -0.14965038001537323, -0.029891397804021835, 0.09536346793174744, 0.07292294502258301, 0.001766630564816296, 0.08879969269037247, -0.006625543348491192, -0.04489583522081375, -0.15167954564094543, -0.021179424598813057, 0.07070887088775635, 0.09741690009832382, 0.129145085811615, 0.03692960739135742, -0.0060694231651723385, -0.05549756437540054, 0.00885478314012289, 0.08826206624507904, 0.2245543748140335, -0.03718642517924309, 0.07770160585641861, 0.15182672441005707, -0.03545188903808594, -0.18021181225776672, -0.0324244387447834, -0.00718699162825942, -0.01543094776570797, -0.004474520683288574, -0.07648282498121262, 0.06653492152690887, 0.028601570054888725, -0.034316305071115494, -0.012586995027959347, -0.16319870948791504, -0.08206485211849213, 0.09132485836744308, 0.0725550577044487, 0.25298601388931274, -0.1514187455177307, -0.07319425046443939, -0.0740860104560852, -0.14846839010715485, 0.1325533241033554, -0.20143698155879974, 0.0436396524310112, 0.017351951450109482, -0.037265971302986145, 0.037815071642398834, -0.08203931897878647, 0.15993370115756989, -0.00894321221858263, 0.1258525848388672, -0.05013208091259003, 0.0263614933937788, -0.03725796937942505, -0.10589102655649185, 0.135211780667305, -0.03210316225886345, 0.0976618230342865, -0.0881982147693634, -0.060305312275886536, -0.09491656720638275, 0.0495334267616272, -0.04314188286662102, -0.02418624423444271, -0.03148188441991806, 0.05289611965417862, 0.09526841342449188, -0.017885947600007057, -0.11058496683835983, -0.10270802676677704, 0.009826460853219032, 0.22599054872989655, 0.16737769544124603, -0.02681933343410492, -0.07411709427833557, 0.03341773897409439, -0.023742035031318665, 0.08792313188314438, -0.045494817197322845, 0.06631065160036087, 0.02361382357776165, 0.025520410388708115, 0.050113726407289505, 0.008295065723359585, -0.08768533170223236, 0.05118192359805107, 0.013965344056487083, -0.09605009853839874, -0.23601625859737396, 0.019081683829426765, 0.05981319770216942, -0.0617394745349884, 0.01573236845433712, 0.06690483540296555, 0.0026365243829786777, -0.01633627340197563, -0.017344823107123375, 0.059468213468790054, 0.026313088834285736, 0.10327041894197464, 0.04180190712213516, 0.08439622074365616, -0.11158064752817154, 0.04817180335521698, -0.03545958176255226, -0.08017091453075409, 0.028301719576120377, 0.04462972655892372, -0.1761014610528946, -0.07724211364984512, -0.08189402520656586, 0.1262233406305313, -0.08046939224004745, -0.13960468769073486, -0.0832936093211174, -0.048754747956991196, -0.012284797616302967, 0.1289615035057068, 0.08822360634803772, 0.03131083399057388, 0.07511802017688751, -0.03184589371085167, -0.044891659170389175, 0.06011892110109329, 0.09967772662639618, 0.004845419432967901, -0.13291499018669128, -0.024973679333925247, 0.01732950657606125, 0.09264203906059265, -0.04195309430360794, -0.0003510529058985412, -0.09326379001140594, 0.012817722745239735, -0.2664475440979004, 0.14036989212036133, -0.1074795350432396, 0.0067819757387042046, 0.005023677367717028, 0.07098972797393799, -0.00011718594760168344, 0.07091419398784637, -0.030901363119482994, 0.009308678098022938, -0.02493537776172161, 0.07703940570354462, -0.1047578975558281, 0.013923278078436852, 0.05231615528464317, -0.03207395225763321, 0.11469700932502747, 0.029802436009049416, -0.09730207175016403, 0.014726105146110058, -0.1261122077703476, -0.01854466274380684, -0.026351947337388992, 0.012892985716462135, -0.01912463642656803, -0.14734536409378052, 0.02505321614444256, 0.042756784707307816, -0.07022196799516678, -0.018123993650078773, 0.1368338018655777, -0.06852850317955017, 0.0391349159181118, 0.05563802644610405, 0.014225607737898827, -0.07205308228731155, 0.026790952309966087, 0.008904549293220043, 0.10238732397556305, 0.13531655073165894, -0.05172419548034668, 0.007575646508485079, -0.10715356469154358, -0.069776751101017, 0.011897572316229343, 0.04497393220663071, -0.11302606761455536, -0.017628073692321777, 0.07696016877889633, 0.010428844951093197, 0.25654512643814087, -0.028798436746001244, 0.014141202904284, -0.00989898294210434, 0.11865061521530151, 0.1476713865995407, -0.010359972715377808, 0.11572849005460739, -0.06289830058813095, 0.007669917307794094, 0.03678062558174133, 0.03731890022754669, -0.006187377963215113, 0.05074257776141167, 0.169316828250885, 0.1126786395907402, 0.13535954058170319, 0.029020754620432854, -0.03994792699813843, -0.052710309624671936, 0.09360463917255402, -0.06301622837781906, -0.03401217237114906, -0.015722645446658134, -0.0601285956799984, -0.02641606330871582, 0.2423735111951828, -0.06475113332271576, 0.030540894716978073, -0.04442165046930313, -0.07527004927396774, -0.12745371460914612, -0.16021594405174255, -0.045831818133592606, -0.050246886909008026, 0.0022355900146067142, -0.08769094944000244, 0.004585936665534973, 0.1920655369758606, -0.00950603373348713, -0.04411409795284271, 0.1651468724012375, -0.015782969072461128, -0.07010657340288162, -0.07865177094936371, -0.023925619199872017, 0.0410534106194973, 0.1426648199558258, 0.016324471682310104, 0.09283274412155151, -0.08996066451072693, 0.04771457612514496, 0.07209738343954086, 0.09763987362384796, 0.06841101497411728, -0.053752392530441284, -0.04202815145254135, -0.014598829671740532, 0.037290263921022415, 0.028962962329387665, 0.22764043509960175, 0.07638908177614212, -0.04477613419294357, 0.010335906408727169, 0.18068747222423553, -0.026193398982286453, -0.033833861351013184, -0.1343180239200592, 0.27826452255249023, -0.040570735931396484, 0.005484960041940212, 0.013515613041818142, -0.12732172012329102, 0.10905224829912186, 0.12920646369457245, 0.0914420560002327, -0.11913850903511047, 0.004939443897455931, -0.05491761118173599, 0.0020985098090022802, -0.007916659116744995, 0.11486315727233887, -0.027397288009524345, 0.2319364696741104, -0.08626708388328552, 0.1719689518213272, -0.00784794520586729, -0.046998754143714905, -0.1015203669667244, 0.08014897257089615, 0.0007955350447446108, 0.020847272127866745, -0.05717092752456665, 0.09567064046859741, -0.07711250334978104, -0.11387262493371964, -0.05089510604739189, 0.0710107609629631, -0.03802433982491493, 0.0273943729698658, -0.16146139800548553, 0.05049372464418411, 0.06962686032056808, -0.07604160159826279, 0.0583159439265728, 0.03455260023474693, -0.018489964306354523, -0.10142159461975098, -0.07850700616836548, 0.06693428754806519, 0.14694562554359436, 0.1639731079339981, 0.02574288845062256, 0.15514637529850006, 0.129154235124588, -0.08267483115196228, -0.1732650250196457, 0.09813783317804337, 0.05345771089196205, -0.05408734083175659, 0.04790593683719635, 0.015113092958927155, 0.02426493912935257, -0.03906512260437012, 0.028485892340540886, -0.05624616518616676, 0.04396915063261986, 0.028393413871526718, 0.017236758023500443, -0.18338625133037567, 0.10796298831701279, -0.09005632996559143, 0.13303683698177338, 0.051832426339387894, -0.07208794355392456, 0.058919213712215424, -0.014027420431375504, 0.10630206763744354, -0.011702011339366436, 0.00825201254338026, 0.021580012515187263, -0.17500612139701843, 0.1539643108844757, 0.11123482882976532, 0.0002333187439944595, -0.22844593226909637, -0.01492128986865282, -0.026253124698996544, -0.004089243710041046, 0.014065081253647804, 0.08786982297897339, 0.009931669570505619, 0.03250611573457718, -0.038413155823946, -0.12293989956378937, -0.025207726284861565, 0.12883614003658295, -0.015392152592539787, -0.14569921791553497 ]
null
null
transformers
## Model card for MUZ ## This is a model finetuned on data. What is Lorem Ipsum? Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book. It has survived not only five centuries, but also the leap into electronic typesetting, remaining essentially unchanged. It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum. Why do we use it? It is a long established fact that a reader will be distracted by the readable content of a page when looking at its layout. The point of using Lorem Ipsum is that it has a more-or-less normal distribution of letters, as opposed to using 'Content here, content here', making it look like readable English. Many desktop publishing packages and web page editors now use Lorem Ipsum as their default model text, and a search for 'lorem ipsum' will uncover many web sites still in their infancy. Various versions have evolved over the years, sometimes by accident, sometimes on purpose (injected humour and the like). Where does it come from? Contrary to popular belief, Lorem Ipsum is not simply random text. It has roots in a piece of classical Latin literature from 45 BC, making it over 2000 years old. Richard McClintock, a Latin professor at Hampden-Sydney College in Virginia, looked up one of the more obscure Latin words, consectetur, from a Lorem Ipsum passage, and going through the cites of the word in classical literature, discovered the undoubtable source. Lorem Ipsum comes from sections 1.10.32 and 1.10.33 of "de Finibus Bonorum et Malorum" (The Extremes of Good and Evil) by Cicero, written in 45 BC. This book is a treatise on the theory of ethics, very popular during the Renaissance. The first line of Lorem Ipsum, "Lorem ipsum dolor sit amet..", comes from a line in section 1.10.32.
{"license": "apache-2.0"}
text-generation
RaduGabriel/MUZ
[ "transformers", "safetensors", "mistral", "text-generation", "conversational", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2024-02-13T21:05:19+00:00
[]
[]
TAGS #transformers #safetensors #mistral #text-generation #conversational #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
## Model card for MUZ ## This is a model finetuned on data. What is Lorem Ipsum? Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book. It has survived not only five centuries, but also the leap into electronic typesetting, remaining essentially unchanged. It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum. Why do we use it? It is a long established fact that a reader will be distracted by the readable content of a page when looking at its layout. The point of using Lorem Ipsum is that it has a more-or-less normal distribution of letters, as opposed to using 'Content here, content here', making it look like readable English. Many desktop publishing packages and web page editors now use Lorem Ipsum as their default model text, and a search for 'lorem ipsum' will uncover many web sites still in their infancy. Various versions have evolved over the years, sometimes by accident, sometimes on purpose (injected humour and the like). Where does it come from? Contrary to popular belief, Lorem Ipsum is not simply random text. It has roots in a piece of classical Latin literature from 45 BC, making it over 2000 years old. Richard McClintock, a Latin professor at Hampden-Sydney College in Virginia, looked up one of the more obscure Latin words, consectetur, from a Lorem Ipsum passage, and going through the cites of the word in classical literature, discovered the undoubtable source. Lorem Ipsum comes from sections 1.10.32 and 1.10.33 of "de Finibus Bonorum et Malorum" (The Extremes of Good and Evil) by Cicero, written in 45 BC. This book is a treatise on the theory of ethics, very popular during the Renaissance. The first line of Lorem Ipsum, "Lorem ipsum dolor sit amet..", comes from a line in section 1.10.32.
[ "## Model card for MUZ ##\n\nThis is a model finetuned on data.\nWhat is Lorem Ipsum?\nLorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book. It has survived not only five centuries, but also the leap into electronic typesetting, remaining essentially unchanged. It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum.\n\nWhy do we use it?\nIt is a long established fact that a reader will be distracted by the readable content of a page when looking at its layout. The point of using Lorem Ipsum is that it has a more-or-less normal distribution of letters, as opposed to using 'Content here, content here', making it look like readable English. Many desktop publishing packages and web page editors now use Lorem Ipsum as their default model text, and a search for 'lorem ipsum' will uncover many web sites still in their infancy. Various versions have evolved over the years, sometimes by accident, sometimes on purpose (injected humour and the like).\n\n\nWhere does it come from?\nContrary to popular belief, Lorem Ipsum is not simply random text. It has roots in a piece of classical Latin literature from 45 BC, making it over 2000 years old. Richard McClintock, a Latin professor at Hampden-Sydney College in Virginia, looked up one of the more obscure Latin words, consectetur, from a Lorem Ipsum passage, and going through the cites of the word in classical literature, discovered the undoubtable source. Lorem Ipsum comes from sections 1.10.32 and 1.10.33 of \"de Finibus Bonorum et Malorum\" (The Extremes of Good and Evil) by Cicero, written in 45 BC. This book is a treatise on the theory of ethics, very popular during the Renaissance. The first line of Lorem Ipsum, \"Lorem ipsum dolor sit amet..\", comes from a line in section 1.10.32." ]
[ "TAGS\n#transformers #safetensors #mistral #text-generation #conversational #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## Model card for MUZ ##\n\nThis is a model finetuned on data.\nWhat is Lorem Ipsum?\nLorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book. It has survived not only five centuries, but also the leap into electronic typesetting, remaining essentially unchanged. It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum.\n\nWhy do we use it?\nIt is a long established fact that a reader will be distracted by the readable content of a page when looking at its layout. The point of using Lorem Ipsum is that it has a more-or-less normal distribution of letters, as opposed to using 'Content here, content here', making it look like readable English. Many desktop publishing packages and web page editors now use Lorem Ipsum as their default model text, and a search for 'lorem ipsum' will uncover many web sites still in their infancy. Various versions have evolved over the years, sometimes by accident, sometimes on purpose (injected humour and the like).\n\n\nWhere does it come from?\nContrary to popular belief, Lorem Ipsum is not simply random text. It has roots in a piece of classical Latin literature from 45 BC, making it over 2000 years old. Richard McClintock, a Latin professor at Hampden-Sydney College in Virginia, looked up one of the more obscure Latin words, consectetur, from a Lorem Ipsum passage, and going through the cites of the word in classical literature, discovered the undoubtable source. Lorem Ipsum comes from sections 1.10.32 and 1.10.33 of \"de Finibus Bonorum et Malorum\" (The Extremes of Good and Evil) by Cicero, written in 45 BC. This book is a treatise on the theory of ethics, very popular during the Renaissance. The first line of Lorem Ipsum, \"Lorem ipsum dolor sit amet..\", comes from a line in section 1.10.32." ]
[ 59, 515 ]
[ "passage: TAGS\n#transformers #safetensors #mistral #text-generation #conversational #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.010252542793750763, 0.07352796941995621, -0.006568553391844034, 0.0001816342701204121, 0.0757177397608757, -0.025136319920420647, 0.18297138810157776, 0.10687468200922012, -0.018304111436009407, -0.03559499233961105, 0.1580429971218109, 0.16274356842041016, -0.018578359857201576, 0.06286827474832535, -0.1004268229007721, -0.15927527844905853, 0.12504154443740845, -0.030325422063469887, 0.019838258624076843, 0.07836394011974335, 0.12561240792274475, -0.02765655517578125, 0.06302490830421448, -0.037133511155843735, -0.0524265356361866, 0.015384755097329617, 0.05416877195239067, -0.12452121078968048, 0.09221465140581131, 0.047140415757894516, 0.07020571827888489, 0.05256413295865059, -0.040049273520708084, -0.2263445109128952, 0.02375885471701622, 0.015231466852128506, -0.0605887845158577, 0.03415777534246445, 0.03167411684989929, -0.04880571365356445, 0.040119487792253494, 0.030711200088262558, -0.015241919085383415, 0.08783470839262009, -0.09830082952976227, -0.06728467345237732, -0.07140665501356125, 0.005408733617514372, 0.11374121159315109, 0.08997190743684769, 0.001302481396123767, 0.1312408596277237, -0.046912096440792084, 0.08120984584093094, 0.08918444812297821, -0.3487792909145355, 0.004115980584174395, 0.07597554475069046, 0.04232953488826752, 0.05199183523654938, -0.039184167981147766, 0.06691788882017136, 0.07014147192239761, -0.009054316207766533, 0.03270167112350464, -0.05688630789518356, -0.11163307726383209, 0.030167872086167336, -0.057139717042446136, -0.037839166820049286, 0.2604406476020813, -0.012873785570263863, 0.01748671941459179, -0.061292920261621475, -0.08573693037033081, 0.05647439882159233, -0.02869628369808197, 0.02583964727818966, 0.0025097152683883905, 0.11952083557844162, 0.0644710510969162, -0.03758637234568596, -0.13979269564151764, -0.0023317444138228893, -0.1742590069770813, 0.0659179538488388, 0.009481769986450672, 0.043676503002643585, -0.15108686685562134, 0.02852519415318966, 0.023962324485182762, -0.13224577903747559, -0.013269432820379734, -0.07785214483737946, 0.08455437421798706, 0.02068314328789711, -0.05762026831507683, -0.043671831488609314, 0.18586686253547668, 0.21189670264720917, 0.000831006036605686, 0.03854392096400261, -0.10021555423736572, 0.08630210906267166, -0.01930639147758484, 0.018723033368587494, 0.05217520520091057, -0.055521152913570404, 0.10975517332553864, -0.09644588828086853, 0.1021677628159523, -0.03931199759244919, -0.13046129047870636, 0.017976872622966766, 0.007982809096574783, 0.13029702007770538, 0.04841647297143936, 0.08540529012680054, -0.034155648201704025, 0.06024540960788727, 0.11709117889404297, -0.07238145917654037, -0.013051922433078289, 0.01770484633743763, 0.0565100759267807, 0.035948481410741806, 0.03971082717180252, 0.06887484341859818, -0.04956711083650589, 0.008280693553388119, -0.04488208889961243, -0.04757337644696236, -0.03315996006131172, -0.03265184164047241, 0.07820938527584076, -0.013682170771062374, 0.030022503808140755, -0.16900722682476044, -0.19474279880523682, 0.01751401461660862, 0.04997485131025314, -0.009684477932751179, -0.04848022758960724, -0.02713644690811634, -0.03508581593632698, 0.04866521805524826, -0.08211714774370193, -0.05060480162501335, -0.09111718088388443, 0.039360616356134415, -0.06490273773670197, 0.03287268802523613, -0.1655266433954239, 0.037351373583078384, -0.12824484705924988, 0.008573218248784542, -0.03318792209029198, 0.010179937817156315, -0.07174979895353317, 0.20974396169185638, -0.05493735149502754, 0.04201517254114151, -0.0273604616522789, 0.031650345772504807, -0.027704479172825813, 0.1834506243467331, -0.12796734273433685, -0.01703787036240101, 0.22635959088802338, -0.13110138475894928, -0.22994908690452576, 0.10837671160697937, 0.00984897930175066, 0.056181129068136215, 0.1101662814617157, 0.18136894702911377, 0.01146231684833765, -0.0397096611559391, 0.08047764748334885, 0.11389042437076569, -0.05726505443453789, -0.08307189494371414, 0.02664053440093994, -0.057468004524707794, -0.11592894047498703, 0.045964352786540985, 0.048171527683734894, 0.06545097380876541, 0.014102572575211525, -0.07255655527114868, -0.06880072504281998, -0.046264953911304474, -0.0048459055833518505, -0.055872876197099686, 0.03703989461064339, -0.11203555017709732, -0.009956411086022854, -0.03158671781420708, 0.015115656889975071, 0.005509060807526112, 0.036672983318567276, -0.06483431905508041, 0.02968791127204895, 0.022968364879488945, 0.07324444502592087, -0.08637925982475281, -0.07117557525634766, -0.0043877847492694855, 0.0357024185359478, -0.000137450682814233, 0.025243118405342102, 0.05910521373152733, -0.02524854615330696, -0.00418273126706481, 0.0015498799039050937, 0.18234853446483612, 0.05058462545275688, -0.037007879465818405, -0.1300317496061325, 0.100154809653759, -0.059592217206954956, 0.06863367557525635, -0.06489858031272888, 0.03587336838245392, 0.04726557433605194, 0.09334373474121094, -0.0013381000608205795, 0.07978472858667374, -0.014279251918196678, -0.04206135869026184, -0.08311929553747177, -0.007465486414730549, 0.09074488282203674, 0.04492030292749405, -0.1106984093785286, 0.2230384796857834, -0.1875034123659134, 0.20849530398845673, 0.18934805691242218, -0.15981917083263397, 0.06803763657808304, -0.0701257586479187, -0.0016351215308532119, -0.011016486212611198, 0.04784798249602318, -0.03754686191678047, 0.004144254140555859, 0.002242141403257847, 0.1473243087530136, -0.057670168578624725, -0.03426865488290787, -0.02469365857541561, -0.08547203242778778, -0.027972273528575897, 0.04244127869606018, 0.037040866911411285, -0.18323862552642822, 0.18811753392219543, 0.31410858035087585, -0.005404068157076836, 0.10921560227870941, -0.07903379201889038, 0.00723345996811986, 0.05158434435725212, 0.03903116658329964, -0.00235843937844038, -0.042227327823638916, -0.14175695180892944, 0.002232127822935581, 0.07850880175828934, 0.05014284700155258, 0.05595972016453743, -0.102883480489254, -0.038251589983701706, -0.0015708932187408209, -0.03354936093091965, 0.005786040332168341, 0.04485464468598366, -0.024124030023813248, 0.11276932060718536, -0.04731103032827377, -0.07435739040374756, 0.11313275247812271, -0.016731111332774162, -0.10337414592504501, 0.16342853009700775, -0.17755094170570374, -0.22182056307792664, -0.14328257739543915, -0.14246761798858643, -0.06028762832283974, 0.04169784113764763, 0.1600288599729538, -0.06313778460025787, -0.06890027225017548, -0.07590115070343018, -0.023084767162799835, 0.02997477725148201, 0.00048373191384598613, 0.00826412346214056, 0.05315007269382477, -0.009989236481487751, -0.12778261303901672, -0.03959066793322563, 0.03964610770344734, -0.05122041329741478, 0.09488074481487274, -0.0854000523686409, 0.09223271906375885, 0.12414287030696869, 0.05177394300699234, -0.001115516060963273, -0.038650769740343094, 0.1401432305574417, -0.03460853174328804, 0.016889991238713264, 0.19638393819332123, -0.05795614421367645, 0.06529393047094345, 0.18629983067512512, 0.001290708314627409, -0.09287770837545395, 0.06399395316839218, -0.044712573289871216, -0.06779327988624573, -0.26221001148223877, -0.10474235564470291, -0.09441300481557846, 0.08875038474798203, 0.00584576977416873, 0.07299592345952988, 0.15877602994441986, 0.06717812269926071, -0.05665771663188934, -0.029498334974050522, 0.10147713869810104, 0.09513623267412186, 0.22614948451519012, -0.030907414853572845, 0.1212201714515686, -0.1158846765756607, -0.07454964518547058, 0.10287195444107056, 0.08011835813522339, 0.11522674560546875, 0.13513067364692688, 0.13670147955417633, 0.06650472432374954, 0.09009866416454315, 0.07653361558914185, 0.119605652987957, 0.03796898201107979, -0.02495310828089714, -0.05234001204371452, -0.054827939718961716, -0.03724612295627594, 0.06567545980215073, -0.11422061175107956, -0.10659125447273254, -0.007576568052172661, -0.04273698106408119, 0.11534444242715836, 0.17863032221794128, 0.042575448751449585, -0.16113145649433136, 0.004613202530890703, 0.14196020364761353, -0.00682741217315197, -0.06116963177919388, 0.11843125522136688, 0.002108689397573471, -0.04339995235204697, 0.11472535878419876, -0.019697247073054314, 0.12856356799602509, 0.013386660255491734, 0.07068197429180145, -0.09359867125749588, -0.08220197260379791, 0.025981761515140533, 0.11521713435649872, -0.3159278333187103, 0.18978425860404968, -0.0010063379304483533, 0.007522616069763899, -0.07916681468486786, 0.0250090342015028, 0.04605873301625252, 0.18513470888137817, 0.1204175055027008, -0.017935210838913918, -0.17496685683727264, 0.027804601937532425, -0.05549866333603859, 0.04576382413506508, 0.0618387870490551, 0.0086124362424016, -0.02425481379032135, -0.07128308713436127, -0.008588594384491444, 0.027496295049786568, 0.015919078141450882, -0.09426803141832352, -0.17344002425670624, 0.03153754770755768, 0.1434376835823059, 0.07716933637857437, -0.06000714749097824, 0.03352149948477745, -0.11384015530347824, 0.17842407524585724, -0.10433799773454666, -0.056842345744371414, -0.10531237721443176, -0.16266579926013947, 0.017533237114548683, -0.028903184458613396, 0.05800329148769379, -0.07804932445287704, 0.05132719501852989, -0.07455950230360031, -0.1922626495361328, 0.11479321867227554, -0.14084671437740326, -0.03271150961518288, -0.043128255754709244, 0.12250269204378128, -0.08908601105213165, -0.013150866143405437, 0.05829966440796852, 0.031616490334272385, -0.08705606311559677, -0.11382154375314713, -0.010173951275646687, 0.030441153794527054, 0.02773318998515606, -0.027103176340460777, -0.11392561346292496, -0.14426855742931366, 0.005620115902274847, -0.06722567230463028, 0.24965260922908783, 0.24065491557121277, -0.051099177449941635, 0.16223986446857452, 0.23263408243656158, -0.09445136040449142, -0.34072408080101013, -0.1429499238729477, -0.18820294737815857, -0.08993430435657501, 0.0019450433319434524, -0.10736341029405594, 0.11139417439699173, 0.039152842015028, -0.08181363344192505, 0.07132827490568161, -0.19033069908618927, -0.09590533375740051, 0.1965101659297943, -0.002614920027554035, 0.3138757646083832, -0.1854335218667984, -0.10005801171064377, -0.1414281278848648, -0.14505444467067719, 0.12983320653438568, -0.20802143216133118, 0.044429171830415726, 0.048711176961660385, 0.021194545552134514, 0.0023561164271086454, -0.04099220409989357, 0.12449119240045547, -0.035172950476408005, 0.05134400725364685, -0.13652174174785614, 0.06063821539282799, 0.07575587928295135, -0.033752311021089554, 0.07389073073863983, -0.19837838411331177, 0.02906295470893383, -0.004439173731952906, -0.017692290246486664, -0.013459340669214725, 0.0832187831401825, 0.00434917351230979, -0.058690816164016724, -0.02447640523314476, -0.07824976742267609, 0.051526885479688644, 0.0027358676306903362, 0.27735230326652527, -0.023270899429917336, 0.13644006848335266, 0.17173177003860474, 0.1310095191001892, -0.1606997549533844, 0.10493151098489761, -0.0486031249165535, -0.09114189445972443, 0.06459162384271622, -0.16171306371688843, 0.07584212720394135, 0.055033277720212936, -0.0765332579612732, 0.08259367197751999, 0.061147578060626984, 0.006398675963282585, -0.026846852153539658, 0.12852883338928223, -0.17464923858642578, -0.07194581627845764, -0.005847286432981491, 0.16038480401039124, 0.05006495863199234, 0.09448280930519104, 0.17284736037254333, 0.005990549456328154, 0.006147334817796946, 0.004619710147380829, 0.06274044513702393, -0.05701812356710434, 0.03845622017979622, -0.0015891932416707277, 0.000038725545891793445, -0.12230079621076584, 0.14206519722938538, 0.0046218628995120525, -0.1403893381357193, 0.027399752289056778, 0.1070607379078865, -0.15013083815574646, -0.14388765394687653, -0.004938151221722364, 0.11717752367258072, -0.0892433151602745, -0.10706952214241028, -0.0264526829123497, -0.17594404518604279, 0.02679920196533203, 0.14002440869808197, 0.06996341049671173, 0.0786001905798912, 0.02167295292019844, -0.05489587038755417, 0.028369301930069923, 0.03966451436281204, -0.06352580338716507, 0.02372283861041069, -0.08215540647506714, -0.07329670339822769, -0.030063757672905922, 0.018145494163036346, -0.06406988948583603, -0.016746098175644875, -0.11123859137296677, 0.013318796642124653, -0.19077003002166748, 0.006555695552378893, -0.10075966268777847, -0.01318433228880167, 0.019079012796282768, -0.04687628522515297, -0.0385894849896431, -0.019272958859801292, -0.09192384779453278, -0.01489227544516325, -0.0430498942732811, 0.07605192810297012, -0.12049007415771484, -0.03518927842378616, 0.07230503112077713, -0.035131726413965225, 0.12461156398057938, 0.0738254114985466, -0.10226332396268845, 0.0875110998749733, -0.26571545004844666, -0.06337278336286545, 0.10762592405080795, 0.015992866829037666, 0.008291834965348244, 0.01836102269589901, -0.04116426035761833, 0.13994671404361725, 0.0015489797806367278, 0.030450500547885895, 0.023258233442902565, -0.10207445174455643, -0.01799721084535122, -0.021926358342170715, -0.11953842639923096, -0.008577907457947731, -0.1244194284081459, 0.10522677004337311, -0.020554382354021072, 0.1890440136194229, -0.06796601414680481, 0.0331474244594574, -0.03806212544441223, 0.03356713056564331, 0.011564678512513638, -0.1650460958480835, -0.1684970259666443, -0.052624598145484924, -0.024402474984526634, -0.01905425824224949, 0.24492231011390686, -0.033982958644628525, -0.06186120584607124, 0.09883400797843933, 0.04438910260796547, 0.031500764191150665, 0.04197186604142189, 0.2752436101436615, 0.07866504043340683, -0.02406279183924198, -0.14695976674556732, -0.013598430901765823, 0.03630317375063896, -0.10550573468208313, 0.05520932748913765, 0.12186451256275177, -0.006292752455919981, 0.11057532578706741, 0.0160667784512043, 0.018217485398054123, -0.030920671299099922, -0.06738458573818207, -0.018263312056660652, 0.06906884163618088, -0.022111793980002403, 0.10265366733074188, 0.21106970310211182, -0.018275398761034012, -0.01235189102590084, -0.07087722420692444, -0.012332356534898281, -0.17377395927906036, -0.1232280433177948, -0.1036277487874031, -0.12159759551286697, -0.011024012230336666, -0.08171845972537994, 0.0451091043651104, 0.055033523589372635, 0.053667567670345306, -0.045263927429914474, 0.07413837313652039, -0.023163223639130592, -0.046370603144168854, 0.021719854325056076, -0.03851430490612984, 0.017562249675393105, 0.006720670498907566, -0.06866300851106644, -0.05166241526603699, -0.006518620532006025, -0.03926064446568489, 0.07923031598329544, 0.02352769486606121, 0.07027895748615265, -0.13117137551307678, -0.06956068426370621, -0.038647811859846115, 0.06411341577768326, -0.05166839435696602, 0.1413545161485672, 0.03510602191090584, -0.013611333444714546, 0.11243408173322678, 0.19237913191318512, -0.07333988696336746, -0.16224688291549683, -0.07671571522951126, 0.13803118467330933, 0.008099090307950974, 0.1202210858464241, -0.03981605917215347, -0.004803306423127651, -0.039539482444524765, 0.2960340678691864, 0.26755020022392273, -0.05800122395157814, 0.019821470603346825, -0.06597090512514114, 0.03274916112422943, 0.03206780180335045, 0.11216409504413605, 0.10619974136352539, 0.1947760134935379, -0.043736424297094345, 0.003951146267354488, -0.004657295066863298, -0.013440691865980625, -0.18110491335391998, 0.09129597991704941, -0.04484325647354126, -0.06248468905687332, 0.0010717905825003982, 0.11364062875509262, -0.11100497841835022, 0.08434328436851501, -0.1220983937382698, -0.0874624252319336, -0.01275616604834795, 0.007348387967795134, 0.2092914879322052, 0.00890094693750143, 0.03234400227665901, -0.018472179770469666, -0.0750458613038063, 0.08405411243438721, -0.025905011221766472, -0.15508121252059937, -0.015458901412785053, 0.037837207317352295, -0.05474290996789932, 0.12587103247642517, 0.016543468460440636, 0.03207947686314583, 0.08822518587112427, 0.029494429007172585, -0.09507623314857483, 0.12379293888807297, 0.03328826650977135, -0.045677971094846725, 0.04380908980965614, -0.07597766816616058, -0.035572703927755356, -0.007287365850061178, 0.0694204717874527, -0.10126548260450363, 0.05749470740556717, 0.05283349007368088, -0.1025528833270073, -0.023606281727552414, 0.03711671382188797, -0.08150213956832886, 0.06034255772829056, -0.00970538891851902, -0.0406995452940464, -0.00021240636124275625, -0.02470887079834938, -0.0012689005816355348, -0.012141534127295017, -0.1594473272562027, -0.03635278344154358, -0.09263020753860474, -0.040749482810497284, 0.11672408133745193, 0.04672009125351906, -0.19359426200389862, -0.013023761101067066, -0.1071794182062149, 0.05781279504299164, -0.17652110755443573, 0.03787307068705559, 0.16900908946990967, -0.004725546110421419, -0.01679854653775692, -0.13803765177726746, 0.04839393496513367, 0.06087784841656685, -0.015797562897205353, -0.1051703691482544 ]
null
null
transformers
#Aquabot DialoGTP Model
{"license": "mit", "tags": ["conversational"], "pipeline_tag": "conversational"}
text-generation
isaipd20/aquabot
[ "transformers", "safetensors", "gpt2", "text-generation", "conversational", "license:mit", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2024-02-13T21:13:10+00:00
[]
[]
TAGS #transformers #safetensors #gpt2 #text-generation #conversational #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
#Aquabot DialoGTP Model
[]
[ "TAGS\n#transformers #safetensors #gpt2 #text-generation #conversational #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #safetensors #gpt2 #text-generation #conversational #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.00370166078209877, 0.058113399893045425, -0.005726831965148449, 0.00041574533679522574, 0.11711852997541428, -0.0027261883951723576, 0.1894751489162445, 0.11687009781599045, 0.001485298853367567, -0.02746708318591118, 0.16575123369693756, 0.18878696858882904, -0.00644758902490139, 0.0888151004910469, -0.1023545116186142, -0.19453029334545135, 0.0984351709485054, 0.006827972829341888, 0.03710964322090149, 0.09929671883583069, 0.1109648197889328, -0.03783723711967468, 0.08207117766141891, -0.03376813977956772, -0.12412987649440765, -0.002740254160016775, 0.08006298542022705, -0.13486473262310028, 0.12542445957660675, 0.0698254257440567, 0.06604217737913132, 0.0810602456331253, -0.042687054723501205, -0.19253458082675934, 0.024144446477293968, 0.015285002067685127, -0.0859275534749031, 0.04045885428786278, 0.04654781147837639, -0.04519061744213104, 0.07429322600364685, 0.06315530091524124, -0.018358934670686722, 0.09624989330768585, -0.16380010545253754, -0.07739976048469543, -0.05925000086426735, 0.006155381444841623, 0.0913405567407608, 0.07014055550098419, -0.01650148257613182, 0.12990930676460266, -0.0459679514169693, 0.08157939463853836, 0.08270861953496933, -0.3844844698905945, 0.01835748739540577, 0.13183298707008362, 0.06838122010231018, 0.04781093820929527, -0.05270428583025932, 0.0985790267586708, 0.06579546630382538, -0.01937243528664112, 0.021399622783064842, -0.06128493323922157, -0.05555502325296402, 0.046227339655160904, -0.06764121353626251, -0.05347953736782074, 0.2445850521326065, -0.032696016132831573, 0.008063322864472866, -0.09607599675655365, -0.0805419385433197, -0.0027975209522992373, -0.014021999202668667, -0.006514052860438824, -0.03046339564025402, 0.11688707023859024, 0.002387512009590864, -0.05759410932660103, -0.14752663671970367, -0.04365755245089531, -0.171499103307724, 0.1455312818288803, 0.009993652813136578, 0.043040286749601364, -0.1588061898946762, 0.06895066797733307, -0.020719770342111588, -0.08949673920869827, -0.031625162810087204, -0.09390083700418472, 0.06135254725813866, -0.014775299467146397, -0.027011269703507423, -0.04196041077375412, 0.1377611756324768, 0.1823466271162033, -0.04270222783088684, 0.016005247831344604, -0.08119373768568039, 0.08628563582897186, 0.005347784608602524, 0.03972025215625763, 0.04219209775328636, -0.00278591550886631, 0.07855170965194702, -0.1417720466852188, 0.05051734670996666, -0.0495157390832901, -0.15936774015426636, 0.002761280396953225, 0.01432530302554369, 0.12085903435945511, 0.007028698921203613, 0.1165710985660553, -0.050112515687942505, 0.05239015445113182, 0.10605474561452866, -0.04620394483208656, -0.011159718967974186, 0.006362666375935078, 0.06688780337572098, -0.005273662507534027, 0.007145931478589773, 0.04845060408115387, -0.05638347566127777, 0.017424466088414192, -0.06856895238161087, -0.053082261234521866, -0.017260532826185226, -0.05239599198102951, 0.052969757467508316, -0.02272939309477806, 0.020363276824355125, -0.1783939152956009, -0.18778832256793976, 0.02203318476676941, 0.004753640852868557, -0.02299903705716133, -0.03050386533141136, -0.034795962274074554, -0.024624351412057877, 0.03538608178496361, -0.0878477543592453, -0.09082554280757904, -0.08164539933204651, 0.09009713679552078, -0.032894328236579895, 0.04820772632956505, -0.16086432337760925, 0.03796354681253433, -0.12873722612857819, 0.008932240307331085, -0.04032081738114357, 0.043530140072107315, -0.038846198469400406, 0.13727881014347076, -0.026143722236156464, 0.015184978954494, -0.05185092240571976, 0.06218321621417999, -0.050346944481134415, 0.2015925794839859, -0.08660920709371567, -0.06022009626030922, 0.28819945454597473, -0.13204969465732574, -0.20991842448711395, 0.11172761023044586, 0.0036436524242162704, 0.08309412002563477, 0.1326369047164917, 0.21133218705654144, -0.018873417750000954, -0.05488474667072296, 0.0522371269762516, 0.09919089078903198, -0.125325009226799, -0.0683283805847168, 0.030691953375935555, -0.046820562332868576, -0.12280486524105072, 0.02664562128484249, 0.05233079940080643, 0.06373555958271027, -0.024399882182478905, -0.04156040772795677, -0.04253103584051132, -0.009027764201164246, 0.03916028514504433, -0.033454664051532745, 0.057525746524333954, -0.11219322681427002, -0.034079451113939285, -0.005373620428144932, -0.02458050660789013, -0.019017554819583893, 0.013170863501727581, -0.08848017454147339, 0.07124315947294235, -0.004152690060436726, 0.06033890321850777, -0.06954585760831833, -0.1196736991405487, -0.007213296368718147, 0.09695515781641006, 0.023161711171269417, 0.06971991062164307, 0.0566059909760952, 0.004127103369683027, -0.01628716289997101, -0.0038215548265725374, 0.1915264129638672, 0.02788371406495571, -0.04061819985508919, -0.08320030570030212, 0.1063087210059166, -0.05602407082915306, 0.03352828323841095, -0.09229867905378342, 0.032299354672431946, 0.09565258026123047, 0.07151535153388977, 0.008609551005065441, 0.03621355816721916, -0.01989591494202614, -0.02677340991795063, -0.07403300702571869, -0.02313285507261753, 0.09158599376678467, 0.029951298609375954, -0.0968412309885025, 0.24936765432357788, -0.21809343993663788, 0.24535049498081207, 0.1920655220746994, -0.18348783254623413, 0.0007136522908695042, -0.10972806066274643, -0.01869973912835121, 0.019364764913916588, 0.024001313373446465, -0.041037630289793015, 0.045287538319826126, -0.021555975079536438, 0.15596863627433777, -0.07240202277898788, -0.04182377830147743, -0.004692755173891783, -0.07883137464523315, -0.02905404008924961, 0.04306690767407417, 0.07753303647041321, -0.20774096250534058, 0.20891056954860687, 0.21469244360923767, 0.08291465789079666, 0.17677319049835205, -0.022882316261529922, 0.01733328402042389, 0.04131516441702843, 0.05755351111292839, 0.0007878703181631863, -0.023720376193523407, -0.1670266091823578, -0.008558706380426884, 0.06956667453050613, 0.04127154499292374, 0.07170762866735458, -0.1372966468334198, -0.06798294931650162, -0.006273340899497271, -0.05411488562822342, 0.0006834776722826064, 0.07436975836753845, -0.015590284951031208, 0.12420568615198135, -0.02877623774111271, -0.023503730073571205, 0.13284137845039368, 0.005127927288413048, -0.11423890292644501, 0.18187503516674042, -0.13320018351078033, -0.27373403310775757, -0.14904986321926117, -0.15681925415992737, -0.03422868996858597, 0.08793231099843979, 0.14551040530204773, -0.07918493449687958, -0.056499507278203964, -0.04230089113116264, 0.028309281915426254, -0.0215899795293808, 0.009524689987301826, -0.05553140118718147, 0.05094711109995842, -0.06415421515703201, -0.11842893809080124, -0.06399621814489365, 0.01581781730055809, -0.1027691438794136, 0.1345934420824051, -0.0862293690443039, 0.06861609220504761, 0.14992931485176086, 0.017805185168981552, 0.01666780561208725, -0.08248627930879593, 0.16543281078338623, -0.07893309742212296, -0.0041901711374521255, 0.18311825394630432, -0.03480110689997673, 0.05875697731971741, 0.16382312774658203, 0.00506521575152874, -0.11806951463222504, 0.05511675402522087, -0.054019488394260406, -0.09598488360643387, -0.24359595775604248, -0.10442109405994415, -0.0907326340675354, 0.12567412853240967, 0.02467276342213154, 0.07936084270477295, 0.17679154872894287, 0.08970751613378525, -0.054973501712083817, -0.011270984075963497, 0.10230827331542969, 0.1078873798251152, 0.23028835654258728, -0.04508858174085617, 0.13223713636398315, -0.08688385784626007, -0.11232585459947586, 0.09787604212760925, 0.07220958173274994, 0.0943717211484909, 0.1292417049407959, 0.11359691619873047, 0.0570191964507103, 0.06824063509702682, 0.12691569328308105, 0.0996527299284935, 0.04520047456026077, -0.04428580775856972, -0.03181300684809685, -0.04202405363321304, -0.040505994111299515, 0.06531773507595062, -0.05312110856175423, -0.16136448085308075, -0.012923025526106358, -0.07873379439115524, 0.10137371718883514, 0.0793534442782402, 0.06881686300039291, -0.20429489016532898, -0.009181199595332146, 0.12103624641895294, -0.004773665685206652, -0.10312357544898987, 0.10512793064117432, 0.043547622859478, -0.0867943987250328, 0.08337633311748505, -0.03348427265882492, 0.09208174794912338, -0.018475333228707314, 0.07308655232191086, -0.075604148209095, -0.09309040009975433, -0.013357874006032944, 0.12410591542720795, -0.3285680115222931, 0.2073790431022644, 0.010247119702398777, -0.00023866798437666148, -0.08731070160865784, 0.012001825496554375, -0.000695686845574528, 0.13803279399871826, 0.16291116178035736, -0.01613151840865612, -0.11882728338241577, -0.04845179617404938, -0.016112670302391052, 0.043505046516656876, 0.10187114030122757, -0.013410420157015324, -0.023369107395410538, -0.05993712693452835, 0.016179386526346207, 0.00924888625741005, -0.05336134508252144, -0.05263769254088402, -0.16678938269615173, 0.042703770101070404, 0.08707457780838013, 0.13890452682971954, -0.039400603622198105, 0.03154359757900238, -0.1371641904115677, 0.22006699442863464, -0.12090060859918594, -0.07881450653076172, -0.11061464250087738, -0.10858435183763504, -0.01964825950562954, -0.029839180409908295, 0.0569106787443161, -0.0654257982969284, 0.04652325436472893, -0.08602796494960785, -0.1787823885679245, 0.13997827470302582, -0.10263728350400925, -0.07169341295957565, -0.04710642993450165, 0.16438515484333038, -0.07670063525438309, -0.011084687896072865, 0.06464897841215134, 0.03926802799105644, -0.060199737548828125, -0.12197504192590714, 0.03077356517314911, -0.03279707208275795, 0.03849117085337639, -0.018213052302598953, -0.07198669761419296, -0.08547007292509079, -0.0007142624817788601, -0.06454448401927948, 0.27303633093833923, 0.25776636600494385, -0.050073184072971344, 0.17802761495113373, 0.1800641268491745, -0.0869753286242485, -0.341132253408432, -0.11039988696575165, -0.1895749866962433, -0.0738367885351181, -0.010355209931731224, -0.12109944224357605, 0.06330162286758423, 0.04862356558442116, -0.06238440051674843, 0.1098012626171112, -0.19193609058856964, -0.10602825880050659, 0.16146302223205566, 0.018748439848423004, 0.3412669897079468, -0.17443889379501343, -0.11722997575998306, -0.0991792306303978, -0.1405080258846283, 0.18041113018989563, -0.11673738807439804, 0.0837133377790451, 0.025182753801345825, 0.03623169660568237, 0.020855199545621872, -0.043816760182380676, 0.10943792760372162, -0.03583509102463722, 0.04603385180234909, -0.13454242050647736, 0.003122474765405059, 0.07753708213567734, 0.0006529542151838541, 0.05158000811934471, -0.12571902573108673, 0.024975502863526344, -0.022715561091899872, -0.04808405414223671, -0.03440580144524574, 0.0771109014749527, 0.02246399037539959, -0.0950007438659668, -0.042499151080846786, -0.05755544826388359, -0.005383144598454237, -0.013729914091527462, 0.2505801022052765, -0.06516275554895401, 0.18026471138000488, 0.12727190554141998, 0.13647587597370148, -0.14925265312194824, 0.09377408772706985, -0.045502468943595886, -0.09501782059669495, 0.06796164065599442, -0.1315586417913437, 0.06017604470252991, 0.06940518319606781, -0.05031799152493477, 0.11151720583438873, 0.09177681803703308, 0.005424549337476492, 0.011978315189480782, 0.14075779914855957, -0.22380101680755615, -0.10926716774702072, -0.033401813358068466, 0.06055975705385208, 0.10096584260463715, 0.10922275483608246, 0.17242607474327087, -0.0003155232989229262, -0.004948523361235857, -0.00019940733909606934, 0.04660478234291077, -0.05569828674197197, 0.03818376734852791, -0.007572783622890711, 0.009157156571745872, -0.1344262957572937, 0.09993170201778412, 0.0026450175791978836, -0.10549458861351013, 0.03256009891629219, 0.07933373749256134, -0.13212178647518158, -0.12653683125972748, -0.05099128559231758, 0.09329839795827866, -0.11829189956188202, -0.06974135339260101, -0.021061694249510765, -0.16564393043518066, 0.03164598345756531, 0.1488107293844223, 0.042070113122463226, 0.11886020004749298, 0.011524360626935959, -0.01983720436692238, -0.026283733546733856, 0.010050519369542599, -0.0785251334309578, 0.029475292190909386, -0.09294628351926804, 0.022508328780531883, -0.032545335590839386, 0.0387813076376915, -0.09077859669923782, -0.042899876832962036, -0.1687036156654358, 0.014338809065520763, -0.10854269564151764, -0.03222676366567612, -0.1205010637640953, -0.02999168448150158, 0.010461528785526752, -0.024631399661302567, -0.04056182876229286, -0.04049370810389519, -0.09666536003351212, 0.02645932324230671, -0.02273833565413952, 0.06406854093074799, -0.11131210625171661, -0.006115266587585211, 0.06434985250234604, -0.024442384019494057, 0.14938704669475555, 0.06949032843112946, -0.08689232915639877, 0.09234444051980972, -0.2735702097415924, -0.03754256293177605, 0.12251153588294983, -0.01985371857881546, 0.0028790689539164305, 0.0558735616505146, 0.0036000777035951614, 0.11537187546491623, -0.004618871491402388, 0.06518907845020294, -0.0006088330992497504, -0.10442513227462769, 0.03250933811068535, -0.03636102378368378, -0.10902689397335052, -0.021753259003162384, -0.07592785358428955, 0.06046270951628685, -0.03556171804666519, 0.15173427760601044, -0.08711842447519302, 0.030668087303638458, -0.06441586464643478, 0.028921857476234436, 0.02418350614607334, -0.1676749438047409, -0.12098317593336105, -0.06408178061246872, -0.00561936479061842, 0.008417123928666115, 0.2996408939361572, 0.012405221350491047, -0.08107692748308182, 0.07527506351470947, 0.06716226041316986, 0.06172863394021988, 0.01693962886929512, 0.2853914797306061, 0.0831795185804367, -0.04147869721055031, -0.1548536717891693, 0.04215165972709656, 0.01585688441991806, -0.1359698474407196, 0.08834924548864365, 0.05586819723248482, -0.08691450208425522, 0.0781945064663887, 0.021427787840366364, -0.009465333074331284, -0.052580900490283966, -0.08029603213071823, -0.031399160623550415, 0.04299604520201683, -0.031372543424367905, 0.08346191048622131, 0.20597301423549652, -0.0344107411801815, -0.0024325852282345295, -0.041624926030635834, -0.03742953762412071, -0.1844482421875, -0.16861478984355927, -0.0924687311053276, -0.14817778766155243, 0.03491974249482155, -0.09454930573701859, 0.05562431365251541, 0.05100851505994797, 0.06444071978330612, -0.05853406712412834, 0.08819910883903503, -0.010002760216593742, -0.07052138447761536, 0.013570589013397694, -0.03429213538765907, 0.058124955743551254, -0.026510506868362427, -0.06466706842184067, -0.055334944278001785, -0.0072351242415606976, -0.005969841964542866, 0.07250532507896423, 0.0171554833650589, 0.044738974422216415, -0.14242957532405853, -0.06317257136106491, -0.04173395037651062, 0.09880541265010834, -0.05691679194569588, 0.13219428062438965, 0.012742600403726101, -0.02016061171889305, 0.09261129796504974, 0.19775868952274323, -0.04737626761198044, -0.10806959122419357, -0.04253189265727997, 0.1967744529247284, 0.0180476363748312, 0.13991227746009827, -0.028591878712177277, -0.015121993608772755, -0.010500025004148483, 0.29444554448127747, 0.282385915517807, -0.037467535585165024, 0.024261130020022392, -0.050562724471092224, 0.034317657351493835, 0.0883466899394989, 0.1310003250837326, 0.05393659695982933, 0.25088950991630554, -0.04748554527759552, -0.01422672625631094, 0.015463791787624359, 0.0010174601338803768, -0.10145054757595062, 0.09850980341434479, -0.006349368952214718, -0.04531671106815338, -0.025142455473542213, 0.11167754977941513, -0.16480469703674316, 0.09828811138868332, -0.08768158406019211, -0.0969359502196312, -0.004879090003669262, 0.025937264785170555, 0.1371358036994934, -0.016788171604275703, 0.05495563521981239, -0.010802836157381535, -0.08196461945772171, 0.03120557591319084, 0.013422277756035328, -0.2160351276397705, 0.023035641759634018, 0.02540794014930725, -0.017516477033495903, 0.10856673121452332, -0.009316481649875641, 0.05006910488009453, 0.08370183408260345, 0.020813612267374992, -0.06217975169420242, 0.13340167701244354, 0.010893707163631916, -0.06947165727615356, 0.029665792360901833, -0.040835317224264145, 0.0038393130525946617, -0.029480691999197006, 0.06742697209119797, -0.14009709656238556, 0.06248565763235092, 0.014877045527100563, -0.09996398538351059, -0.04386210814118385, 0.02135022170841694, -0.07499545067548752, 0.07365463674068451, 0.0025425138883292675, -0.019646992906928062, 0.010146964341402054, -0.04524363949894905, 0.01617594063282013, 0.007634963374584913, -0.11558648943901062, -0.027188090607523918, -0.13450439274311066, -0.06002146378159523, 0.14395491778850555, 0.019062213599681854, -0.2317706048488617, 0.019241543486714363, -0.1125217080116272, 0.07768011838197708, -0.1797899305820465, 0.06509954482316971, 0.1601763367652893, 0.02099434845149517, -0.01597164198756218, -0.12230639904737473, 0.04862997308373451, 0.07791098952293396, -0.03803945705294609, -0.09531211853027344 ]
null
null
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
text2text-generation
Rahulrayudu/Flan-T5-finetuned-agri
[ "transformers", "safetensors", "t5", "text2text-generation", "arxiv:1910.09700", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2024-02-13T21:14:13+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #t5 #text2text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #t5 #text2text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ 58, 6, 3, 82, 28, 3, 4, 9, 9, 10, 42, 20, 3, 4, 5, 9, 11, 13, 3, 12, 5, 4, 5, 3, 4, 9, 53, 9, 8, 6, 3, 14, 8, 7, 9, 4 ]
[ "passage: TAGS\n#transformers #safetensors #t5 #text2text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact" ]
[ -0.053328532725572586, 0.16120538115501404, -0.005120371468365192, 0.022602224722504616, 0.09686747193336487, 0.013199392706155777, 0.07261143624782562, 0.11177206039428711, -0.020693831145763397, 0.1128523200750351, 0.0323781855404377, 0.09778297692537308, 0.11381756514310837, 0.15530984103679657, -0.0018252237932756543, -0.23414164781570435, 0.051169246435165405, -0.12603329122066498, -0.039110470563173294, 0.11734651774168015, 0.14655858278274536, -0.10434788465499878, 0.07780920714139938, -0.029932111501693726, -0.010786613449454308, -0.030950399115681648, -0.06109464541077614, -0.04963193088769913, 0.05158040300011635, 0.07096312940120697, 0.06875279545783997, 0.009741154499351978, 0.09293358027935028, -0.2676756680011749, 0.021060682833194733, 0.07436702400445938, -0.0019205488497391343, 0.07644513249397278, 0.05394738167524338, -0.07786445319652557, 0.08801496773958206, -0.053122974932193756, 0.14802159368991852, 0.08166222274303436, -0.09144649654626846, -0.19256246089935303, -0.08630277216434479, 0.10201671719551086, 0.17971307039260864, 0.050409309566020966, -0.02338344417512417, 0.10295069962739944, -0.08843041211366653, 0.012706292793154716, 0.059160783886909485, -0.06515879184007645, -0.05482804775238037, 0.0630323737859726, 0.08173035830259323, 0.0787791833281517, -0.12468571215867996, -0.018215585500001907, 0.011311499401926994, 0.00691694812849164, 0.08102929592132568, 0.022060219198465347, 0.14176861941814423, 0.03922285884618759, -0.1292058527469635, -0.047744158655405045, 0.10315844416618347, 0.04381343349814415, -0.04969092458486557, -0.24839195609092712, -0.028692634776234627, -0.03409173712134361, -0.029329892247915268, -0.041139665991067886, 0.04428756237030029, -0.010770969092845917, 0.08322557806968689, -0.008045176975429058, -0.07979845255613327, -0.03690612316131592, 0.06324487924575806, 0.05645342543721199, 0.024454401805996895, -0.008984005078673363, 0.006743076257407665, 0.1175178587436676, 0.10636600106954575, -0.12631633877754211, -0.05289403349161148, -0.06528059393167496, -0.0853288322687149, -0.04429693520069122, 0.03338160738348961, 0.04351643845438957, 0.04334709793329239, 0.24920088052749634, 0.011966975405812263, 0.05556565150618553, 0.03878911957144737, 0.011687099933624268, 0.06360286474227905, 0.11270952969789505, -0.05845928564667702, -0.09383665025234222, -0.033332064747810364, 0.09301437437534332, 0.008503437042236328, -0.0402098223567009, -0.06047673895955086, 0.06078295037150383, 0.015703821554780006, 0.12211526930332184, 0.087046779692173, 0.002870776690542698, -0.07195370644330978, -0.06478150933980942, 0.19285908341407776, -0.15949691832065582, 0.047871991991996765, 0.03357849270105362, -0.040312062948942184, -0.0005020854296162724, 0.01165273692458868, 0.023987481370568275, -0.021567439660429955, 0.0924374982714653, -0.05500924214720726, -0.03761355206370354, -0.10879732668399811, -0.03591866046190262, 0.03197222575545311, 0.0022585385013371706, -0.02967100404202938, -0.033424828201532364, -0.08920473605394363, -0.0635172426700592, 0.09580977261066437, -0.07413128018379211, -0.05156254023313522, -0.016345804557204247, -0.0761859342455864, 0.026101797819137573, 0.01702207140624523, 0.08535456657409668, -0.0213642455637455, 0.037230201065540314, -0.05421315133571625, 0.06241346150636673, 0.10910454392433167, 0.0320611298084259, -0.053984515368938446, 0.06094928830862045, -0.2412392497062683, 0.10316064208745956, -0.07156267017126083, 0.05108866095542908, -0.15137021243572235, -0.025331947952508926, 0.04665522649884224, 0.009590202011168003, -0.011478574015200138, 0.14007656276226044, -0.2198302298784256, -0.029333066195249557, 0.1640782356262207, -0.09730498492717743, -0.08055570721626282, 0.059064920991659164, -0.054139286279678345, 0.10999192297458649, 0.04003598168492317, -0.023768696933984756, 0.06297750771045685, -0.14250542223453522, -0.0039275879971683025, -0.041889119893312454, -0.01720282807946205, 0.16010744869709015, 0.07506491243839264, -0.06698185205459595, 0.077672079205513, 0.022212913259863853, -0.023321649059653282, -0.04393244534730911, -0.022494852542877197, -0.10826845467090607, 0.009565223939716816, -0.06269361078739166, 0.02424052357673645, -0.023944495245814323, -0.0903024971485138, -0.029575346037745476, -0.1770460456609726, -0.013402442447841167, 0.08679109811782837, -0.010982494801282883, -0.019886262714862823, -0.11693590134382248, 0.012033592909574509, 0.032231178134679794, 0.0004325093177612871, -0.13445010781288147, -0.05658498778939247, 0.0273329745978117, -0.16240260004997253, 0.031236927956342697, -0.05114622414112091, 0.04928715154528618, 0.03406677767634392, -0.03175085783004761, -0.031348153948783875, 0.01572313904762268, 0.006510823033750057, -0.013680041767656803, -0.24737438559532166, -0.02852414920926094, -0.022412575781345367, 0.16979394853115082, -0.2190135270357132, 0.04012007266283035, 0.07135825604200363, 0.15074580907821655, 0.006911954842507839, -0.03669405356049538, 0.005606858059763908, -0.0768459290266037, -0.03284264728426933, -0.0623927041888237, -0.008401541970670223, -0.03721899166703224, -0.054593876004219055, 0.051287684589624405, -0.16718235611915588, -0.031153932213783264, 0.1028679683804512, 0.06780845671892166, -0.13963541388511658, -0.01705223321914673, -0.04106766730546951, -0.043112557381391525, -0.05709490180015564, -0.05539087578654289, 0.11148729920387268, 0.05757083371281624, 0.04828811436891556, -0.06848311424255371, -0.0756818875670433, 0.006132613401859999, -0.0179264098405838, -0.021222935989499092, 0.0928845927119255, 0.07583390921354294, -0.12310270220041275, 0.09178637713193893, 0.10549022257328033, 0.0892157256603241, 0.10119049996137619, -0.02137933485209942, -0.08691582083702087, -0.04892461374402046, 0.0229446180164814, 0.016364475712180138, 0.13983985781669617, -0.016759416088461876, 0.05310053750872612, 0.04020100086927414, -0.012910815887153149, 0.011883769184350967, -0.09328193217515945, 0.02934250421822071, 0.03636814281344414, -0.019501443952322006, 0.040251899510622025, -0.03908125311136246, 0.020790016278624535, 0.08787564933300018, 0.04434992000460625, 0.03818633407354355, 0.013980780728161335, -0.04370194673538208, -0.11091572046279907, 0.17051653563976288, -0.12536633014678955, -0.239797443151474, -0.14147889614105225, 0.001731917611323297, 0.041165996342897415, -0.01159723661839962, 0.0031763319857418537, -0.06770002096891403, -0.11874829977750778, -0.09346967190504074, 0.015001182444393635, 0.04228860139846802, -0.080612413585186, -0.05524664744734764, 0.05777253210544586, 0.040611669421195984, -0.143319234251976, 0.020423002541065216, 0.04869217798113823, -0.08989228308200836, -0.00900039542466402, 0.08071441948413849, 0.06998268514871597, 0.17929090559482574, 0.009512054733932018, -0.020932139828801155, 0.03292093798518181, 0.2157505750656128, -0.13771237432956696, 0.11451084166765213, 0.14277678728103638, -0.0911637470126152, 0.08293474465608597, 0.1991184800863266, 0.03884927183389664, -0.10264625400304794, 0.03326369449496269, 0.022328944876790047, -0.028676386922597885, -0.2503291964530945, -0.06918580830097198, 0.0007976540364325047, -0.05238448083400726, 0.07527847588062286, 0.08888168632984161, 0.09494108706712723, 0.01729334332048893, -0.09416709095239639, -0.08025584369897842, 0.04901478812098503, 0.10409125685691833, 0.010409193113446236, -0.01156378723680973, 0.09060908854007721, -0.03323452174663544, 0.01843860000371933, 0.09313460439443588, 0.004041523206979036, 0.17060963809490204, 0.05550962686538696, 0.18336638808250427, 0.07643263041973114, 0.0721396952867508, 0.015671607106924057, 0.013079277239739895, 0.02304760180413723, 0.021578695625066757, -0.0033059304114431143, -0.0851421132683754, -0.009511260315775871, 0.11862117052078247, 0.06801546365022659, 0.020754681900143623, 0.009507957845926285, -0.033934496343135834, 0.08064714074134827, 0.17465052008628845, -0.0009437129483558238, -0.1870066076517105, -0.06896740943193436, 0.08026526123285294, -0.08972865343093872, -0.10345284640789032, -0.02900044620037079, 0.0354950949549675, -0.17372116446495056, 0.02448408491909504, -0.018045885488390923, 0.11108683049678802, -0.1356782615184784, -0.01890929788351059, 0.06319493800401688, 0.07008420675992966, -0.0016097982879728079, 0.06208989396691322, -0.16155508160591125, 0.10791012644767761, 0.01390943955630064, 0.06503470987081528, -0.09786296635866165, 0.10111832618713379, -0.006267238408327103, -0.007413685787469149, 0.14043578505516052, 0.009255880489945412, -0.07051325589418411, -0.08343593031167984, -0.0979004055261612, -0.010649190284311771, 0.12877127528190613, -0.14879846572875977, 0.08456916362047195, -0.0322830006480217, -0.04405250772833824, 0.005208021495491266, -0.10768675804138184, -0.12857580184936523, -0.18887875974178314, 0.05537694692611694, -0.13356289267539978, 0.033175256103277206, -0.1055491715669632, -0.0408647358417511, -0.02885887771844864, 0.19630752503871918, -0.22321896255016327, -0.0670507624745369, -0.15318840742111206, -0.09096445143222809, 0.14798617362976074, -0.049908362329006195, 0.08374498039484024, -0.005065108183771372, 0.18742504715919495, 0.01894373446702957, -0.024415504187345505, 0.1011786088347435, -0.09638315439224243, -0.19627197086811066, -0.08534666895866394, 0.15457913279533386, 0.13537167012691498, 0.0351712740957737, -0.004617651924490929, 0.03167666867375374, -0.0189940445125103, -0.12101218104362488, 0.022920187562704086, 0.17696480453014374, 0.07036592066287994, 0.024736741557717323, -0.02639835514128208, -0.11453131586313248, -0.06600044667720795, -0.032452553510665894, 0.02982977218925953, 0.18294402956962585, -0.07586611062288284, 0.18679921329021454, 0.13732017576694489, -0.05770440772175789, -0.1956426501274109, 0.01923983357846737, 0.04058924317359924, 0.00837375782430172, 0.032165057957172394, -0.20239581167697906, 0.08806682378053665, 0.0007347199134528637, -0.05074144899845123, 0.13624143600463867, -0.17552010715007782, -0.15046143531799316, 0.06929060816764832, 0.03642011433839798, -0.19279520213603973, -0.12030941992998123, -0.08865538984537125, -0.05107492581009865, -0.17776648700237274, 0.10758756101131439, 0.02193085290491581, 0.00676411809399724, 0.033654287457466125, 0.026140762493014336, 0.014790141955018044, -0.0396585576236248, 0.19431912899017334, -0.02348872646689415, 0.030807901173830032, -0.08293910324573517, -0.07001609355211258, 0.05941145867109299, -0.05705835670232773, 0.0775861069560051, -0.022215960547327995, 0.013414059765636921, -0.10643109679222107, -0.04425564035773277, -0.03175993636250496, 0.015691282227635384, -0.09722420573234558, -0.08909335732460022, -0.050057362765073776, 0.09262266010046005, 0.0974174216389656, -0.035089656710624695, -0.03564268350601196, -0.07118509709835052, 0.039714183658361435, 0.18831974267959595, 0.17605267465114594, 0.046182651072740555, -0.08030564337968826, -0.004098092205822468, -0.011694483458995819, 0.042484745383262634, -0.21906526386737823, 0.062426332384347916, 0.05058585852384567, 0.014059843495488167, 0.1173645630478859, -0.01779606007039547, -0.15810294449329376, -0.06761486083269119, 0.05993710458278656, -0.06326820701360703, -0.19225671887397766, 0.0032602818682789803, 0.055388111621141434, -0.16711848974227905, -0.04538320377469063, 0.0430813767015934, -0.005750913172960281, -0.039257556200027466, 0.01613711006939411, 0.08359149098396301, 0.0031580389477312565, 0.07040093839168549, 0.05520293489098549, 0.086640864610672, -0.10250966250896454, 0.07937785238027573, 0.08386688679456711, -0.08347215503454208, 0.028158824890851974, 0.09330378472805023, -0.06144890934228897, -0.029910072684288025, 0.032212331891059875, 0.08255140483379364, 0.012964491732418537, -0.04401125758886337, 0.008184057660400867, -0.10146338492631912, 0.0627170279622078, 0.09755739569664001, 0.03206513822078705, 0.011901181191205978, 0.03383762761950493, 0.04645882546901703, -0.07481352984905243, 0.11842621862888336, 0.025973208248615265, 0.01822328381240368, -0.04273592680692673, -0.04516541585326195, 0.027133917436003685, -0.02340707741677761, -0.007566304877400398, -0.03583317995071411, -0.06988023966550827, -0.01722576655447483, -0.16493180394172668, -0.01076561864465475, -0.044063083827495575, 0.008020744659006596, 0.026847293600440025, -0.0369400717318058, 0.008594665676355362, 0.009077225811779499, -0.07577309012413025, -0.06240518018603325, -0.02245018258690834, 0.0914878100156784, -0.16343435645103455, 0.023352261632680893, 0.08310231566429138, -0.12098916620016098, 0.09322582185268402, 0.018653366714715958, -0.0019369579385966063, 0.02680385299026966, -0.15561461448669434, 0.0368269607424736, -0.027320701628923416, 0.014671673998236656, 0.045705173164606094, -0.21818207204341888, -0.0014451020397245884, -0.03558654710650444, -0.059982262551784515, -0.010693925432860851, -0.037350837141275406, -0.11245633661746979, 0.10088492184877396, 0.012412267737090588, -0.08672942966222763, -0.03157110512256622, 0.03652326017618179, 0.08053763210773468, -0.02631879225373268, 0.15205731987953186, -0.0010786735219880939, 0.07447176426649094, -0.1738860309123993, -0.0210786834359169, -0.0090115275233984, 0.02177848480641842, -0.016872623935341835, -0.01564885675907135, 0.042430613189935684, -0.026671668514609337, 0.18584245443344116, -0.027355844154953957, 0.03733034059405327, 0.06316441297531128, 0.01770097203552723, -0.021354418247938156, 0.10755398869514465, 0.06012963131070137, 0.02173144742846489, 0.019801700487732887, 0.0075409491546452045, -0.041807159781455994, -0.018543899059295654, -0.19347810745239258, 0.07164526730775833, 0.14044208824634552, 0.08769161999225616, -0.012164209969341755, 0.08067302405834198, -0.10084949433803558, -0.11743459850549698, 0.11121641099452972, -0.059808436781167984, -0.0022669173777103424, -0.06652101874351501, 0.13155525922775269, 0.14582572877407074, -0.19254228472709656, 0.07050827890634537, -0.06511960923671722, -0.05269601568579674, -0.11906112730503082, -0.1953776627779007, -0.05703132599592209, -0.054343048483133316, -0.015079263597726822, -0.05059242993593216, 0.07498416304588318, 0.05622640252113342, 0.010858895257115364, 0.0015552249969914556, 0.06971994787454605, -0.019759170711040497, 0.001521410304121673, 0.032095473259687424, 0.06417544931173325, 0.014362066984176636, -0.03133942559361458, 0.018592869862914085, -0.008470231667160988, 0.03991629183292389, 0.0633486732840538, 0.04155107960104942, -0.028110865503549576, 0.01659207232296467, -0.0337030366063118, -0.10854189842939377, 0.04278707876801491, -0.028698457404971123, -0.08063279837369919, 0.13984808325767517, 0.025403661653399467, 0.009562181308865547, -0.022226108238101006, 0.241981640458107, -0.07480388879776001, -0.09265431761741638, -0.14692139625549316, 0.1055137887597084, -0.04348868504166603, 0.06415078788995743, 0.045384783297777176, -0.10421041399240494, 0.012057800777256489, 0.12658540904521942, 0.1625804305076599, -0.0438871793448925, 0.019560009241104126, 0.03037482313811779, 0.00398933095857501, -0.03853052854537964, 0.05252939090132713, 0.06827457249164581, 0.14848913252353668, -0.050116557627916336, 0.09223522990942001, 0.0050886585377156734, -0.09908851981163025, -0.034064266830682755, 0.11810369789600372, -0.019035303965210915, 0.019260596483945847, -0.05601469427347183, 0.11788773536682129, -0.06368034332990646, -0.233087420463562, 0.06406685709953308, -0.07426205277442932, -0.14131881296634674, -0.024826664477586746, 0.07676053047180176, -0.014309047721326351, 0.027850469574332237, 0.0722186341881752, -0.07654546946287155, 0.19937579333782196, 0.03671684116125107, -0.058611851185560226, -0.05623113736510277, 0.07896319031715393, -0.11419995129108429, 0.27488458156585693, 0.015893742442131042, 0.045155949890613556, 0.1038452610373497, -0.013412448577582836, -0.13435201346874237, 0.01833420805633068, 0.09638454020023346, -0.08846497535705566, 0.04018587991595268, 0.20595665276050568, -0.0028567397966980934, 0.11962885409593582, 0.07707620412111282, -0.08087631314992905, 0.049051105976104736, -0.09828304499387741, -0.07230360060930252, -0.08931835740804672, 0.09120666980743408, -0.07232820242643356, 0.14308606088161469, 0.1311190128326416, -0.05265164002776146, 0.00968363881111145, -0.029376711696386337, 0.045510269701480865, 0.004632700700312853, 0.10403459519147873, 0.008749093860387802, -0.1797543615102768, 0.02403045818209648, 0.01841445453464985, 0.10992073267698288, -0.1701374351978302, -0.09734909981489182, 0.043629229068756104, -0.0012522460892796516, -0.06121290475130081, 0.1290796846151352, 0.05957380682229996, 0.05011506378650665, -0.043520737439394, -0.0211784765124321, -0.008504665456712246, 0.14072857797145844, -0.10404830425977707, -0.00016830587992444634 ]
null
null
transformers
===== Solstice-11B-v1 ===== A model trained with the sole goal of NSFW. That is it. Results are to be as expected. Finetuned off several instruct datasets that are NSFW. Example subset Below, other instruct datasets modified are private for now. Outputs were manually verified by me and two good friends. [Lewd-Assistant-v1](https://huggingface.co/datasets/Himitsui/Lewd-Assistant-v1) ---> Used a combination of Claude 2.0, GPT-4-Turbo and WinterGoddess-1.4x to reformat instead of a small 10B model [Fimbulvetr-v1] like the example dataset shown, which resulted in less errors and better answers. Private for now. Ruled Out names: <br>Solarslut <br>Solascivious <br>Sultry <br>Sundress <br>Scorch *** Prompt Format: Alpaca There are several issues with the model, but this is an experimental one so :shrug: <br>----> May speak as {{user}} sometimes. I know what causes it. I kinda like it though that way. <br>----> May ramble on or give small outputs. Sampler settings. <br>----> May be a little inconsistent at times. Yeah its inevitable due to the nature of the data. <br>----> Steers towards NSFW --> As Expected. *** GGUF: https://huggingface.co/Sao10K/Solstice-11B-v1-GGUF
{"language": ["en"], "license": "cc-by-nc-4.0", "datasets": ["Himitsui/Lewd-Assistant-v1"]}
text-generation
LoneStriker/Solstice-11B-v1-AWQ
[ "transformers", "pytorch", "safetensors", "llama", "text-generation", "en", "dataset:Himitsui/Lewd-Assistant-v1", "license:cc-by-nc-4.0", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "4-bit", "region:us" ]
2024-02-13T21:15:53+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #safetensors #llama #text-generation #en #dataset-Himitsui/Lewd-Assistant-v1 #license-cc-by-nc-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #4-bit #region-us
===== Solstice-11B-v1 ===== A model trained with the sole goal of NSFW. That is it. Results are to be as expected. Finetuned off several instruct datasets that are NSFW. Example subset Below, other instruct datasets modified are private for now. Outputs were manually verified by me and two good friends. Lewd-Assistant-v1 ---> Used a combination of Claude 2.0, GPT-4-Turbo and WinterGoddess-1.4x to reformat instead of a small 10B model [Fimbulvetr-v1] like the example dataset shown, which resulted in less errors and better answers. Private for now. Ruled Out names: <br>Solarslut <br>Solascivious <br>Sultry <br>Sundress <br>Scorch * Prompt Format: Alpaca There are several issues with the model, but this is an experimental one so :shrug: <br>----> May speak as {{user}} sometimes. I know what causes it. I kinda like it though that way. <br>----> May ramble on or give small outputs. Sampler settings. <br>----> May be a little inconsistent at times. Yeah its inevitable due to the nature of the data. <br>----> Steers towards NSFW --> As Expected. * GGUF: URL
[]
[ "TAGS\n#transformers #pytorch #safetensors #llama #text-generation #en #dataset-Himitsui/Lewd-Assistant-v1 #license-cc-by-nc-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #4-bit #region-us \n" ]
[ 85 ]
[ "passage: TAGS\n#transformers #pytorch #safetensors #llama #text-generation #en #dataset-Himitsui/Lewd-Assistant-v1 #license-cc-by-nc-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #4-bit #region-us \n" ]
[ -0.06761788576841354, 0.06937704980373383, -0.005667154677212238, 0.03993748128414154, 0.09608259797096252, 0.0015243743546307087, 0.1558346003293991, 0.09004928171634674, -0.04419463500380516, -0.019849544391036034, 0.17299272119998932, 0.19932672381401062, -0.016625305637717247, 0.05546605587005615, -0.08373744040727615, -0.13255436718463898, 0.05665615200996399, 0.042433686554431915, 0.011234105564653873, 0.12450406700372696, 0.0874817967414856, -0.07846406102180481, 0.06387058645486832, -0.06745894253253937, -0.11215730011463165, 0.015703756362199783, 0.03912702202796936, -0.11296284943819046, 0.09958536177873611, 0.03458554670214653, 0.09541165083646774, 0.10289844870567322, -0.013105147518217564, -0.1514524668455124, 0.024060970172286034, 0.02471703104674816, -0.08513861894607544, 0.05638488382101059, 0.03446224331855774, -0.0217461995780468, 0.054954733699560165, -0.015814045444130898, -0.04607922211289406, 0.03507917746901512, -0.07911276817321777, -0.06432969123125076, -0.04883264750242233, 0.08876381814479828, 0.06454847007989883, 0.09526578336954117, 0.01305905357003212, 0.1647387444972992, -0.08331124484539032, 0.10960683971643448, 0.0893922820687294, -0.3024989664554596, 0.020083850249648094, 0.07302301377058029, 0.057843729853630066, 0.10397709906101227, -0.02369416132569313, 0.04969964548945427, 0.05916759371757507, 0.004715851973742247, 0.059103600680828094, -0.06016315519809723, -0.08062192052602768, 0.050677407532930374, -0.08504517376422882, -0.05073770135641098, 0.27290335297584534, -0.035203173756599426, 0.01636457070708275, -0.031050652265548706, -0.06259047240018845, -0.010390790179371834, -0.008502432145178318, 0.05590541288256645, -0.019490793347358704, 0.04311394691467285, -0.016294440254569054, -0.0076815467327833176, -0.1442287117242813, -0.01936723105609417, -0.14764216542243958, 0.10149533301591873, 0.024279195815324783, 0.05212061479687691, -0.10224185883998871, 0.0745876133441925, 0.08555300533771515, -0.08810970932245255, 0.013272405602037907, -0.06299150735139847, 0.12135033309459686, 0.023036520928144455, -0.031422823667526245, 0.03736516460776329, 0.1364370584487915, 0.088732048869133, -0.043273817747831345, -0.006694428157061338, -0.08435571193695068, 0.09215462952852249, -0.025777585804462433, -0.04529252275824547, -0.04284609854221344, 0.029434911906719208, 0.11734955757856369, -0.011502583511173725, 0.0883171409368515, -0.02938431315124035, -0.1317175328731537, -0.015905091539025307, 0.0517861433327198, 0.1332320272922516, 0.03725501894950867, 0.08144061267375946, -0.030581319704651833, 0.024516507983207703, 0.13729918003082275, -0.07557546347379684, -0.0022474753204733133, 0.02191636525094509, 0.011177252046763897, 0.03178227320313454, 0.015289360657334328, 0.010281965136528015, -0.09977341443300247, 0.06984078139066696, -0.06729312986135483, -0.033611372113227844, -0.010498881340026855, -0.08397173136472702, 0.08499502390623093, -0.09053467959165573, 0.031107096001505852, -0.18286098539829254, -0.1778816431760788, 0.041452180594205856, -0.013351951725780964, -0.0038001108914613724, -0.05284158140420914, -0.023533156141638756, -0.06835680454969406, 0.016828682273626328, -0.09610819071531296, -0.008807389996945858, -0.08904368430376053, 0.13521070778369904, -0.03186538442969322, 0.04632285609841347, -0.14932367205619812, 0.01932157203555107, -0.09759507328271866, -0.006734470836818218, 0.014608887024223804, 0.016554245725274086, -0.05120043456554413, 0.10815471410751343, -0.03274725750088692, -0.04759214073419571, -0.02312895469367504, 0.019000260159373283, -0.0033287284895777702, 0.19400811195373535, -0.09620776027441025, -0.06777733564376831, 0.18684490025043488, -0.13140317797660828, -0.19916826486587524, 0.12480625510215759, 0.015042934566736221, -0.027027303352952003, 0.09505538642406464, 0.1217871904373169, 0.04888242110610008, -0.08572426438331604, -0.03835752233862877, 0.11566852033138275, -0.0635233148932457, -0.18089671432971954, 0.021964600309729576, 0.040507327765226364, -0.1081453412771225, 0.030156629160046577, 0.01651628687977791, 0.05160805210471153, -0.053274448961019516, -0.05658629909157753, -0.055736981332302094, -0.061704687774181366, 0.01998491957783699, -0.005877560470253229, 0.03849548473954201, -0.10155114531517029, -0.020262496545910835, -0.021615732461214066, 0.04260551929473877, -0.03935932740569115, 0.05518830195069313, -0.08994629979133606, 0.12191684544086456, -0.03708183020353317, 0.0348842591047287, -0.10926112532615662, -0.024014869704842567, -0.010145984590053558, 0.1532411277294159, -0.0029831964056938887, -0.006664159707725048, 0.041884854435920715, 0.01570298708975315, -0.035604093223810196, -0.028698880225419998, 0.15075628459453583, 0.0008723707287572324, -0.056349288672208786, -0.13175524771213531, 0.06867969036102295, -0.02386927418410778, 0.06310078501701355, -0.10245433449745178, 0.016393449157476425, 0.09294252842664719, 0.0894390195608139, -0.03175456076860428, 0.07289142161607742, -0.0163250919431448, 0.04918704554438591, -0.06409864872694016, 0.03644054755568504, 0.10442894697189331, 0.024337327107787132, -0.11165955662727356, 0.1844383031129837, -0.12337937206029892, 0.2615654170513153, 0.22552533447742462, -0.21282313764095306, 0.07991170138120651, -0.06526751816272736, -0.003521741833537817, -0.01432556752115488, 0.030980903655290604, -0.040200915187597275, -0.026732733473181725, -0.019198520109057426, 0.163655087351799, -0.09013015031814575, -0.01008285116404295, 0.020031018182635307, -0.04547417536377907, -0.037369899451732635, 0.09080586582422256, 0.09858645498752594, -0.10258953273296356, 0.15906046330928802, 0.21809470653533936, -0.02239677682518959, 0.12021581083536148, -0.05723697319626808, -0.014058014377951622, 0.037864845246076584, 0.01618107780814171, 0.003287645522505045, -0.000850222771987319, -0.11344780772924423, 0.005784135311841965, 0.06867828965187073, -0.002581497887149453, 0.05120521038770676, -0.16407625377178192, -0.056917015463113785, -0.04063975438475609, -0.058828841894865036, -0.04603268951177597, 0.03872503712773323, 0.0015452842926606536, 0.1303955465555191, -0.07413258403539658, -0.07054713368415833, 0.11696341633796692, -0.007646295242011547, -0.10338445752859116, 0.16830755770206451, -0.15070787072181702, -0.30436280369758606, -0.1736915111541748, -0.14961709082126617, -0.06765256077051163, 0.019892418757081032, 0.10870539397001266, -0.06236870959401131, -0.04172416776418686, -0.07653211802244186, -0.04152647778391838, -0.038805194199085236, 0.024048347026109695, -0.01961175724864006, 0.061092715710401535, -0.02001660130918026, -0.11847686767578125, -0.016223490238189697, 0.00453368853777647, -0.06664715707302094, 0.15847669541835785, -0.06424397230148315, 0.08535922318696976, 0.11391377449035645, 0.022272799164056778, 0.005930216982960701, -0.05869762599468231, 0.1333877444267273, -0.010288418270647526, -0.00855825375765562, 0.2162056863307953, -0.01644062250852585, 0.06651076674461365, 0.10779786109924316, 0.025884848088026047, -0.06580262631177902, 0.020669495686888695, -0.06167194992303848, -0.07756611704826355, -0.2105773687362671, -0.12037568539381027, -0.06776135414838791, 0.10833544284105301, 0.03979687765240669, 0.07515480369329453, 0.06780221313238144, 0.08984103053808212, -0.04987424239516258, 0.029621059074997902, 0.030026087537407875, 0.05481823533773422, 0.23814725875854492, -0.03789299353957176, 0.13325366377830505, -0.0854392722249031, -0.052697330713272095, 0.0972985029220581, 0.0783345177769661, 0.08931809663772583, 0.025021111592650414, 0.07466913759708405, 0.06061850115656853, 0.1841631829738617, 0.11960475146770477, 0.12599505484104156, 0.01425999030470848, -0.0074385832995176315, 0.006648913025856018, -0.053957171738147736, -0.027041180059313774, 0.0046234820038080215, -0.04557359218597412, -0.09095802903175354, -0.027953049167990685, -0.053159210830926895, 0.08004968613386154, 0.09089922159910202, 0.05429355800151825, -0.24543985724449158, 0.0031166512053459883, 0.07011236995458603, 0.009110572747886181, -0.05798695608973503, 0.0872129499912262, 0.04322222247719765, -0.00910605676472187, 0.06506418436765671, -0.0318010114133358, 0.08817596733570099, -0.05553513392806053, 0.04626850783824921, -0.057720281183719635, -0.047892842441797256, 0.007460447959601879, 0.1032397523522377, -0.3136799931526184, 0.1681155264377594, 0.011952587403357029, -0.0019547150004655123, -0.08738207072019577, -0.023326145485043526, 0.005035393871366978, 0.18115633726119995, 0.1267625391483307, -0.009525636211037636, 0.027992842718958855, -0.07143168151378632, -0.10127688199281693, 0.04622814059257507, 0.0773744136095047, 0.031964536756277084, 0.009688306599855423, -0.019037742167711258, -0.002047067740932107, 0.008852221071720123, -0.02329699881374836, -0.0347917340695858, -0.1317235380411148, 0.029519887641072273, 0.1661970317363739, 0.05710494518280029, -0.048714205622673035, -0.04375256225466728, -0.14570719003677368, 0.13762614130973816, -0.18265897035598755, -0.09036645293235779, -0.0671577900648117, -0.07382836192846298, 0.046648599207401276, -0.042725902050733566, 0.036469463258981705, -0.03085406683385372, 0.0014550716150552034, -0.0609227679669857, -0.15047362446784973, 0.06284961104393005, -0.11820387840270996, -0.07747466117143631, -0.02987769991159439, 0.1740485578775406, -0.07102998346090317, 0.006472419481724501, 0.006475089117884636, 0.00601585116237402, -0.07353684306144714, -0.10102100670337677, 0.007176347076892853, 0.0006759434472769499, 0.09403514862060547, 0.04496988654136658, -0.10923942923545837, -0.05252082273364067, -0.005022240336984396, -0.07201192528009415, 0.2089317888021469, 0.24464954435825348, -0.044491302222013474, 0.129873126745224, 0.23805266618728638, -0.08045750856399536, -0.3571317195892334, -0.13346250355243683, -0.1513713300228119, -0.04193921759724617, -0.055660948157310486, -0.1262519657611847, 0.06949771195650101, 0.057556211948394775, -0.04520675539970398, 0.1253916472196579, -0.23558352887630463, -0.08480708301067352, 0.12430319935083389, 0.06262815743684769, 0.29731523990631104, -0.16111017763614655, -0.07768171280622482, -0.08243885636329651, -0.10420001298189163, 0.15620069205760956, -0.10071901232004166, 0.07838049530982971, -0.030521349981427193, 0.02278629131615162, -0.007264291401952505, -0.060333143919706345, 0.09391449391841888, -0.031394924968481064, 0.04936037212610245, -0.11185186356306076, 0.03063579834997654, 0.11695081740617752, -0.005270701367408037, 0.03711304813623428, -0.18760181963443756, 0.02827521413564682, -0.06893106549978256, -0.01915328949689865, -0.04806731268763542, 0.05779118463397026, -0.009530428797006607, -0.0535484254360199, -0.023772265762090683, -0.04008518531918526, 0.004912707023322582, -0.019072506576776505, 0.15592199563980103, -0.020333586260676384, 0.07982747256755829, 0.19717790186405182, 0.14819300174713135, -0.14560078084468842, 0.06807394325733185, -0.045570213347673416, -0.0950000062584877, 0.0687120333313942, -0.10764807462692261, 0.04394230246543884, 0.10764477401971817, -0.022961150854825974, 0.0656055361032486, 0.08194828778505325, 0.020305665209889412, 0.0030913613736629486, 0.15634500980377197, -0.2256101369857788, 0.008830618113279343, -0.0281990934163332, -0.010971952229738235, 0.026163017377257347, 0.09477671980857849, 0.16214901208877563, -0.029754217714071274, 0.002965564141049981, -0.004278023727238178, 0.042441125959157944, -0.040640752762556076, 0.0963859111070633, 0.05180814117193222, 0.028705965727567673, -0.13166195154190063, 0.10894915461540222, 0.003199551487341523, -0.13003221154212952, -0.0013791699893772602, 0.09797167778015137, -0.15748387575149536, -0.12405624985694885, -0.0614512600004673, 0.020758507773280144, -0.10895738750696182, -0.11120042949914932, -0.07060246169567108, -0.13899783790111542, 0.07212802022695541, 0.15904411673545837, 0.049433205276727676, 0.0851295068860054, 0.013706463389098644, -0.05582836642861366, -0.09943033009767532, 0.06264248490333557, -0.033751532435417175, 0.023286886513233185, -0.1292669177055359, 0.03876335173845291, -0.013888093642890453, 0.08524066209793091, -0.06493550539016724, 0.008764744736254215, -0.1151595488190651, 0.04133845865726471, -0.16194845736026764, 0.017711181193590164, -0.06352037191390991, -0.024239417165517807, -0.010941997170448303, -0.0098018329590559, -0.02572065033018589, 0.002313147997483611, -0.06346793472766876, 0.02138102427124977, -0.001737220212817192, 0.04297000542283058, -0.11490950733423233, -0.053014520555734634, 0.03517568111419678, -0.03539583086967468, 0.13469944894313812, 0.08196555078029633, -0.11375666409730911, 0.09135476499795914, -0.14657315611839294, -0.04239552468061447, 0.11731795221567154, 0.03581798076629639, 0.003571770153939724, 0.03708679601550102, 0.01707015186548233, 0.133083313703537, -0.030054643750190735, 0.06926431506872177, 0.040901895612478256, -0.08533411473035812, 0.021836165338754654, -0.05408666282892227, -0.07330769300460815, -0.05588981509208679, -0.035549987107515335, 0.11357011646032333, 0.014639823697507381, 0.17359457910060883, -0.0670638158917427, 0.011182072572410107, -0.06560006737709045, 0.01797482557594776, -0.000039873499190434813, -0.18046949803829193, -0.13159218430519104, -0.045221466571092606, 0.02624053694307804, -0.01333921030163765, 0.2404760718345642, -0.009649713523685932, -0.09845778346061707, 0.04155353084206581, 0.021870901808142662, -0.005284113343805075, 0.015379700809717178, 0.2975419759750366, 0.05004633590579033, -0.029361970722675323, -0.11886295676231384, 0.004944915417581797, 0.04905995354056358, -0.00010670035408111289, 0.026016442105174065, 0.06585714966058731, 0.028300058096647263, 0.07328574359416962, 0.07113276422023773, -0.0035832617431879044, -0.03284073621034622, -0.07004905492067337, -0.056369878351688385, 0.07761183381080627, 0.0002996858675032854, 0.11336079984903336, 0.1529252827167511, -0.018158676102757454, -0.03555905446410179, -0.031617190688848495, -0.05884285271167755, -0.144126296043396, -0.15285591781139374, -0.09054205566644669, -0.1238408014178276, 0.02264987677335739, -0.1024065762758255, -0.0016187281580641866, 0.06939642876386642, 0.05638134479522705, -0.047391582280397415, 0.039299387484788895, 0.0473281554877758, -0.036694932729005814, 0.05989053100347519, -0.02380303665995598, 0.001113586826249957, -0.004041010048240423, -0.055829793214797974, -0.019929848611354828, -0.05823856592178345, -0.019769495353102684, 0.061692975461483, 0.02206406183540821, 0.07767954468727112, -0.12179504334926605, -0.08128818869590759, -0.039167094975709915, 0.054606758058071136, 0.03252193331718445, 0.19584275782108307, 0.02521129697561264, -0.02880268171429634, 0.06253069639205933, 0.1628674566745758, -0.043568000197410583, -0.14149029552936554, -0.012399706989526749, 0.17002438008785248, -0.0009299395023845136, 0.05227821692824364, 0.02784591354429722, -0.006559289060533047, -0.002711971988901496, 0.3147435486316681, 0.27779507637023926, -0.07447782903909683, 0.04237842559814453, -0.03887811303138733, 0.027552006766200066, 0.043930284678936005, 0.12752600014209747, 0.13494008779525757, 0.20628832280635834, -0.055346664041280746, -0.040682751685380936, -0.06173788011074066, 0.035347793251276016, -0.11799366772174835, 0.05291552469134331, -0.023187076672911644, -0.07850872725248337, -0.053715791553258896, 0.07020068168640137, -0.13393592834472656, 0.04754556342959404, 0.004086533561348915, -0.08421020209789276, 0.002952704904600978, 0.000792243517935276, 0.12456277012825012, 0.004949855152517557, 0.014374093152582645, -0.04951506480574608, -0.017228098586201668, 0.018193280324339867, -0.020658094435930252, -0.17676451802253723, 0.038087643682956696, 0.03576314449310303, -0.01258299220353365, 0.06709366291761398, 0.007615501061081886, 0.12849879264831543, 0.07163262367248535, 0.032745447009801865, -0.08777619153261185, 0.13863511383533478, 0.004455615300685167, -0.06959307193756104, 0.02968328446149826, -0.0370727963745594, 0.0010689134942367673, 0.012282193638384342, 0.03451087325811386, -0.024861952289938927, 0.03725828975439072, 0.0183072742074728, -0.06819691509008408, -0.04316079989075661, 0.008508010767400265, -0.05292486771941185, 0.07829330116510391, 0.020800990983843803, -0.031918466091156006, -0.010165954940021038, -0.06369049847126007, -0.012380455620586872, -0.02614472433924675, -0.13905061781406403, 0.019025009125471115, -0.13721148669719696, -0.056188683956861496, 0.12912330031394958, 0.03979631885886192, -0.23963266611099243, 0.019031425938010216, -0.09719961881637573, 0.011248872615396976, -0.18313005566596985, 0.04815872013568878, 0.13250483572483063, 0.004272429272532463, -0.024230673909187317, -0.005433162208646536, 0.02667142264544964, 0.050979990512132645, -0.08280682563781738, -0.10063239932060242 ]
null
null
transformers
Final Layer of language model
{}
text-classification
ManojAlexender/Final_layer_LM
[ "transformers", "safetensors", "roberta", "text-classification", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2024-02-13T21:24:12+00:00
[]
[]
TAGS #transformers #safetensors #roberta #text-classification #autotrain_compatible #endpoints_compatible #region-us
Final Layer of language model
[]
[ "TAGS\n#transformers #safetensors #roberta #text-classification #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ 38 ]
[ "passage: TAGS\n#transformers #safetensors #roberta #text-classification #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ -0.01254188735038042, 0.04932502284646034, -0.007006458472460508, 0.003189654555171728, 0.18244215846061707, 0.0013143597170710564, 0.1169193834066391, 0.08502434194087982, 0.006821845192462206, -0.0005061998963356018, 0.12406975775957108, 0.19994735717773438, -0.050712328404188156, 0.16625741124153137, -0.15378855168819427, -0.22403189539909363, 0.1144980937242508, -0.0039152707904577255, -0.0048158722929656506, 0.09282375872135162, 0.10143786668777466, -0.07303066551685333, 0.07010231912136078, -0.06555245816707611, -0.11781618744134903, 0.04699474945664406, 0.07575174421072006, -0.15539658069610596, 0.0900331363081932, 0.061877306550741196, 0.17633122205734253, 0.04615776613354683, -0.052197184413671494, -0.18187153339385986, 0.03486737608909607, 0.018801460042595863, -0.0889405906200409, 0.017653118818998337, 0.056707292795181274, -0.1367085725069046, -0.02604415826499462, 0.0038470986764878035, 0.03364947438240051, 0.07664389163255692, -0.15555402636528015, -0.05174184963107109, -0.01861695386469364, -0.02990095317363739, 0.1113235130906105, 0.05682392790913582, -0.015598166733980179, 0.14998400211334229, -0.10918935388326645, 0.13022591173648834, 0.08060936629772186, -0.2885364592075348, -0.016323363408446312, 0.10165884345769882, 0.04258476570248604, 0.050326019525527954, -0.055333398282527924, 0.0786801129579544, 0.06348196417093277, -0.029746532440185547, 0.010256463661789894, -0.07020429521799088, -0.11646713316440582, 0.018947435542941093, -0.06397873163223267, -0.019477006047964096, 0.19781537353992462, -0.04268163815140724, 0.03541755676269531, -0.06497997045516968, -0.10141821950674057, -0.026809794828295708, -0.02280704490840435, 0.007532485295087099, -0.030648918822407722, 0.07816115021705627, 0.008139525540173054, 0.03919908031821251, -0.10318174958229065, 0.006727985572069883, -0.19146105647087097, 0.24809755384922028, -0.010980257764458656, 0.0317610464990139, -0.16905155777931213, 0.008260820060968399, 0.000444586796220392, -0.10854922980070114, 0.030539073050022125, -0.10714860260486603, -0.003815345000475645, -0.05400091037154198, -0.060558103024959564, -0.0881996750831604, 0.14568375051021576, 0.192643940448761, 0.04607626795768738, 0.07854727655649185, -0.08727390319108963, 0.06072840839624405, 0.025679871439933777, 0.10086184740066528, 0.05033521354198456, -0.06150684505701065, 0.07584486901760101, -0.11977572739124298, 0.03557157889008522, -0.0660666823387146, -0.12583407759666443, -0.004266063217073679, 0.07322974503040314, 0.11594806611537933, -0.00008466981671517715, 0.0985836610198021, -0.057702917605638504, 0.0372590534389019, 0.08048275858163834, -0.08533237129449844, 0.0074585010297596455, 0.008861647918820381, 0.06090855598449707, 0.009311417117714882, -0.024445436894893646, 0.02089284360408783, -0.026285437867045403, 0.10338398069143295, -0.04940469563007355, -0.03487023711204529, -0.018497496843338013, -0.07787740230560303, 0.03633662313222885, -0.10981973260641098, 0.05039240047335625, -0.21355971693992615, -0.1227220818400383, -0.01011607889086008, 0.012447265908122063, -0.007823587395250797, 0.018872780725359917, -0.04820196330547333, -0.015349684283137321, 0.05571436882019043, -0.05794868990778923, -0.12802079319953918, -0.0778493657708168, 0.06419924646615982, -0.0037106084637343884, 0.07546669244766235, -0.10481604933738708, 0.03494281321763992, -0.11177600175142288, 0.002357146702706814, -0.15593557059764862, 0.04365875571966171, -0.06506406515836716, 0.23667709529399872, 0.0029743860941380262, 0.020650114864110947, -0.06720510870218277, 0.07617649435997009, -0.06298615038394928, 0.18509939312934875, -0.07963411509990692, -0.06494385749101639, 0.2424948513507843, -0.1312667727470398, -0.16598030924797058, 0.08952795714139938, -0.02270018681883812, 0.029941927641630173, 0.12230859696865082, 0.2104354202747345, 0.08144437521696091, -0.0006680534570477903, 0.06209303066134453, 0.0831838846206665, -0.1120694950222969, -0.08302938938140869, -0.021768243983387947, -0.0050077782943844795, -0.149734228849411, 0.037072401493787766, 0.11389040946960449, 0.06549268960952759, -0.03192567080259323, -0.03894880414009094, -0.03079073503613472, -0.031369611620903015, 0.102027527987957, 0.022814366966485977, 0.07964156568050385, -0.12739986181259155, -0.001733205164782703, -0.04866455867886543, -0.02247270755469799, 0.010889296419918537, 0.0029665359761565924, -0.0960722416639328, 0.06908705085515976, -0.003745842492207885, 0.0396786592900753, -0.1852719485759735, -0.13646742701530457, -0.012436293065547943, 0.13115520775318146, -0.03555452823638916, 0.062401384115219116, 0.06255318224430084, -0.04098264127969742, -0.017564790323376656, -0.03514985740184784, 0.2274853140115738, 0.05467784404754639, -0.045438818633556366, -0.07438584417104721, 0.11948368698358536, -0.09389247000217438, 0.02138362266123295, -0.11848361045122147, 0.02797878347337246, 0.09712482988834381, 0.11111597716808319, 0.05698578059673309, 0.0661223828792572, -0.012240523472428322, 0.034425389021635056, -0.08790574967861176, -0.006493826862424612, 0.07268377393484116, 0.00010250118793919683, -0.08115597069263458, 0.15307484567165375, -0.20454077422618866, 0.32951074838638306, 0.18848255276679993, -0.21407517790794373, -0.03598983213305473, -0.010835207998752594, 0.02935412898659706, 0.03790993243455887, 0.005570809822529554, 0.006509591359645128, -0.011258008889853954, -0.020471028983592987, 0.18366116285324097, -0.03151395171880722, -0.02456611394882202, 0.009060459211468697, -0.0818958729505539, -0.04566380754113197, 0.06832007318735123, -0.0338406041264534, -0.24035176634788513, 0.19198669493198395, 0.21297204494476318, 0.03211754932999611, 0.14452914893627167, -0.012035982683300972, 0.06346448510885239, 0.07291626930236816, 0.009760154411196709, 0.018843475729227066, -0.07187549024820328, -0.10813641548156738, -0.04170412942767143, 0.0465334951877594, 0.042486824095249176, 0.04140021651983261, -0.08768202364444733, -0.040712907910346985, 0.016372675076127052, 0.01801615208387375, 0.01282760314643383, 0.07202916592359543, 0.028847433626651764, 0.11330016702413559, -0.00518242921680212, -0.07519952207803726, 0.11572913825511932, -0.02439412660896778, -0.08809425681829453, 0.1956680417060852, -0.13240642845630646, -0.31414490938186646, -0.1637018769979477, -0.20230461657047272, -0.011842126958072186, 0.10290822386741638, 0.10393712669610977, -0.13141152262687683, -0.07580111175775528, -0.01916954107582569, -0.019693931564688683, -0.0007516316254623234, 0.05523395538330078, -0.028887970373034477, 0.10558106750249863, -0.022316690534353256, -0.05841365456581116, -0.057797715067863464, -0.006265678443014622, -0.01346888393163681, 0.1545335203409195, -0.12292144447565079, 0.11388137191534042, 0.14236071705818176, -0.012161964550614357, 0.026334263384342194, -0.050725534558296204, 0.1359081119298935, -0.09664026647806168, -0.022418690845370293, 0.1610455960035324, -0.09625454992055893, 0.052469026297330856, 0.20765578746795654, -0.0059596761129796505, -0.10531338304281235, 0.0749388262629509, -0.03806108981370926, -0.10195251554250717, -0.2193378210067749, -0.1269853711128235, -0.08221524953842163, 0.07478796690702438, 0.04727979004383087, 0.07754610478878021, 0.1457219123840332, 0.08947831392288208, 0.014073943719267845, -0.01115041971206665, 0.06514241546392441, 0.09777452796697617, 0.19081218540668488, 0.017443154007196426, 0.1547803431749344, -0.09305305778980255, -0.1491154283285141, 0.073776014149189, 0.000030012357456143945, 0.09036179631948471, 0.13208657503128052, 0.012021169997751713, 0.0004394423449411988, 0.018394632264971733, 0.17216958105564117, 0.1360793113708496, 0.06899947673082352, -0.049573659896850586, -0.011268620379269123, 0.004186907783150673, -0.07690033316612244, 0.03320625424385071, -0.03856270760297775, -0.13035649061203003, -0.05575080215930939, -0.08037421852350235, 0.12746688723564148, 0.08324217051267624, 0.06775025278329849, -0.23583151400089264, 0.00933094508945942, 0.14200031757354736, -0.019177308306097984, -0.10113538056612015, 0.10089515149593353, -0.011706633493304253, -0.08966535329818726, 0.12594760954380035, -0.027938079088926315, 0.11172942072153091, -0.05513806641101837, 0.07482371479272842, -0.09686270356178284, -0.11192840337753296, -0.013837315142154694, 0.09404060244560242, -0.26530224084854126, 0.2156963348388672, 0.02325119823217392, -0.015834137797355652, -0.05878595635294914, -0.013714953325688839, 0.03977113217115402, 0.22499410808086395, 0.12553216516971588, -0.02211090363562107, -0.20793874561786652, -0.16347330808639526, -0.015588384121656418, 0.015393316745758057, 0.13727232813835144, -0.009163939394056797, 0.012540830299258232, -0.06062226742506027, -0.029159847646951675, -0.010550783947110176, -0.08707597106695175, -0.01038098894059658, -0.15597331523895264, 0.009272357448935509, 0.08051829040050507, 0.11042498052120209, -0.018404075875878334, 0.015562273561954498, -0.11297079175710678, 0.19027812778949738, -0.08764854818582535, -0.03211408108472824, -0.1262277215719223, -0.1023973748087883, -0.004564125090837479, -0.05850803107023239, 0.07510846108198166, -0.07354667037725449, 0.0610165074467659, -0.06554762274026871, -0.18299010396003723, 0.143818661570549, -0.1213504895567894, -0.03822363540530205, -0.07458259165287018, 0.1021244078874588, -0.084508016705513, -0.02909291349351406, 0.05686478316783905, 0.05387892574071884, -0.046079378575086594, -0.07309788465499878, -0.0010158733930438757, 0.001909456099383533, 0.03351389616727829, 0.08221030235290527, -0.08241637796163559, -0.16901566088199615, -0.018894154578447342, -0.008290726691484451, 0.24717599153518677, 0.22731563448905945, -0.06348404288291931, 0.11680494248867035, 0.15808947384357452, -0.059772610664367676, -0.36571449041366577, -0.06256610155105591, -0.16242213547229767, -0.04154045134782791, -0.02315020002424717, -0.07617900520563126, 0.1324867606163025, 0.006189401727169752, -0.042887430638074875, 0.05610458180308342, -0.09400513768196106, -0.099453404545784, 0.2053203582763672, -0.004327296279370785, 0.4137114882469177, -0.15348438918590546, -0.09388810396194458, -0.12928755581378937, -0.06995803862810135, 0.0942000076174736, -0.11759138107299805, 0.05480913817882538, 0.03709033131599426, -0.031828876584768295, 0.044782429933547974, -0.04335608333349228, 0.10831436514854431, -0.030656669288873672, 0.07102964073419571, -0.14889055490493774, -0.06429320573806763, 0.06022678688168526, -0.046778883785009384, 0.0015717737842351198, -0.035033270716667175, 0.022702768445014954, -0.07514240592718124, -0.04485613480210304, -0.011645467020571232, 0.054373808205127716, 0.04802318289875984, -0.0309903621673584, -0.016224835067987442, -0.027730893343687057, 0.010578032582998276, -0.007792531047016382, 0.3115534782409668, -0.07154344022274017, 0.2001889944076538, 0.12602345645427704, 0.15560594201087952, -0.11745034158229828, 0.14738689363002777, -0.014374572783708572, -0.08430425077676773, 0.057979464530944824, -0.1142604798078537, 0.10284692049026489, 0.07284888625144958, -0.07918781042098999, 0.07984147220849991, 0.10348936170339584, 0.06468349695205688, -0.0050907679833471775, 0.19576506316661835, -0.20471985638141632, -0.013899031095206738, -0.027942141517996788, -0.004320491570979357, 0.05331546440720558, 0.09610199928283691, 0.15129223465919495, 0.04886668547987938, -0.012441675178706646, -0.015183180570602417, 0.01484730839729309, -0.0063045998103916645, 0.05759065970778465, 0.058624267578125, 0.03816797584295273, -0.11419862508773804, 0.0794171392917633, 0.03487493470311165, -0.1422940343618393, 0.014453641138970852, 0.08677410334348679, -0.17091143131256104, -0.11997586488723755, 0.013183344155550003, 0.17444513738155365, -0.08296605199575424, -0.08437260240316391, -0.0781538188457489, -0.15008363127708435, 0.02995387092232704, 0.2657928168773651, 0.10066299140453339, 0.09301384538412094, 0.012649200856685638, -0.037863537669181824, -0.022016659379005432, 0.009461639449000359, -0.005320260301232338, 0.04488332197070122, -0.15128681063652039, -0.020893123000860214, -0.051920704543590546, 0.0814724862575531, -0.1084047481417656, -0.03899355232715607, -0.1980859935283661, 0.03447233885526657, -0.10465482622385025, 0.005821552127599716, -0.08987561613321304, -0.006561096291989088, 0.00018816122610587627, -0.03713909909129143, -0.05161541327834129, -0.0699007511138916, -0.09376106411218643, 0.043280228972435, -0.020844070240855217, 0.03042660467326641, -0.08192291855812073, -0.05376067012548447, 0.0821814313530922, -0.04176601395010948, 0.0940200686454773, 0.07314243167638779, -0.086429163813591, 0.08736944198608398, -0.2280653715133667, -0.0876983255147934, 0.14775118231773376, -0.030463797971606255, 0.052245497703552246, 0.0660732164978981, 0.023374781012535095, 0.08623737096786499, 0.012666551396250725, 0.07095315307378769, 0.04433911666274071, -0.0958750918507576, 0.08057132363319397, -0.052326202392578125, -0.16524532437324524, -0.034027405083179474, -0.06012742966413498, 0.12509693205356598, -0.05067409202456474, 0.18189705908298492, -0.08895798772573471, 0.06640196591615677, -0.02055211551487446, 0.013882435858249664, -0.007519107311964035, -0.21962086856365204, -0.08865846693515778, -0.06255663931369781, 0.012163497507572174, -0.011242670938372612, 0.26108863949775696, 0.039457160979509354, 0.032364822924137115, 0.07740871608257294, 0.03224507346749306, 0.029092641547322273, 0.05990388244390488, 0.20631027221679688, 0.09965192526578903, -0.06638208776712418, -0.097160704433918, 0.0352749302983284, 0.05010462552309036, -0.08187506347894669, 0.08119434118270874, 0.10410276800394058, -0.09969078749418259, 0.0997195839881897, -0.03483065962791443, 0.03910502791404724, -0.04954201355576515, -0.11677473783493042, -0.09139906615018845, 0.032786983996629715, 0.038142744451761246, -0.027009036391973495, 0.16882430016994476, -0.010973501950502396, 0.0024828093592077494, -0.0740455761551857, -0.03884507343173027, -0.20765399932861328, -0.07270259410142899, -0.14337727427482605, -0.10889790952205658, 0.02123532071709633, -0.08070003241300583, -0.01221873052418232, 0.07617192715406418, 0.04494219645857811, -0.021970374509692192, 0.12841269373893738, -0.015816310420632362, -0.020578226074576378, 0.07109574228525162, -0.046122897416353226, 0.018734872341156006, 0.04839349910616875, -0.06171103194355965, -0.11855938285589218, -0.022743036970496178, -0.057444583624601364, 0.051923394203186035, -0.02892129495739937, 0.04808187484741211, -0.16005639731884003, -0.10824339836835861, -0.018769484013319016, 0.09592486917972565, -0.07900094240903854, 0.07882664352655411, 0.025954999029636383, -0.01950482279062271, 0.0766785517334938, 0.22729748487472534, -0.05030373856425285, -0.08015944808721542, -0.07235722988843918, 0.18592742085456848, 0.061569344252347946, 0.16179314255714417, -0.05848094820976257, -0.03544497489929199, -0.032697200775146484, 0.27801451086997986, 0.25569674372673035, -0.02154744416475296, 0.06144910678267479, -0.044227395206689835, 0.03240944445133209, 0.11516135931015015, 0.12885577976703644, 0.0611368753015995, 0.18913504481315613, -0.03165801241993904, -0.03456106781959534, 0.018527165055274963, -0.02754475362598896, -0.1272626668214798, 0.0868145301938057, 0.02008986659348011, -0.011557817459106445, -0.07126893103122711, 0.13051298260688782, -0.15975184738636017, 0.13342703878879547, 0.01282237097620964, -0.17443503439426422, -0.04537435621023178, -0.020204469561576843, 0.13790035247802734, -0.025889180600643158, 0.0683685690164566, -0.0038477033376693726, -0.1249990239739418, -0.03748948127031326, 0.011738501489162445, -0.16907136142253876, -0.011550881899893284, 0.001235250849276781, -0.011939246207475662, 0.0693843737244606, -0.01853594370186329, -0.023026464506983757, 0.09959103167057037, 0.01203946303576231, -0.04134592413902283, 0.11364856362342834, -0.0037089756224304438, -0.03196559101343155, 0.0504600815474987, 0.018632059916853905, -0.023207170888781548, -0.027210602536797523, 0.08311432600021362, -0.1720953732728958, 0.06448908150196075, -0.06856304407119751, -0.08522795140743256, -0.03378521651029587, 0.04190380126237869, -0.06526170670986176, 0.06958337873220444, 0.06416885554790497, -0.00999132078140974, 0.024103539064526558, -0.009630563668906689, 0.0033856211230158806, -0.005399757996201515, -0.1168380156159401, -0.097812220454216, -0.12556034326553345, -0.07163894176483154, 0.17094185948371887, 0.019049368798732758, -0.22946704924106598, 0.007774499244987965, -0.12376958131790161, 0.0671824961900711, -0.16806286573410034, 0.09603556990623474, 0.11476022750139236, 0.022616807371377945, -0.033984389156103134, -0.13245418667793274, 0.05669108405709267, 0.10228246450424194, -0.05878631770610809, -0.12315772473812103 ]
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # bert-azahead-v1.0 This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the azaheadhealth dataset. It achieves the following results on the evaluation set: - Loss: 0.7204 - Accuracy: 0.7083 - F1: 0.4615 - Precision: 0.5 - Recall: 0.4286 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:---------:|:------:| | 0.5889 | 1.0 | 10 | 0.5438 | 0.625 | 0.0 | 0.0 | 0.0 | | 0.4926 | 2.0 | 20 | 0.4309 | 0.75 | 0.5714 | 0.5714 | 0.5714 | | 0.3613 | 3.0 | 30 | 0.4260 | 0.75 | 0.5714 | 0.5714 | 0.5714 | | 0.2628 | 4.0 | 40 | 0.4989 | 0.75 | 0.5714 | 0.5714 | 0.5714 | | 0.1658 | 5.0 | 50 | 0.5883 | 0.7083 | 0.4615 | 0.5 | 0.4286 | | 0.1153 | 6.0 | 60 | 0.6374 | 0.6667 | 0.3333 | 0.4 | 0.2857 | | 0.074 | 7.0 | 70 | 0.6709 | 0.6667 | 0.3333 | 0.4 | 0.2857 | | 0.0548 | 8.0 | 80 | 0.6848 | 0.7083 | 0.4615 | 0.5 | 0.4286 | | 0.0456 | 9.0 | 90 | 0.7322 | 0.7083 | 0.4615 | 0.5 | 0.4286 | | 0.0439 | 10.0 | 100 | 0.7204 | 0.7083 | 0.4615 | 0.5 | 0.4286 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.2.0+cu121 - Datasets 2.16.1 - Tokenizers 0.13.2
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["azaheadhealth"], "metrics": ["accuracy", "f1", "precision", "recall"], "base_model": "bert-base-uncased", "model-index": [{"name": "bert-azahead-v1.0", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "azaheadhealth", "type": "azaheadhealth", "config": "small", "split": "test", "args": "small"}, "metrics": [{"type": "accuracy", "value": 0.7083333333333334, "name": "Accuracy"}, {"type": "f1", "value": 0.46153846153846156, "name": "F1"}, {"type": "precision", "value": 0.5, "name": "Precision"}, {"type": "recall", "value": 0.42857142857142855, "name": "Recall"}]}]}]}
text-classification
zwellington/bert-azahead-v1.0
[ "transformers", "pytorch", "bert", "text-classification", "generated_from_trainer", "dataset:azaheadhealth", "base_model:bert-base-uncased", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2024-02-13T21:28:23+00:00
[]
[]
TAGS #transformers #pytorch #bert #text-classification #generated_from_trainer #dataset-azaheadhealth #base_model-bert-base-uncased #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
bert-azahead-v1.0 ================= This model is a fine-tuned version of bert-base-uncased on the azaheadhealth dataset. It achieves the following results on the evaluation set: * Loss: 0.7204 * Accuracy: 0.7083 * F1: 0.4615 * Precision: 0.5 * Recall: 0.4286 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 2e-05 * train\_batch\_size: 8 * eval\_batch\_size: 8 * seed: 42 * gradient\_accumulation\_steps: 2 * total\_train\_batch\_size: 16 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * num\_epochs: 10 ### Training results ### Framework versions * Transformers 4.31.0 * Pytorch 2.2.0+cu121 * Datasets 2.16.1 * Tokenizers 0.13.2
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 16\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 10", "### Training results", "### Framework versions\n\n\n* Transformers 4.31.0\n* Pytorch 2.2.0+cu121\n* Datasets 2.16.1\n* Tokenizers 0.13.2" ]
[ "TAGS\n#transformers #pytorch #bert #text-classification #generated_from_trainer #dataset-azaheadhealth #base_model-bert-base-uncased #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 16\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 10", "### Training results", "### Framework versions\n\n\n* Transformers 4.31.0\n* Pytorch 2.2.0+cu121\n* Datasets 2.16.1\n* Tokenizers 0.13.2" ]
[ 74, 126, 4, 33 ]
[ "passage: TAGS\n#transformers #pytorch #bert #text-classification #generated_from_trainer #dataset-azaheadhealth #base_model-bert-base-uncased #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 16\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 10### Training results### Framework versions\n\n\n* Transformers 4.31.0\n* Pytorch 2.2.0+cu121\n* Datasets 2.16.1\n* Tokenizers 0.13.2" ]
[ -0.13921870291233063, 0.18508315086364746, -0.002243270166218281, 0.10482655465602875, 0.15909737348556519, 0.040857382118701935, 0.09685206413269043, 0.16126884520053864, -0.06993351131677628, 0.07895315438508987, 0.11726761609315872, 0.11490252614021301, 0.04857656732201576, 0.17372620105743408, -0.034124504774808884, -0.26458677649497986, 0.005244869738817215, 0.014380332082509995, -0.06501124054193497, 0.14438021183013916, 0.08860651403665543, -0.1207006499171257, 0.0711255669593811, -0.006219291128218174, -0.15039047598838806, -0.043220918625593185, -0.018348345533013344, -0.039215534925460815, 0.12723027169704437, -0.006171469110995531, 0.09878881275653839, 0.042961522936820984, 0.1052461713552475, -0.19752709567546844, 0.0026281208265572786, 0.04575058072805405, 0.014525645412504673, 0.11502145975828171, 0.061496831476688385, -0.029738470911979675, 0.0875454917550087, -0.10453581809997559, 0.05163572356104851, 0.02135186269879341, -0.10951504856348038, -0.21109607815742493, -0.1204729899764061, 0.09102250635623932, 0.10488356649875641, 0.08071158826351166, -0.0110210245475173, 0.07976113259792328, -0.10644293576478958, 0.06677642464637756, 0.24548214673995972, -0.2977456748485565, -0.05392713099718094, 0.03921588137745857, 0.010070410557091236, 0.055191393941640854, -0.10699401795864105, -0.048704471439123154, 0.0292967539280653, 0.04166935756802559, 0.12574560940265656, -0.0014989334158599377, -0.026649679988622665, 0.010593129321932793, -0.1510871946811676, -0.05823642760515213, 0.12040628492832184, 0.05018364638090134, -0.026850372552871704, -0.06948316097259521, -0.057312652468681335, -0.21261419355869293, -0.02005075477063656, 0.008900538086891174, 0.03816778212785721, -0.04708213359117508, -0.07941737025976181, 0.0015910164220258594, -0.07543522119522095, -0.06946033239364624, -0.019371308386325836, 0.05900060012936592, 0.046390876173973083, 0.004242789465934038, 0.027977202087640762, 0.14116354286670685, 0.02853085659444332, -0.1422155648469925, 0.016583558171987534, 0.01016427855938673, -0.034784406423568726, -0.027156980708241463, -0.03274320811033249, 0.02668244205415249, 0.018211711198091507, 0.13120925426483154, -0.05770009756088257, 0.014766819775104523, 0.03720313683152199, 0.03726464882493019, -0.07424865663051605, 0.1551366001367569, -0.09015112370252609, -0.07400402426719666, -0.014813900925219059, 0.12866239249706268, 0.018313394859433174, -0.011788775213062763, -0.11900603771209717, 0.006424168590456247, 0.12147896736860275, 0.04808783903717995, -0.032727502286434174, 0.03364895284175873, -0.07341636717319489, -0.05199670419096947, 0.0687769204378128, -0.09534430503845215, 0.025708317756652832, 0.014566384255886078, -0.0878419280052185, -0.037203624844551086, 0.013448799028992653, 0.01657196320593357, -0.032381821423769, 0.12327050417661667, -0.09074718505144119, 0.013734730891883373, -0.07967938482761383, -0.09772287309169769, 0.027140893042087555, -0.1273506134748459, -0.014145244844257832, -0.06604712456464767, -0.15620602667331696, -0.04246510565280914, 0.04589546471834183, -0.05478903651237488, -0.0976085439324379, -0.0799684152007103, -0.0646495670080185, 0.023782527074217796, -0.02531871199607849, 0.11761002242565155, -0.061256200075149536, 0.12938620150089264, -0.010037878528237343, 0.06983797252178192, 0.00014534243382513523, 0.06769081950187683, -0.08934468030929565, 0.038488421589136124, -0.15489926934242249, 0.0686182826757431, -0.044457707554101944, 0.03528101369738579, -0.12482135742902756, -0.11563124507665634, 0.022719336673617363, -0.044488247483968735, 0.09416353702545166, 0.12183134257793427, -0.16040542721748352, -0.07414013147354126, 0.18425485491752625, -0.0693698450922966, -0.12742426991462708, 0.11276545375585556, -0.03993003070354462, 0.0186062790453434, 0.03571022301912308, 0.17493779957294464, 0.06894953548908234, -0.06164543330669403, -0.02607259526848793, -0.034796614199876785, 0.08454478532075882, -0.05641520023345947, 0.11572357267141342, -0.02406475692987442, 0.04441066458821297, 0.014686371199786663, -0.07574599981307983, 0.053401343524456024, -0.10912497341632843, -0.09443026036024094, -0.015819424763321877, -0.09904877841472626, 0.08595743030309677, 0.05467430129647255, 0.057729609310626984, -0.06656008958816528, -0.09336240589618683, 0.03750963881611824, 0.11727584153413773, -0.07129346579313278, 0.02577706053853035, -0.05381268262863159, 0.10175862908363342, -0.0622439980506897, -0.02569434978067875, -0.18767981231212616, -0.03438389301300049, 0.02384522557258606, 0.001276151044294238, -0.009795010089874268, -0.018164947628974915, 0.04133237898349762, 0.0742698460817337, -0.059677816927433014, -0.04763641208410263, -0.053990717977285385, -0.01771100051701069, -0.1142648458480835, -0.22681500017642975, -0.07414572685956955, -0.01755177043378353, 0.19055771827697754, -0.20496846735477448, 0.031214894726872444, -0.02007162757217884, 0.10898075252771378, 0.01775948517024517, -0.028050405904650688, -0.007323760539293289, 0.07920774817466736, -0.033101219683885574, -0.06173565611243248, 0.08619148284196854, -0.004007513169199228, -0.09349685907363892, -0.015446594916284084, -0.0721568837761879, 0.11649076640605927, 0.10981861501932144, -0.0035052199382334948, -0.07262169569730759, -0.06243253871798515, -0.08168784528970718, -0.03833281993865967, -0.04944019392132759, 0.027644416317343712, 0.12874636054039001, 0.032301682978868484, 0.1477377563714981, -0.07918929308652878, -0.04188692197203636, 0.02603095956146717, -0.00557835353538394, 0.020643746480345726, 0.12790429592132568, 0.13259074091911316, -0.08225615322589874, 0.14543528854846954, 0.12934724986553192, -0.03272728994488716, 0.11754228174686432, -0.039808109402656555, -0.0955347865819931, -0.03549911826848984, -0.003949787002056837, 0.00021604239009320736, 0.11725348234176636, -0.12486787885427475, -0.015723401680588722, 0.042750369757413864, 0.031032903119921684, 0.020166223868727684, -0.20163489878177643, -0.027654696255922318, 0.01725943572819233, -0.06285075843334198, -0.037359341979026794, -0.015159380622208118, 0.014834841713309288, 0.11757884919643402, 0.03132827952504158, -0.06603451818227768, 0.029905404895544052, -0.00239358632825315, -0.09086712449789047, 0.18357497453689575, -0.08594779670238495, -0.17020496726036072, -0.11514385044574738, -0.04837331175804138, -0.03619096055626869, -0.011254175566136837, 0.0399140901863575, -0.06879536062479019, -0.001282613491639495, -0.06534084677696228, 0.004144188482314348, 0.00317848427221179, 0.02048276551067829, -0.0039055098313838243, 0.01959526352584362, 0.05155298858880997, -0.09083974361419678, 0.0036724754609167576, -0.048140425235033035, -0.053363118320703506, 0.04648972302675247, 0.016866248100996017, 0.10982175916433334, 0.1750115305185318, 0.006158630363643169, 0.02240300551056862, -0.04808395728468895, 0.18388523161411285, -0.07178688794374466, -0.01880452409386635, 0.11912795901298523, 0.00287402281537652, 0.07442020624876022, 0.1604449450969696, 0.04926512390375137, -0.07157367467880249, 0.0038038433995097876, 0.02649620920419693, -0.019352437928318977, -0.23518411815166473, -0.057220470160245895, -0.05490763485431671, -0.006812490988522768, 0.12034664303064346, 0.023524312302470207, -0.0032697170972824097, 0.05501972883939743, 0.0068338834680616856, 0.02960025519132614, -0.045014120638370514, 0.06462956219911575, 0.06983741372823715, 0.03649793937802315, 0.13308334350585938, -0.02202717587351799, -0.019793691113591194, 0.04995097219944, 0.009519286453723907, 0.24752867221832275, -0.05390104278922081, 0.1433858424425125, 0.0663260892033577, 0.2143091857433319, 0.007293880917131901, 0.06750499457120895, -0.0058991932310163975, 0.0026393430307507515, -0.012703760527074337, -0.04094080254435539, -0.06686989963054657, 0.01590058207511902, 0.007212739437818527, 0.05540463700890541, -0.15052202343940735, 0.013689013198018074, 0.05436842888593674, 0.312860906124115, 0.10255306959152222, -0.31454598903656006, -0.09969381988048553, -0.0033991546370089054, -0.02263227477669716, -0.031687695533037186, 0.001322916359640658, 0.11505285650491714, -0.11791522800922394, 0.021870508790016174, -0.07003478705883026, 0.09307927638292313, -0.06950841099023819, 0.0153504628688097, 0.08029209822416306, 0.05548764392733574, 0.006911837495863438, 0.08600876480340958, -0.248220294713974, 0.2788905203342438, 0.0008760684286244214, 0.03524560108780861, -0.06819978356361389, 0.001629422651603818, 0.03195219114422798, 0.04906808212399483, 0.09441277384757996, 0.011967089027166367, -0.013252245262265205, -0.19115176796913147, -0.12619295716285706, 0.015217781998217106, 0.09036778658628464, -0.10009574145078659, 0.10720453411340714, -0.024236563593149185, -0.010293812490999699, 0.02194344811141491, -0.01695757545530796, -0.030105087906122208, -0.10426325350999832, 0.026791712269186974, -0.024951623752713203, 0.014026416465640068, -0.07549995183944702, -0.12180155515670776, -0.06623384356498718, 0.1772271990776062, -0.08002375811338425, -0.059416837990283966, -0.12293222546577454, 0.11873485893011093, 0.12625616788864136, -0.09773992002010345, 0.03611377626657486, 0.003709738375619054, 0.08329811692237854, 0.01380401011556387, -0.04712928459048271, 0.08718718588352203, -0.05278902128338814, -0.22537320852279663, -0.07118260860443115, 0.1584794968366623, 0.034841082990169525, 0.0803804099559784, -0.024989807978272438, 0.020890437066555023, -0.014607501216232777, -0.0797286406159401, 0.03753900155425072, 0.043224576860666275, 0.10683919489383698, 0.04293975234031677, -0.07416127622127533, 0.00974627397954464, -0.06502842158079147, -0.040785569697618484, 0.1908472180366516, 0.24630925059318542, -0.11098698526620865, 0.07775343954563141, 0.062057673931121826, -0.05693008005619049, -0.18670764565467834, 0.0044972398318350315, 0.11257801204919815, 0.031760379672050476, 0.014135903678834438, -0.1997600793838501, 0.0724397674202919, 0.1134733334183693, -0.02764209359884262, 0.10163169354200363, -0.3420564830303192, -0.10916990041732788, 0.06202513352036476, 0.09973780810832977, 0.051446765661239624, -0.14586788415908813, -0.05067078769207001, 0.014423332177102566, -0.12915228307247162, 0.11862755566835403, -0.040773339569568634, 0.1260969638824463, -0.03714579716324806, 0.06748412549495697, 0.014705277048051357, -0.05729721486568451, 0.12632796168327332, 0.031442634761333466, 0.04696996137499809, -0.03648918494582176, -0.06461837142705917, 0.06222931295633316, -0.07083367556333542, 0.024538177996873856, -0.053437162190675735, 0.0452665239572525, -0.1354648917913437, -0.017482943832874298, -0.10009487718343735, 0.009075715206563473, -0.047985635697841644, -0.062391020357608795, -0.033898141235113144, 0.0752202570438385, 0.0758376196026802, 0.0011262556072324514, 0.11256971955299377, 0.0019279270200058818, 0.1388130635023117, 0.09499935060739517, 0.057079214602708817, -0.05153781175613403, -0.06621614843606949, -0.03426751494407654, -0.02678413689136505, 0.037208717316389084, -0.1544644832611084, 0.024059327319264412, 0.1590331643819809, 0.040204089134931564, 0.14660212397575378, 0.06614339351654053, -0.04732208698987961, 0.012294548563659191, 0.06984843313694, -0.13385692238807678, -0.06417787075042725, -0.004049885552376509, -0.03207327798008919, -0.17697036266326904, 0.030241910368204117, 0.10099291056394577, -0.047580305486917496, -0.03544732555747032, -0.0076186093501746655, 0.03436581417918205, -0.03209672123193741, 0.1909458190202713, 0.060598861426115036, 0.08198584616184235, -0.10775988548994064, 0.07066038250923157, 0.08839553594589233, -0.09797515720129013, 0.006582947447896004, 0.07534019649028778, -0.11065108329057693, -0.033335041254758835, 0.02415665239095688, 0.1179870218038559, -0.031236061826348305, -0.03902002051472664, -0.11932666599750519, -0.09967155754566193, 0.0887540727853775, 0.10592764616012573, 0.08625645935535431, 0.03869669511914253, -0.027980666607618332, -0.015695959329605103, -0.12102840840816498, 0.10409994423389435, 0.05018109455704689, 0.0597568042576313, -0.12216181308031082, 0.14640432596206665, 0.0062837060540914536, 0.07120107114315033, -0.011680267751216888, 0.024408506229519844, -0.10211503505706787, -0.009064637124538422, -0.10130289196968079, -0.016237283125519753, -0.055194392800331116, -0.012457123026251793, -0.01465903501957655, -0.050051409751176834, -0.03903692960739136, 0.009716521948575974, -0.10656105726957321, -0.05127081647515297, -0.0068208216689527035, 0.05559702590107918, -0.12400414049625397, -0.029262254014611244, 0.019532175734639168, -0.0845768004655838, 0.11459255963563919, 0.03550395369529724, 0.04305151477456093, 0.016622936353087425, -0.06441808491945267, -0.008699358440935612, 0.0335681177675724, 0.006486409343779087, 0.06742546707391739, -0.13929052650928497, -0.000024904766178224236, -0.023943277075886726, 0.0075170486234128475, 0.03181523457169533, 0.07768839597702026, -0.14231297373771667, 0.00750677241012454, -0.031743645668029785, -0.04281138256192207, -0.05759338289499283, 0.05270396173000336, 0.06487062573432922, 0.03469512611627579, 0.17139144241809845, -0.06051573157310486, 0.06221450865268707, -0.22918978333473206, -0.016115358099341393, -0.03483286499977112, -0.05844610184431076, -0.10231897234916687, -0.04104965180158615, 0.0699479803442955, -0.04983486235141754, 0.10821381211280823, -0.003361530601978302, 0.05831886827945709, 0.01368839293718338, -0.020542455837130547, 0.013746724463999271, 0.02091282606124878, 0.14705894887447357, 0.03881572186946869, -0.0399656780064106, 0.07714615762233734, 0.03793443366885185, 0.06002020463347435, 0.11087268590927124, 0.21047216653823853, 0.115610271692276, 0.0803760513663292, 0.05758970230817795, 0.0372602716088295, -0.09508788585662842, -0.18383103609085083, 0.06945374608039856, -0.02488430216908455, 0.09562558680772781, -0.012673270888626575, 0.22203080356121063, 0.08041969686746597, -0.2096233069896698, 0.05004662275314331, -0.030267097055912018, -0.08666305989027023, -0.09512084722518921, -0.05832970142364502, -0.07710352540016174, -0.13991983234882355, -0.008543593809008598, -0.11844254285097122, 0.002364607760682702, 0.12543198466300964, 0.0032637938857078552, 0.004720437340438366, 0.07739168405532837, 0.02865087427198887, 0.026730535551905632, 0.07194648683071136, 0.03527044877409935, 0.009594527073204517, -0.07092561572790146, -0.06501802057027817, 0.014000559225678444, 0.00012341677211225033, 0.06812265515327454, -0.06200827285647392, -0.04442961886525154, 0.0451233871281147, -0.004859639797359705, -0.10453305393457413, 0.013268515467643738, 0.0033196185249835253, 0.07136803865432739, 0.08988572657108307, 0.023401305079460144, 0.016610531136393547, -0.020984290167689323, 0.24697013199329376, -0.07261330634355545, -0.059696014970541, -0.13553421199321747, 0.25831669569015503, 0.034904517233371735, -0.04124355688691139, 0.05048779770731926, -0.09459979087114334, -0.001203623483888805, 0.19131441414356232, 0.21432772278785706, -0.08968737721443176, -0.02475512959063053, 0.034852221608161926, -0.009138394147157669, -0.010248608887195587, 0.08908098936080933, 0.13177937269210815, 0.08159764856100082, -0.0994899645447731, -0.03908936306834221, -0.0637514665722847, -0.028889110311865807, -0.02116505615413189, 0.06854084879159927, 0.02702694945037365, 0.006956425495445728, -0.041898343712091446, 0.06436143815517426, -0.056834712624549866, -0.090567946434021, 0.06612572073936462, -0.24072374403476715, -0.20172302424907684, -0.024066586047410965, 0.053641512989997864, 0.012781200930476189, 0.054485250264406204, -0.0074635581113398075, -0.0012099662562832236, 0.12577122449874878, -0.020041409879922867, -0.07887023687362671, -0.08040914684534073, 0.09293722361326218, -0.07045373320579529, 0.20886921882629395, -0.039215222001075745, 0.033465299755334854, 0.1282019019126892, 0.0655919536948204, -0.10468311607837677, 0.036526069045066833, 0.059599149972200394, -0.03631975129246712, 0.0358760729432106, 0.11619261652231216, -0.026609037071466446, 0.06928674876689911, 0.05569480359554291, -0.12807154655456543, 0.0010191404726356268, -0.06751436740159988, -0.05236814171075821, -0.05813189223408699, -0.0076556336134672165, -0.028282538056373596, 0.1371109038591385, 0.22210712730884552, -0.057784710079431534, 0.0009534207638353109, -0.06656487286090851, 0.004491706378757954, 0.03589824587106705, 0.0712064877152443, -0.04264117404818535, -0.21806086599826813, 0.02623739093542099, 0.01594693399965763, 0.002062045270577073, -0.25294268131256104, -0.0837215855717659, 0.023262949660420418, -0.06504999101161957, -0.1126651018857956, 0.11255036294460297, 0.012756455689668655, 0.049101535230875015, -0.049437016248703, -0.03034740686416626, -0.06729938834905624, 0.15535767376422882, -0.1751621812582016, -0.08258108049631119 ]
null
null
null
- 100 million trainable parameters - GPT model definition from https://github.com/mistralai/mistral-src/blob/main/one_file_ref.py - Trained for 3 hours on 4xA100 80GB. - Repo at https://github.com/ratisbonrobotics/gpt
{"language": ["en"], "license": "apache-2.0", "datasets": ["markusheimerl/socratic_dialogs"]}
null
markusheimerl/gpt_100m
[ "en", "dataset:markusheimerl/socratic_dialogs", "license:apache-2.0", "region:us" ]
2024-02-13T21:28:23+00:00
[]
[ "en" ]
TAGS #en #dataset-markusheimerl/socratic_dialogs #license-apache-2.0 #region-us
- 100 million trainable parameters - GPT model definition from URL - Trained for 3 hours on 4xA100 80GB. - Repo at URL
[]
[ "TAGS\n#en #dataset-markusheimerl/socratic_dialogs #license-apache-2.0 #region-us \n" ]
[ 33 ]
[ "passage: TAGS\n#en #dataset-markusheimerl/socratic_dialogs #license-apache-2.0 #region-us \n" ]
[ -0.06534311175346375, 0.258933424949646, -0.005976614076644182, 0.03859489783644676, -0.06460468471050262, -0.05544065311551094, 0.21638184785842896, 0.10229083150625229, 0.12753184139728546, -0.10321910679340363, 0.16019320487976074, 0.08914182335138321, 0.008633168414235115, 0.05461468547582626, -0.06295832246541977, -0.06833279877901077, 0.08620097488164902, -0.0377567782998085, -0.10007335245609283, 0.02702910825610161, 0.10432498902082443, -0.04213057458400726, 0.03635556250810623, -0.016829075291752815, 0.011221364140510559, 0.024646934121847153, 0.0017022764077410102, -0.04472491145133972, 0.06339918076992035, 0.005472882185131311, -0.035427238792181015, 0.07555948942899704, -0.043439704924821854, -0.1474585086107254, 0.045698270201683044, -0.056648917496204376, -0.05270854011178017, 0.06983540207147598, 0.021601682528853416, -0.0025089355185627937, 0.057795654982328415, 0.022907670587301254, -0.07128924876451492, 0.06454341113567352, -0.15600229799747467, -0.2024306058883667, -0.19658084213733673, 0.016528349369764328, 0.06647411733865738, 0.032174620777368546, 0.012493074871599674, 0.1657111644744873, -0.09359841793775558, -0.004465947858989239, 0.04525991529226303, -0.29019519686698914, 0.009027479216456413, 0.08155850321054459, -0.04783441871404648, 0.03194800019264221, 0.03337356820702553, 0.10417792946100235, 0.10474185645580292, -0.06645958125591278, -0.07644537836313248, -0.0624220110476017, -0.058411773294210434, 0.1028267964720726, -0.04353250935673714, -0.0682295486330986, 0.47458556294441223, 0.09286558628082275, 0.024645721539855003, 0.07600048184394836, -0.013170590624213219, 0.16319003701210022, 0.0705123245716095, 0.033187732100486755, 0.04618782922625542, 0.14635346829891205, 0.1225833147764206, -0.09957370907068253, -0.14792922139167786, 0.04043196514248848, -0.15444812178611755, 0.0342458114027977, -0.014579026959836483, 0.10164910554885864, -0.17241057753562927, -0.01500293705612421, 0.005836158525198698, -0.11886541545391083, -0.015234914608299732, -0.07364565879106522, 0.08239102363586426, 0.05451611056923866, -0.06857448816299438, 0.13460534811019897, 0.1587369441986084, 0.2417801022529602, 0.06659223139286041, -0.015758613124489784, -0.017612777650356293, 0.1569233387708664, 0.09293749183416367, -0.024811599403619766, 0.030357960611581802, -0.0110330143943429, 0.10964836925268173, -0.09933217614889145, 0.12071430683135986, -0.0115862637758255, -0.08909650892019272, -0.00825376808643341, -0.15034960210323334, 0.1391046941280365, 0.1647564172744751, -0.0590544193983078, -0.09583106637001038, 0.02311721444129944, 0.031769536435604095, -0.02166871167719364, -0.0025047636590898037, -0.01700190082192421, -0.02160676196217537, 0.007317694369703531, -0.0035284156911075115, 0.0729881003499031, 0.013082324527204037, -0.00444083521142602, -0.06501109153032303, -0.024678394198417664, -0.006701696198433638, 0.08958027511835098, 0.15159529447555542, -0.007098241243511438, 0.06353819370269775, -0.03037969395518303, -0.1219303160905838, 0.023542627692222595, 0.07411644607782364, 0.02068254165351391, -0.05331951752305031, 0.022355658933520317, 0.02888232283294201, -0.0714016705751419, -0.08798759430646896, -0.05795685201883316, -0.09167248755693436, 0.011391726322472095, -0.05689971521496773, -0.022899340838193893, -0.15861830115318298, -0.00026886779232881963, -0.11997106671333313, 0.06067907437682152, 0.046212442219257355, -0.03549114614725113, -0.10508633404970169, 0.12328918278217316, -0.03182975575327873, 0.04068627208471298, -0.024234546348452568, -0.003137287450954318, -0.004920809529721737, 0.2000793218612671, -0.1601647585630417, 0.0004278072447050363, 0.1184040904045105, -0.12099473178386688, -0.23267099261283875, 0.09278850257396698, -0.023534968495368958, 0.023919347673654556, 0.08477157354354858, 0.2839738130569458, -0.08269856870174408, -0.08325868844985962, 0.003988579381257296, 0.1192338764667511, -0.08510057628154755, -0.1555376499891281, 0.1415640413761139, -0.1314280778169632, -0.09620372951030731, -0.00567377544939518, -0.07218458503484726, 0.03659876435995102, -0.0182332843542099, -0.12233541160821915, -0.03354475274682045, -0.0607382170855999, 0.011607435531914234, 0.0011515965452417731, -0.01965160295367241, -0.12452524900436401, 0.05524957552552223, -0.04493064805865288, 0.0792633518576622, 0.1361209750175476, 0.017234334722161293, -0.08122559636831284, 0.02985050156712532, -0.05879727005958557, 0.027036406099796295, -0.09300491958856583, -0.030925048515200615, -0.04362044483423233, -0.03763928636908531, 0.15394268929958344, 0.2638188898563385, 0.028760839253664017, -0.13342653214931488, -0.014640253968536854, 0.04849996790289879, 0.014168829657137394, 0.04555400460958481, 0.019271481782197952, -0.1799839735031128, 0.05137774720788002, -0.05185855180025101, 0.16365337371826172, 0.057384930551052094, -0.00641807122156024, 0.06431793421506882, 0.06477631628513336, -0.03832995891571045, 0.0877133384346962, 0.0414130762219429, -0.07798712700605392, 0.03651212155818939, 0.005128978285938501, 0.0898255780339241, 0.06051083654165268, -0.09297896176576614, 0.0681689977645874, 0.024716980755329132, 0.17735238373279572, 0.17503736913204193, -0.04258597269654274, 0.12480262666940689, -0.05564168468117714, -0.0023850949946790934, 0.0033299403730779886, 0.055156514048576355, 0.04752758890390396, -0.02796793356537819, 0.02168411575257778, -0.015844007954001427, -0.04918449744582176, -0.03591164946556091, -0.07418482005596161, -0.11829321086406708, -0.07411344349384308, 0.0319448783993721, 0.1835399717092514, -0.20192064344882965, 0.19234958291053772, 0.42726296186447144, 0.07455199211835861, 0.09851060807704926, -0.1482548862695694, -0.029896046966314316, -0.07565173506736755, -0.04924129322171211, -0.05742082744836807, 0.172848641872406, -0.12046308815479279, 0.04894173517823219, 0.09605702757835388, 0.030013786628842354, 0.03142200782895088, -0.16336703300476074, -0.1016482263803482, 0.007822024635970592, -0.04707936942577362, -0.09703649580478668, 0.022796429693698883, -0.10986960679292679, 0.0803917646408081, 0.0022189237643033266, -0.04627804085612297, 0.13572488725185394, -0.022861719131469727, -0.07141073048114777, 0.04797850921750069, -0.20072132349014282, -0.106968954205513, -0.03690916672348976, 0.01568998210132122, -0.09344550222158432, 0.022421643137931824, 0.07639380544424057, -0.04549732804298401, -0.06341861933469772, 0.03745349496603012, -0.07550367712974548, -0.06675457954406738, -0.013085906393826008, 0.14655572175979614, 0.06515765935182571, 0.0066505298018455505, -0.1303420215845108, -0.05570428445935249, -0.007815239019691944, 0.04994754493236542, 0.010170691646635532, -0.013163755647838116, 0.09647019952535629, 0.05956953763961792, 0.05402836948633194, 0.03351684287190437, -0.015664536505937576, 0.17597121000289917, -0.05207354947924614, -0.04208870232105255, 0.17005544900894165, -0.020221855491399765, 0.04516787454485893, 0.14167585968971252, 0.0366545170545578, -0.07703039050102234, -0.01138878520578146, -0.05132802948355675, -0.06344761699438095, -0.31562721729278564, -0.07595322281122208, -0.1036052405834198, 0.18868762254714966, -0.049773987382650375, 0.009791986085474491, 0.06509236991405487, 0.07683074474334717, -0.009826380759477615, -0.12197432667016983, -0.011074868962168694, 0.030754821375012398, 0.24401184916496277, -0.0803951844573021, -0.00878173392266035, -0.17519882321357727, 0.03974504768848419, 0.1900469958782196, 0.09611820429563522, 0.07289601117372513, 0.20494960248470306, 0.05258742719888687, 0.1280669867992401, 0.2077053189277649, -0.015468946658074856, -0.013224820606410503, 0.09211589395999908, 0.004981651436537504, -0.058232951909303665, -0.02906348928809166, -0.07232507318258286, 0.08222746849060059, -0.002513497369363904, -0.16439618170261383, 0.03850829228758812, -0.08161263912916183, 0.10316783934831619, 0.1442004293203354, 0.10842619836330414, 0.007411630358546972, -0.020341506227850914, 0.10894814133644104, 0.08430028706789017, 0.07706917077302933, 0.09335170686244965, -0.05762319266796112, -0.06990063190460205, 0.12371137738227844, -0.008533693850040436, 0.09452414512634277, 0.04678887501358986, 0.035400621592998505, -0.16611801087856293, -0.1029287651181221, 0.048926107585430145, 0.1456998884677887, -0.2528286278247833, 0.28364813327789307, 0.019216889515519142, -0.08937395364046097, -0.11900942027568817, -0.0202920101583004, 0.065919429063797, 0.14055556058883667, 0.10795659571886063, 0.03690384328365326, -0.24379582703113556, 0.16600248217582703, -0.04290109872817993, 0.048983920365571976, -0.01388216856867075, 0.030032850801944733, -0.13862241804599762, -0.04407011345028877, 0.0625419095158577, 0.028220361098647118, 0.09620379656553268, -0.08524295687675476, -0.08960625529289246, 0.03829837590456009, 0.1260748952627182, -0.058771662414073944, -0.04631487652659416, 0.030867308378219604, -0.04693913459777832, 0.16922172904014587, 0.02156408689916134, -0.023904014378786087, -0.05440004914999008, -0.09883425384759903, 0.050178054720163345, -0.04106806218624115, -0.018957097083330154, -0.0386185459792614, -0.028558721765875816, -0.08532679080963135, -0.17875055968761444, 0.10161683708429337, -0.14896376430988312, -0.008691969327628613, -0.023510629311203957, 0.13969433307647705, -0.031030161306262016, 0.03529207780957222, 0.0037255878560245037, 0.018407637253403664, -0.08590977638959885, -0.135772243142128, 0.08136462420225143, -0.06054918095469475, 0.04786785691976547, 0.0176246277987957, 0.053311534225940704, -0.10270378738641739, 0.04544414207339287, -0.08957294374704361, 0.11713739484548569, 0.37946799397468567, -0.06324724107980728, 0.19588938355445862, 0.30566585063934326, -0.06694526970386505, -0.22931601107120514, -0.07967470586299896, -0.27454766631126404, -0.12216551601886749, 0.13578033447265625, -0.17120195925235748, 0.09646692872047424, 0.14359720051288605, -0.1547129899263382, 0.0568305142223835, -0.17524580657482147, -0.043556876480579376, 0.17965401709079742, -0.08524047583341599, 0.382836252450943, -0.09875407069921494, -0.09347479045391083, -0.05156690627336502, -0.21806608140468597, 0.1398683488368988, -0.19806157052516937, -0.011540807783603668, 0.07594600319862366, -0.002471582731232047, -0.020126059651374817, 0.0006265084375627339, 0.25095632672309875, 0.008716669864952564, 0.05503596365451813, -0.11683584004640579, -0.058827631175518036, 0.19719207286834717, -0.020174644887447357, 0.028187844902276993, -0.11957237124443054, 0.037878163158893585, -0.11500291526317596, 0.013118249364197254, -0.08066926896572113, 0.06808624416589737, -0.0106124859303236, -0.11501776427030563, -0.13650372624397278, -0.009874078445136547, -0.03119577281177044, -0.024352025240659714, 0.24322117865085602, 0.024734502658247948, -0.023623183369636536, 0.09201359748840332, 0.09951517730951309, -0.15840287506580353, 0.0702463835477829, -0.07424401491880417, -0.05873895809054375, 0.028405865654349327, -0.16362765431404114, -0.010660377331078053, 0.09841840714216232, -0.0184921994805336, 0.07511846721172333, -0.013399830088019371, -0.047943972051143646, 0.01919778622686863, 0.08936338871717453, -0.14783930778503418, -0.18465082347393036, 0.04207555577158928, 0.072032630443573, 0.05329904705286026, 0.10097189247608185, 0.11464021354913712, 0.06454932689666748, -0.022221561521291733, 0.014794833026826382, 0.07255982607603073, -0.13705070316791534, -0.050340376794338226, 0.06324364244937897, -0.027085954323410988, -0.13503648340702057, 0.26553258299827576, -0.009959638118743896, -0.0978575348854065, 0.0074023958295583725, 0.021211642771959305, -0.060571663081645966, -0.07218383997678757, -0.11671857535839081, 0.07728025317192078, -0.14682742953300476, -0.13458487391471863, -0.05004633218050003, -0.09022767841815948, -0.0034010345116257668, -0.01608959399163723, 0.09047235548496246, 0.030278779566287994, 0.027498099952936172, -0.036704860627651215, 0.09097125381231308, -0.03434967249631882, -0.012792937457561493, -0.08281084150075912, -0.034059472382068634, -0.16052967309951782, -0.026616724207997322, 0.09559386968612671, -0.0174112506210804, -0.05976947396993637, -0.06787284463644028, 0.053972650319337845, -0.12727512419223785, -0.052150145173072815, -0.1601422131061554, -0.009957672096788883, 0.04307034984230995, -0.1776108741760254, -0.027523936703801155, 0.005924133118242025, -0.1477767527103424, 0.024042904376983643, 0.04401126503944397, 0.10959053039550781, -0.15943960845470428, -0.07204720377922058, 0.13248489797115326, 0.05608222261071205, 0.14726214110851288, 0.09467252343893051, -0.051127102226018906, 0.07892963290214539, -0.17351055145263672, -0.0986933559179306, 0.0929342731833458, 0.06878308951854706, 0.017978504300117493, -0.06391598284244537, -0.1006011962890625, 0.11271288245916367, 0.024721713736653328, 0.0401834212243557, -0.060203563421964645, -0.0783926323056221, -0.037857599556446075, 0.05038987472653389, -0.10809185355901718, 0.024286724627017975, -0.11468370258808136, 0.1847686469554901, 0.034262921661138535, 0.14323098957538605, 0.060239121317863464, -0.002462545409798622, -0.03246866539120674, 0.03844894468784332, -0.013412405736744404, -0.13390012085437775, -0.16898386180400848, -0.04958261176943779, -0.03467854857444763, -0.03485298156738281, 0.26288676261901855, -0.04947367683053017, -0.14122821390628815, 0.10506091266870499, 0.05468914285302162, 0.01242650393396616, 0.03503746539354324, 0.1637238711118698, -0.03664649277925491, -0.003407115815207362, -0.10675332695245743, 0.0101338354870677, 0.000014182482118485495, -0.1199285164475441, 0.15505121648311615, 0.1259940266609192, 0.13051211833953857, 0.03402373194694519, 0.03211387246847153, -0.00445371400564909, -0.010534573346376419, -0.07983756810426712, 0.08415567874908447, 0.04562358185648918, 0.005540091544389725, 0.09438703209161758, 0.10975876450538635, -0.06889256834983826, 0.010558201000094414, -0.07151391357183456, 0.02241828665137291, -0.111845001578331, -0.14659017324447632, -0.007616767194122076, -0.07488340139389038, 0.010068153962492943, -0.07382453978061676, 0.0579741895198822, 0.14813446998596191, 0.0725850835442543, -0.12539054453372955, -0.03936713561415672, -0.1283121556043625, -0.03768758103251457, -0.02843240462243557, -0.007251386530697346, -0.012189434841275215, -0.11866931617259979, -0.10251815617084503, -0.044574074447155, 0.014180589467287064, -0.05889620631933212, 0.03782757371664047, -0.03692816197872162, -0.0208697859197855, -0.11091826111078262, -0.0442427434027195, -0.07651344686746597, 0.05912601575255394, 0.03875913843512535, 0.19815276563167572, 0.023824917152523994, 0.059873342514038086, 0.14248690009117126, 0.16586299240589142, -0.04973990097641945, -0.12209517508745193, -0.12408263236284256, 0.004566282033920288, -0.11617367714643478, 0.030952639877796173, 0.025835266336798668, -0.02603444829583168, 0.011414216831326485, 0.1476919800043106, 0.26866966485977173, -0.10302074253559113, 0.015773452818393707, -0.14758944511413574, 0.02669081650674343, -0.004060589708387852, 0.029586920514702797, 0.001889052800834179, 0.053274720907211304, -0.08351307362318039, 0.05046140030026436, 0.021182173863053322, 0.029232153668999672, -0.11596515774726868, -0.034772831946611404, -0.05816101282835007, -0.13583533465862274, 0.010161579586565495, 0.1423405259847641, -0.03644406795501709, -0.05287126079201698, -0.027603307738900185, -0.0982171818614006, -0.04982290789484978, 0.01928129605948925, 0.12150128185749054, 0.06066295877099037, 0.07108724117279053, -0.08627189695835114, -0.04584791511297226, 0.08118075132369995, 0.03220153599977493, -0.22111980617046356, -0.15886259078979492, 0.18116135895252228, 0.002994418842718005, 0.1538999080657959, 0.01631843112409115, 0.11058134585618973, 0.07531048357486725, 0.00924472976475954, -0.155775249004364, 0.04011130332946777, 0.1196504533290863, -0.013411667197942734, -0.13225598633289337, -0.20925477147102356, -0.02431904897093773, -0.0000674853436066769, 0.08540984988212585, 0.04004191234707832, 0.021231824532151222, 0.13010846078395844, -0.039411526173353195, -0.030900629237294197, 0.09009408950805664, -0.13576838374137878, 0.08465135097503662, -0.039226215332746506, -0.06281118094921112, -0.07175834476947784, -0.0021335273049771786, -0.0141829252243042, 0.018074383959174156, -0.21353726089000702, -0.048242196440696716, 0.16047047078609467, 0.01701585203409195, 0.08857555687427521, 0.03143562003970146, 0.010590692982077599, -0.0548165999352932, -0.08204378932714462, 0.04419366642832756, -0.04374684765934944, 0.03290433809161186, 0.08591058850288391, -0.04571925103664398, 0.030624995008111, -0.09386010468006134, 0.04025426134467125, -0.040655769407749176, 0.0031274408102035522, -0.03102106600999832 ]
null
null
transformers
# SMILES2IUPAC-canonical-small SMILES2IUPAC-canonical-small was designed to accurately translate SMILES chemical names to IUPAC standards. ## Model Details ### Model Description SMILES2IUPAC-canonical-small is based on the MT5 model with optimizations in implementing different tokenizers for the encoder and decoder. - **Developed by:** Knowladgator Engineering - **Model type:** Encoder-Decoder with attention mechanism - **Language(s) (NLP):** SMILES, IUPAC (English) - **License:** Apache License 2.0 ### Model Sources - **Paper:** coming soon - **Demo:** [ChemicalConverters](https://huggingface.co/spaces/knowledgator/ChemicalConverters) ## Quickstart Firstly, install the library: ```commandline pip install chemical-converters ``` ### SMILES to IUPAC #### ! Preferred IUPAC style To choose the preferred IUPAC style, place style tokens before your SMILES sequence. | Style Token | Description | |-------------|----------------------------------------------------------------------------------------------------| | `<BASE>` | The most known name of the substance, sometimes is the mixture of traditional and systematic style | | `<SYST>` | The totally systematic style without trivial names | | `<TRAD>` | The style is based on trivial names of the parts of substances | #### To perform simple translation, follow the example: ```python from chemicalconverters import NamesConverter converter = NamesConverter(model_name="knowledgator/SMILES2IUPAC-canonical-small") print(converter.smiles_to_iupac('CCO')) print(converter.smiles_to_iupac(['<SYST>CCO', '<TRAD>CCO', '<BASE>CCO'])) ``` ```text ['ethanol'] ['ethanol', 'ethanol', 'ethanol'] ``` #### Processing in batches: ```python from chemicalconverters import NamesConverter converter = NamesConverter(model_name="knowledgator/SMILES2IUPAC-canonical-small") print(converter.smiles_to_iupac(["<BASE>C=CC=C" for _ in range(10)], num_beams=1, process_in_batch=True, batch_size=1000)) ``` ```text ['buta-1,3-diene', 'buta-1,3-diene'...] ``` #### Validation SMILES to IUPAC translations It's possible to validate the translations by reverse translation into IUPAC and calculating Tanimoto similarity of two molecules fingerprints. ````python from chemicalconverters import NamesConverter converter = NamesConverter(model_name="knowledgator/SMILES2IUPAC-canonical-small") print(converter.smiles_to_iupac('CCO', validate=True)) ```` ````text ['ethanol'] 1.0 ```` The larger is Tanimoto similarity, the larger is probability, that the prediction was correct. You can also process validation manually: ```python from chemicalconverters import NamesConverter validation_model = NamesConverter(model_name="knowledgator/IUPAC2SMILES-canonical-base") print(NamesConverter.validate_iupac(input_sequence='CCO', predicted_sequence='CCO', validation_model=validation_model)) ``` ```text 1.0 ``` ## Bias, Risks, and Limitations This model has limited accuracy in processing large molecules and currently, doesn't support isomeric and isotopic SMILES. ### Training Procedure <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> The model was trained on 100M examples of SMILES-IUPAC pairs with lr=0.0003, batch_size=1024 for 2 epochs. ## Evaluation | Model | Accuracy | BLEU-4 score | Size(MB) | |-------------------------------------|---------|------------------|----------| | SMILES2IUPAC-canonical-small |75%| 0.93 | 23 | | SMILES2IUPAC-canonical-base |86.9%|0.964|180| | STOUT V2.0\* | 66.65% | 0.92 | 128 | | STOUT V2.0 (according to our tests) | | 0.89 | 128 | *According to the original paper https://jcheminf.biomedcentral.com/articles/10.1186/s13321-021-00512-4 ## Citation Coming soon. ## Model Card Authors [Mykhailo Shtopko](https://huggingface.co/BioMike) ## Model Card Contact [email protected]
{"license": "apache-2.0", "tags": ["chemistry", "biology", "medical", "smiles", "iupac", "text-generation-inference"], "metrics": ["accuracy", "bleu"], "pipeline_tag": "text2text-generation", "widget": [{"text": "CCO", "example_title": "ethanol"}]}
text2text-generation
knowledgator/SMILES2IUPAC-canonical-small
[ "transformers", "safetensors", "mt5", "text2text-generation", "chemistry", "biology", "medical", "smiles", "iupac", "text-generation-inference", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2024-02-13T21:28:25+00:00
[]
[]
TAGS #transformers #safetensors #mt5 #text2text-generation #chemistry #biology #medical #smiles #iupac #text-generation-inference #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
SMILES2IUPAC-canonical-small ============================ SMILES2IUPAC-canonical-small was designed to accurately translate SMILES chemical names to IUPAC standards. Model Details ------------- ### Model Description SMILES2IUPAC-canonical-small is based on the MT5 model with optimizations in implementing different tokenizers for the encoder and decoder. * Developed by: Knowladgator Engineering * Model type: Encoder-Decoder with attention mechanism * Language(s) (NLP): SMILES, IUPAC (English) * License: Apache License 2.0 ### Model Sources * Paper: coming soon * Demo: ChemicalConverters Quickstart ---------- Firstly, install the library: ### SMILES to IUPAC #### ! Preferred IUPAC style To choose the preferred IUPAC style, place style tokens before your SMILES sequence. #### To perform simple translation, follow the example: #### Processing in batches: #### Validation SMILES to IUPAC translations It's possible to validate the translations by reverse translation into IUPAC and calculating Tanimoto similarity of two molecules fingerprints. ' ' The larger is Tanimoto similarity, the larger is probability, that the prediction was correct. You can also process validation manually: Bias, Risks, and Limitations ---------------------------- This model has limited accuracy in processing large molecules and currently, doesn't support isomeric and isotopic SMILES. ### Training Procedure The model was trained on 100M examples of SMILES-IUPAC pairs with lr=0.0003, batch\_size=1024 for 2 epochs. Evaluation ---------- Coming soon. Model Card Authors ------------------ Mykhailo Shtopko Model Card Contact ------------------ info@URL
[ "### Model Description\n\n\nSMILES2IUPAC-canonical-small is based on the MT5 model with optimizations in implementing different tokenizers for the encoder and decoder.\n\n\n* Developed by: Knowladgator Engineering\n* Model type: Encoder-Decoder with attention mechanism\n* Language(s) (NLP): SMILES, IUPAC (English)\n* License: Apache License 2.0", "### Model Sources\n\n\n* Paper: coming soon\n* Demo: ChemicalConverters\n\n\nQuickstart\n----------\n\n\nFirstly, install the library:", "### SMILES to IUPAC", "#### ! Preferred IUPAC style\n\n\nTo choose the preferred IUPAC style, place style tokens before\nyour SMILES sequence.", "#### To perform simple translation, follow the example:", "#### Processing in batches:", "#### Validation SMILES to IUPAC translations\n\n\nIt's possible to validate the translations by reverse translation into IUPAC\nand calculating Tanimoto similarity of two molecules fingerprints.\n'\n'\nThe larger is Tanimoto similarity, the larger is probability, that the prediction was correct.\n\n\nYou can also process validation manually:\n\n\nBias, Risks, and Limitations\n----------------------------\n\n\nThis model has limited accuracy in processing large molecules and currently, doesn't support isomeric and isotopic SMILES.", "### Training Procedure\n\n\nThe model was trained on 100M examples of SMILES-IUPAC pairs with lr=0.0003, batch\\_size=1024 for 2 epochs.\n\n\nEvaluation\n----------\n\n\n\nComing soon.\n\n\nModel Card Authors\n------------------\n\n\nMykhailo Shtopko\n\n\nModel Card Contact\n------------------\n\n\ninfo@URL" ]
[ "TAGS\n#transformers #safetensors #mt5 #text2text-generation #chemistry #biology #medical #smiles #iupac #text-generation-inference #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n", "### Model Description\n\n\nSMILES2IUPAC-canonical-small is based on the MT5 model with optimizations in implementing different tokenizers for the encoder and decoder.\n\n\n* Developed by: Knowladgator Engineering\n* Model type: Encoder-Decoder with attention mechanism\n* Language(s) (NLP): SMILES, IUPAC (English)\n* License: Apache License 2.0", "### Model Sources\n\n\n* Paper: coming soon\n* Demo: ChemicalConverters\n\n\nQuickstart\n----------\n\n\nFirstly, install the library:", "### SMILES to IUPAC", "#### ! Preferred IUPAC style\n\n\nTo choose the preferred IUPAC style, place style tokens before\nyour SMILES sequence.", "#### To perform simple translation, follow the example:", "#### Processing in batches:", "#### Validation SMILES to IUPAC translations\n\n\nIt's possible to validate the translations by reverse translation into IUPAC\nand calculating Tanimoto similarity of two molecules fingerprints.\n'\n'\nThe larger is Tanimoto similarity, the larger is probability, that the prediction was correct.\n\n\nYou can also process validation manually:\n\n\nBias, Risks, and Limitations\n----------------------------\n\n\nThis model has limited accuracy in processing large molecules and currently, doesn't support isomeric and isotopic SMILES.", "### Training Procedure\n\n\nThe model was trained on 100M examples of SMILES-IUPAC pairs with lr=0.0003, batch\\_size=1024 for 2 epochs.\n\n\nEvaluation\n----------\n\n\n\nComing soon.\n\n\nModel Card Authors\n------------------\n\n\nMykhailo Shtopko\n\n\nModel Card Contact\n------------------\n\n\ninfo@URL" ]
[ 75, 94, 30, 9, 32, 11, 8, 121, 74 ]
[ "passage: TAGS\n#transformers #safetensors #mt5 #text2text-generation #chemistry #biology #medical #smiles #iupac #text-generation-inference #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Model Description\n\n\nSMILES2IUPAC-canonical-small is based on the MT5 model with optimizations in implementing different tokenizers for the encoder and decoder.\n\n\n* Developed by: Knowladgator Engineering\n* Model type: Encoder-Decoder with attention mechanism\n* Language(s) (NLP): SMILES, IUPAC (English)\n* License: Apache License 2.0### Model Sources\n\n\n* Paper: coming soon\n* Demo: ChemicalConverters\n\n\nQuickstart\n----------\n\n\nFirstly, install the library:### SMILES to IUPAC#### ! Preferred IUPAC style\n\n\nTo choose the preferred IUPAC style, place style tokens before\nyour SMILES sequence.#### To perform simple translation, follow the example:#### Processing in batches:#### Validation SMILES to IUPAC translations\n\n\nIt's possible to validate the translations by reverse translation into IUPAC\nand calculating Tanimoto similarity of two molecules fingerprints.\n'\n'\nThe larger is Tanimoto similarity, the larger is probability, that the prediction was correct.\n\n\nYou can also process validation manually:\n\n\nBias, Risks, and Limitations\n----------------------------\n\n\nThis model has limited accuracy in processing large molecules and currently, doesn't support isomeric and isotopic SMILES.### Training Procedure\n\n\nThe model was trained on 100M examples of SMILES-IUPAC pairs with lr=0.0003, batch\\_size=1024 for 2 epochs.\n\n\nEvaluation\n----------\n\n\n\nComing soon.\n\n\nModel Card Authors\n------------------\n\n\nMykhailo Shtopko\n\n\nModel Card Contact\n------------------\n\n\ninfo@URL" ]
[ -0.01816508360207081, 0.19243282079696655, -0.006063899490982294, -0.051325444132089615, 0.01114952377974987, 0.016341712325811386, 0.10177384316921234, 0.12987172603607178, -0.006672416348010302, 0.1395346075296402, -0.05691524222493172, 0.004705336410552263, 0.15924644470214844, 0.07220668345689774, 0.04362326115369797, -0.22498291730880737, 0.097966268658638, -0.07446373254060745, -0.006525044795125723, 0.09154544025659561, 0.11607064306735992, -0.04467419162392616, 0.05957138538360596, 0.005489766132086515, 0.013978322967886925, -0.009536554105579853, -0.00045528163900598884, -0.07352519780397415, 0.03985186666250229, 0.034066203981637955, 0.025699017569422722, 0.08272606134414673, 0.021893952041864395, -0.2982316017150879, 0.012022378854453564, 0.05056589096784592, 0.030023163184523582, 0.04275748133659363, 0.18691694736480713, -0.020507769659161568, 0.11904805153608322, -0.10223156958818436, 0.04722869023680687, 0.048813074827194214, -0.09766469150781631, -0.15333744883537292, -0.0912301167845726, 0.06640177965164185, 0.043863777071237564, -0.031139792874455452, -0.026315748691558838, 0.10042383521795273, -0.0010314758401364088, 0.024980146437883377, 0.1716662049293518, -0.21713662147521973, -0.0020784130319952965, 0.016796568408608437, 0.07899976521730423, 0.07222314924001694, -0.01605822890996933, 0.020154399797320366, -0.012905780225992203, -0.03713361173868179, 0.036980096250772476, -0.032329387962818146, 0.08610643446445465, -0.03229039907455444, -0.12116137146949768, -0.07529260963201523, 0.10859041661024094, -0.02336227148771286, -0.10517697781324387, -0.14573925733566284, -0.06094075366854668, 0.08595985174179077, -0.046001799404621124, -0.11499869078397751, 0.06662420183420181, 0.02014913223683834, 0.11196854710578918, 0.00917065143585205, -0.09092883765697479, -0.01866554468870163, 0.01007848046720028, 0.0982208400964737, 0.043434664607048035, 0.0023925958666950464, 0.06734461337327957, 0.038555338978767395, -0.04098183289170265, -0.09065267443656921, -0.046946823596954346, -0.06645041704177856, -0.10188513249158859, -0.06724148988723755, -0.00892975740134716, -0.061452291905879974, 0.061457641422748566, 0.11938019841909409, -0.027997802942991257, 0.07620886713266373, 0.004623623099178076, -0.030025048181414604, 0.04685128852725029, 0.14572973549365997, -0.09766346216201782, -0.06944340467453003, -0.05013999342918396, 0.010637038387358189, -0.021646179258823395, 0.0378166027367115, 0.026185667142271996, -0.058480482548475266, 0.14417025446891785, 0.08537036925554276, 0.029022326692938805, 0.013777642510831356, -0.048318468034267426, -0.08002914488315582, 0.19656015932559967, -0.1129167228937149, 0.03744888678193092, 0.03482344001531601, -0.04401519149541855, 0.12939760088920593, 0.014113000594079494, -0.03203120082616806, -0.10302073508501053, 0.02310205064713955, -0.040839605033397675, 0.01907094568014145, -0.09035639464855194, -0.10693894326686859, 0.0249672569334507, 0.05646772310137749, -0.09129185974597931, -0.08774102479219437, -0.05685943737626076, -0.07526423037052155, 0.04548740014433861, -0.06626700609922409, 0.03493203967809677, 0.032428544014692307, 0.025770684704184532, 0.029939567670226097, 0.004995348397642374, -0.025211619213223457, -0.03330960497260094, 0.008921068161725998, -0.027694856747984886, 0.12306917458772659, 0.17132768034934998, 0.029056036844849586, -0.1165027841925621, 0.06745585054159164, -0.2207307517528534, 0.051083046942949295, -0.08974267542362213, -0.018117006868124008, -0.16740508377552032, -0.08450670540332794, -0.01416225265711546, -0.027691662311553955, 0.002106965286657214, 0.10765343904495239, -0.14389118552207947, -0.01921454817056656, 0.2056884765625, -0.13023681938648224, -0.025706086307764053, 0.09642595797777176, 0.03676728904247284, 0.013339533470571041, 0.07565131783485413, 0.10708600282669067, 0.09569112211465836, -0.1325949728488922, -0.0466826893389225, -0.041602157056331635, -0.03528107702732086, 0.1279740333557129, 0.0961289256811142, -0.01486958097666502, 0.013775203377008438, 0.041804395616054535, -0.0003309565654490143, -0.05430125072598457, -0.013768273405730724, -0.027419503778219223, 0.0020610445644706488, -0.008506213314831257, -0.02871575765311718, -0.023575425148010254, -0.047018859535455704, -0.043886374682188034, -0.05940105766057968, -0.02421543002128601, 0.07140308618545532, -0.015030594542622566, 0.022612787783145905, -0.08083849400281906, 0.05333774536848068, -0.010229429230093956, 0.004047759342938662, -0.14170756936073303, 0.008627825416624546, 0.11417924612760544, -0.18590594828128815, 0.011658716946840286, -0.07096976041793823, 0.004206751007586718, 0.06368979811668396, -0.04043295606970787, 0.0036106023471802473, 0.07678330689668655, -0.021417973563075066, -0.01145744789391756, -0.09318199008703232, -0.006824019830673933, -0.029157325625419617, 0.10938302427530289, -0.0771186575293541, 0.004008506890386343, 0.04930884391069412, 0.1115124374628067, 0.03779364004731178, -0.00692896731197834, 0.08179984986782074, -0.02355607971549034, -0.00295942067168653, -0.05632385239005089, 0.015050196088850498, -0.03239480406045914, 0.00981625821441412, 0.1562168449163437, -0.19606490433216095, -0.12397172302007675, 0.07591412961483002, 0.06867169588804245, -0.06283898651599884, -0.10230410099029541, -0.009844924323260784, -0.016784900799393654, -0.030659127980470657, -0.005753074306994677, -0.01740088127553463, 0.04076904058456421, 0.0790315493941307, -0.11982164531946182, -0.006666994653642178, -0.03473178669810295, -0.05532430112361908, -0.12066850811243057, 0.03119080327451229, 0.04623882472515106, -0.20559118688106537, 0.015999160706996918, -0.0018286589765921235, 0.0848783552646637, 0.06331174820661545, 0.04329944774508476, -0.11424180120229721, -0.013017388060688972, 0.04646431282162666, 0.06138208508491516, 0.02041913941502571, 0.039003271609544754, 0.042588796466588974, 0.09474863111972809, 0.02248973585665226, 0.012401534244418144, -0.07327954471111298, 0.11091875284910202, 0.04403550550341606, -0.06291338801383972, 0.015904905274510384, -0.032087843865156174, 0.037181831896305084, 0.0929301530122757, -0.0022971059661358595, 0.09438613802194595, -0.10828086733818054, -0.040578443557024, -0.11307904869318008, 0.17833344638347626, -0.09548165649175644, -0.2513602375984192, -0.1636650562286377, 0.008818358182907104, 0.01890500634908676, 0.03781978413462639, -0.0018200170015916228, 0.005499923601746559, -0.08852478861808777, -0.0951675996184349, 0.0257523525506258, 0.03511887043714523, -0.04184472933411598, -0.1245461106300354, 0.04743365943431854, 0.06885385513305664, -0.11061044037342072, -0.0037447912618517876, 0.04875363036990166, -0.004294475540518761, 0.09544814378023148, -0.053028617054224014, 0.0705946832895279, 0.10697022825479507, -0.002028967486694455, -0.05375460162758827, -0.004549847450107336, 0.1487073004245758, -0.057194191962480545, 0.13690976798534393, 0.17207710444927216, 0.04110752418637276, 0.11375938355922699, 0.1383862942457199, -0.03365376964211464, -0.0667906180024147, 0.05072379484772682, -0.025270545855164528, -0.008624321781098843, -0.2070808857679367, -0.06553643196821213, -0.018418803811073303, 0.015183726325631142, 0.06309515237808228, 0.024950560182332993, 0.02113480679690838, 0.08632457256317139, -0.09622648358345032, -0.026722870767116547, 0.054609376937150955, 0.10677406191825867, 0.07443127781152725, 0.04696027562022209, 0.07130684703588486, -0.025522245094180107, -0.00033585671917535365, 0.10839316993951797, 0.036638349294662476, 0.13721761107444763, -0.011544485576450825, 0.19717924296855927, 0.06505243480205536, 0.04393962770700455, 0.06608884036540985, 0.09507029503583908, -0.0028356541879475117, 0.059775460511446, -0.0431213453412056, -0.08045165985822678, -0.01593657024204731, 0.075921431183815, 0.0387549102306366, -0.026040444150567055, 0.03104480728507042, 0.03321323171257973, 0.05685838311910629, 0.20880191028118134, 0.05287715420126915, -0.19828613102436066, -0.018808793276548386, 0.0794353261590004, -0.06968584656715393, -0.08443781733512878, -0.04015623405575752, 0.026796484366059303, -0.11162631213665009, 0.09979654848575592, -0.028380976989865303, 0.0869908258318901, -0.07424893230199814, -0.025073295459151268, 0.0382012240588665, 0.08531893044710159, -0.022216742858290672, 0.11554960161447525, -0.16752085089683533, 0.08348765224218369, 0.013010821305215359, 0.04045124724507332, -0.05532962828874588, 0.045560307800769806, 0.00016097171464934945, 0.05363709107041359, 0.08475601673126221, 0.02642456255853176, -0.20013253390789032, -0.07128344476222992, -0.11629970371723175, -0.0387299619615078, 0.11694543808698654, 0.017329735681414604, 0.13662642240524292, -0.022621050477027893, -0.036671653389930725, -0.07851676642894745, -0.01968785747885704, -0.16698910295963287, -0.12886621057987213, 0.08910893648862839, -0.11921930313110352, 0.08221980184316635, -0.06896425038576126, 0.013486329466104507, -0.0720803365111351, 0.12251246720552444, -0.04702253267168999, -0.0841246098279953, -0.1491008698940277, -0.06972100585699081, 0.18456508219242096, -0.07397989183664322, -0.008269988000392914, -0.032123807817697525, 0.17296092212200165, -0.027631374076008797, -0.07685238122940063, 0.06076901778578758, -0.1103210374712944, -0.1587161272764206, -0.056260690093040466, 0.14399588108062744, 0.009290539659559727, 0.06963106244802475, 0.05189649015665054, 0.032727040350437164, -0.003161685774102807, -0.08152651786804199, 0.038286104798316956, 0.10523116588592529, 0.045315902680158615, 0.03547993674874306, -0.08706807345151901, -0.1433112621307373, -0.12287488579750061, -0.03504627197980881, -0.0035629665944725275, 0.25803595781326294, -0.04371635988354683, 0.13444454967975616, 0.193353071808815, -0.120638906955719, -0.14011111855506897, -0.045731764286756516, 0.03915732353925705, 0.012764827348291874, -0.018814461305737495, -0.19008886814117432, 0.09433423727750778, 0.03308744356036186, -0.007067120634019375, 0.01346112322062254, -0.19338560104370117, -0.14278899133205414, -0.00802027340978384, 0.0484066978096962, -0.09004431217908859, -0.12034888565540314, -0.08262068033218384, -0.10255151987075806, -0.14014486968517303, 0.06783870607614517, 0.0395476371049881, 0.0583171620965004, -0.002817561849951744, -0.003928289748728275, 0.030281372368335724, -0.03338440880179405, 0.15576042234897614, -0.004481064155697823, 0.04488592967391014, -0.05018264427781105, 0.05796470493078232, 0.004852592945098877, 0.01347858551889658, 0.12392064929008484, 0.06330309808254242, 0.01476363930851221, -0.04019494354724884, -0.06994011998176575, -0.05207476764917374, 0.006837088149040937, -0.04160319268703461, -0.05170951783657074, -0.08779177069664001, 0.09846329689025879, 0.11856727302074432, -0.04578060284256935, -0.14952373504638672, -0.05720873549580574, -0.011072601191699505, 0.1878853738307953, 0.14942492544651031, -0.015720544382929802, 0.027872366830706596, 0.02700919844210148, 0.015608183108270168, 0.012298187240958214, 0.019510610029101372, 0.07038313150405884, 0.06886731833219528, -0.003282164689153433, 0.11054662615060806, -0.04656343162059784, -0.1390209197998047, -0.06387490034103394, 0.0816505029797554, -0.0782972052693367, -0.16303415596485138, 0.015648821368813515, 0.11108289659023285, -0.06090663745999336, -0.14384876191616058, 0.08594177663326263, -0.0027218456380069256, -0.07191893458366394, -0.0015769366873428226, 0.08032199740409851, 0.062104858458042145, 0.1522744596004486, 0.01986842229962349, 0.04884706065058708, -0.04351842775940895, 0.1240835189819336, 0.11609738320112228, -0.1602884829044342, -0.052520498633384705, 0.09517235308885574, -0.11378210783004761, -0.029756151139736176, 0.02212618663907051, 0.030363254249095917, 0.021309416741132736, -0.06776157021522522, -0.06213989108800888, -0.09857835620641708, 0.04325305297970772, 0.09054728597402573, 0.044039346277713776, 0.0021680830977857113, 0.025171179324388504, 0.007565530948340893, -0.042659692466259, 0.09038100391626358, 0.06618046760559082, 0.07257227599620819, -0.04212149232625961, 0.12064757198095322, -0.0457407683134079, -0.046222612261772156, -0.023374788463115692, -0.005771011114120483, -0.057443249970674515, -0.052319932729005814, -0.04812771826982498, 0.06827428936958313, -0.15812736749649048, -0.03582654520869255, 0.008112307637929916, 0.0035229327622801065, 0.002034231787547469, -0.012421619147062302, -0.09104500710964203, -0.05125093087553978, -0.11584363877773285, 0.11098860949277878, -0.10513407737016678, -0.001638342859223485, 0.1258619725704193, -0.08557502925395966, 0.059548232704401016, 0.0021676383912563324, -0.015568014234304428, -0.021965760737657547, -0.09739670157432556, -0.0545816570520401, 0.021155769005417824, 0.07807955145835876, -0.04377729818224907, -0.2570057213306427, 0.04615362733602524, 0.010849910788238049, -0.0641225278377533, -0.02924034558236599, 0.009168488904833794, -0.08963967859745026, -0.0005422959220595658, -0.07793904095888138, -0.05644345283508301, -0.05742967873811722, 0.043038129806518555, 0.10136958211660385, -0.04975389316678047, 0.11383135616779327, -0.07310359925031662, 0.0001436006568837911, -0.14273253083229065, -0.011957610957324505, 0.0009974028216674924, -0.04021152853965759, -0.018236801028251648, -0.03604784235358238, 0.0613747164607048, 0.04964005947113037, 0.1359216272830963, 0.008986618369817734, 0.027238259091973305, 0.04800326004624367, -0.03221147879958153, -0.033321257680654526, 0.05441749840974808, 0.05719789117574692, 0.035871103405952454, -0.0115595031529665, 0.03682854771614075, -0.0018542585894465446, -0.03129132091999054, 0.011209340766072273, 0.10223890095949173, 0.24199195206165314, 0.10172881931066513, 0.04173091799020767, -0.04037651792168617, -0.04799211397767067, -0.07069281488656998, 0.024393970146775246, -0.07545366138219833, 0.027143249288201332, -0.012860117480158806, 0.11010761559009552, 0.13079799711704254, -0.20610038936138153, 0.12265574187040329, -0.04612456262111664, -0.048987988382577896, -0.09756040573120117, -0.12704645097255707, -0.09599486738443375, -0.047438088804483414, -0.03739556297659874, -0.09660425037145615, 0.058605823665857315, 0.06528772413730621, 0.0295767430216074, 0.015146163292229176, 0.044353727251291275, -0.1158788800239563, -0.10109375417232513, 0.08283461630344391, 0.008557498455047607, 0.02060622349381447, 0.11343662440776825, 0.003775546560063958, 0.0354950875043869, 0.02263708971440792, 0.07864478975534439, 0.05942229554057121, 0.03972221538424492, 0.0002707852690946311, -0.06963608413934708, -0.06859740614891052, 0.02106659486889839, -0.05793704837560654, -0.017087630927562714, 0.05755017325282097, 0.0468231700360775, -0.011382077820599079, -0.002160044852644205, 0.17241603136062622, -0.036737289279699326, -0.12718971073627472, -0.1417166143655777, 0.14591358602046967, 0.023589707911014557, 0.0672057643532753, 0.049653299152851105, -0.07528817653656006, -0.03384413197636604, 0.1204313412308693, 0.03845347464084625, 0.009085956029593945, -0.024473998695611954, 0.02474389784038067, 0.005148342344909906, 0.038583531975746155, 0.06740883737802505, 0.043173227459192276, 0.10873354226350784, -0.04583238065242767, 0.09448329359292984, -0.03756764158606529, -0.055308662354946136, -0.12945541739463806, 0.16762320697307587, -0.01541490200906992, 0.01944475620985031, -0.04110043868422508, 0.07864145189523697, 0.03885554149746895, -0.16359172761440277, -0.02041940577328205, -0.0006037302664481103, -0.09803853929042816, -0.016221990808844566, -0.022874854505062103, 0.04197598248720169, 0.029583469033241272, 0.02584962546825409, -0.014422014355659485, 0.1357085406780243, 0.06867533177137375, -0.07884686440229416, -0.005371816921979189, 0.06685221940279007, -0.05625704675912857, 0.14484363794326782, 0.026463041082024574, 0.07155707478523254, 0.09595482796430588, -0.05115742236375809, -0.13371515274047852, 0.05409929156303406, 0.043647557497024536, -0.0633193776011467, 0.10097111016511917, 0.15871615707874298, 0.03749155253171921, 0.012572373263537884, 0.053130533546209335, -0.054432161152362823, 0.05761351436376572, -0.0308562982827425, -0.04167192801833153, -0.07338441908359528, 0.08280482143163681, -0.09416339546442032, 0.07624708116054535, 0.17600299417972565, -0.02725042775273323, 0.04333360120654106, -0.06439817696809769, 0.0657457560300827, 0.0022042840719223022, 0.063792385160923, -0.011158265173435211, -0.09734101593494415, -0.0016741639701649547, -0.07890079915523529, 0.11520156264305115, -0.15354393422603607, -0.06759793311357498, 0.009355523623526096, -0.015518450178205967, -0.06911616772413254, 0.13427738845348358, 0.0298785287886858, 0.020736539736390114, -0.038157571107149124, -0.21939560770988464, 0.024449627846479416, 0.11148770153522491, -0.13159990310668945, -0.028526319190859795 ]
null
null
transformers
===== Solstice-11B-v1 ===== A model trained with the sole goal of NSFW. That is it. Results are to be as expected. Finetuned off several instruct datasets that are NSFW. Example subset Below, other instruct datasets modified are private for now. Outputs were manually verified by me and two good friends. [Lewd-Assistant-v1](https://huggingface.co/datasets/Himitsui/Lewd-Assistant-v1) ---> Used a combination of Claude 2.0, GPT-4-Turbo and WinterGoddess-1.4x to reformat instead of a small 10B model [Fimbulvetr-v1] like the example dataset shown, which resulted in less errors and better answers. Private for now. Ruled Out names: <br>Solarslut <br>Solascivious <br>Sultry <br>Sundress <br>Scorch *** Prompt Format: Alpaca There are several issues with the model, but this is an experimental one so :shrug: <br>----> May speak as {{user}} sometimes. I know what causes it. I kinda like it though that way. <br>----> May ramble on or give small outputs. Sampler settings. <br>----> May be a little inconsistent at times. Yeah its inevitable due to the nature of the data. <br>----> Steers towards NSFW --> As Expected. *** GGUF: https://huggingface.co/Sao10K/Solstice-11B-v1-GGUF
{"language": ["en"], "license": "cc-by-nc-4.0", "datasets": ["Himitsui/Lewd-Assistant-v1"]}
text-generation
LoneStriker/Solstice-11B-v1-GPTQ
[ "transformers", "pytorch", "llama", "text-generation", "en", "dataset:Himitsui/Lewd-Assistant-v1", "license:cc-by-nc-4.0", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2024-02-13T21:29:09+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #llama #text-generation #en #dataset-Himitsui/Lewd-Assistant-v1 #license-cc-by-nc-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
===== Solstice-11B-v1 ===== A model trained with the sole goal of NSFW. That is it. Results are to be as expected. Finetuned off several instruct datasets that are NSFW. Example subset Below, other instruct datasets modified are private for now. Outputs were manually verified by me and two good friends. Lewd-Assistant-v1 ---> Used a combination of Claude 2.0, GPT-4-Turbo and WinterGoddess-1.4x to reformat instead of a small 10B model [Fimbulvetr-v1] like the example dataset shown, which resulted in less errors and better answers. Private for now. Ruled Out names: <br>Solarslut <br>Solascivious <br>Sultry <br>Sundress <br>Scorch * Prompt Format: Alpaca There are several issues with the model, but this is an experimental one so :shrug: <br>----> May speak as {{user}} sometimes. I know what causes it. I kinda like it though that way. <br>----> May ramble on or give small outputs. Sampler settings. <br>----> May be a little inconsistent at times. Yeah its inevitable due to the nature of the data. <br>----> Steers towards NSFW --> As Expected. * GGUF: URL
[]
[ "TAGS\n#transformers #pytorch #llama #text-generation #en #dataset-Himitsui/Lewd-Assistant-v1 #license-cc-by-nc-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 77 ]
[ "passage: TAGS\n#transformers #pytorch #llama #text-generation #en #dataset-Himitsui/Lewd-Assistant-v1 #license-cc-by-nc-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.049034226685762405, 0.10144619643688202, -0.005310742650181055, 0.037241458892822266, 0.1303713023662567, 0.017335545271635056, 0.14252999424934387, 0.11379635334014893, -0.005623956210911274, -0.03997872397303581, 0.15055686235427856, 0.22561697661876678, -0.005716316867619753, 0.025737494230270386, -0.07812640070915222, -0.19685706496238708, 0.031945742666721344, 0.07199884951114655, 0.0009695991757325828, 0.10655628144741058, 0.10498841851949692, -0.06860098987817764, 0.09109021723270416, -0.02456873282790184, -0.1569415032863617, 0.02397669479250908, 0.030383840203285217, -0.11827446520328522, 0.10542871803045273, 0.056915976107120514, 0.09415589272975922, 0.07842440903186798, -0.0075564272701740265, -0.15251457691192627, 0.02114257775247097, -0.005708994809538126, -0.07609643787145615, 0.06579328328371048, 0.05390128493309021, -0.027117885649204254, 0.10850529372692108, 0.0403880812227726, -0.041287485510110855, 0.0467577800154686, -0.10149915516376495, -0.0192161425948143, -0.06091657280921936, 0.0460011251270771, 0.05683813989162445, 0.09331124275922775, 0.024533160030841827, 0.13188046216964722, -0.09107843041419983, 0.08752235770225525, 0.09963053464889526, -0.31331998109817505, 0.01319095864892006, 0.09934073686599731, 0.03383893519639969, 0.04641084745526314, -0.021290304139256477, 0.06774309277534485, 0.04784923046827316, 0.0007684126612730324, 0.03598456457257271, -0.0797121524810791, -0.09438449889421463, 0.06742703169584274, -0.06590547412633896, -0.03863084688782692, 0.2922324538230896, -0.030988743528723717, 0.04172538220882416, -0.008086903020739555, -0.050714027136564255, 0.011418753303587437, -0.015883255749940872, 0.03745931759476662, -0.008732186630368233, 0.05612645298242569, 0.03042249009013176, -0.0401790477335453, -0.13913358747959137, -0.020970935001969337, -0.18500429391860962, 0.07730595767498016, 0.012230283580720425, 0.049645330756902695, -0.12579062581062317, 0.06767009198665619, 0.034521471709012985, -0.0844985693693161, 0.0035924408584833145, -0.053609706461429596, 0.07973714172840118, -0.0033421244006603956, -0.03879298269748688, -0.022027965635061264, 0.10066821426153183, 0.07918766885995865, 0.017121922224760056, -0.020639244467020035, -0.08546588569879532, 0.10821904987096786, -0.013696074485778809, 0.030209684744477272, -0.03432465344667435, 0.005157887935638428, 0.08075081557035446, -0.07926378399133682, 0.04890313372015953, -0.0285712368786335, -0.1669217199087143, -0.047651249915361404, -0.021985750645399094, 0.10965735465288162, 0.028758613392710686, 0.0803452581167221, -0.03757987916469574, -0.013860568404197693, 0.0998191386461258, -0.0681782215833664, -0.005897911731153727, -0.01723933219909668, -0.00974948424845934, 0.09400562942028046, 0.02334989234805107, 0.02616862952709198, -0.08904144912958145, 0.06954825669527054, -0.07460135221481323, -0.006143215112388134, -0.03521576523780823, -0.05373920500278473, 0.07148505002260208, -0.0974171832203865, 0.030807875096797943, -0.1752423644065857, -0.20109885931015015, 0.024895215407013893, 0.0106661356985569, -0.015678955242037773, -0.07431206107139587, -0.038309354335069656, -0.03380206599831581, 0.018048031255602837, -0.08405132591724396, -0.0024438153486698866, -0.08034411817789078, 0.10252384841442108, -0.05322210118174553, 0.04925856739282608, -0.15059894323349, 0.0595194548368454, -0.08959881961345673, -0.012827512808144093, -0.007095616310834885, 0.050098419189453125, -0.03298885375261307, 0.105813167989254, -0.05119280889630318, -0.025412462651729584, -0.024753475561738014, 0.03582635521888733, -0.016930731013417244, 0.18637792766094208, -0.13419866561889648, -0.07243777066469193, 0.15280477702617645, -0.08325513452291489, -0.18884451687335968, 0.09167393296957016, 0.00922262854874134, 0.04589606076478958, 0.08483385294675827, 0.14735817909240723, 0.0350952111184597, -0.08198394626379013, 0.018709788098931313, 0.09923137724399567, -0.06979144364595413, -0.20308828353881836, 0.021961500868201256, 0.007903855293989182, -0.09841631352901459, 0.04337438941001892, 0.0199807807803154, 0.06771830469369888, -0.043807487934827805, -0.06553755700588226, -0.04720250144600868, -0.05599060654640198, -0.018855394795536995, 0.0029671976808458567, 0.08480874449014664, -0.051144134253263474, 0.002517574466764927, -0.005882262717932463, 0.031557876616716385, -0.023341625928878784, 0.056014902889728546, -0.06204771623015404, 0.11400096863508224, -0.03765048086643219, 0.03347593918442726, -0.13520418107509613, -0.03351351618766785, -0.01180564146488905, 0.14856234192848206, 0.025070972740650177, 0.04837543144822121, 0.01079084724187851, -0.026632050052285194, -0.025501301512122154, 0.00958278588950634, 0.15494011342525482, -0.0007252903888002038, -0.0532318577170372, -0.10751572251319885, 0.06776221841573715, -0.02253667823970318, 0.016523046419024467, -0.09919705986976624, 0.006435787305235863, 0.04849201440811157, 0.09319677948951721, -0.028797855600714684, 0.08913536369800568, -0.00775644788518548, 0.04732397571206093, -0.07237828522920609, 0.03538600727915764, 0.1164293885231018, 0.016884498298168182, -0.09675242006778717, 0.21493247151374817, -0.09469964355230331, 0.22045156359672546, 0.20328935980796814, -0.2403641641139984, 0.05252329632639885, -0.07172700762748718, -0.01727456972002983, -0.0074929846450686455, 0.025716116651892662, -0.009235285222530365, 0.03171088919043541, -0.008625861257314682, 0.1834697276353836, -0.06712543964385986, -0.0054125408641994, 0.002795885084196925, -0.042803213000297546, -0.03943811357021332, 0.08297725766897202, 0.19316521286964417, -0.12646368145942688, 0.16178403794765472, 0.23159843683242798, -0.013848419301211834, 0.1402653455734253, -0.03274041414260864, -0.019341804087162018, 0.03710115700960159, -0.02327699586749077, -0.012436605989933014, -0.010511710308492184, -0.10541673004627228, -0.002799295587465167, 0.08416956663131714, -0.00018182213534601033, 0.06753597408533096, -0.1491035521030426, -0.07189694792032242, -0.0310774315148592, -0.03873962536454201, -0.03537937253713608, 0.08782068639993668, 0.01753067784011364, 0.128334641456604, -0.051640938967466354, -0.06278031319379807, 0.11246023327112198, 0.005560848396271467, -0.0898054838180542, 0.1569569706916809, -0.13892027735710144, -0.2924811542034149, -0.16305863857269287, -0.16571520268917084, -0.07089906185865402, 0.023396149277687073, 0.09480155259370804, -0.05249262973666191, -0.04799347370862961, -0.030985508114099503, -0.011247217655181885, -0.05897943302989006, -0.0034085328225046396, -0.0424722284078598, 0.06421343982219696, -0.06029471755027771, -0.11193588376045227, -0.043454427272081375, -0.0044265370815992355, -0.0640472024679184, 0.12870556116104126, -0.09717836230993271, 0.09882897138595581, 0.1328934282064438, 0.025340624153614044, 0.026841068640351295, -0.04431796446442604, 0.13088175654411316, -0.04853203520178795, -0.009431427344679832, 0.22270165383815765, 0.0011693616397678852, 0.05880896374583244, 0.13138096034526825, 0.038107480853796005, -0.07699986547231674, 0.0045023816637694836, -0.04538979381322861, -0.0798133835196495, -0.23326468467712402, -0.1373303234577179, -0.11261934041976929, 0.05847621709108353, 0.05341200903058052, 0.06106415018439293, 0.11219381541013718, 0.08764012902975082, -0.014922751113772392, 0.058201052248477936, -0.021582402288913727, 0.07762761414051056, 0.2839027941226959, -0.01894594356417656, 0.13017886877059937, -0.08824797719717026, -0.05806112289428711, 0.09288623183965683, 0.10236862301826477, 0.12053664028644562, 0.059395331889390945, 0.08671223372220993, 0.06011790409684181, 0.13369910418987274, 0.13008637726306915, 0.09064308553934097, 0.04012376442551613, -0.004751825239509344, -0.01217054482549429, -0.044730979949235916, -0.035446617752313614, 0.035599954426288605, 0.013642698526382446, -0.13429448008537292, -0.02092130482196808, -0.07859308272600174, 0.05072641372680664, 0.119310662150383, 0.025424810126423836, -0.23313206434249878, 0.025882529094815254, 0.05987260863184929, -0.0031277218367904425, -0.07122143357992172, 0.08518851548433304, -0.016570409759879112, -0.08306598663330078, 0.07057943195104599, -0.027958616614341736, 0.12143699079751968, -0.07391038537025452, 0.0519358366727829, -0.03370404988527298, -0.05230972543358803, 0.03313904255628586, 0.1089467853307724, -0.3237506151199341, 0.22024185955524445, 0.018825465813279152, -0.026497717946767807, -0.09279364347457886, -0.025149978697299957, 0.012764468789100647, 0.1749781221151352, 0.12539535760879517, -0.007683464791625738, -0.015535303391516209, -0.05810444429516792, -0.05734163895249367, 0.035665884613990784, 0.06896401941776276, 0.0010584074771031737, -0.0003624292730819434, -0.024091387167572975, -0.005279526114463806, -0.0014077178202569485, -0.00490220682695508, -0.043236907571554184, -0.17163509130477905, 0.05557248741388321, 0.14406868815422058, 0.05307428911328316, -0.004156381823122501, -0.039885297417640686, -0.1416422724723816, 0.17220140993595123, -0.13633225858211517, -0.07610277086496353, -0.10404983907938004, -0.062380265444517136, 0.05498787760734558, -0.05518300458788872, 0.03755813464522362, -0.06245018541812897, -0.002284383401274681, -0.06032416224479675, -0.17134849727153778, 0.06986992806196213, -0.0996970683336258, -0.015461762435734272, -0.03423244133591652, 0.15149866044521332, -0.08292442560195923, 0.017603974789381027, 0.02323228307068348, 0.01411656104028225, -0.07490243017673492, -0.09193401783704758, -0.005053247790783644, 0.03293938934803009, 0.09035728871822357, 0.01723586395382881, -0.14129149913787842, -0.0095018669962883, -0.014963537454605103, -0.08424284309148788, 0.26092803478240967, 0.19374249875545502, -0.05883967876434326, 0.1770647019147873, 0.15347951650619507, -0.11490184813737869, -0.32271862030029297, -0.1000501811504364, -0.11443756520748138, -0.028265872970223427, -0.04509029909968376, -0.18660955131053925, 0.077033631503582, 0.049923256039619446, -0.031645506620407104, 0.10615070909261703, -0.23199057579040527, -0.09402638673782349, 0.12149857729673386, 0.018926985561847687, 0.2996464669704437, -0.12258071452379227, -0.08332320302724838, -0.08195407688617706, -0.13797886669635773, 0.19211558997631073, 0.00286630867049098, 0.10877324640750885, -0.04681301862001419, 0.09396523237228394, 0.007644746918231249, -0.03690613806247711, 0.10031165927648544, 0.006828084588050842, 0.025079969316720963, -0.10434320569038391, -0.0027483508456498384, 0.08884637802839279, 0.002070714021101594, 0.041003402322530746, -0.11694075912237167, 0.008402612991631031, -0.1172478049993515, -0.027270840480923653, -0.05345173180103302, 0.07041863352060318, 0.009682005271315575, -0.05904176086187363, -0.022814305499196053, -0.019129959866404533, 0.023440003395080566, -0.015210927464067936, 0.18577711284160614, -0.03642501309514046, 0.07289411127567291, 0.120790034532547, 0.137875497341156, -0.13844045996665955, 0.029775775969028473, -0.07663135230541229, -0.07252693921327591, 0.06818535923957825, -0.13475310802459717, 0.03976571559906006, 0.133937805891037, -0.029848946258425713, 0.0664096474647522, 0.0765647441148758, 0.025761689990758896, -0.00638764351606369, 0.14227253198623657, -0.19646349549293518, 0.019226932898163795, -0.04374184459447861, 0.006416141055524349, 0.06395085155963898, 0.04797811433672905, 0.16513222455978394, -0.025768468156456947, -0.009783856570720673, 0.015844149515032768, 0.014720029197633266, -0.04439609870314598, 0.0675467774271965, 0.04722321033477783, 0.0031884554773569107, -0.1300448477268219, 0.10301768779754639, 0.025875011458992958, -0.11881901323795319, -0.002760943491011858, 0.102492555975914, -0.13203735649585724, -0.12005016207695007, -0.05686751753091812, 0.06437907367944717, -0.2058214694261551, -0.08213391900062561, -0.05728176608681679, -0.11623122543096542, 0.08587322384119034, 0.1539333164691925, 0.05562883988022804, 0.06769925355911255, -0.03056035190820694, -0.08275887370109558, -0.0737461969256401, 0.020974867045879364, -0.03823217377066612, 0.02078600414097309, -0.10290080308914185, 0.015325134620070457, -0.022673455998301506, 0.1308482140302658, -0.05975550785660744, -0.0260691586881876, -0.10978298634290695, 0.042345207184553146, -0.14944908022880554, -0.012575010769069195, -0.08017249405384064, -0.0310711357742548, -0.00464948546141386, -0.015980064868927002, -0.06127501279115677, -0.016248362138867378, -0.10641258955001831, 0.02068985439836979, -0.020699268206954002, 0.07433059066534042, -0.0841195210814476, -0.04886775091290474, 0.05056420713663101, -0.015597312711179256, 0.10742154717445374, 0.06851629167795181, -0.09662238508462906, 0.08528278768062592, -0.1458529531955719, -0.05586700141429901, 0.09775988012552261, 0.0555204413831234, 0.04613138735294342, 0.02461850643157959, 0.01933644339442253, 0.12628474831581116, -0.014499308541417122, 0.048842575401067734, 0.019805999472737312, -0.10847801715135574, -0.006426680367439985, -0.05144473537802696, -0.09860483556985855, -0.06442166119813919, -0.016366492956876755, 0.08798839896917343, 0.04380229115486145, 0.15357235074043274, -0.04235014319419861, 0.04793012514710426, -0.042971670627593994, 0.020120149478316307, -0.008167668245732784, -0.16375763714313507, -0.08338074386119843, -0.08619426190853119, 0.01709349825978279, -0.0072896163910627365, 0.256022572517395, 0.031082479283213615, -0.055167291313409805, 0.04080076143145561, 0.06237408146262169, -0.0060380566865205765, -0.007072872016578913, 0.25718989968299866, 0.08115017414093018, -0.0018656767206266522, -0.0709218680858612, 0.055914513766765594, 0.016445070505142212, 0.0564819797873497, 0.07317715883255005, 0.05913015827536583, 0.027723869308829308, 0.07779153436422348, 0.060071516782045364, -0.007243873085826635, -0.04913724586367607, -0.06347004324197769, -0.027562761679291725, 0.08634694665670395, -0.0175641942769289, 0.09104092419147491, 0.12690387666225433, -0.052363429218530655, 0.006516739260405302, -0.03274639695882797, -0.05228272080421448, -0.15372596681118011, -0.19265799224376678, -0.08410533517599106, -0.11393267661333084, 0.03020910546183586, -0.09558413177728653, 0.040890172123909, 0.06267426162958145, 0.05259932950139046, -0.050175171345472336, 0.003746095346286893, -0.010680551640689373, -0.06471575796604156, 0.07236453890800476, -0.028445163741707802, 0.02719707041978836, -0.07552506774663925, -0.014066273346543312, -0.063225157558918, -0.0467803031206131, -0.036711398512125015, 0.059050049632787704, 0.020925790071487427, 0.0454951673746109, -0.133080393075943, -0.07453975081443787, -0.04458501189947128, 0.054712191224098206, 0.02359694242477417, 0.16524924337863922, 0.014143720269203186, -0.011845545843243599, 0.05693808197975159, 0.172040656208992, -0.038605257868766785, -0.10545060783624649, -0.043073464184999466, 0.16965802013874054, 0.023394731804728508, 0.04107775166630745, 0.006772690452635288, 0.015023903921246529, -0.027153026312589645, 0.3453980088233948, 0.288048654794693, -0.10212592035531998, 0.018246658146381378, -0.021862352266907692, 0.03034358285367489, 0.06844350695610046, 0.12959334254264832, 0.10881296545267105, 0.20990362763404846, -0.07219986617565155, -0.06748899072408676, -0.07748356461524963, 0.021169332787394524, -0.10375670343637466, 0.08185678720474243, 0.005325254052877426, -0.09320597350597382, -0.03272281587123871, 0.09537962079048157, -0.17274652421474457, 0.08563049882650375, -0.004398995079100132, -0.12151198089122772, -0.03424680978059769, -0.002985055325552821, 0.07231775671243668, -0.0008496428490616381, 0.024526016786694527, -0.06182459369301796, -0.041415877640247345, 0.05475268512964249, -0.0044367010705173016, -0.2126384824514389, 0.02068512886762619, 0.06696335226297379, -0.0013985815457999706, 0.03933676332235336, -0.007371969521045685, 0.11203435063362122, 0.06820465624332428, 0.06483098119497299, -0.0719994381070137, 0.08872124552726746, 0.014060961082577705, -0.06090587005019188, 0.03618868812918663, -0.022428318858146667, -0.0032652681693434715, -0.004855544771999121, 0.04206673428416252, -0.037646062672138214, 0.06020224466919899, -0.0012370774056762457, -0.056301265954971313, -0.029712067916989326, 0.020235275849699974, -0.07321474701166153, 0.09561844915151596, 0.05176868662238121, -0.010201070457696915, -0.031797103583812714, -0.07183147221803665, -0.00331803853623569, -0.003993404563516378, -0.15203137695789337, -0.03916352614760399, -0.0990368202328682, -0.07603249698877335, 0.10195887833833694, 0.017836207523941994, -0.16887257993221283, -0.005666637793183327, -0.08880164474248886, 0.007904051803052425, -0.16425450146198273, 0.06891398131847382, 0.11502542346715927, -0.0008546813041903079, -0.020225094631314278, -0.05280805379152298, 0.042419176548719406, 0.03357136249542236, -0.09693200886249542, -0.09446734189987183 ]
null
null
transformers
# IUPAC2SMILES-canonical-small IUPAC2SMILES-canonical-small was designed to accurately translate IUPAC chemical names to SMILES. ## Model Details ### Model Description IUPAC2SMILES-canonical-small is based on the MT5 model with optimizations in implementing different tokenizers for the encoder and decoder. - **Developed by:** Knowladgator Engineering - **Model type:** Encoder-Decoder with attention mechanism - **Language(s) (NLP):** SMILES, IUPAC (English) - **License:** Apache License 2.0 ### Model Sources - **Paper:** coming soon - **Demo:** [ChemicalConverters](https://huggingface.co/spaces/knowledgator/ChemicalConverters) ## Quickstart Firstly, install the library: ```commandline pip install chemical-converters ``` ### IUPAC to SMILES #### To perform simple translation, follow the example: ```python from chemicalconverters import NamesConverter converter = NamesConverter(model_name="knowledgator/IUPAC2SMILES-canonical-small") print(converter.iupac_to_smiles('ethanol')) print(converter.iupac_to_smiles(['ethanol', 'ethanol', 'ethanol'])) ``` ```text ['CCO'] ['CCO', 'CCO', 'CCO'] ``` #### Processing in batches: ```python from chemicalconverters import NamesConverter converter = NamesConverter(model_name="knowledgator/IUPAC2SMILES-canonical-small") print(converter.iupac_to_smiles(["buta-1,3-diene" for _ in range(10)], num_beams=1, process_in_batch=True, batch_size=1000)) ``` ```text ['<SYST>C=CC=C', '<SYST>C=CC=C'...] ``` Our models also predict IUPAC styles from the table: | Style Token | Description | |-------------|----------------------------------------------------------------------------------------------------| | `<BASE>` | The most known name of the substance, sometimes is the mixture of traditional and systematic style | | `<SYST>` | The totally systematic style without trivial names | | `<TRAD>` | The style is based on trivial names of the parts of substances | ## Bias, Risks, and Limitations This model has limited accuracy in processing large molecules and currently, doesn't support isomeric and isotopic SMILES. ### Training Procedure <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> The model was trained on 100M examples of SMILES-IUPAC pairs with lr=0.0003, batch_size=1024 for 2 epochs. ## Evaluation | Model | Accuracy | BLEU-4 score | Size(MB) | |-------------------------------------|---------|------------------|----------| | IUPAC2SMILES-canonical-small |88.9% |0.966 |23 | | IUPAC2SMILES-canonical-base |93.7% |0.974 |180 | | STOUT V2.0\* |68.47% |0.92 |128 | *According to the original paper https://jcheminf.biomedcentral.com/articles/10.1186/s13321-021-00512-4 ## Citation Coming soon. ## Model Card Authors [Mykhailo Shtopko](https://huggingface.co/BioMike) ## Model Card Contact [email protected]
{"license": "apache-2.0", "tags": ["chemistry", "biology", "medical", "smiles", "iupac", "text-generation-inference"], "metrics": ["accuracy", "bleu"], "pipeline_tag": "text2text-generation", "widget": [{"text": "ethanol", "example_title": "CCO"}]}
text2text-generation
knowledgator/IUPAC2SMILES-canonical-small
[ "transformers", "safetensors", "mt5", "text2text-generation", "chemistry", "biology", "medical", "smiles", "iupac", "text-generation-inference", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us", "has_space" ]
2024-02-13T21:34:45+00:00
[]
[]
TAGS #transformers #safetensors #mt5 #text2text-generation #chemistry #biology #medical #smiles #iupac #text-generation-inference #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us #has_space
IUPAC2SMILES-canonical-small ============================ IUPAC2SMILES-canonical-small was designed to accurately translate IUPAC chemical names to SMILES. Model Details ------------- ### Model Description IUPAC2SMILES-canonical-small is based on the MT5 model with optimizations in implementing different tokenizers for the encoder and decoder. * Developed by: Knowladgator Engineering * Model type: Encoder-Decoder with attention mechanism * Language(s) (NLP): SMILES, IUPAC (English) * License: Apache License 2.0 ### Model Sources * Paper: coming soon * Demo: ChemicalConverters Quickstart ---------- Firstly, install the library: ### IUPAC to SMILES #### To perform simple translation, follow the example: #### Processing in batches: Our models also predict IUPAC styles from the table: Bias, Risks, and Limitations ---------------------------- This model has limited accuracy in processing large molecules and currently, doesn't support isomeric and isotopic SMILES. ### Training Procedure The model was trained on 100M examples of SMILES-IUPAC pairs with lr=0.0003, batch\_size=1024 for 2 epochs. Evaluation ---------- Coming soon. Model Card Authors ------------------ Mykhailo Shtopko Model Card Contact ------------------ info@URL
[ "### Model Description\n\n\nIUPAC2SMILES-canonical-small is based on the MT5 model with optimizations in implementing different tokenizers for the encoder and decoder.\n\n\n* Developed by: Knowladgator Engineering\n* Model type: Encoder-Decoder with attention mechanism\n* Language(s) (NLP): SMILES, IUPAC (English)\n* License: Apache License 2.0", "### Model Sources\n\n\n* Paper: coming soon\n* Demo: ChemicalConverters\n\n\nQuickstart\n----------\n\n\nFirstly, install the library:", "### IUPAC to SMILES", "#### To perform simple translation, follow the example:", "#### Processing in batches:\n\n\nOur models also predict IUPAC styles from the table:\n\n\n\nBias, Risks, and Limitations\n----------------------------\n\n\nThis model has limited accuracy in processing large molecules and currently, doesn't support isomeric and isotopic SMILES.", "### Training Procedure\n\n\nThe model was trained on 100M examples of SMILES-IUPAC pairs with lr=0.0003, batch\\_size=1024 for 2 epochs.\n\n\nEvaluation\n----------\n\n\n\nComing soon.\n\n\nModel Card Authors\n------------------\n\n\nMykhailo Shtopko\n\n\nModel Card Contact\n------------------\n\n\ninfo@URL" ]
[ "TAGS\n#transformers #safetensors #mt5 #text2text-generation #chemistry #biology #medical #smiles #iupac #text-generation-inference #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us #has_space \n", "### Model Description\n\n\nIUPAC2SMILES-canonical-small is based on the MT5 model with optimizations in implementing different tokenizers for the encoder and decoder.\n\n\n* Developed by: Knowladgator Engineering\n* Model type: Encoder-Decoder with attention mechanism\n* Language(s) (NLP): SMILES, IUPAC (English)\n* License: Apache License 2.0", "### Model Sources\n\n\n* Paper: coming soon\n* Demo: ChemicalConverters\n\n\nQuickstart\n----------\n\n\nFirstly, install the library:", "### IUPAC to SMILES", "#### To perform simple translation, follow the example:", "#### Processing in batches:\n\n\nOur models also predict IUPAC styles from the table:\n\n\n\nBias, Risks, and Limitations\n----------------------------\n\n\nThis model has limited accuracy in processing large molecules and currently, doesn't support isomeric and isotopic SMILES.", "### Training Procedure\n\n\nThe model was trained on 100M examples of SMILES-IUPAC pairs with lr=0.0003, batch\\_size=1024 for 2 epochs.\n\n\nEvaluation\n----------\n\n\n\nComing soon.\n\n\nModel Card Authors\n------------------\n\n\nMykhailo Shtopko\n\n\nModel Card Contact\n------------------\n\n\ninfo@URL" ]
[ 79, 95, 30, 9, 11, 63, 74 ]
[ "passage: TAGS\n#transformers #safetensors #mt5 #text2text-generation #chemistry #biology #medical #smiles #iupac #text-generation-inference #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us #has_space \n### Model Description\n\n\nIUPAC2SMILES-canonical-small is based on the MT5 model with optimizations in implementing different tokenizers for the encoder and decoder.\n\n\n* Developed by: Knowladgator Engineering\n* Model type: Encoder-Decoder with attention mechanism\n* Language(s) (NLP): SMILES, IUPAC (English)\n* License: Apache License 2.0### Model Sources\n\n\n* Paper: coming soon\n* Demo: ChemicalConverters\n\n\nQuickstart\n----------\n\n\nFirstly, install the library:### IUPAC to SMILES#### To perform simple translation, follow the example:#### Processing in batches:\n\n\nOur models also predict IUPAC styles from the table:\n\n\n\nBias, Risks, and Limitations\n----------------------------\n\n\nThis model has limited accuracy in processing large molecules and currently, doesn't support isomeric and isotopic SMILES.### Training Procedure\n\n\nThe model was trained on 100M examples of SMILES-IUPAC pairs with lr=0.0003, batch\\_size=1024 for 2 epochs.\n\n\nEvaluation\n----------\n\n\n\nComing soon.\n\n\nModel Card Authors\n------------------\n\n\nMykhailo Shtopko\n\n\nModel Card Contact\n------------------\n\n\ninfo@URL" ]
[ -0.07072722911834717, 0.13918642699718475, -0.00275039323605597, -0.021253231912851334, 0.049189772456884384, 0.013972537592053413, 0.12198697775602341, 0.12781813740730286, 0.02985719032585621, 0.15019004046916962, 0.061505112797021866, 0.045572128146886826, 0.09345753490924835, 0.1740005910396576, 0.023521805182099342, -0.22405888140201569, 0.08358147740364075, -0.11863776296377182, -0.06573158502578735, 0.09068199247121811, 0.11199480295181274, -0.0672433152794838, 0.07726442813873291, -0.018634695559740067, -0.014765857718884945, -0.018148550763726234, -0.06686302274465561, -0.08497246354818344, 0.048557475209236145, 0.02623578906059265, 0.040373917669057846, 0.03913174569606781, 0.0700116977095604, -0.2927497923374176, 0.01060453336685896, 0.03867319971323013, 0.05780305340886116, 0.056742824614048004, 0.11858747154474258, -0.047989219427108765, 0.13058754801750183, -0.15312150120735168, 0.06879855692386627, 0.05925348028540611, -0.07580238580703735, -0.08555500209331512, -0.08216588944196701, 0.08989457041025162, 0.10605046153068542, 0.023609915748238564, -0.027278609573841095, 0.10252448171377182, 0.0008176160044968128, 0.0350327230989933, 0.16288627684116364, -0.1896783858537674, -0.012797363102436066, 0.0683695450425148, 0.06531988084316254, 0.09462765604257584, -0.014312738552689552, 0.04223332926630974, 0.026684226468205452, -0.01721460185945034, 0.033248305320739746, -0.012640819884836674, 0.11877164989709854, -0.009107165969908237, -0.1534772664308548, -0.08204561471939087, 0.13686151802539825, 0.0019695800729095936, -0.07948106527328491, -0.18232351541519165, -0.040698543190956116, -0.03464154899120331, -0.06974818557500839, -0.08433128893375397, 0.06453140079975128, 0.005899031646549702, 0.09058540314435959, -0.032027408480644226, -0.0731048658490181, -0.041202347725629807, 0.05738252028822899, 0.06315567344427109, 0.043806109577417374, 0.001930949161760509, 0.02896532602608204, 0.06610107421875, -0.03250431269407272, -0.08802936971187592, -0.07189290225505829, -0.08490458875894547, -0.06855428218841553, -0.032697033137083054, -0.003720920067280531, -0.03171141445636749, 0.07952680438756943, 0.12635186314582825, -0.06708262860774994, 0.063454270362854, 0.052253127098083496, 0.010990363545715809, 0.04688176512718201, 0.09449896961450577, -0.03982813283801079, -0.07443166524171829, -0.04566057771444321, 0.06821662187576294, -0.01771385781466961, 0.01304340735077858, -0.016580592840909958, 0.02113494649529457, 0.06457135826349258, 0.08287928253412247, 0.004344310611486435, 0.0270138718187809, -0.051423050463199615, -0.03682081773877144, 0.10530354827642441, -0.13228756189346313, 0.008780467323958874, 0.03597889840602875, -0.029835816472768784, 0.08748773485422134, 0.02460660971701145, 0.0006456106784753501, -0.058669399470090866, 0.01874716579914093, -0.06801823526620865, -0.03427216783165932, -0.11041845381259918, -0.09092923253774643, 0.012262881733477116, 0.03342941030859947, -0.06176247075200081, -0.090667724609375, -0.13691934943199158, -0.06565691530704498, 0.05508734658360481, -0.07024365663528442, 0.02163388393819332, -0.04410358890891075, -0.022575773298740387, 0.04815125837922096, 0.031719762831926346, 0.030609291046857834, -0.05140463635325432, -0.007822301238775253, -0.036713190376758575, 0.12090721726417542, 0.12747211754322052, 0.037330903112888336, -0.11740478873252869, 0.08466215431690216, -0.18116562068462372, 0.098396435379982, -0.1317596286535263, 0.010539977811276913, -0.17094101011753082, -0.07577542960643768, -0.011851462535560131, 0.015779979526996613, 0.025582455098628998, 0.18838536739349365, -0.16832324862480164, -0.0467875637114048, 0.19811131060123444, -0.1052258312702179, -0.06764986366033554, 0.08179447054862976, 0.005203988868743181, 0.0642707347869873, 0.03870179131627083, 0.11045995354652405, 0.04023341089487076, -0.14079996943473816, -0.03454174846410751, 0.0020272161345928907, 0.017163259908556938, 0.09460354596376419, 0.09853136539459229, -0.06712384521961212, 0.022139154374599457, 0.02913808822631836, -0.0846126452088356, -0.024941062554717064, -0.05541796237230301, -0.0617198646068573, 0.0004778947331942618, -0.05404350534081459, -0.02189035527408123, -0.0452827587723732, 0.009898912161588669, -0.045177776366472244, -0.09831486642360687, 0.020635128021240234, 0.10399016737937927, -0.047238655388355255, -0.004177541937679052, -0.06580358743667603, 0.0718650221824646, -0.009389624930918217, -0.0023434374015778303, -0.1314404457807541, -0.018303554505109787, 0.06646939367055893, -0.17484290897846222, 0.019815590232610703, -0.055935706943273544, 0.022765764966607094, 0.08785887807607651, 0.0217912420630455, -0.017775775864720345, 0.03491869196295738, 0.02647089585661888, -0.061452507972717285, -0.17933912575244904, -0.017232442274689674, -0.06494859606027603, 0.15078437328338623, -0.1904415786266327, 0.02919895201921463, 0.025256820023059845, 0.09043247997760773, 0.014759224839508533, -0.03404088318347931, 0.07484766095876694, -0.011182880960404873, -0.01699782907962799, -0.08552327007055283, 0.05320966988801956, -0.039886489510536194, -0.022332454100251198, 0.07665379345417023, -0.14722514152526855, -0.16551408171653748, 0.0918256863951683, 0.06168655678629875, -0.08997644484043121, -0.09550526738166809, -0.040730684995651245, -0.02682344615459442, -0.013047401793301105, -0.019925372675061226, 0.04103618860244751, 0.04693203046917915, 0.09448227286338806, -0.08436652272939682, -0.01600893773138523, -0.012522011063992977, -0.05750862509012222, -0.10585149377584457, 0.0955401286482811, 0.07787784188985825, -0.14714807271957397, 0.06879530102014542, 0.09805775433778763, 0.0719619169831276, 0.15140345692634583, 0.0238481517881155, -0.13100752234458923, 0.0013205244904384017, 0.028301075100898743, 0.036109354346990585, 0.061263807117938995, -0.05065178871154785, 0.0075385780073702335, 0.06215333566069603, 0.005485487170517445, 0.04204494133591652, -0.07805126905441284, 0.06297996640205383, 0.045821648091077805, -0.04219650477170944, 0.046533290296792984, -0.02344796434044838, -0.0052444711327552795, 0.07912849634885788, 0.02705869823694229, 0.02019377611577511, -0.05409548059105873, -0.03949727118015289, -0.1296902596950531, 0.206238254904747, -0.12094037979841232, -0.22264857590198517, -0.1545518934726715, 0.009196136146783829, 0.023669645190238953, 0.010204659774899483, 0.00400620698928833, -0.06473896652460098, -0.0901121199131012, -0.12501543760299683, 0.008175944909453392, 0.005539467092603445, -0.036132991313934326, -0.006854562554508448, 0.029547277837991714, 0.021613173186779022, -0.13753849267959595, -0.019952721893787384, 0.029734497889876366, -0.02276180312037468, 0.04380187764763832, -0.046189166605472565, 0.07793118804693222, 0.18356628715991974, 0.024285342544317245, 0.006549329496920109, -0.004895460791885853, 0.20461519062519073, -0.06403974443674088, 0.06457680463790894, 0.1296776831150055, 0.01553130429238081, 0.07688652724027634, 0.11191762983798981, 0.007982471957802773, -0.08441401273012161, 0.057879868894815445, -0.023825859650969505, -0.03817615285515785, -0.22307300567626953, -0.09910242259502411, -0.017090551555156708, 0.019328471273183823, 0.05351534113287926, 0.038081809878349304, 0.04137616232037544, 0.06805732101202011, -0.026629090309143066, -0.03295409306883812, -0.0024456719402223825, 0.1259344071149826, 0.04313618689775467, 0.010297897271811962, 0.06330596655607224, -0.02240959368646145, 0.016877466812729836, 0.0924704521894455, 0.00031043868511915207, 0.18783599138259888, 0.02680380456149578, 0.18281403183937073, 0.0976463034749031, 0.04057617485523224, 0.05109451338648796, 0.07183203846216202, -0.0031475902069360018, 0.04258748143911362, -0.04965324327349663, -0.1106676533818245, -0.015541408210992813, 0.1262759566307068, -0.02443194016814232, -0.02625509724020958, -0.015079876407980919, 0.12933455407619476, 0.048829805105924606, 0.24630126357078552, 0.06844372302293777, -0.24073243141174316, -0.04059579595923424, 0.0829702615737915, -0.02446146123111248, -0.05607302486896515, 0.007549747359007597, 0.04311232268810272, -0.1261359304189682, 0.019537396728992462, -0.03760634735226631, 0.08021224290132523, -0.043946970254182816, 0.00003926332647097297, 0.027386048808693886, 0.09293843805789948, -0.029421398416161537, 0.09434903413057327, -0.18269746005535126, 0.17299728095531464, 0.0012089069932699203, 0.06596168875694275, -0.07862133532762527, 0.06900493055582047, 0.032601773738861084, 0.020046548917889595, 0.16248628497123718, 0.03059380315244198, -0.1728878766298294, -0.09845074266195297, -0.13702815771102905, -0.02539496123790741, 0.11452649533748627, -0.004247426521033049, 0.12633858621120453, -0.021546559408307076, -0.038934215903282166, -0.03491513803601265, -0.043867554515600204, -0.17930357158184052, -0.14468219876289368, 0.08046859502792358, -0.10093531012535095, 0.05762923136353493, -0.08387120813131332, -0.032835952937603, -0.047895122319459915, 0.1603831946849823, -0.10711859911680222, -0.09529833495616913, -0.15230268239974976, -0.022200526669621468, 0.1531612128019333, -0.07082722336053848, 0.0401831790804863, -0.010784507729113102, 0.12938334047794342, -0.02884417399764061, -0.07496686279773712, 0.09620821475982666, -0.142366424202919, -0.20887506008148193, -0.058463819324970245, 0.11366099864244461, 0.04674112796783447, 0.06535810232162476, 0.026633530855178833, 0.05507737398147583, 0.0011861263774335384, -0.10616373270750046, 0.04167627543210983, 0.14434555172920227, 0.06019099801778793, 0.06089840084314346, -0.08411411941051483, -0.14962802827358246, -0.1001545786857605, -0.0706174448132515, 0.09029816091060638, 0.2726401686668396, -0.04834405705332756, 0.14472909271717072, 0.18304909765720367, -0.13138312101364136, -0.1473333090543747, -0.022211920469999313, 0.06021681800484657, 0.011044714599847794, 0.03547823429107666, -0.18833887577056885, 0.09249535202980042, 0.07593188434839249, -0.02860148623585701, 0.03488774225115776, -0.31287264823913574, -0.14445894956588745, 0.08090676367282867, 0.06620606780052185, -0.04241340607404709, -0.14833849668502808, -0.07764244079589844, -0.05541154369711876, -0.14432725310325623, 0.06918933242559433, 0.02629687450826168, 0.057408303022384644, 0.008232776075601578, 0.014040056616067886, 0.022748256102204323, -0.06400489062070847, 0.15893436968326569, 0.003472391050308943, 0.015814799815416336, -0.04326235130429268, -0.018161727115511894, 0.016093093901872635, -0.029587719589471817, 0.12651747465133667, -0.0036168161313980818, 0.03129149600863457, -0.126102477312088, -0.05007601156830788, -0.027604155242443085, 0.03919646516442299, -0.05761795490980148, -0.08807148784399033, -0.06690233200788498, 0.10118293762207031, 0.10722751170396805, -0.02725731022655964, -0.06488452851772308, -0.06733362376689911, 0.0013461688067764044, 0.1897769719362259, 0.12182994186878204, 0.03754463419318199, -0.03733876347541809, 0.0284540057182312, -0.0015357747906818986, 0.011109931394457817, -0.11206342279911041, 0.025336502119898796, 0.07950861006975174, 0.021792981773614883, 0.14019496738910675, -0.005453390534967184, -0.12146827578544617, -0.020195571705698967, 0.06355547904968262, -0.10067227482795715, -0.13954801857471466, 0.011860892176628113, 0.09571711719036102, -0.060997847467660904, -0.09321362525224686, 0.09718825668096542, -0.06925936043262482, -0.02499110996723175, -0.031880032271146774, 0.1102156788110733, 0.012823006138205528, 0.13937507569789886, 0.044575247913599014, 0.0760297179222107, -0.037924349308013916, 0.08977457880973816, 0.07388068735599518, -0.11315920948982239, 0.03789984807372093, 0.12351783365011215, -0.10331181436777115, -0.03662361577153206, 0.031157957389950752, 0.09949935972690582, -0.03899893909692764, -0.062067706137895584, -0.03704807162284851, -0.08488501608371735, 0.05957075580954552, 0.09549690783023834, 0.036763742566108704, -0.011491441167891026, -0.02333850786089897, 0.005127124488353729, -0.05148802325129509, 0.11263970285654068, 0.0397004671394825, 0.03534742072224617, -0.050985366106033325, 0.10534951090812683, -0.025361720472574234, -0.02130637690424919, -0.03813864290714264, -0.01635657623410225, -0.0922500416636467, -0.031745199114084244, -0.14500057697296143, 0.042343445122241974, -0.13881750404834747, -0.05376267805695534, -0.010354558005928993, 0.012447844259440899, 0.0007943677483126521, 0.004128929693251848, -0.07940199971199036, -0.0436955988407135, -0.07744081318378448, 0.12598398327827454, -0.11732256412506104, 0.008836794644594193, 0.08154288679361343, -0.08782099187374115, 0.06708469986915588, -0.0009816844249144197, -0.012007255107164383, -0.029369279742240906, -0.11887507140636444, 0.016164183616638184, 0.02047063410282135, 0.05423349142074585, -0.0036680346820503473, -0.18605230748653412, -0.01126027014106512, -0.02520165592432022, -0.017929578199982643, -0.01659969426691532, 0.048420269042253494, -0.09219607710838318, 0.0371055044233799, -0.03778634965419769, -0.015917474403977394, -0.05428726226091385, 0.042552024126052856, 0.09688437730073929, -0.022857151925563812, 0.110436350107193, -0.07311031967401505, 0.03298554569482803, -0.11841008812189102, -0.014230295084416866, 0.016723517328500748, -0.005822285078465939, -0.012682066299021244, -0.04139178618788719, 0.06234733387827873, -0.013051167130470276, 0.21154458820819855, -0.01625959575176239, 0.0016737972619011998, 0.06321533769369125, -0.06972754746675491, 0.018404776230454445, 0.06299057602882385, 0.11192753911018372, 0.051811374723911285, 0.022065062075853348, -0.0015148873208090663, 0.009555631317198277, 0.02704545482993126, -0.03752676397562027, 0.15254396200180054, 0.23470580577850342, 0.07869089394807816, 0.08705181628465652, 0.0010026784148067236, -0.09290151298046112, -0.11041712015867233, 0.02562086656689644, -0.027691438794136047, 0.09748568385839462, -0.05906502157449722, 0.16937121748924255, 0.13931410014629364, -0.20044372975826263, 0.07944552600383759, -0.06494756042957306, -0.08182242512702942, -0.10671521723270416, -0.11935526877641678, -0.06468440592288971, -0.08144830167293549, -0.0185061264783144, -0.10439325124025345, 0.05515943095088005, 0.08359388262033463, 0.05557475611567497, 0.023438338190317154, 0.08119763433933258, -0.10583449900150299, -0.03883712366223335, 0.07397483289241791, 0.0010951913427561522, 0.019540933892130852, -0.008964666165411472, -0.0384807288646698, 0.020532945170998573, 0.003478850470855832, 0.05264357477426529, 0.0388154573738575, -0.007537175435572863, -0.0027068022172898054, -0.013993574306368828, -0.07921414077281952, -0.00035225224564783275, -0.01228558924049139, -0.02389444224536419, 0.07090383768081665, 0.06568581610918045, -0.038949549198150635, -0.01392701081931591, 0.1541188657283783, -0.07529640942811966, -0.13530410826206207, -0.1349160373210907, 0.13152512907981873, -0.016139402985572815, 0.06772910058498383, 0.012997861951589584, -0.11016196757555008, -0.04710399731993675, 0.16836532950401306, 0.1631019413471222, -0.061044272035360336, -0.034612011164426804, -0.0030351956374943256, 0.0013947892002761364, -0.011271205730736256, 0.08262022584676743, 0.0706724226474762, 0.16184306144714355, -0.07600529491901398, 0.06875359266996384, -0.04206031560897827, -0.081444151699543, -0.14313526451587677, 0.12679922580718994, 0.029755068942904472, -0.005183465778827667, 0.0008164866594597697, 0.08019149303436279, -0.06503123790025711, -0.19171829521656036, -0.005231016781181097, -0.04647350683808327, -0.12229088693857193, -0.02493877336382866, -0.02639891393482685, 0.0557747557759285, 0.03200671076774597, 0.00855155847966671, -0.02936657704412937, 0.09822750836610794, 0.0391574390232563, -0.06297366321086884, -0.03396451100707054, 0.06633629649877548, -0.1057644709944725, 0.20212697982788086, 0.03576372191309929, 0.06865203380584717, 0.10221247375011444, -0.03704121708869934, -0.13050836324691772, 0.03847717493772507, 0.07274070382118225, -0.04172869026660919, 0.07585544139146805, 0.14431077241897583, -0.011717615649104118, 0.06457753479480743, 0.052693720906972885, -0.09856036305427551, 0.045405033975839615, 0.01344978716224432, -0.059953074902296066, -0.0717010572552681, 0.019621942192316055, -0.09756549447774887, 0.13194580376148224, 0.1729554384946823, -0.03945846110582352, 0.01866845041513443, -0.03132929280400276, 0.10257194936275482, -0.01795610971748829, 0.05382981523871422, -0.02969237044453621, -0.18130803108215332, 0.007560345344245434, -0.029950257390737534, 0.09173505008220673, -0.22643040120601654, -0.07378469407558441, -0.034751176834106445, 0.03216780349612236, -0.05471085011959076, 0.12212277203798294, 0.14594964683055878, 0.0177445150911808, -0.04854704812169075, -0.12467727810144424, 0.00027192654670216143, 0.14123624563217163, -0.11989950388669968, -0.03629092127084732 ]
null
null
null
The model is a 50/50 checkpoint merger of Dreamlike Photoreal 2.0 - Primary model (A), Dreamshaper5 - Secondary model (B), and Realistic Vision 1.3 -Tertiary model (C) I use this model mainly for watercolors. I get the best results using (watercolor on textured paper:1.3)
{"license": "mit"}
null
cbo305/Dreamlike_Shaper_RealisticV13_5050
[ "license:mit", "region:us" ]
2024-02-13T21:38:58+00:00
[]
[]
TAGS #license-mit #region-us
The model is a 50/50 checkpoint merger of Dreamlike Photoreal 2.0 - Primary model (A), Dreamshaper5 - Secondary model (B), and Realistic Vision 1.3 -Tertiary model (C) I use this model mainly for watercolors. I get the best results using (watercolor on textured paper:1.3)
[]
[ "TAGS\n#license-mit #region-us \n" ]
[ 11 ]
[ "passage: TAGS\n#license-mit #region-us \n" ]
[ 0.026221778243780136, -0.033018264919519424, -0.008281232789158821, -0.05295303836464882, 0.052470896393060684, 0.06768012046813965, 0.1598525494337082, 0.04655371606349945, 0.23683255910873413, -0.05407243221998215, 0.11752297729253769, 0.08923697471618652, 0.004284696187824011, -0.0009730930323712528, 0.014216204173862934, -0.17134642601013184, 0.04864625632762909, -0.02878100797533989, 0.08764812350273132, 0.032233644276857376, -0.006205103360116482, -0.03845774009823799, -0.0022142508532851934, -0.03178790956735611, -0.057939812541007996, 0.03869890421628952, 0.045729056000709534, -0.02754949778318405, 0.14189864695072174, -0.021783310920000076, 0.13335508108139038, 0.046146418899297714, -0.011738095432519913, -0.2486042082309723, 0.008575023151934147, -0.07252951711416245, -0.11333522200584412, 0.016201216727495193, 0.035761721432209015, -0.010069100186228752, 0.032174937427043915, 0.11049123108386993, -0.011680051684379578, 0.06288356333971024, -0.2015703022480011, -0.20486389100551605, -0.07508610188961029, -0.07555478066205978, 0.0589042492210865, 0.030872387811541557, 0.05628744140267372, 0.1426718831062317, -0.18022038042545319, -0.0018841808196157217, 0.04129622131586075, -0.3510737717151642, 0.09011197835206985, 0.19666501879692078, 0.06407395005226135, 0.07872317731380463, -0.04774639382958412, 0.06726468354463577, 0.07745297998189926, -0.02402484230697155, -0.10679105669260025, -0.06142130121588707, 0.040939174592494965, 0.15604156255722046, -0.03852643445134163, -0.10356393456459045, 0.2591084837913513, -0.023262828588485718, -0.04234466329216957, 0.08201269060373306, -0.02980397455394268, -0.040379155427217484, 0.04404358193278313, 0.044016025960445404, 0.036236923187971115, 0.182089164853096, 0.1260262131690979, -0.03375067934393883, -0.16269677877426147, -0.030629513785243034, -0.2528207004070282, 0.07418664544820786, -0.003647059667855501, 0.10666298121213913, -0.20037521421909332, 0.03286786004900932, -0.15483668446540833, -0.009493621066212654, -0.02952384203672409, -0.059835705906152725, 0.05229754373431206, -0.0237403754144907, -0.04600388556718826, 0.07238677144050598, 0.08390641957521439, 0.2046167105436325, 0.023024363443255424, 0.016697337850928307, -0.10405295342206955, 0.15052515268325806, 0.019140364602208138, 0.024860305711627007, 0.179348424077034, 0.07677878439426422, -0.04891882464289665, -0.2251969277858734, 0.027894439175724983, -0.03671982139348984, -0.1441805064678192, 0.015881337225437164, -0.1542915552854538, 0.1736440360546112, -0.04078168794512749, -0.06919530034065247, -0.08578147739171982, 0.09790384024381638, 0.07768166810274124, -0.021921472623944283, -0.023105677217245102, -0.01381723117083311, 0.03522264584898949, -0.048196230083703995, -0.11687057465314865, 0.018241960555315018, 0.11869648098945618, 0.12573401629924774, -0.1483907401561737, -0.008189842104911804, -0.017200417816638947, 0.019065292552113533, 0.09696817398071289, -0.112403005361557, 0.028845038264989853, -0.09672309458255768, -0.13033071160316467, 0.036653537303209305, 0.017736904323101044, -0.019008556380867958, 0.1340927630662918, 0.061849117279052734, 0.056560322642326355, -0.011025321669876575, -0.07250872999429703, -0.14035539329051971, -0.08679798245429993, 0.1058693379163742, -0.046787332743406296, 0.010320915840566158, -0.24556252360343933, -0.014234079979360104, -0.14995723962783813, 0.059662189334630966, -0.0037668521981686354, -0.08819212019443512, -0.07740068435668945, 0.21408265829086304, 0.0018596589798107743, 0.04301392287015915, -0.1078512966632843, 0.054903753101825714, -0.06764797121286392, 0.10065380483865738, -0.12895582616329193, -0.06441528350114822, 0.1613781899213791, -0.13135331869125366, -0.14002031087875366, 0.0033312994055449963, -0.009472889825701714, 0.12053907662630081, 0.0802001804113388, 0.44566696882247925, -0.058881040662527084, -0.16201181709766388, 0.1270403116941452, 0.17969723045825958, -0.13685379922389984, -0.25928929448127747, 0.12393020838499069, -0.1636963188648224, -0.16647985577583313, 0.0040023741312325, -0.006962866988033056, 0.08049977570772171, -0.03446655720472336, -0.056274134665727615, 0.042339932173490524, 0.024350708350539207, 0.029094615951180458, 0.01740112341940403, 0.07037191838026047, -0.1023021712899208, 0.08444856107234955, 0.058610700070858, -0.014111426658928394, 0.15077349543571472, 0.011494536884129047, -0.05393160134553909, 0.014761670492589474, 0.044013332575559616, -0.015627963468432426, -0.05899091437458992, -0.09661509096622467, 0.019826244562864304, -0.031149597838521004, 0.08229395002126694, 0.1699674129486084, 0.023824702948331833, -0.02797185815870762, 0.028922779485583305, 0.028606392443180084, 0.1009954959154129, 0.06960704177618027, 0.03099375218153, -0.04839283227920532, 0.04952205345034599, -0.0417071171104908, -0.11430390179157257, -0.004862460307776928, -0.011735930107533932, 0.11975742131471634, -0.08906009048223495, -0.01223952230066061, 0.05951591953635216, -0.04513183981180191, 0.0019881438929587603, 0.0428374819457531, 0.0035966038703918457, 0.1388600617647171, 0.004440935328602791, -0.04352007433772087, 0.17440910637378693, -0.05288633331656456, 0.15533447265625, 0.1715822070837021, -0.07049662619829178, 0.015605369582772255, -0.1273636519908905, 0.003230511210858822, -0.014480113983154297, 0.05292887985706329, -0.05400136485695839, -0.05201306566596031, -0.01274962443858385, 0.014292534440755844, -0.03134604170918465, 0.01711403578519821, -0.06057267636060715, -0.08167021721601486, -0.10849859565496445, 0.018649224191904068, 0.20683221518993378, -0.22544461488723755, 0.1609548032283783, 0.40251004695892334, 0.15190774202346802, 0.21155193448066711, -0.12478897720575333, -0.002471078187227249, -0.06630261242389679, 0.026115071028470993, -0.024814706295728683, 0.13782677054405212, -0.13174867630004883, -0.01413064356893301, 0.03880728408694267, 0.0454997681081295, 0.0661163181066513, -0.17195898294448853, -0.15260353684425354, -0.0034879595041275024, -0.020591814070940018, -0.1749730259180069, 0.04874620959162712, -0.07595308125019073, 0.02181261032819748, 0.018216799944639206, -0.10832522064447403, 0.16837291419506073, -0.033566512167453766, -0.06695768237113953, 0.052613962441682816, -0.20581911504268646, -0.07900715619325638, -0.17772749066352844, -0.18375012278556824, 0.06050071492791176, 0.05760138854384422, 0.07903145253658295, -0.05951719731092453, -0.01922747679054737, 0.061719246208667755, -0.009363299235701561, -0.13802112638950348, -0.04235544428229332, -0.06993678212165833, 0.08744155615568161, -0.09474305808544159, -0.07518411427736282, -0.07833878695964813, -0.046996138989925385, -0.020961694419384003, 0.08125963062047958, -0.1039251759648323, 0.08903530240058899, 0.1493726521730423, 0.03651920333504677, 0.05440247058868408, -0.08271230012178421, 0.12693379819393158, -0.037743739783763885, -0.09459595382213593, 0.07307634502649307, 0.004350725095719099, 0.04920351505279541, 0.24039287865161896, 0.08962162584066391, -0.10578162968158722, -0.01780811697244644, -0.0968487411737442, -0.16405464708805084, -0.2553846538066864, -0.06823288649320602, -0.08744750916957855, 0.14417944848537445, 0.014636521227657795, 0.10712126642465591, 0.14313316345214844, 0.01343101728707552, 0.10255914181470871, -0.08983208239078522, -0.018939344212412834, 0.031209396198391914, 0.2135104089975357, -0.05208220332860947, 0.00838248711079359, -0.13684824109077454, -0.0256142970174551, 0.14601100981235504, 0.13798639178276062, 0.14503207802772522, 0.31421369314193726, 0.15292863547801971, 0.13410434126853943, 0.13474710285663605, 0.12333164364099503, 0.07403261214494705, 0.03444362059235573, -0.015304201282560825, -0.06035377085208893, -0.003846159903332591, 0.02816268615424633, 0.05421729013323784, 0.06724072247743607, -0.22906480729579926, 0.041139665991067886, -0.2661744952201843, 0.03544611483812332, -0.0854712724685669, 0.1161833181977272, -0.028890252113342285, 0.11051984131336212, 0.11386284977197647, 0.05553818494081497, -0.023278791457414627, 0.16036942601203918, 0.032686375081539154, -0.07703183591365814, 0.020292721688747406, 0.024695809930562973, 0.06633034348487854, 0.08606193959712982, 0.09550496190786362, -0.020778406411409378, -0.1831783503293991, 0.025963841006159782, 0.12212833017110825, -0.20747940242290497, 0.289523184299469, 0.013651901856064796, -0.0743619054555893, -0.01690039224922657, -0.06958060711622238, 0.008433517068624496, 0.12829731404781342, 0.10406835377216339, 0.05508929491043091, -0.2613787055015564, -0.13299626111984253, 0.046764206141233444, -0.00873907096683979, 0.11356569826602936, -0.0052223424427211285, -0.14201195538043976, -0.06640999764204025, 0.05814211815595627, -0.006591420155018568, 0.13023322820663452, -0.018290361389517784, -0.08173255622386932, -0.010230090469121933, 0.055564697831869125, -0.001312803477048874, -0.04580084979534149, 0.07523149996995926, 0.009008137509226799, 0.02259289287030697, -0.08178020268678665, 0.03887253627181053, -0.08071476966142654, -0.25375792384147644, 0.019298138096928596, -0.04987313598394394, 0.004092312417924404, -0.04684043675661087, -0.15448936820030212, -0.1129264086484909, -0.15445278584957123, 0.13100723922252655, -0.03675999864935875, 0.091565802693367, -0.0817658007144928, 0.13736046850681305, -0.08521489799022675, 0.05375019088387489, 0.00614814180880785, 0.03918716683983803, -0.017955513671040535, -0.1031481996178627, 0.09334362298250198, -0.1874227225780487, 0.023863423615694046, 0.010427716188132763, -0.056847453117370605, -0.01354232057929039, 0.03918023407459259, -0.08763083070516586, 0.21879427134990692, 0.3331502079963684, -0.011948764324188232, 0.22546616196632385, 0.35863226652145386, -0.13763751089572906, -0.23258967697620392, -0.1205512136220932, -0.3263251483440399, -0.09005610644817352, 0.17321562767028809, -0.18057219684123993, 0.04850830137729645, 0.16150830686092377, -0.10868281871080399, 0.22499866783618927, -0.22723928093910217, -0.04793389141559601, 0.1823979914188385, -0.038322996348142624, 0.4527989625930786, -0.1144307404756546, -0.1784561723470688, -0.03637253865599632, -0.16285361349582672, 0.12426037341356277, -0.026553882285952568, 0.06700495630502701, 0.02416347898542881, -0.011372359469532967, -0.009014161303639412, -0.04529716446995735, 0.2216065675020218, 0.0522729866206646, 0.10468899458646774, -0.09159468114376068, -0.17199653387069702, 0.1907423883676529, -0.0004908236442133784, -0.003372655250132084, -0.05411549657583237, -0.04850282520055771, -0.06871756166219711, 0.033092137426137924, -0.0334564633667469, 0.06195882335305214, 0.03364093229174614, -0.11903523653745651, -0.10248823463916779, 0.034111104905605316, -0.13155671954154968, -0.054850947111845016, 0.26421889662742615, -0.02080743946135044, 0.09609334170818329, 0.04959092289209366, -0.05474294349551201, -0.13538943231105804, 0.005736751481890678, -0.07534020394086838, -0.05711410939693451, 0.06573604047298431, -0.11453206837177277, -0.024341827258467674, 0.1293732225894928, -0.029497180134058, 0.09674722701311111, 0.08061115443706512, -0.07585363835096359, 0.02032829262316227, 0.15617427229881287, -0.07247176766395569, -0.10849180817604065, 0.04999847710132599, 0.04640531167387962, 0.17256882786750793, 0.004101871978491545, 0.02018604800105095, 0.08726977556943893, 0.045959215611219406, -0.007486662827432156, 0.007311292923986912, -0.11321697384119034, -0.04241771996021271, 0.0387241393327713, -0.005273692775517702, -0.10946331918239594, 0.16008898615837097, 0.056837860494852066, 0.004653505515307188, -0.06027700752019882, 0.09720424562692642, -0.06709636747837067, -0.07046061009168625, -0.1753035932779312, 0.018511172384023666, -0.12734080851078033, -0.09874535351991653, 0.06846235692501068, -0.09371624886989594, -0.04084605351090431, 0.08152704685926437, 0.046927981078624725, 0.14401860535144806, -0.006597559433430433, -0.023080874234437943, 0.149825319647789, -0.0884878933429718, -0.2241756170988083, 0.01969664730131626, -0.04083063453435898, -0.07065816223621368, -0.0007070365245454013, 0.06069544702768326, -0.0663156732916832, -0.11958606541156769, -0.20477768778800964, 0.10412076860666275, -0.12043121457099915, -0.03954985365271568, -0.1041841059923172, -0.053260523825883865, 0.07891252636909485, -0.02613759972155094, -0.04122013971209526, -0.047595683485269547, -0.16630595922470093, 0.054254453629255295, 0.07140932232141495, 0.11125344783067703, -0.0759999230504036, -0.018354382365942, 0.1398727148771286, 0.048581548035144806, 0.08479110151529312, 0.07578440010547638, 0.026255371049046516, 0.16728560626506805, -0.1708206981420517, -0.0542997270822525, 0.1068294569849968, -0.026716172695159912, 0.01994573324918747, 0.10631280392408371, -0.04839588701725006, 0.07042654603719711, -0.05095988139510155, 0.05859163776040077, -0.15704534947872162, -0.13073866069316864, -0.04184387996792793, 0.023728877305984497, -0.2260182797908783, 0.015071595087647438, -0.1769561767578125, 0.19692228734493256, -0.024228032678365707, 0.11490963399410248, 0.08052190393209457, 0.02052290178835392, 0.03539382666349411, -0.006019921973347664, 0.00946811307221651, -0.10524865239858627, -0.05784677714109421, -0.07560300827026367, -0.1168874129652977, -0.009665017947554588, 0.36614301800727844, 0.02430291846394539, -0.19682736694812775, 0.051222387701272964, 0.18285293877124786, 0.023639049381017685, -0.0073763905093073845, 0.26180747151374817, 0.08150359988212585, -0.023175053298473358, -0.1782374382019043, 0.0396091528236866, -0.08699734508991241, -0.15269799530506134, 0.11385007947683334, 0.09347525984048843, 0.05813581123948097, 0.022930078208446503, 0.10404518246650696, -0.035940010100603104, -0.05509711429476738, -0.13301853835582733, 0.13368983566761017, -0.001790675800293684, 0.0193882267922163, 0.0897885113954544, 0.19249756634235382, -0.045275162905454636, 0.05437124893069267, -0.07336640357971191, -0.001598604372702539, -0.15740543603897095, -0.13358698785305023, 0.06194563955068588, -0.08269550651311874, 0.06342913210391998, 0.050261519849300385, 0.04341990500688553, 0.31786394119262695, 0.039095040410757065, -0.046439893543720245, 0.003166865324601531, -0.14845187962055206, -0.08075450360774994, -0.06024569645524025, -0.03110554814338684, 0.028620192781090736, -0.13928957283496857, -0.09898591786623001, -0.06917677819728851, -0.130235955119133, -0.06539803743362427, 0.025270747020840645, 0.014251931570470333, -0.053083837032318115, -0.17625881731510162, -0.04808593541383743, -0.06644169986248016, 0.10105955600738525, -0.08462738990783691, 0.1516820639371872, 0.0022449472453445196, 0.030281953513622284, 0.07627002149820328, 0.09585131704807281, 0.018900424242019653, -0.06975197046995163, 0.05599058046936989, 0.12436293810606003, 0.01323844213038683, 0.1259988248348236, -0.06034265458583832, -0.019420607015490532, -0.014145253226161003, 0.14038437604904175, 0.304447740316391, -0.01856905221939087, -0.013814439997076988, -0.022110093384981155, 0.021388787776231766, 0.10893569141626358, 0.19800719618797302, -0.03437356278300285, 0.2551359534263611, -0.058974795043468475, 0.0756678432226181, -0.013180435635149479, -0.005362013820558786, -0.053146667778491974, 0.06074550002813339, 0.06268858164548874, -0.06877048313617706, -0.10191375762224197, 0.15178529918193817, -0.14985080063343048, 0.13306055963039398, 0.14678068459033966, -0.06057753041386604, 0.03797250986099243, 0.0007459368789568543, 0.19896264374256134, -0.03570213168859482, 0.0984780564904213, -0.10653308779001236, -0.10261140763759613, -0.14764924347400665, 0.037690844386816025, -0.36797797679901123, -0.1756322830915451, 0.11731542646884918, 0.14115898311138153, 0.1759258657693863, -0.012341637164354324, 0.056479312479496, 0.0033020609989762306, 0.08296097069978714, -0.04232487455010414, 0.1519634872674942, 0.0612073615193367, -0.017103128135204315, -0.15296664834022522, -0.20328094065189362, -0.0012039330322295427, -0.058561209589242935, 0.055583830922842026, -0.02269243635237217, 0.025347469374537468, 0.07746459543704987, -0.06768939644098282, -0.029180381447076797, -0.02352982573211193, -0.13262848556041718, 0.052229251712560654, -0.04354005306959152, 0.0320255309343338, -0.03958037868142128, -0.022394726052880287, -0.039987675845623016, 0.10721533745527267, -0.22402705252170563, -0.08517231047153473, 0.1422796994447708, -0.03421911224722862, 0.1542559564113617, -0.02848726324737072, -0.12159585952758789, -0.024955326691269875, -0.06977712363004684, 0.10887379199266434, -0.1419300138950348, 0.038592495024204254, 0.13747453689575195, 0.008710617199540138, 0.031119761988520622, -0.2533661723136902, 0.050644006580114365, -0.03556957095861435, -0.016733208671212196, -0.057031940668821335 ]
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # chat_300STEPS_1e7rate_SFT This model is a fine-tuned version of [meta-llama/Llama-2-7b-chat-hf](https://huggingface.co/meta-llama/Llama-2-7b-chat-hf) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.4992 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-07 - train_batch_size: 4 - eval_batch_size: 1 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 8 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_steps: 100 - training_steps: 300 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | 1.6169 | 0.1 | 50 | 1.6126 | | 1.5653 | 0.2 | 100 | 1.5784 | | 1.5269 | 0.29 | 150 | 1.5290 | | 1.4991 | 0.39 | 200 | 1.5046 | | 1.5009 | 0.49 | 250 | 1.4995 | | 1.4926 | 0.59 | 300 | 1.4992 | ### Framework versions - Transformers 4.37.2 - Pytorch 2.0.0+cu117 - Datasets 2.17.0 - Tokenizers 0.15.2
{"tags": ["trl", "sft", "generated_from_trainer"], "base_model": "meta-llama/Llama-2-7b-chat-hf", "model-index": [{"name": "chat_300STEPS_1e7rate_SFT", "results": []}]}
text-generation
tsavage68/chat_300STEPS_1e7rate_SFT
[ "transformers", "safetensors", "llama", "text-generation", "trl", "sft", "generated_from_trainer", "conversational", "base_model:meta-llama/Llama-2-7b-chat-hf", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2024-02-13T21:40:38+00:00
[]
[]
TAGS #transformers #safetensors #llama #text-generation #trl #sft #generated_from_trainer #conversational #base_model-meta-llama/Llama-2-7b-chat-hf #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
chat\_300STEPS\_1e7rate\_SFT ============================ This model is a fine-tuned version of meta-llama/Llama-2-7b-chat-hf on an unknown dataset. It achieves the following results on the evaluation set: * Loss: 1.4992 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 1e-07 * train\_batch\_size: 4 * eval\_batch\_size: 1 * seed: 42 * gradient\_accumulation\_steps: 2 * total\_train\_batch\_size: 8 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: cosine * lr\_scheduler\_warmup\_steps: 100 * training\_steps: 300 ### Training results ### Framework versions * Transformers 4.37.2 * Pytorch 2.0.0+cu117 * Datasets 2.17.0 * Tokenizers 0.15.2
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-07\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 1\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 8\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_steps: 100\n* training\\_steps: 300", "### Training results", "### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.0.0+cu117\n* Datasets 2.17.0\n* Tokenizers 0.15.2" ]
[ "TAGS\n#transformers #safetensors #llama #text-generation #trl #sft #generated_from_trainer #conversational #base_model-meta-llama/Llama-2-7b-chat-hf #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-07\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 1\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 8\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_steps: 100\n* training\\_steps: 300", "### Training results", "### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.0.0+cu117\n* Datasets 2.17.0\n* Tokenizers 0.15.2" ]
[ 84, 145, 4, 33 ]
[ "passage: TAGS\n#transformers #safetensors #llama #text-generation #trl #sft #generated_from_trainer #conversational #base_model-meta-llama/Llama-2-7b-chat-hf #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-07\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 1\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 8\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_steps: 100\n* training\\_steps: 300### Training results### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.0.0+cu117\n* Datasets 2.17.0\n* Tokenizers 0.15.2" ]
[ -0.12492161244153976, 0.09557868540287018, -0.00299114640802145, 0.06906754523515701, 0.11658298969268799, 0.01611233875155449, 0.10384144634008408, 0.14766092598438263, -0.06552407145500183, 0.08878865838050842, 0.14001023769378662, 0.11771455407142639, 0.06006374582648277, 0.17466315627098083, -0.019529949873685837, -0.328204482793808, -0.002541201887652278, -0.021807825192809105, -0.15777245163917542, 0.13359518349170685, 0.08356416970491409, -0.11745356023311615, 0.053438909351825714, -0.03296345844864845, -0.09962140768766403, -0.03568457067012787, -0.038455430418252945, -0.04571174085140228, 0.1374576985836029, 0.0007299284916371107, 0.08207694441080093, 0.04450170323252678, 0.09667569398880005, -0.24799096584320068, 0.013360013253986835, 0.05664946138858795, 0.031393300741910934, 0.08540059626102448, 0.06512777507305145, -0.04580775648355484, 0.1125800684094429, -0.10985741764307022, 0.07892909646034241, 0.039763014763593674, -0.11513494700193405, -0.22233805060386658, -0.08221293240785599, 0.035325512290000916, 0.15822620689868927, 0.07849845290184021, -0.0239225123077631, 0.06541944295167923, -0.09004147350788116, 0.08922221511602402, 0.26719823479652405, -0.2593250572681427, -0.08143581449985504, 0.04835839197039604, 0.0697430893778801, 0.07212886214256287, -0.1364506632089615, -0.000965715094935149, 0.024124154821038246, 0.0041911848820745945, 0.14433182775974274, 0.0004643235297407955, 0.08452083170413971, 0.020382901653647423, -0.14359711110591888, -0.04082384705543518, 0.09769897162914276, 0.07486476004123688, -0.022191854193806648, -0.12063668668270111, -0.05184625834226608, -0.2135319858789444, -0.047143787145614624, 0.005130645353347063, 0.023381957784295082, -0.05395262688398361, -0.09882871061563492, 0.0008752351859584451, -0.06546737998723984, -0.108488067984581, 0.07246625423431396, 0.16273275017738342, 0.044576313346624374, -0.04308009147644043, 0.03364534676074982, 0.14971736073493958, 0.08751988410949707, -0.16021299362182617, -0.0025066807866096497, 0.019494937732815742, -0.09174294024705887, -0.012873081490397453, -0.020678289234638214, 0.017759136855602264, 0.022879688069224358, 0.15180696547031403, -0.024246877059340477, 0.0737544596195221, 0.07237406820058823, 0.024146804586052895, -0.0930938571691513, 0.13865937292575836, -0.0610324889421463, -0.10349433869123459, -0.05002492666244507, 0.14513495564460754, 0.012934699654579163, -0.023925041779875755, -0.08346981555223465, 0.015187988989055157, 0.09888383746147156, 0.07577554881572723, -0.023887932300567627, 0.039946962147951126, -0.06848173588514328, -0.01680920086801052, 0.018677353858947754, -0.10652533918619156, 0.029706861823797226, 0.01308364700525999, -0.06766652315855026, -0.058653030544519424, -0.0012719337828457355, 0.008793084882199764, 0.0012127761729061604, 0.11935599893331528, -0.07302148640155792, -0.025661244988441467, -0.09042844921350479, -0.08537235110998154, 0.0011028199223801494, -0.09088238328695297, -0.010199243202805519, -0.06071057915687561, -0.1417161375284195, -0.05287041515111923, 0.060585059225559235, -0.06352595239877701, -0.06779872626066208, -0.08825372904539108, -0.0996432825922966, 0.03439394757151604, -0.00500468397513032, 0.15868380665779114, -0.052381984889507294, 0.12109770625829697, -0.006661498453468084, 0.08675026893615723, 0.09461503475904465, 0.044946737587451935, -0.03877023980021477, 0.06678801774978638, -0.18481895327568054, 0.08440668135881424, -0.0799451395869255, 0.07218611985445023, -0.12376739829778671, -0.09608382731676102, -0.03140048310160637, -0.00038279342697933316, 0.0852985754609108, 0.16809162497520447, -0.17266212403774261, -0.08323449641466141, 0.21098382771015167, -0.053295910358428955, -0.11439768224954605, 0.11404401808977127, -0.036097727715969086, 0.016176017001271248, 0.023731358349323273, 0.16206218302249908, 0.09643324464559555, -0.07128268480300903, 0.0339311845600605, -0.01831815019249916, 0.09747527539730072, 0.041082169860601425, 0.09252872318029404, -0.038035936653614044, 0.027140876278281212, -0.001084742951206863, -0.06144893541932106, 0.04607082158327103, -0.09254691749811172, -0.08893299102783203, -0.004326364956796169, -0.08368737250566483, 0.07185202836990356, 0.04143860191106796, 0.03070041351020336, -0.09257952123880386, -0.11875317990779877, -0.022729186341166496, 0.10401925444602966, -0.09027720242738724, 0.018020031973719597, -0.035168375819921494, 0.036848559975624084, 0.0068129138089716434, 0.006922150030732155, -0.1364578753709793, -0.04037010669708252, 0.0324149914085865, 0.015344955958425999, -0.012484638951718807, -0.021350281313061714, 0.0860389992594719, 0.06743359565734863, -0.07311496138572693, -0.08694091439247131, -0.05041178688406944, -0.005422496236860752, -0.10245712101459503, -0.2507999539375305, -0.06839659065008163, -0.03531922399997711, 0.20721746981143951, -0.23733964562416077, 0.04368286579847336, 0.010353227145969868, 0.12112042307853699, 0.029036186635494232, -0.04009917378425598, 0.011541263200342655, 0.04467253014445305, -0.027271989732980728, -0.09219150990247726, 0.04231894761323929, -0.011502457782626152, -0.14469951391220093, -0.005308539140969515, -0.1342044323682785, 0.10954342782497406, 0.0921364277601242, 0.014174208045005798, -0.1231493279337883, -0.09761389344930649, -0.06556415557861328, -0.04487822204828262, -0.025817865505814552, -0.0059634181670844555, 0.1382373422384262, 0.03618888184428215, 0.11900381743907928, -0.08011411875486374, -0.07020416855812073, 0.02341524325311184, -0.004049067851155996, 0.015237594954669476, 0.16158124804496765, 0.04402264580130577, -0.06340181082487106, 0.11789407581090927, 0.12433306872844696, -0.0398869588971138, 0.15855646133422852, -0.0485881008207798, -0.08314550668001175, -0.03310263156890869, 0.05897541716694832, 0.02929363213479519, 0.13181403279304504, -0.11458360403776169, -0.018040817230939865, 0.008251780644059181, 0.017806483432650566, 0.0033096258994191885, -0.1932353377342224, -0.04323501139879227, 0.05185787007212639, -0.06501477211713791, 0.024439474567770958, -0.026417991146445274, -0.020999068394303322, 0.1017405241727829, 0.03200656920671463, -0.05157705023884773, -0.0020263567566871643, -0.02227751351892948, -0.08417551219463348, 0.23143574595451355, -0.09338182210922241, -0.13647381961345673, -0.0988595113158226, 0.022493764758110046, -0.011628206819295883, 0.008166082203388214, 0.023729000240564346, -0.10774030536413193, 0.0031715109944343567, -0.08151848614215851, 0.012105254456400871, -0.03764709457755089, 0.03832124173641205, -0.023859763517975807, 0.014602876268327236, 0.02894744649529457, -0.08119004964828491, 0.017195886000990868, -0.019427789375185966, -0.04980187490582466, 0.04689914360642433, 0.011181257665157318, 0.10288840532302856, 0.16596847772598267, 0.028392702341079712, 0.028504973277449608, -0.04441125690937042, 0.1364877074956894, -0.1283193826675415, 0.023656677454710007, 0.09017807990312576, 0.026667285710573196, 0.057645708322525024, 0.1473352611064911, 0.038237519562244415, -0.08311396837234497, 0.0396568737924099, 0.0353616327047348, -0.03341898322105408, -0.20487166941165924, -0.006464815698564053, -0.04965364933013916, 0.023388834670186043, 0.12163837254047394, 0.040116846561431885, 0.01866219751536846, 0.060007963329553604, -0.029700294137001038, -0.0349760539829731, 0.02890755794942379, 0.07293960452079773, -0.010224984958767891, 0.019477015361189842, 0.11780459433794022, -0.0068725296296179295, -0.04846556857228279, 0.009517310187220573, 0.008744032122194767, 0.23035098612308502, -0.02603997476398945, 0.15797391533851624, 0.03859588876366615, 0.1592949777841568, -0.011855760589241982, 0.07677140086889267, 0.028064655140042305, -0.038978222757577896, -0.003630703315138817, -0.058661267161369324, -0.037239570170640945, 0.06302762031555176, 0.04246099665760994, 0.0522591657936573, -0.11521342396736145, 0.02834206633269787, 0.046616606414318085, 0.31457099318504333, 0.07615423947572708, -0.2915080487728119, -0.0715518370270729, 0.021821819245815277, -0.04991203919053078, -0.03330652788281441, 0.02348480187356472, 0.1425856351852417, -0.12411941587924957, 0.05420802906155586, -0.08539313077926636, 0.072427898645401, -0.06602002680301666, 0.00199496210552752, 0.043409377336502075, 0.08095905184745789, -0.029013074934482574, 0.0640212818980217, -0.27405276894569397, 0.3006431758403778, -0.01181994192302227, 0.05511616915464401, -0.05726097524166107, 0.020468877628445625, 0.03386414051055908, 0.011542761698365211, 0.11785030364990234, -0.004439159296452999, -0.033702120184898376, -0.16431596875190735, -0.09789863228797913, 0.007872787304222584, 0.1469767987728119, -0.14170552790164948, 0.12490073591470718, -0.02723393775522709, -0.036287590861320496, 0.044375866651535034, -0.06324479728937149, -0.05464053153991699, -0.10305701941251755, 0.011190749704837799, -0.04297277703881264, 0.06380696594715118, -0.10668399930000305, -0.09549897164106369, -0.03423098474740982, 0.16316461563110352, -0.1067972332239151, -0.040535472333431244, -0.1551324427127838, 0.06071015074849129, 0.13481104373931885, -0.07555676996707916, 0.05388864874839783, 0.01532089151442051, 0.1092909723520279, 0.004825675394386053, 0.004625583998858929, 0.12369687110185623, -0.08400938659906387, -0.23902827501296997, -0.06907153129577637, 0.1897241622209549, 0.04412965849041939, 0.06442315131425858, -0.028247451409697533, 0.018100358545780182, -0.01107620820403099, -0.08990166336297989, 0.06881952285766602, 0.0300348661839962, 0.04976670816540718, 0.04596269130706787, -0.05023700371384621, 0.08143647015094757, -0.06561712175607681, -0.05141843110322952, 0.14295320212841034, 0.3026931881904602, -0.10396695137023926, 0.05234414339065552, 0.059353746473789215, -0.0341879241168499, -0.18377672135829926, 0.004330878611654043, 0.1046108826994896, 0.040084727108478546, 0.011406287550926208, -0.20052532851696014, 0.0288961511105299, 0.08292859047651291, -0.024435028433799744, 0.09049829840660095, -0.33425864577293396, -0.12653712928295135, 0.0752321109175682, 0.11908982694149017, -0.013189694844186306, -0.16997921466827393, -0.06195434182882309, -0.012474635615944862, -0.04828086495399475, 0.04928017035126686, -0.04838865250349045, 0.12593768537044525, -0.0159571785479784, 0.007534995209425688, 0.02793300338089466, -0.062352102249860764, 0.13709115982055664, -0.0004951396840624511, 0.07326988875865936, -0.026342034339904785, -0.0015660347416996956, -0.001679684384725988, -0.0827425941824913, 0.010579206049442291, -0.12362983822822571, 0.037128694355487823, -0.10334640741348267, -0.019140366464853287, -0.0957132950425148, 0.030050039291381836, -0.06128634884953499, -0.07462889701128006, -0.025176307186484337, 0.044401638209819794, 0.08098448812961578, -0.0004741779121104628, 0.11173088103532791, -0.040971312671899796, 0.1641789823770523, 0.09878019243478775, 0.11060464382171631, -0.008362961001694202, -0.07313361018896103, -0.010383637621998787, -0.006156345829367638, 0.04242454469203949, -0.14926402270793915, 0.007761950604617596, 0.12915152311325073, 0.06145890802145004, 0.13861528038978577, 0.06929782778024673, -0.06020393595099449, -0.019303275272250175, 0.07160398364067078, -0.10952842980623245, -0.12331501394510269, -0.020680122077465057, 0.0011125896126031876, -0.14692991971969604, 0.043444208800792694, 0.0943094789981842, -0.04791291058063507, -0.004549843724817038, 0.000020089933968847618, 0.029905853793025017, -0.013484717346727848, 0.2026648372411728, 0.06157873570919037, 0.10535869002342224, -0.08725851774215698, 0.07305968552827835, 0.026265345513820648, -0.10634627938270569, 0.027345167472958565, 0.12125151604413986, -0.0905134379863739, -0.02010556496679783, 0.05305950343608856, 0.07046777009963989, 0.014460536651313305, -0.0045630307868123055, -0.12701722979545593, -0.13858996331691742, 0.06364282220602036, 0.11112967133522034, 0.04105248302221298, 0.024716394022107124, -0.01856471225619316, 0.04795128479599953, -0.11528153717517853, 0.11665882915258408, 0.06186096742749214, 0.08216554671525955, -0.13178765773773193, 0.14864777028560638, -0.002153369365260005, 0.013726731762290001, -0.00891219824552536, 0.015462215058505535, -0.12157353013753891, 0.011864918284118176, -0.0480315275490284, -0.06060482934117317, -0.06290999799966812, -0.023280777037143707, -0.014346068724989891, -0.04433419182896614, -0.009503391571342945, -0.005594329442828894, -0.10649887472391129, -0.055902957916259766, -0.025481294840574265, 0.03811261057853699, -0.09836874902248383, -0.030349407345056534, 0.032604463398456573, -0.12960056960582733, 0.098830945789814, 0.032514795660972595, 0.034230977296829224, 0.005905539728701115, -0.09362833946943283, 0.042690668255090714, 0.02347765676677227, -0.02986643835902214, 0.028708932921290398, -0.13487882912158966, -0.027264533564448357, -0.07453182339668274, 0.019576076418161392, 0.012232691049575806, 0.0075958059169352055, -0.1368374228477478, 0.03110271506011486, -0.04593367874622345, -0.06517826020717621, -0.06973891705274582, 0.06076914444565773, 0.04824713617563248, -0.0009447062620893121, 0.13209383189678192, -0.07061675935983658, 0.06919295340776443, -0.22505095601081848, -0.019157567992806435, -0.008082657121121883, -0.07439237833023071, -0.06662759929895401, -0.036021023988723755, 0.0930376648902893, -0.05768537521362305, 0.0689421221613884, -0.04432343691587448, 0.038582317531108856, 0.020132767036557198, -0.10088493674993515, 0.10085061192512512, 0.05596490576863289, 0.17947492003440857, 0.05192200094461441, -0.045736316591501236, 0.05889424681663513, 0.034267108887434006, 0.06633535027503967, 0.0650341659784317, 0.16982461512088776, 0.13133728504180908, 0.028039880096912384, 0.08564265817403793, 0.035146262496709824, -0.12578175961971283, -0.14099088311195374, 0.09326834231615067, -0.03781190142035484, 0.09232058376073837, -0.026213254779577255, 0.2095469981431961, 0.1310584843158722, -0.21205663681030273, 0.03388601914048195, -0.020349876955151558, -0.10096615552902222, -0.09115560352802277, -0.06213212385773659, -0.0708351656794548, -0.17157895863056183, -0.0035139156971126795, -0.10427307337522507, 0.021384552121162415, 0.08323489874601364, 0.030529502779245377, 0.04136089235544205, 0.17448553442955017, 0.08933404088020325, 0.014833238907158375, 0.10303179919719696, 0.04525718837976456, 0.011071449145674706, -0.03828956559300423, -0.1067974865436554, 0.006363595370203257, -0.0586368702352047, 0.03095630742609501, -0.06809566169977188, -0.09796846657991409, 0.0601549856364727, 0.03442404419183731, -0.10327968746423721, 0.02603258192539215, -0.00842572282999754, 0.06183425709605217, 0.07769656181335449, 0.022291207686066628, -0.029142098501324654, -0.036153484135866165, 0.26237279176712036, -0.1117410659790039, -0.04338083043694496, -0.1042357012629509, 0.2538636028766632, 0.016555435955524445, -0.0004518413043115288, 0.01313086785376072, -0.08034288883209229, 0.009370142593979836, 0.17251205444335938, 0.18086864054203033, -0.05245852842926979, -0.0038620815612375736, 0.017957409843802452, -0.013781404122710228, -0.01831909827888012, 0.0769590437412262, 0.1134575754404068, 0.04872583597898483, -0.07339390367269516, -0.009501928463578224, -0.012695389799773693, -0.07654258608818054, -0.05936230719089508, 0.0833243727684021, 0.048066504299640656, 0.006285509094595909, -0.032917320728302, 0.1063707172870636, -0.039084553718566895, -0.13852429389953613, 0.0555226169526577, -0.2065557986497879, -0.1684890240430832, -0.05484990403056145, 0.03794737532734871, 0.014782427810132504, 0.07520762830972672, 0.01324726827442646, -0.022410213947296143, 0.08368957042694092, 0.011068089865148067, -0.043715886771678925, -0.09499413520097733, 0.07041055709123611, -0.0964977890253067, 0.19249486923217773, -0.05728979408740997, -0.01559244841337204, 0.13099424540996552, 0.021795466542243958, -0.07459637522697449, 0.03940952196717262, 0.10178042203187943, -0.09093184769153595, 0.044595956802368164, 0.17050933837890625, -0.030156826600432396, 0.09833172708749771, 0.04876212030649185, -0.1273239403963089, 0.02351667732000351, -0.0832715630531311, -0.058147232979536057, -0.07168333977460861, 0.006267943419516087, -0.02141532301902771, 0.14904168248176575, 0.22995918989181519, -0.06986381858587265, 0.007358490023761988, -0.049419645220041275, -0.0005391067243181169, 0.04881279170513153, 0.10011403262615204, -0.02438896894454956, -0.25511544942855835, 0.007693781983107328, 0.04440634325146675, 0.005644710268825293, -0.25589489936828613, -0.09885728359222412, 0.027983104810118675, -0.04741508886218071, -0.07960935682058334, 0.09875524789094925, 0.04316912591457367, 0.054690685123205185, -0.03833874315023422, -0.09030016511678696, -0.05090774968266487, 0.19081753492355347, -0.1794241964817047, -0.055999644100666046 ]
null
null
transformers
# Model Trained Using AutoTrain This model was trained using AutoTrain. For more information, please visit [AutoTrain](https://hf.co/docs/autotrain). # Usage ```python from transformers import AutoModelForCausalLM, AutoTokenizer model_path = "PATH_TO_THIS_REPO" tokenizer = AutoTokenizer.from_pretrained(model_path) model = AutoModelForCausalLM.from_pretrained( model_path, device_map="auto", torch_dtype='auto' ).eval() # Prompt content: "hi" messages = [ {"role": "user", "content": "hi"} ] input_ids = tokenizer.apply_chat_template(conversation=messages, tokenize=True, add_generation_prompt=True, return_tensors='pt') output_ids = model.generate(input_ids.to('cuda')) response = tokenizer.decode(output_ids[0][input_ids.shape[1]:], skip_special_tokens=True) # Model response: "Hello! How can I assist you today?" print(response) ```
{"license": "other", "tags": ["autotrain", "text-generation"], "widget": [{"text": "I love AutoTrain because "}]}
text-generation
Jimmyhd/mistral7bTimeSheet1000Rows
[ "transformers", "safetensors", "mistral", "text-generation", "autotrain", "conversational", "license:other", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2024-02-13T21:44:24+00:00
[]
[]
TAGS #transformers #safetensors #mistral #text-generation #autotrain #conversational #license-other #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# Model Trained Using AutoTrain This model was trained using AutoTrain. For more information, please visit AutoTrain. # Usage
[ "# Model Trained Using AutoTrain\n\nThis model was trained using AutoTrain. For more information, please visit AutoTrain.", "# Usage" ]
[ "TAGS\n#transformers #safetensors #mistral #text-generation #autotrain #conversational #license-other #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# Model Trained Using AutoTrain\n\nThis model was trained using AutoTrain. For more information, please visit AutoTrain.", "# Usage" ]
[ 60, 29, 3 ]
[ "passage: TAGS\n#transformers #safetensors #mistral #text-generation #autotrain #conversational #license-other #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Model Trained Using AutoTrain\n\nThis model was trained using AutoTrain. For more information, please visit AutoTrain.# Usage" ]
[ -0.021511150524020195, 0.04406950995326042, -0.0018974434351548553, 0.042258311063051224, 0.12830689549446106, -0.032055750489234924, 0.25097373127937317, 0.046006232500076294, -0.055674921721220016, -0.0875626727938652, 0.1800176501274109, 0.18086481094360352, -0.044509101659059525, 0.18035292625427246, -0.050944287329912186, -0.25305476784706116, 0.03928804025053978, -0.03002876788377762, 0.08513160794973373, 0.11528930813074112, 0.15422189235687256, -0.06810108572244644, 0.07065033167600632, 0.02824488840997219, -0.20104824006557465, 0.019978217780590057, 0.061692725867033005, -0.13531090319156647, 0.1744065284729004, 0.06427885591983795, 0.10493632405996323, 0.055612560361623764, 0.116838239133358, -0.1132093071937561, 0.022470777854323387, -0.0016907136887311935, -0.014712871052324772, 0.08243117481470108, 0.06501992791891098, -0.05186719819903374, 0.07627531886100769, 0.15464681386947632, 0.10604750365018845, 0.0482061542570591, -0.11266434192657471, 0.019983580335974693, -0.009796111844480038, 0.024494826793670654, 0.11246276646852493, 0.10748797655105591, -0.019376493990421295, 0.17442357540130615, -0.11404631286859512, 0.08803998678922653, -0.06867949664592743, -0.2569102942943573, -0.02377595752477646, 0.17971466481685638, 0.04951933026313782, 0.009332622401416302, -0.11764580756425858, 0.07480819523334503, 0.10503014177083969, -0.0036651177797466516, 0.06326811760663986, -0.027249373495578766, -0.04522029682993889, -0.012807485647499561, -0.08539242297410965, 0.01736890710890293, 0.19978225231170654, -0.07611245661973953, -0.029304297640919685, -0.11826693266630173, -0.03709862753748894, 0.039590343832969666, -0.003653857856988907, -0.11187566071748734, -0.03012414276599884, 0.09636328369379044, -0.040389467030763626, -0.02467985264956951, -0.1315167397260666, -0.05055169388651848, -0.10564202070236206, 0.06777604669332504, -0.005282477010041475, -0.013226459734141827, -0.1092604473233223, 0.10347986221313477, 0.024692028760910034, -0.11134970933198929, 0.054500650614500046, -0.0958658754825592, 0.02550862729549408, -0.1014677956700325, -0.028109585866332054, -0.12843111157417297, 0.029125744476914406, 0.2112814038991928, 0.14645351469516754, 0.0026995285879820585, -0.08157949894666672, 0.03567685931921005, 0.021564165130257607, 0.10918251425027847, 0.04826131463050842, -0.05470609664916992, 0.05478431284427643, -0.05002056062221527, -0.01882634125649929, -0.03960404917597771, -0.18943770229816437, 0.035271406173706055, 0.03341565653681755, 0.055131714791059494, -0.06098131462931633, 0.10801588743925095, -0.030042583122849464, 0.02901764214038849, 0.05494352802634239, -0.03733232617378235, 0.025456679984927177, -0.04736876115202904, 0.0010906178504228592, -0.05055661499500275, 0.04184268042445183, 0.11400067061185837, 0.029079049825668335, 0.10226564854383469, -0.07642456144094467, -0.04008793830871582, -0.09685022383928299, -0.07439978420734406, 0.005647685844451189, 0.047459330409765244, 0.043970074504613876, -0.20281462371349335, -0.2693801820278168, -0.019272325560450554, 0.05773158743977547, -0.022137636318802834, -0.07310711592435837, -0.10689149051904678, 0.0075124879367649555, 0.06080935522913933, -0.03403749689459801, 0.04879654571413994, -0.019104430451989174, 0.027751440182328224, -0.07201311737298965, 0.022503772750496864, -0.07536090165376663, 0.020340394228696823, -0.13729903101921082, -0.02291964925825596, -0.019758576527237892, 0.050816237926483154, -0.02963360585272312, 0.15936978161334991, -0.03056740015745163, 0.03803997114300728, -0.027829257771372795, 0.05552501976490021, 0.010931183584034443, 0.15893206000328064, -0.1403975933790207, -0.03167174383997917, 0.15015147626399994, -0.11030462384223938, -0.11657889932394028, 0.12217075377702713, -0.10353213548660278, 0.26386839151382446, 0.12699852883815765, 0.11197960376739502, 0.04626157879829407, -0.07833969593048096, 0.12762103974819183, 0.012613244354724884, -0.08764777332544327, -0.004540602210909128, 0.005323404911905527, 0.007409125100821257, -0.20041394233703613, 0.03927268087863922, 0.12687388062477112, 0.07012435793876648, -0.04830225929617882, -0.08430612087249756, -0.007972520776093006, -0.04824082925915718, 0.08512664586305618, -0.011973553337156773, 0.13039983808994293, -0.06363964080810547, -0.03081914782524109, 0.06789000332355499, 0.042220357805490494, 0.042559653520584106, -0.04828557372093201, -0.08518916368484497, -0.04232817888259888, -0.017585016787052155, 0.02622099779546261, -0.09633272886276245, -0.04491637647151947, -0.02053658664226532, 0.11658555269241333, 0.0512460358440876, 0.09422498941421509, 0.036101795732975006, 0.05028996989130974, -0.01500276941806078, 0.016517972573637962, 0.18897680938243866, 0.03480306640267372, -0.12018852680921555, -0.11630555242300034, 0.11346139758825302, -0.08252450078725815, 0.1552623063325882, -0.23897039890289307, 0.036255478858947754, -0.09490116685628891, 0.08064372092485428, 0.0017309013055637479, 0.08484213799238205, -0.07529973238706589, 0.022200478240847588, -0.10573600977659225, 0.0037568025290966034, 0.075276680290699, 0.04180307686328888, -0.05554218962788582, 0.15476444363594055, -0.16981041431427002, 0.22939729690551758, 0.11703455448150635, -0.13663136959075928, -0.0915972888469696, -0.1152295246720314, 0.007930442690849304, -0.016463875770568848, -0.07295192033052444, -0.009564582258462906, 0.12166782468557358, -0.03571236506104469, 0.18361419439315796, -0.018240265548229218, -0.028462545946240425, -0.013746020384132862, -0.09323040395975113, -0.012784305028617382, 0.017725640907883644, 0.04489867389202118, -0.231642484664917, 0.13959123194217682, 0.15498210489749908, -0.023230239748954773, 0.19611501693725586, 0.0333954393863678, 0.028921006247401237, 0.00525292381644249, -0.05130617320537567, 0.0011728843674063683, -0.026411594823002815, -0.051287874579429626, -0.04254357889294624, 0.020757026970386505, 0.0074368310160934925, 0.026900894939899445, -0.11432439088821411, -0.03215554356575012, 0.022868173196911812, 0.05850449204444885, 0.02872556447982788, 0.06238703802227974, -0.08062804490327835, 0.07846172899007797, -0.03310805931687355, -0.14237742125988007, 0.1218472421169281, 0.008713995106518269, -0.12699823081493378, 0.17084048688411713, -0.09632863849401474, -0.21641428768634796, -0.19598573446273804, -0.17537488043308258, -0.005434079095721245, 0.07832824438810349, 0.07291760295629501, -0.07093466073274612, -0.06506931781768799, -0.013085749931633472, -0.06344981491565704, 0.014589118771255016, -0.02378881722688675, -0.0763881728053093, 0.040016770362854004, -0.019185611978173256, -0.10781533271074295, -0.04439970850944519, 0.012527063488960266, -0.060712482780218124, 0.08484047651290894, -0.06380870193243027, 0.055258508771657944, 0.15709160268306732, -0.014931886456906796, 0.0185655876994133, -0.039996929466724396, 0.1619407683610916, -0.07088436186313629, 0.007049378007650375, 0.09842365980148315, -0.06707129627466202, 0.03206530585885048, 0.21750140190124512, 0.01955440454185009, -0.06769856065511703, 0.09086031466722488, -0.05018770694732666, -0.056951407343149185, -0.19610178470611572, -0.11177343130111694, -0.011685210280120373, 0.015133507549762726, 0.08075802773237228, 0.04754025116562843, 0.27289727330207825, 0.12313301116228104, 0.05691686272621155, 0.05817769095301628, 0.054087668657302856, 0.09844991564750671, 0.15277034044265747, -0.035599011927843094, 0.18337403237819672, -0.06327883899211884, -0.1762838214635849, 0.0463390089571476, -0.031888697296381, 0.07911207526922226, 0.17036698758602142, 0.018747372552752495, 0.034574054181575775, 0.06665497273206711, 0.14666350185871124, 0.12334746867418289, 0.084808848798275, -0.056319113820791245, -0.021449556574225426, -0.020032169297337532, -0.05785588547587395, 0.12442591786384583, -0.04964852333068848, -0.04578586295247078, -0.02340216189622879, 0.04389573261141777, 0.04918395355343819, 0.0898701474070549, 0.0074623748660087585, -0.286918967962265, 0.0050653028301894665, 0.04626360535621643, -0.05904335901141167, -0.08890440315008163, 0.08012378960847855, -0.007189115509390831, -0.16694433987140656, 0.014110126532614231, -0.03885189816355705, 0.0952671691775322, 0.01716073974967003, 0.07566265016794205, -0.09568360447883606, -0.02710309624671936, -0.042300328612327576, 0.13319268822669983, -0.39069533348083496, 0.21181745827198029, -0.01682322286069393, 0.045987967401742935, -0.10731089115142822, 0.007316072005778551, 0.08743616193532944, 0.1837569922208786, 0.096399687230587, -0.055255234241485596, -0.16648070514202118, -0.1134042739868164, -0.09045281261205673, -0.005390846636146307, 0.02524641342461109, -0.01680302806198597, 0.018255090340971947, -0.1121804416179657, -0.004623611457645893, 0.05001246929168701, 0.015457145869731903, -0.1535630226135254, -0.16051709651947021, 0.006068591494113207, 0.031445879489183426, 0.11646050214767456, -0.031057335436344147, -0.08076300472021103, -0.094660185277462, 0.1585388332605362, 0.03501934930682182, 0.0011602110462263227, -0.13256333768367767, -0.04017394408583641, -0.04944954439997673, -0.03613399714231491, 0.06408081203699112, 0.005305019672960043, 0.1273590475320816, -0.08283259719610214, -0.08643025904893875, 0.11136641353368759, -0.11668875813484192, -0.060902297496795654, -0.10979162901639938, 0.024973781779408455, -0.019845841452479362, 0.003304521320387721, 0.10422127693891525, 0.04288886487483978, -0.07108274102210999, -0.05980661138892174, -0.02113221026957035, -0.005155228078365326, -0.014997591264545918, -0.10641545802354813, -0.11598227173089981, -0.14479388296604156, -0.03176797181367874, -0.11093590408563614, 0.22375445067882538, 0.15719911456108093, -0.08290619403123856, 0.146688774228096, 0.2352554351091385, -0.11010316014289856, -0.3153008222579956, -0.06070736423134804, -0.0540805347263813, 0.006331745535135269, 0.03831018880009651, -0.12917447090148926, 0.08694636821746826, 0.007767969276756048, -0.07379741966724396, -0.04062706604599953, -0.14264778792858124, -0.1587120145559311, 0.25331321358680725, 0.013260179199278355, 0.23543070256710052, -0.09259489178657532, -0.059353094547986984, -0.14886894822120667, 0.005429987329989672, 0.06631099432706833, -0.09092322736978531, 0.06045657396316528, 0.051252949982881546, 0.07965859025716782, 0.02966240607202053, -0.024035006761550903, 0.055772002786397934, -0.05038310959935188, 0.0759878158569336, -0.17024953663349152, -0.04500401392579079, 0.05323336645960808, -0.018912682309746742, 0.0998123288154602, -0.05401824042201042, 0.02677818201482296, -0.008542642928659916, -0.06801259517669678, 0.023929402232170105, 0.06284014135599136, -0.004270337522029877, -0.11778664588928223, 0.014000087045133114, -0.010050134733319283, 0.0053947134874761105, -0.059911634773015976, 0.05780978873372078, -0.03476967662572861, 0.1485874503850937, 0.15241393446922302, 0.23329348862171173, -0.07141220569610596, 0.10244371742010117, -0.03340986743569374, -0.11548477411270142, 0.08455099910497665, -0.07695529609918594, 0.027050279080867767, 0.07731927186250687, -0.047056566923856735, 0.1748439520597458, 0.05639607086777687, 0.011933833360671997, -0.008426292799413204, 0.142360657453537, -0.15987731516361237, 0.009308046661317348, -0.07943334430456161, 0.13067156076431274, 0.04492649808526039, 0.0004141576064284891, 0.12906450033187866, -0.09000152349472046, -0.01941542886197567, 0.021079346537590027, 0.00479003693908453, -0.03201186656951904, 0.10242592543363571, 0.02793554961681366, 0.016818257048726082, -0.07914214581251144, 0.03550894185900688, 0.07390327006578445, 0.0034617725759744644, 0.04619156941771507, 0.02801675535738468, -0.10968965291976929, -0.11486479640007019, 0.003840886289253831, 0.2648932933807373, -0.17048506438732147, -0.0780344158411026, -0.012264362536370754, -0.1285719871520996, 0.0121929831802845, 0.08398593217134476, 0.07780637592077255, 0.04008983448147774, -0.06751032918691635, -0.02295597828924656, -0.09875822067260742, 0.09456682950258255, 0.012302984483540058, 0.047485072165727615, -0.13529013097286224, 0.0852399468421936, -0.03691168501973152, -0.007895942777395248, -0.09560481458902359, -0.049228549003601074, -0.12735776603221893, 0.02569100819528103, -0.13867126405239105, -0.03795100376009941, -0.05277736112475395, -0.007859637029469013, 0.05610223114490509, -0.015295255929231644, -0.018181191757321358, -0.03177627921104431, -0.09117025136947632, 0.030161157250404358, -0.0012549255043268204, 0.057158660143613815, -0.044383708387613297, -0.027280032634735107, 0.03378000855445862, -0.0044015152379870415, 0.06994954496622086, 0.02122250199317932, -0.032562851905822754, 0.052383456379175186, -0.19439010322093964, 0.013238121755421162, 0.07609103620052338, -0.001706396578811109, 0.032405223697423935, -0.04411722719669342, -0.01344985794275999, 0.10101554542779922, 0.043881531804800034, 0.04432373866438866, 0.011401806026697159, -0.08144962042570114, 0.04596644267439842, 0.06498777121305466, -0.13372421264648438, -0.03389865532517433, -0.03190535679459572, 0.0028287700843065977, -0.03814942017197609, 0.22334830462932587, -0.12197455763816833, 0.04868736490607262, -0.04305120185017586, 0.03215811774134636, -0.02491365373134613, -0.13802887499332428, -0.11349769681692123, -0.12090087682008743, -0.028644030913710594, -0.005757875274866819, 0.24444162845611572, 0.11631548404693604, -0.023379435762763023, 0.03680773451924324, 0.0689404159784317, 0.05268724635243416, 0.019476626068353653, 0.20020228624343872, 0.11798638105392456, 0.00801010336726904, -0.13824819028377533, 0.08099787682294846, 0.03446761891245842, -0.05022868886590004, -0.0002508852630853653, 0.006553226616233587, -0.07786662876605988, 0.06981057673692703, 0.051118154078722, -0.010024713352322578, -0.08102474361658096, -0.12497129291296005, -0.1094818115234375, 0.039945054799318314, -0.09261518716812134, 0.019266700372099876, 0.17661623656749725, -0.04462537169456482, -0.012400207109749317, -0.06642598658800125, -0.044931549578905106, -0.20246891677379608, -0.16378946602344513, -0.11721070855855942, -0.07838650792837143, 0.010228404775261879, -0.041436467319726944, 0.0513053722679615, 0.024983378127217293, 0.06237258389592171, -0.05817200243473053, 0.0788678303360939, -0.08986064791679382, -0.010724514722824097, 0.01965920440852642, -0.06444259732961655, 0.005938014481216669, -0.1949824094772339, -0.015196557156741619, -0.11736103147268295, 0.0319199413061142, -0.02515447326004505, -0.018131008371710777, 0.0060718166641891, -0.005089160520583391, -0.045827340334653854, -0.036580640822649, -0.012856644578278065, 0.03945055603981018, -0.006096675992012024, 0.04251397028565407, 0.01255482342094183, -0.00021614103752654046, 0.045874763280153275, 0.20457006990909576, -0.03719383850693703, -0.1900760382413864, -0.14537177979946136, 0.24047856032848358, 0.016291463747620583, 0.13067831099033356, -0.05723630264401436, -0.007833994925022125, 0.038664985448122025, 0.28923484683036804, 0.30338987708091736, -0.056060388684272766, 0.011621600948274136, -0.01911630854010582, -0.005083557218313217, -0.006262179929763079, 0.1619090884923935, 0.01673344522714615, 0.15317486226558685, -0.04591422155499458, 0.06250827014446259, -0.013997954316437244, -0.08819081634283066, -0.04897632822394371, 0.12038447707891464, -0.013747419230639935, -0.010778333060443401, -0.023452607914805412, 0.08203831315040588, -0.0993853434920311, 0.1308000087738037, -0.16504473984241486, -0.043394576758146286, -0.061062831431627274, 0.037428054958581924, 0.1087816059589386, 0.004587629344314337, 0.04372462257742882, -0.020919285714626312, -0.015757059678435326, 0.04033498093485832, -0.04032468423247337, -0.0748470202088356, -0.010522308759391308, 0.07876913994550705, -0.023894717916846275, 0.19323021173477173, -0.01973799616098404, 0.05881485715508461, 0.08197367191314697, 0.0009901272132992744, -0.0923495963215828, 0.09593427926301956, -0.002569346921518445, -0.04332247003912926, 0.12296795845031738, -0.003559875302016735, 0.009056528098881245, 0.0047286273911595345, 0.014467168599367142, -0.16888590157032013, 0.12593960762023926, -0.09653013944625854, -0.09478563070297241, -0.04189549759030342, 0.10743332654237747, -0.01823744736611843, 0.14765001833438873, 0.07268229871988297, -0.018285823985934258, 0.017968133091926575, -0.027959873899817467, 0.05307796970009804, -0.01953253708779812, -0.10735776275396347, -0.042452070862054825, -0.20386724174022675, -0.031956519931554794, 0.07400371879339218, -0.012480479665100574, -0.2703115940093994, -0.07448812574148178, -0.0989629402756691, -0.02670217864215374, -0.12356587499380112, 0.07305317372083664, 0.21730293333530426, 0.030869849026203156, -0.009126100689172745, -0.1264304667711258, -0.009863014332950115, 0.0476914644241333, -0.0390271432697773, -0.09755232185125351 ]
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # distilbert-base-uncased-finetuned-emotion This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the emotion dataset. It achieves the following results on the evaluation set: - Loss: 0.1659 - Accuracy: 0.932 - F1: 0.9323 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:| | 0.2129 | 1.0 | 250 | 0.1866 | 0.9285 | 0.9292 | | 0.14 | 2.0 | 500 | 0.1659 | 0.932 | 0.9323 | ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.0+cu121 - Datasets 2.17.0 - Tokenizers 0.15.1
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["emotion"], "metrics": ["accuracy", "f1"], "base_model": "distilbert-base-uncased", "model-index": [{"name": "distilbert-base-uncased-finetuned-emotion", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "emotion", "type": "emotion", "config": "split", "split": "validation", "args": "split"}, "metrics": [{"type": "accuracy", "value": 0.932, "name": "Accuracy"}, {"type": "f1", "value": 0.9323066206816922, "name": "F1"}]}]}]}
text-classification
zackdai/distilbert-base-uncased-finetuned-emotion
[ "transformers", "tensorboard", "safetensors", "distilbert", "text-classification", "generated_from_trainer", "dataset:emotion", "base_model:distilbert-base-uncased", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2024-02-13T21:46:05+00:00
[]
[]
TAGS #transformers #tensorboard #safetensors #distilbert #text-classification #generated_from_trainer #dataset-emotion #base_model-distilbert-base-uncased #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
distilbert-base-uncased-finetuned-emotion ========================================= This model is a fine-tuned version of distilbert-base-uncased on the emotion dataset. It achieves the following results on the evaluation set: * Loss: 0.1659 * Accuracy: 0.932 * F1: 0.9323 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 2e-05 * train\_batch\_size: 64 * eval\_batch\_size: 64 * seed: 42 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * num\_epochs: 2 ### Training results ### Framework versions * Transformers 4.35.2 * Pytorch 2.1.0+cu121 * Datasets 2.17.0 * Tokenizers 0.15.1
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2", "### Training results", "### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1" ]
[ "TAGS\n#transformers #tensorboard #safetensors #distilbert #text-classification #generated_from_trainer #dataset-emotion #base_model-distilbert-base-uncased #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2", "### Training results", "### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1" ]
[ 82, 98, 4, 33 ]
[ "passage: TAGS\n#transformers #tensorboard #safetensors #distilbert #text-classification #generated_from_trainer #dataset-emotion #base_model-distilbert-base-uncased #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2### Training results### Framework versions\n\n\n* Transformers 4.35.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.15.1" ]
[ -0.12298925966024399, 0.16415931284427643, -0.0022249515168368816, 0.12838715314865112, 0.12877148389816284, 0.01904618926346302, 0.1514195203781128, 0.12019442021846771, -0.03650584816932678, 0.046193916350603104, 0.12899209558963776, 0.13060440123081207, 0.020094528794288635, 0.13745972514152527, -0.08555696159601212, -0.20673809945583344, 0.021901700645685196, 0.034984759986400604, -0.019554216414690018, 0.1276279240846634, 0.10021701455116272, -0.11031892150640488, 0.10418032854795456, -0.0035885514225810766, -0.13512423634529114, -0.007379281334578991, 0.016138669103384018, -0.038977932184934616, 0.12101870775222778, 0.019757069647312164, 0.08555757254362106, 0.03709016367793083, 0.061780042946338654, -0.2063295543193817, 0.01633494906127453, 0.037308480590581894, -0.006699043791741133, 0.08305735141038895, 0.033081576228141785, -0.027233779430389404, 0.06545687466859818, -0.09334614872932434, 0.04899849742650986, 0.025416752323508263, -0.12426608800888062, -0.2230583131313324, -0.08652891963720322, 0.05607512220740318, 0.06954199075698853, 0.09302328526973724, -0.02298552542924881, 0.12882556021213531, -0.03278733789920807, 0.09309118241071701, 0.1929847002029419, -0.2785753309726715, -0.06253271549940109, 0.019933240488171577, 0.024857191368937492, 0.07781456410884857, -0.11607067286968231, -0.02765176258981228, 0.05328065901994705, 0.025419652462005615, 0.1408209651708603, -0.03000331111252308, -0.0000100845136330463, -0.008367571979761124, -0.122127965092659, -0.0467156320810318, 0.19661131501197815, 0.08830983191728592, -0.05298612639307976, -0.08292418718338013, -0.062137067317962646, -0.12344267219305038, -0.02369128353893757, -0.015646615996956825, 0.048551563173532486, -0.0006837965920567513, -0.07165957242250443, -0.018645696341991425, -0.10995084792375565, -0.039818160235881805, -0.014501640573143959, 0.13194607198238373, 0.011023708619177341, 0.0013879250036552548, 0.006495350040495396, 0.09365515410900116, -0.005687214434146881, -0.15282344818115234, 0.011868472211062908, 0.008473686873912811, 0.02234680764377117, -0.03086315095424652, -0.06160522252321243, -0.08252810686826706, 0.007355652283877134, 0.12234635651111603, -0.06391282379627228, 0.0381796732544899, 0.02186567336320877, 0.02776717022061348, -0.0757814571261406, 0.18540902435779572, -0.03730515390634537, -0.04703554883599281, 0.018810899928212166, 0.12801603972911835, 0.06705563515424728, -0.01113202329725027, -0.11687757074832916, 0.036215994507074356, 0.12639904022216797, 0.008259239606559277, -0.06383469700813293, 0.07295916974544525, -0.09291710704565048, -0.03045615740120411, 0.040017351508140564, -0.08345825970172882, 0.010006575845181942, 0.010459926910698414, -0.051640018820762634, -0.05533228814601898, 0.02033049426972866, 0.023422755300998688, 0.008880945853888988, 0.05069858580827713, -0.08697323501110077, -0.004888592287898064, -0.06064574047923088, -0.09836641699075699, 0.013786204159259796, -0.07303544878959656, 0.0294688418507576, -0.10141290724277496, -0.22318731248378754, -0.023288331925868988, 0.06568637490272522, -0.015711693093180656, -0.05751540884375572, -0.0765494853258133, -0.05566942319273949, 0.01963070221245289, -0.0009356812224723399, 0.02586907520890236, -0.06492088735103607, 0.08844626694917679, 0.043852370232343674, 0.06694947183132172, -0.04401993751525879, 0.03836948052048683, -0.12762954831123352, 0.04506474360823631, -0.12825657427310944, 0.064083531498909, -0.03988830745220184, 0.08981011807918549, -0.07992232590913773, -0.07603347301483154, 0.015640147030353546, -0.02602706104516983, 0.05740893632173538, 0.1261247992515564, -0.16774190962314606, -0.07420198619365692, 0.16842208802700043, -0.08345673978328705, -0.15481406450271606, 0.13748565316200256, -0.05880170688033104, 0.07106537371873856, 0.07528718560934067, 0.20262359082698822, 0.06439574062824249, -0.048542819917201996, -0.022134937345981598, -0.005324368830770254, 0.07184375822544098, -0.00817684456706047, 0.09434540569782257, 0.008535358123481274, -0.004509931895881891, 0.022837214171886444, -0.05818382650613785, 0.060154400765895844, -0.06362628936767578, -0.10318101197481155, -0.03899693489074707, -0.1149534285068512, 0.07968122512102127, 0.06385963410139084, 0.05263492837548256, -0.10782752186059952, -0.08684656769037247, 0.011198734864592552, 0.0920170322060585, -0.08252817392349243, 0.01560035441070795, -0.0750403180718422, 0.0779593288898468, -0.06215186044573784, -0.012691549956798553, -0.14994220435619354, 0.0015186409000307322, 0.025656571611762047, 0.008467240259051323, -0.003673878498375416, -0.010716867633163929, 0.07532812654972076, 0.057596828788518906, -0.07332060486078262, -0.06730838119983673, -0.036077093333005905, 0.009738643653690815, -0.09620614349842072, -0.20298953354358673, -0.013542743399739265, -0.04094722494482994, 0.2085779309272766, -0.22458098828792572, 0.055817391723394394, 0.001140425680205226, 0.06879791617393494, 0.03843928128480911, -0.03587650880217552, -0.0029504087287932634, 0.03329670801758766, -0.04915175959467888, -0.07146980613470078, 0.07085344940423965, 0.02826550044119358, -0.1311282217502594, -0.0149794090539217, -0.14575673639774323, 0.16261208057403564, 0.11418330669403076, -0.03650867938995361, -0.048777587711811066, -0.0063068438321352005, -0.04165167361497879, -0.020295223221182823, -0.018865853548049927, 0.009229526855051517, 0.1351911425590515, 0.007918436080217361, 0.1456819772720337, -0.08592311292886734, -0.021741749718785286, 0.017022565007209778, -0.04692652076482773, -0.013722056522965431, 0.11608871072530746, 0.013309158384799957, -0.14549824595451355, 0.14500054717063904, 0.1914234459400177, -0.06624388694763184, 0.14039725065231323, -0.042862098664045334, -0.04359167069196701, -0.049233049154281616, -0.00044229652849026024, 0.008725951425731182, 0.10347870737314224, -0.11197403818368912, 0.002216064603999257, 0.01330116018652916, -0.00366339017637074, -0.00963506568223238, -0.19666317105293274, -0.038123928010463715, 0.06050406023859978, -0.04967494308948517, 0.004861168097704649, -0.009077387861907482, -0.02050962671637535, 0.08195244520902634, 0.005004735663533211, -0.06806423515081406, 0.05337528884410858, -0.008175288327038288, -0.08379372954368591, 0.19792059063911438, -0.08355582505464554, -0.1820179522037506, -0.1360347718000412, -0.04509621486067772, -0.08249577134847641, 0.03641597554087639, 0.06329063326120377, -0.07827824354171753, -0.02807299606502056, -0.11779822409152985, -0.022146549075841904, 0.026902813464403152, 0.01316885743290186, 0.046345897018909454, -0.02449585497379303, 0.09087736159563065, -0.09302917122840881, -0.015289775095880032, -0.015670262277126312, -0.02726726606488228, 0.043986979871988297, 0.0003908520156983286, 0.11644674837589264, 0.1344602108001709, -0.0024111997336149216, 0.001982764108106494, -0.02674761414527893, 0.24015909433364868, -0.060976624488830566, -0.021268535405397415, 0.14057914912700653, -0.025824066251516342, 0.0664115697145462, 0.14003711938858032, 0.04754038155078888, -0.10216362029314041, 0.01825735531747341, 0.02516135387122631, -0.022555982694029808, -0.19683779776096344, -0.023665999993681908, -0.03820670023560524, 0.013110728934407234, 0.0923033356666565, 0.02500477060675621, 0.05578959360718727, 0.07980433851480484, 0.011307960376143456, 0.046082641929388046, -0.01593147963285446, 0.07934863120317459, 0.11102430522441864, 0.03545748442411423, 0.10672835260629654, -0.02969278022646904, -0.036512281745672226, 0.05040682479739189, -0.005518865305930376, 0.17186596989631653, 0.0012750305468216538, 0.19665370881557465, 0.0376354418694973, 0.16429176926612854, -0.03006909415125847, 0.057630978524684906, -0.009824772365391254, -0.02506519854068756, -0.029358426108956337, -0.042727939784526825, -0.07352055609226227, 0.047611966729164124, -0.06342733651399612, 0.09695445001125336, -0.11881578713655472, 0.016130482777953148, 0.06686265766620636, 0.27346566319465637, 0.05122973397374153, -0.3381480276584625, -0.118779256939888, 0.03163718804717064, -0.012638452462852001, -0.024628782644867897, 0.006265256088227034, 0.11430396884679794, -0.0633852407336235, 0.053050447255373, -0.07694634050130844, 0.0762980729341507, -0.06483963876962662, 0.05347886681556702, 0.017030466347932816, 0.05816536024212837, -0.003607850056141615, 0.07206951081752777, -0.2507160007953644, 0.237730011343956, 0.011494971811771393, 0.06829703599214554, -0.05151974409818649, -0.004263938404619694, 0.06421706080436707, 0.0902511328458786, 0.08569177985191345, -0.0013113958993926644, 0.00467272661626339, -0.17602375149726868, -0.06630519032478333, 0.022091830149292946, 0.0539868026971817, -0.06566138565540314, 0.09515029937028885, -0.03607471287250519, 0.005177485756576061, 0.06704428791999817, 0.042426832020282745, -0.07040222734212875, -0.10223818570375443, 0.002481512725353241, 0.05932764708995819, 0.0026643802411854267, -0.08312251418828964, -0.10207563638687134, -0.09500055015087128, 0.15390591323375702, -0.012838419526815414, -0.04031655564904213, -0.1032203808426857, 0.05126528441905975, 0.04013681784272194, -0.08564738184213638, 0.019136466085910797, -0.003202908206731081, 0.11475201696157455, 0.01415063627064228, -0.05189846456050873, 0.10044047981500626, -0.06313786655664444, -0.1765647828578949, -0.05559659004211426, 0.10916933417320251, 0.02679434046149254, 0.04905441030859947, 0.0072315763682127, 0.006358620710670948, -0.04452360421419144, -0.06758596748113632, 0.04355176165699959, 0.02181336283683777, 0.04135998710989952, 0.01441817544400692, -0.01724773831665516, -0.004459582734853029, -0.07943835109472275, -0.029633482918143272, 0.16276660561561584, 0.2990800440311432, -0.0724276676774025, 0.008852411061525345, 0.053898707032203674, -0.053623735904693604, -0.17211347818374634, 0.02427157387137413, 0.03177283704280853, 0.015426634810864925, 0.0598289929330349, -0.14455188810825348, 0.0749862790107727, 0.06510473042726517, -0.0287938192486763, 0.0798645168542862, -0.2513330578804016, -0.12475714832544327, 0.12642055749893188, 0.14921393990516663, 0.1332807093858719, -0.15596584975719452, -0.038069210946559906, -0.04890201613306999, -0.10739458352327347, 0.10282620787620544, -0.10546713322401047, 0.10821212828159332, -0.006642368156462908, 0.07144784927368164, 0.012308468110859394, -0.037015244364738464, 0.1497841775417328, -0.0053465948440134525, 0.10033758729696274, -0.05874332785606384, -0.012331431731581688, 0.07007947564125061, -0.07364233583211899, 0.03307785093784332, -0.140011727809906, 0.04514949768781662, -0.10610543936491013, -0.03220716491341591, -0.06697490066289902, 0.02192913554608822, -0.03276041895151138, -0.06069423258304596, -0.03603912517428398, 0.04129990562796593, 0.09074567258358002, 0.004461248405277729, 0.13149239122867584, 0.012619505636394024, 0.11043127626180649, 0.15007495880126953, 0.09350668638944626, -0.045194126665592194, -0.03603368252515793, -0.031068192794919014, -0.0329701267182827, 0.051759686321020126, -0.15761122107505798, 0.038663070648908615, 0.10641254484653473, 0.008402816019952297, 0.17831099033355713, 0.05722370743751526, -0.0326821506023407, 0.015341658145189285, 0.061104100197553635, -0.1644946038722992, -0.12632444500923157, -0.03550735488533974, -0.023713625967502594, -0.16105681657791138, 0.02984657511115074, 0.11732334643602371, -0.05834845080971718, -0.0014169812202453613, -0.015113011002540588, 0.02170756831765175, -0.021074876189231873, 0.14048346877098083, 0.04846009239554405, 0.03503165766596794, -0.09008875489234924, 0.09292999655008316, 0.039195068180561066, -0.0952715054154396, 0.0248859915882349, 0.012673599645495415, -0.0900382325053215, -0.04714033380150795, 0.03031144291162491, 0.19745920598506927, -0.040961854159832, -0.051432423293590546, -0.15959081053733826, -0.10266458988189697, 0.051544420421123505, 0.1154305636882782, 0.09952162951231003, 0.023818695917725563, -0.04972377419471741, -0.0019818672444671392, -0.09349818527698517, 0.11611457169055939, 0.06941138207912445, 0.061568208038806915, -0.16531093418598175, 0.0715322345495224, -0.01668120175600052, 0.014330887235701084, -0.016147714108228683, 0.01841355860233307, -0.07997079938650131, -0.019082441926002502, -0.13620004057884216, 0.0056248800829052925, -0.04218827933073044, 0.017805742099881172, -0.0014917199732735753, -0.05678331479430199, -0.04375004023313522, 0.010821771807968616, -0.09677516669034958, -0.03369671106338501, 0.04078563302755356, 0.0690656453371048, -0.11939346790313721, -0.051640816032886505, 0.029735703021287918, -0.07973416894674301, 0.08691716939210892, 0.03120613284409046, 0.009364795871078968, 0.03232576325535774, -0.15181834995746613, 0.04097861424088478, 0.061356496065855026, 0.0010428341338410974, 0.018393993377685547, -0.10278753936290741, -0.026722000911831856, -0.0025697823148220778, 0.007415526546537876, 0.016234593465924263, 0.10535606741905212, -0.11115217953920364, 0.007499100640416145, 0.011769969016313553, -0.037147048860788345, -0.06605912744998932, 0.037295207381248474, 0.07689788937568665, 0.021435894072055817, 0.22531521320343018, -0.08321985602378845, 0.01358514092862606, -0.20898322761058807, 0.008152627386152744, 0.004840954206883907, -0.11456567794084549, -0.1384253352880478, -0.06088176742196083, 0.03864532336592674, -0.05609184131026268, 0.1072925254702568, -0.005791793577373028, 0.02195131406188011, 0.019817862659692764, 0.0024811180774122477, 0.05772358551621437, 0.00891666579991579, 0.2178145945072174, 0.016372209414839745, -0.04919351637363434, 0.0604318343102932, 0.027111297473311424, 0.11198686808347702, 0.11162023991346359, 0.1395191252231598, 0.1593293696641922, -0.004752764478325844, 0.10206945240497589, 0.011167536489665508, 0.003676127642393112, -0.14276158809661865, 0.04454519972205162, -0.03397553041577339, 0.10183025151491165, 0.004331657197326422, 0.23346753418445587, 0.08796288818120956, -0.16622917354106903, 0.03672078251838684, -0.04876039922237396, -0.07527824491262436, -0.07965926080942154, -0.10067258030176163, -0.09405376762151718, -0.13725221157073975, -0.005322910379618406, -0.12106827646493912, 0.0037962363567203283, 0.07040152698755264, -0.00980472657829523, -0.04000251367688179, 0.14826920628547668, -0.004319100175052881, 0.01474982313811779, 0.08450791984796524, -0.01017678715288639, -0.07189807295799255, -0.06506330519914627, -0.08639872819185257, 0.020230503752827644, -0.0006633383454754949, 0.04267972707748413, -0.04681497439742088, -0.024334469810128212, 0.030356328934431076, -0.016857890412211418, -0.12031431496143341, 0.010171568021178246, 0.02395612746477127, 0.04880282282829285, 0.044744398444890976, 0.015952477231621742, 0.00639567943289876, 0.01082262210547924, 0.2127608209848404, -0.06947115808725357, -0.01553473062813282, -0.1102161779999733, 0.16648617386817932, 0.0038185124285519123, -0.013057523407042027, 0.02416670322418213, -0.09783343225717545, 0.042867351323366165, 0.1887388378381729, 0.16507670283317566, -0.09282473474740982, 0.004849682096391916, -0.03368780389428139, -0.0017054337076842785, -0.04403488710522652, 0.06038481742143631, 0.09466515481472015, -0.05141542851924896, -0.08203145116567612, 0.002195686334744096, -0.04958672448992729, -0.023554332554340363, -0.023252420127391815, 0.04895400255918503, 0.02790220081806183, 0.01331760548055172, -0.05317055433988571, 0.055212270468473434, -0.04259690269827843, -0.08908868581056595, 0.05131212994456291, -0.1913837343454361, -0.13959868252277374, -0.04022086411714554, 0.07758738100528717, 0.016668090596795082, 0.053110647946596146, -0.011013668961822987, 0.0170731358230114, 0.08375632762908936, -0.02727385051548481, -0.07615333795547485, -0.057335834950208664, 0.07826893031597137, -0.10985586792230606, 0.2154049277305603, -0.046016622334718704, 0.01907510496675968, 0.12245169281959534, 0.03985793516039848, -0.0819580927491188, 0.08338607102632523, 0.046950891613960266, -0.02591324783861637, 0.033736422657966614, 0.10738630592823029, -0.03027603216469288, 0.12614227831363678, 0.05635765939950943, -0.12484896183013916, 0.002519885776564479, -0.021606534719467163, -0.07779785990715027, -0.05144015699625015, -0.02547837793827057, -0.05639272183179855, 0.13560938835144043, 0.18014462292194366, -0.05217359960079193, -0.003382803173735738, -0.04032931104302406, 0.01732526533305645, 0.07948071509599686, 0.016613582149147987, -0.03237352892756462, -0.20988033711910248, 0.01582900993525982, 0.07813900709152222, 0.009509092196822166, -0.3007434606552124, -0.10395227372646332, -0.01963624358177185, -0.04528871551156044, -0.06816833466291428, 0.09115426242351532, 0.08285064995288849, 0.048586610704660416, -0.051645874977111816, -0.054796792566776276, -0.07190260291099548, 0.16186556220054626, -0.11737532913684845, -0.089300237596035 ]
null
null
transformers
--- # 🦜 EmertonOmniBeagle-7B-dpo EmertonOmniBeagle-7B-dpo is a DPO fine-tune of [mlabonne/OmniBeagle14-7B](https://huggingface.co/mlabonne/OmniBeagle-7B) using the [yleo/emerton_dpo_pairs](https://huggingface.co/datasets/yleo/emerton_dpo_pairs) preference dataset created from [Intel/orca_dpo_pairs](https://huggingface.co/datasets/Intel/orca_dpo_pairs) by replacing gpt 3.5 answer by a gpt4 Turbo answer. Then, gpt4 Turbo is put as chosen whereas gpt4 is put as rejected. ## 🔍 Applications This model uses a context window of 8k. It is compatible with different templates, like chatml and Llama's chat template. ## 🏆 Evaluation ### Open LLM Leaderboard To come... ## 💻 Usage ```python !pip install -qU transformers accelerate from transformers import AutoTokenizer import transformers import torch model = "yleo/EmertonOmniBeagle14-7B" messages = [{"role": "user", "content": "How to improve LLM fine-tuning?"}] tokenizer = AutoTokenizer.from_pretrained(model) prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True) pipeline = transformers.pipeline( "text-generation", model=model, torch_dtype=torch.float16, device_map="auto", ) outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95) print(outputs[0]["generated_text"]) ```
{"license": "cc-by-nc-4.0", "tags": ["dpo"], "datasets": ["yleo/emerton_dpo_pairs"], "base_model": "mlabonne/OmniBeagle14-7B"}
text-generation
yleo/EmertonOmniBeagle-7B-dpo
[ "transformers", "safetensors", "mistral", "text-generation", "dpo", "dataset:yleo/emerton_dpo_pairs", "base_model:mlabonne/OmniBeagle14-7B", "license:cc-by-nc-4.0", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2024-02-13T21:54:06+00:00
[]
[]
TAGS #transformers #safetensors #mistral #text-generation #dpo #dataset-yleo/emerton_dpo_pairs #base_model-mlabonne/OmniBeagle14-7B #license-cc-by-nc-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
--- # EmertonOmniBeagle-7B-dpo EmertonOmniBeagle-7B-dpo is a DPO fine-tune of mlabonne/OmniBeagle14-7B using the yleo/emerton_dpo_pairs preference dataset created from Intel/orca_dpo_pairs by replacing gpt 3.5 answer by a gpt4 Turbo answer. Then, gpt4 Turbo is put as chosen whereas gpt4 is put as rejected. ## Applications This model uses a context window of 8k. It is compatible with different templates, like chatml and Llama's chat template. ## Evaluation ### Open LLM Leaderboard To come... ## Usage
[ "# EmertonOmniBeagle-7B-dpo\n\nEmertonOmniBeagle-7B-dpo is a DPO fine-tune of mlabonne/OmniBeagle14-7B using the yleo/emerton_dpo_pairs preference dataset created from Intel/orca_dpo_pairs by replacing gpt 3.5 answer by a gpt4 Turbo answer. Then, gpt4 Turbo is put as chosen whereas gpt4 is put as rejected.", "## Applications\n\nThis model uses a context window of 8k. It is compatible with different templates, like chatml and Llama's chat template.", "## Evaluation", "### Open LLM Leaderboard\n\nTo come...", "## Usage" ]
[ "TAGS\n#transformers #safetensors #mistral #text-generation #dpo #dataset-yleo/emerton_dpo_pairs #base_model-mlabonne/OmniBeagle14-7B #license-cc-by-nc-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# EmertonOmniBeagle-7B-dpo\n\nEmertonOmniBeagle-7B-dpo is a DPO fine-tune of mlabonne/OmniBeagle14-7B using the yleo/emerton_dpo_pairs preference dataset created from Intel/orca_dpo_pairs by replacing gpt 3.5 answer by a gpt4 Turbo answer. Then, gpt4 Turbo is put as chosen whereas gpt4 is put as rejected.", "## Applications\n\nThis model uses a context window of 8k. It is compatible with different templates, like chatml and Llama's chat template.", "## Evaluation", "### Open LLM Leaderboard\n\nTo come...", "## Usage" ]
[ 93, 113, 33, 3, 10, 3 ]
[ "passage: TAGS\n#transformers #safetensors #mistral #text-generation #dpo #dataset-yleo/emerton_dpo_pairs #base_model-mlabonne/OmniBeagle14-7B #license-cc-by-nc-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# EmertonOmniBeagle-7B-dpo\n\nEmertonOmniBeagle-7B-dpo is a DPO fine-tune of mlabonne/OmniBeagle14-7B using the yleo/emerton_dpo_pairs preference dataset created from Intel/orca_dpo_pairs by replacing gpt 3.5 answer by a gpt4 Turbo answer. Then, gpt4 Turbo is put as chosen whereas gpt4 is put as rejected.## Applications\n\nThis model uses a context window of 8k. It is compatible with different templates, like chatml and Llama's chat template.## Evaluation### Open LLM Leaderboard\n\nTo come...## Usage" ]
[ -0.10793977975845337, 0.1515197902917862, -0.004711492918431759, 0.04618701711297035, 0.04636171832680702, 0.03882050886750221, 0.1646241843700409, 0.11415281891822815, 0.11099720001220703, 0.0176865104585886, -0.0426727756857872, 0.1771545559167862, 0.053771715611219406, 0.10362639278173447, -0.030811667442321777, -0.18179795145988464, 0.04623178392648697, 0.015320499427616596, 0.07763510942459106, 0.04354372248053551, 0.0640454888343811, -0.004320908337831497, 0.03873451426625252, 0.040968071669340134, -0.024509364739060402, 0.012394050136208534, 0.004758123308420181, -0.04137595742940903, 0.06518720835447311, 0.07271458208560944, 0.06954905390739441, 0.055975865572690964, -0.024514488875865936, -0.12384343147277832, 0.03205541521310806, 0.05781178921461105, -0.018426144495606422, -0.000014119050320005044, 0.06302061676979065, -0.01689998432993889, 0.06581854075193405, -0.048226498067379, 0.031148377805948257, 0.019172847270965576, -0.03103179857134819, -0.12762586772441864, -0.07903577387332916, 0.012943542562425137, 0.06009618937969208, 0.046840209513902664, 0.02949729934334755, 0.10428699105978012, -0.0245374608784914, 0.09641002118587494, 0.11667925864458084, -0.2059849351644516, 0.008784127421677113, 0.14073112607002258, 0.028891736641526222, 0.016756262630224228, -0.02927480638027191, -0.015834711492061615, -0.038604818284511566, 0.004839015193283558, -0.0471443310379982, -0.07726521044969559, 0.03285445645451546, -0.05668117478489876, -0.058552198112010956, -0.0014650430530309677, 0.19578883051872253, 0.003552383044734597, -0.039467476308345795, -0.128699392080307, -0.11310500651597977, 0.02513067238032818, 0.00723608722910285, -0.007812389638274908, 0.023641718551516533, 0.03882187604904175, 0.11304399371147156, -0.014552712440490723, -0.042834874242544174, -0.054351359605789185, -0.1300714612007141, 0.1700189858675003, 0.027994869276881218, 0.028184425085783005, -0.06027716025710106, 0.06788868457078934, -0.1038847416639328, -0.14553503692150116, -0.13320882618427277, -0.028127435594797134, 0.040531158447265625, 0.013770322315394878, -0.057759955525398254, -0.030170854181051254, 0.14631198346614838, 0.20717056095600128, -0.01447149645537138, 0.10157784819602966, 0.021700169891119003, -0.013139949180185795, -0.009251262992620468, 0.07463371753692627, 0.01891608163714409, -0.06138208135962486, 0.1042739525437355, -0.01733843795955181, 0.10159225761890411, -0.012594670988619328, -0.07948215305805206, -0.10344681143760681, -0.020871246233582497, 0.021950971335172653, 0.06183289736509323, 0.08749222010374069, -0.05121464654803276, -0.05958817899227142, 0.23659318685531616, -0.139039546251297, 0.03009682521224022, 0.013943071477115154, -0.042996227741241455, 0.03255098685622215, 0.0992000475525856, -0.00996882189065218, -0.050968002527952194, 0.010733158327639103, -0.014213891699910164, -0.042237572371959686, -0.07403985410928726, -0.0602903738617897, 0.011485056951642036, 0.007785230875015259, 0.020398395135998726, -0.13794869184494019, -0.21089328825473785, -0.028744986280798912, 0.026765799149870872, -0.009784598834812641, -0.036973871290683746, -0.03246224299073219, -0.007188518065959215, 0.006315047387033701, -0.014156358316540718, -0.07677637785673141, -0.04169764369726181, 0.008379856124520302, 0.04118451103568077, 0.07217561453580856, -0.1664549857378006, 0.01961452141404152, -0.029844770208001137, 0.04512855410575867, -0.14371563494205475, 0.13861754536628723, -0.10056809335947037, -0.031053703278303146, -0.007297861855477095, 0.021868852898478508, -0.029312675818800926, -0.026549726724624634, 0.0238319244235754, 0.06218208000063896, -0.09551095962524414, -0.04636082425713539, 0.20691904425621033, -0.13529740273952484, -0.07958544045686722, 0.030136819928884506, 0.04173406586050987, -0.10244237631559372, 0.0597669743001461, 0.1605479121208191, 0.25643953680992126, -0.11376580595970154, -0.06438163667917252, 0.06912349909543991, 0.012764902785420418, -0.0073075308464467525, 0.019885240122675896, -0.013480771332979202, 0.030557192862033844, 0.08390446752309799, -0.05327986553311348, 0.059457600116729736, 0.06313665211200714, 0.0013277198886498809, -0.07227744162082672, 0.0066950442269444466, -0.010716396383941174, -0.04509943723678589, -0.08345100283622742, -0.047202177345752716, -0.05595104396343231, 0.03873478248715401, 0.13419081270694733, -0.04196031764149666, -0.02796253189444542, -0.09476301074028015, 0.0872730165719986, -0.03442496806383133, 0.026437733322381973, -0.05453087016940117, -0.0936153456568718, 0.0389244519174099, -0.10521350800991058, -0.0688917487859726, 0.07343566417694092, 0.044535648077726364, 0.07147232443094254, 0.005173940677195787, -0.051473479717969894, -0.021890223026275635, 0.015099778771400452, 0.02298043482005596, -0.06751789897680283, 0.018104514107108116, -0.043373964726924896, 0.1427346020936966, -0.038408197462558746, 0.06825988739728928, 0.10115846246480942, 0.09841454029083252, 0.009875290095806122, -0.04208609089255333, 0.010664032772183418, -0.025399766862392426, -0.026349270716309547, -0.06873472034931183, 0.014717687852680683, 0.06574279069900513, 0.0012893311213701963, 0.1308935433626175, -0.24763141572475433, -0.03427589684724808, 0.1274740844964981, 0.1001526415348053, 0.04242943972349167, -0.07177089899778366, -0.016940051689743996, -0.03804260119795799, -0.008014216087758541, -0.02759447507560253, 0.19290269911289215, 0.012385604903101921, 0.11620654910802841, -0.061031270772218704, -0.002359799575060606, -0.021087098866701126, -0.012366484850645065, -0.0018407247262075543, -0.014301669783890247, 0.022077444940805435, 0.051385048776865005, 0.06878254562616348, 0.047096919268369675, 0.0035111813340336084, 0.16046170890331268, -0.0035382171627134085, -0.008136607706546783, -0.030698977410793304, 0.024967892095446587, -0.005447440780699253, 0.021278580650687218, -0.2013874650001526, 0.06831666827201843, 0.058306124061346054, 0.03353553265333176, 0.06521663814783096, -0.11690445989370346, 0.04964056983590126, 0.05836329236626625, -0.05667320266366005, 0.010092254728078842, 0.0605584979057312, 0.009652158245444298, 0.03037949651479721, -0.04964251443743706, -0.01109329517930746, 0.024698249995708466, -0.032115958631038666, -0.06971672922372818, 0.11368267238140106, -0.14794644713401794, -0.1735442876815796, -0.09723614156246185, -0.09249266237020493, -0.13716500997543335, 0.023672044277191162, 0.07879273593425751, 0.04436752200126648, -0.05517897754907608, -0.0626382976770401, 0.021231843158602715, 0.0645466074347496, -0.004043933469802141, -0.01163871493190527, 0.004118519369512796, -0.005710121244192123, -0.16560892760753632, -0.05210181698203087, 0.06499000638723373, -0.09768989682197571, 0.07873915135860443, -0.05031564459204674, 0.04001382738351822, 0.06370382755994797, -0.004661571700125933, 0.011183514259755611, 0.05264393985271454, 0.18656283617019653, -0.06421422213315964, 0.07249437272548676, 0.1711367815732956, 0.06162894889712334, 0.07909486442804337, 0.14515167474746704, -0.01735788583755493, -0.059729885309934616, -0.005812646355479956, -0.016541732475161552, -0.011360601522028446, -0.18732091784477234, -0.10147812962532043, -0.09836868196725845, -0.04816657304763794, 0.005369945894926786, 0.07856705039739609, 0.06455328315496445, 0.11362506449222565, -0.0462627038359642, 0.08608163893222809, 0.07396229356527328, 0.0655757337808609, 0.22222952544689178, 0.0011921059340238571, 0.04840733855962753, -0.06908199191093445, 0.004552257247269154, 0.08513146638870239, 0.09955137968063354, 0.1112247109413147, -0.03016752004623413, 0.08281999081373215, 0.07169322669506073, 0.03903787583112717, 0.06066400185227394, 0.05654273182153702, 0.03804013133049011, 0.00936383567750454, 0.02006390504539013, -0.08433233946561813, -0.06242885813117027, 0.020622404292225838, -0.11019957065582275, -0.009878112934529781, 0.009483850561082363, 0.04548923671245575, 0.025138165801763535, 0.15300248563289642, 0.0516015999019146, -0.2380068600177765, -0.0031158272176980972, 0.032677698880434036, 0.025213679298758507, -0.08149322867393494, 0.009304547682404518, 0.09619098156690598, -0.058782171458005905, 0.17567364871501923, 0.020347047597169876, 0.07791212201118469, -0.1704484224319458, -0.057357415556907654, -0.03266220539808273, 0.10615495592355728, 0.01973474957048893, 0.07812157273292542, -0.19620545208454132, 0.015913264825940132, 0.021068641915917397, 0.05764211714267731, -0.07798463106155396, 0.0696742981672287, 0.04996621981263161, 0.18700867891311646, 0.01644793525338173, 0.015023577958345413, 0.011719661764800549, -0.01341279223561287, -0.10503392666578293, 0.06735620647668839, 0.013189263641834259, -0.02368801459670067, 0.03304653987288475, -0.0463949516415596, -0.0475715808570385, -0.0364411324262619, 0.03873879462480545, -0.1426916867494583, -0.17713028192520142, 0.05227959528565407, 0.1771015077829361, 0.05626142397522926, -0.08992428332567215, 0.007653943728655577, -0.0264846533536911, 0.1651042401790619, -0.029864924028515816, -0.1276809722185135, -0.08189217001199722, -0.10540211200714111, 0.028989281505346298, -0.046920012682676315, 0.024516446515917778, -0.02832658402621746, 0.12918420135974884, -0.010486006736755371, -0.12149277329444885, 0.04156220331788063, -0.09683845937252045, -0.018445724621415138, -0.03885860741138458, 0.06220398098230362, -0.037930041551589966, -0.016763068735599518, 0.06576284021139145, -0.014879636466503143, -0.038918230682611465, -0.0949307531118393, -0.07548469305038452, 0.22958174347877502, -0.03350817784667015, 0.07572881132364273, -0.023643553256988525, -0.09948178380727768, 0.024498816579580307, 0.026447726413607597, 0.010278439149260521, 0.15086022019386292, -0.021552592515945435, 0.07925190031528473, 0.026305869221687317, -0.08847855776548386, -0.14058248698711395, -0.10740391910076141, 0.05675249174237251, -0.005948056001216173, -0.06310880184173584, -0.08890342712402344, 0.0953289195895195, 0.09646108746528625, -0.00822617020457983, 0.19840966165065765, -0.22775837779045105, -0.09364460408687592, 0.057984065264463425, 0.06188446655869484, 0.05915122851729393, -0.07351940125226974, -0.08456144481897354, -0.05369267240166664, -0.16418178379535675, 0.07162677496671677, -0.04293324798345566, 0.07128233462572098, -0.05035843700170517, 0.05519450083374977, 0.03263077139854431, -0.03856514021754265, 0.15243254601955414, -0.00848492793738842, 0.04425151273608208, -0.14504659175872803, 0.1290869265794754, 0.11911334097385406, -0.05378517135977745, 0.09751200675964355, -0.08106700330972672, 0.08352743089199066, -0.057417452335357666, -0.04315223544836044, -0.033644694834947586, 0.0829286202788353, -0.005722061265259981, -0.052520040422677994, 0.002339157974347472, -0.060532186180353165, 0.05129053816199303, 0.005080387927591801, -0.07523349672555923, -0.01923232153058052, 0.03978722542524338, 0.11601995676755905, 0.019923290237784386, -0.16087371110916138, -0.09803131967782974, -0.048222895711660385, 0.01996164210140705, 0.03702494874596596, -0.004508227575570345, 0.034865353256464005, 0.08544821292161942, -0.00108272023499012, 0.07526124268770218, 0.030834121629595757, -0.04302302375435829, -0.0034915213473141193, 0.06592968851327896, -0.13499915599822998, -0.06212889403104782, -0.04400009661912918, 0.0864877924323082, -0.0025155365001410246, 0.020231908187270164, 0.1227450892329216, 0.022407611832022667, -0.018806474283337593, 0.027753591537475586, -0.019571563228964806, -0.00908056739717722, 0.11755799502134323, 0.033786773681640625, 0.021055955439805984, -0.08512718975543976, 0.010705415159463882, -0.01738104037940502, -0.08019295334815979, -0.027603188529610634, 0.06689662486314774, -0.13947667181491852, -0.07718385010957718, -0.05122194439172745, 0.12601636350154877, 0.03186541050672531, -0.08434322476387024, -0.08054676651954651, -0.036630719900131226, 0.025703249499201775, -0.05872401222586632, 0.052908871322870255, 0.03798903897404671, 0.016536422073841095, -0.03925847262144089, -0.03729365020990372, 0.08347619324922562, -0.09667426347732544, 0.11309315264225006, -0.1473436802625656, -0.034175630658864975, 0.005226729903370142, 0.012438161298632622, -0.013341855257749557, -0.04040242359042168, -0.05027247965335846, -0.045252442359924316, -0.13693514466285706, 0.08601808547973633, -0.11499357223510742, 0.049611080437898636, -0.0041083903051912785, -0.011379983276128769, -0.02004084549844265, 0.039077505469322205, -0.04579298570752144, -0.0658276230096817, -0.050232261419296265, 0.032151948660612106, -0.06979481875896454, -0.05942762270569801, 0.0034253110643476248, -0.08372162282466888, 0.07599230110645294, 0.03936642035841942, -0.051692377775907516, -0.06376747041940689, -0.14191676676273346, -0.045143045485019684, 0.06618524342775345, 0.09834057837724686, -0.020448733121156693, -0.047386594116687775, 0.028209678828716278, 0.0683501660823822, -0.019406698644161224, -0.049559690058231354, 0.10590850561857224, -0.1414821594953537, -0.03124532662332058, -0.061296600848436356, -0.04571147263050079, -0.0509035550057888, -0.005325454752892256, 0.11057096719741821, 0.061283178627491, 0.10350935906171799, -0.06145774573087692, 0.02815379947423935, -0.12728162109851837, -0.05407000333070755, 0.00732444180175662, -0.017564212903380394, 0.07634513825178146, -0.018911223858594894, 0.049466896802186966, 0.028634952381253242, 0.18861359357833862, 0.0324445478618145, 0.013921055011451244, -0.015643252059817314, 0.012827248312532902, 0.06038717180490494, -0.045473624020814896, 0.19692625105381012, 0.03714260458946228, 0.02239997684955597, 0.04336721450090408, 0.02953927405178547, 0.032063111662864685, -0.032718684524297714, -0.017269378527998924, 0.03882337361574173, -0.005816140677779913, 0.04460415616631508, 0.07787492871284485, -0.00824874546378851, -0.11727271974086761, -0.08057790249586105, -0.04345003515481949, 0.06653273105621338, -0.019121671095490456, 0.0617360882461071, 0.055355269461870193, -0.09311231970787048, 0.0684945359826088, 0.04839402064681053, -0.04410676658153534, -0.12484558671712875, -0.20414629578590393, -0.07585322856903076, -0.17746759951114655, -0.02863399311900139, -0.10772169381380081, 0.016676558181643486, 0.018789460882544518, -0.008914374746382236, -0.017492277547717094, 0.10451891273260117, -0.04362597316503525, -0.09251613169908524, -0.029504790902137756, -0.019501270726323128, -0.012003463692963123, -0.006187657359987497, -0.024803174659609795, 0.01082328800112009, 0.0529501736164093, 0.018907375633716583, 0.04431973770260811, 0.051422253251075745, 0.0263042114675045, -0.03099733777344227, -0.0356937050819397, -0.03391392529010773, 0.006127723027020693, -0.03971991315484047, 0.057379212230443954, 0.035273127257823944, -0.03016611747443676, 0.02139386162161827, 0.1529569774866104, -0.019823342561721802, -0.14341674745082855, -0.06885087490081787, -0.009966040961444378, 0.01100652664899826, 0.05780798941850662, 0.025370795279741287, -0.09660901874303818, -0.0294288732111454, 0.12045080959796906, 0.1103278249502182, 0.0693756639957428, 0.039630450308322906, 0.01965486817061901, 0.007400284055620432, 0.01456766203045845, 0.05370057001709938, 0.08658325672149658, 0.2388615757226944, 0.027532542124390602, -0.0052130152471363544, 0.011050784960389137, 0.021618101745843887, -0.10613453388214111, 0.0063519468531012535, -0.08074098825454712, -0.07228440791368484, 0.002573922974988818, 0.04991032928228378, 0.03634229302406311, -0.0002939369878731668, 0.04671533778309822, -0.05927859991788864, -0.09766895323991776, -0.057228781282901764, 0.09034797549247742, -0.014040674082934856, 0.023456452414393425, -0.018262097612023354, -0.005841326899826527, 0.22571112215518951, -0.0430884063243866, -0.21501824259757996, -0.042271699756383896, -0.020156050100922585, -0.03991103917360306, 0.12498924136161804, 0.004883849062025547, 0.12505018711090088, 0.0991305261850357, 0.06292189657688141, -0.11257253587245941, 0.11197904497385025, 0.007266299333423376, -0.13804800808429718, 0.06642331182956696, -0.0025319636333733797, -0.04145912453532219, -0.061690304428339005, 0.05867328122258186, -0.08882219344377518, 0.013940678909420967, 0.0919748917222023, -0.0037009355146437883, -0.1235138326883316, 0.07337981462478638, -0.08753075450658798, 0.07839559763669968, 0.08715921640396118, -0.001557297189719975, -0.015429080463945866, -0.03732332959771156, 0.03218802064657211, 0.03409027308225632, -0.04219873249530792, -0.0307286586612463, -0.07889296859502792, -0.025407828390598297, -0.00384093658067286, 0.049146540462970734, -0.1089351624250412, -0.05156705528497696, -0.0810028538107872, -0.08838081359863281, -0.057875797152519226, 0.05296989530324936, 0.015912452712655067, 0.04548175260424614, -0.05172182619571686, -0.09476976096630096, 0.013163607567548752, 0.09148523956537247, -0.14860926568508148, -0.11189195513725281 ]
null
null
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
null
soominc/gptj-lora
[ "transformers", "safetensors", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
2024-02-13T21:54:35+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ 31, 6, 3, 82, 28, 3, 4, 9, 9, 10, 42, 20, 3, 4, 5, 9, 11, 13, 3, 12, 5, 4, 5, 3, 4, 9, 53, 9, 8, 6, 3, 14, 8, 7, 9, 4 ]
[ "passage: TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact" ]
[ -0.06646376848220825, 0.2168014943599701, -0.00225935154594481, 0.023818302899599075, 0.1271018385887146, -0.001635765191167593, 0.04218708351254463, 0.13324736058712006, -0.020175931975245476, 0.11144465953111649, 0.046588581055402756, 0.09377603232860565, 0.09928803145885468, 0.18404334783554077, 0.04859916493296623, -0.2059975117444992, 0.007056170143187046, -0.09090408682823181, 0.014076028019189835, 0.1116579994559288, 0.13719257712364197, -0.10291384905576706, 0.08272874355316162, -0.04045208916068077, -0.02019004337489605, 0.00012576708104461432, -0.09259183704853058, -0.07032395154237747, 0.06885425746440887, 0.06264153122901917, 0.051234472543001175, 0.001456156256608665, 0.09140396863222122, -0.2864592671394348, 0.017265573143959045, 0.08406311273574829, 0.0027674848679453135, 0.06290827691555023, 0.07236549258232117, -0.07389893382787704, 0.11328595131635666, -0.08021481335163116, 0.13019037246704102, 0.08625296503305435, -0.062064990401268005, -0.23071379959583282, -0.07525765895843506, 0.0963398814201355, 0.12251301854848862, 0.06215599179267883, -0.022921854630112648, 0.15455181896686554, -0.06248689442873001, 0.012971068732440472, 0.1294165402650833, -0.11526761949062347, -0.05572471022605896, 0.061741601675748825, 0.11775490641593933, 0.10740239918231964, -0.14110268652439117, -0.0017287094378843904, 0.04900608956813812, 0.029121357947587967, 0.08589313924312592, 0.022661056369543076, 0.12003941088914871, 0.04652795568108559, -0.13695219159126282, -0.04037507623434067, 0.12011898308992386, 0.038862764835357666, -0.06446044892072678, -0.2168138176202774, -0.006778308190405369, -0.0601806715130806, -0.014732478186488152, -0.07019448280334473, 0.039128515869379044, -0.02470310963690281, 0.07317749410867691, -0.04465159401297569, -0.1063927412033081, -0.0421026237308979, 0.0892222449183464, 0.07748593389987946, 0.011527054943144321, -0.02519804798066616, 0.04627908393740654, 0.13455867767333984, 0.05402068421244621, -0.10399353504180908, -0.07017925381660461, -0.06942764669656754, -0.09420394152402878, -0.04035796597599983, 0.056760527193546295, 0.031942449510097504, 0.02665667235851288, 0.22703726589679718, 0.016653569415211678, 0.04155244305729866, 0.0224777739495039, 0.01032855175435543, 0.043662428855895996, 0.0955500528216362, -0.05303520709276199, -0.15660029649734497, -0.04072032496333122, 0.09077946096658707, -0.0027527001220732927, -0.036689214408397675, -0.03966725245118141, 0.03849169611930847, 0.06843466311693192, 0.13122352957725525, 0.07552056759595871, -0.017929591238498688, -0.04813180863857269, -0.030096933245658875, 0.23523783683776855, -0.1493375599384308, 0.04426715523004532, -0.02271856553852558, -0.01804111897945404, -0.03908449783921242, 0.03597262129187584, 0.022118929773569107, -0.000004518366949923802, 0.09706240892410278, -0.058981191366910934, -0.05378659814596176, -0.10168042778968811, -0.03272576630115509, 0.04088849574327469, -0.013975566253066063, -0.010589460842311382, -0.09025166928768158, -0.09490354359149933, -0.04766594246029854, 0.05537205561995506, -0.05123869329690933, -0.03770573064684868, 0.009465423412621021, -0.08151785284280777, -0.005444355774670839, -0.005417742300778627, 0.10699385404586792, -0.03222226724028587, 0.04445803165435791, -0.027600755915045738, 0.05225523188710213, 0.09919606149196625, 0.031576547771692276, -0.0773419588804245, 0.0561848059296608, -0.22559374570846558, 0.07503069192171097, -0.11481974273920059, 0.04335082694888115, -0.1704932004213333, -0.042439818382263184, 0.005444696638733149, 0.0139949731528759, 0.013206101022660732, 0.12720820307731628, -0.19255615770816803, -0.01654396951198578, 0.13260798156261444, -0.09212633967399597, -0.118110790848732, 0.07884611934423447, -0.029701577499508858, 0.1624738723039627, 0.04682036489248276, -0.027025915682315826, 0.09224298596382141, -0.16434773802757263, -0.07092688232660294, -0.00949116237461567, -0.01727987825870514, 0.12109188735485077, 0.07512219995260239, -0.05991523340344429, 0.046571120619773865, 0.02832140028476715, -0.038078423589468, -0.04424772411584854, -0.050857074558734894, -0.10884185880422592, -0.01070026308298111, -0.08987759798765182, 0.04065500199794769, -0.01250192429870367, -0.07916021347045898, -0.029885273426771164, -0.18612512946128845, -0.0030564051121473312, 0.10038342326879501, 0.0035033065360039473, -0.005652366206049919, -0.08666291832923889, 0.026358824223279953, -0.03112892620265484, -0.008404186926782131, -0.16764774918556213, -0.04399421438574791, 0.046902090311050415, -0.16094985604286194, 0.020117372274398804, -0.06413903087377548, 0.06334125250577927, 0.03641495108604431, -0.05590536445379257, -0.0248766727745533, -0.01730942726135254, 0.011945613659918308, -0.05083848536014557, -0.18994836509227753, -0.056277405470609665, -0.037882111966609955, 0.149809330701828, -0.25956398248672485, 0.032966937869787216, 0.051140617579221725, 0.14649195969104767, 0.00406361510977149, -0.05115427449345589, 0.01429014839231968, -0.05360214412212372, -0.054652128368616104, -0.06746816635131836, -0.006135428790003061, -0.027576493099331856, -0.05147203803062439, 0.019243421033024788, -0.1755700707435608, -0.021410830318927765, 0.09424154460430145, 0.12876708805561066, -0.1486445665359497, -0.018640631809830666, -0.048725154250860214, -0.06339836865663528, -0.0715010017156601, -0.07038594037294388, 0.10712739825248718, 0.0513901449739933, 0.04796046018600464, -0.07435787469148636, -0.07092321664094925, 0.02726263552904129, 0.006906150374561548, -0.03382374346256256, 0.08727246522903442, 0.05199531093239784, -0.09209315478801727, 0.0756213590502739, 0.1092359870672226, 0.07177663594484329, 0.09363535046577454, 0.01574566215276718, -0.11756632477045059, -0.028492970392107964, 0.036266472190618515, 0.02740776725113392, 0.1465986967086792, -0.05952361226081848, 0.04016614332795143, 0.04494241625070572, -0.04170418903231621, 0.022319864481687546, -0.08787637203931808, 0.024075502529740334, 0.025203049182891846, -0.0034381982404738665, 0.06284574419260025, -0.02525499276816845, -0.0050758360885083675, 0.07016654312610626, 0.047779910266399384, 0.04621000960469246, 0.009655474685132504, -0.01720241829752922, -0.1047825813293457, 0.16950392723083496, -0.0951867327094078, -0.269941508769989, -0.17632324993610382, 0.026197833940386772, 0.04035249724984169, -0.022378476336598396, 0.031619444489479065, -0.07056326419115067, -0.10630585998296738, -0.1060405746102333, -0.002429972169920802, 0.01714223250746727, -0.06364088505506516, -0.0741225928068161, 0.07348573952913284, 0.04382912442088127, -0.14902326464653015, 0.038552410900592804, 0.055694397538900375, -0.057955220341682434, -0.0233661737293005, 0.09118817001581192, 0.12397737801074982, 0.14583967626094818, -0.021366750821471214, -0.028626007959246635, 0.029004426673054695, 0.19620531797409058, -0.13469526171684265, 0.10371150821447372, 0.13814030587673187, -0.04545360431075096, 0.08360563963651657, 0.1560150384902954, 0.029186224564909935, -0.08317049592733383, 0.05044832453131676, 0.04082648828625679, -0.043159641325473785, -0.2666129767894745, -0.0534592866897583, 0.012832709588110447, -0.06255637854337692, 0.09786593168973923, 0.10183793306350708, 0.11542957276105881, 0.034910861402750015, -0.07166364789009094, -0.043925940990448, -0.0058974819257855415, 0.11737963557243347, -0.05490213260054588, -0.012639665976166725, 0.07686592638492584, -0.05086168646812439, 0.005355054512619972, 0.10266812145709991, 0.02973790094256401, 0.17442677915096283, 0.020399179309606552, 0.11231429129838943, 0.06195578724145889, 0.08633565157651901, 0.0007386076031252742, 0.02951662428677082, 0.05147615820169449, 0.017203815281391144, -0.002300140680745244, -0.10421168059110641, -0.006156572140753269, 0.1449710875749588, 0.028103826567530632, 0.029669636860489845, -0.0018948549404740334, -0.005003341939300299, 0.05121048167347908, 0.1746254414319992, -0.011592294089496136, -0.22072425484657288, -0.0845772922039032, 0.06936841458082199, -0.06218599155545235, -0.12968985736370087, -0.026130788028240204, 0.045467354357242584, -0.17519839107990265, 0.026703642681241035, -0.027433741837739944, 0.0919293761253357, -0.09345759451389313, -0.02221956104040146, 0.03687324374914169, 0.084866963326931, -0.014529162086546421, 0.08703910559415817, -0.14498743414878845, 0.11886418610811234, 0.02978132851421833, 0.09024628251791, -0.11081171780824661, 0.07909037172794342, -0.007550720125436783, 0.009180475026369095, 0.19379350543022156, -0.011335089802742004, -0.03514958545565605, -0.08774717897176743, -0.11210042238235474, -0.013537433929741383, 0.12687496840953827, -0.1243172138929367, 0.08773399889469147, -0.015198243781924248, -0.044079482555389404, 0.00937260314822197, -0.12100647389888763, -0.17273177206516266, -0.19628387689590454, 0.05585884302854538, -0.09575839340686798, 0.025643249973654747, -0.11914430558681488, -0.07089093327522278, -0.02952558360993862, 0.241120383143425, -0.1745356321334839, -0.06510113179683685, -0.1468164622783661, -0.046294767409563065, 0.1662203073501587, -0.04437198117375374, 0.0718095526099205, -0.0208172257989645, 0.20345525443553925, 0.005988610442727804, -0.004939318168908358, 0.06724198162555695, -0.08892562240362167, -0.16873881220817566, -0.06771010160446167, 0.1510489284992218, 0.11680185794830322, 0.04907919466495514, -0.002248800592496991, 0.0011772146681323647, -0.016943959519267082, -0.1137804463505745, -0.0033210667315870523, 0.16037839651107788, 0.03878779336810112, 0.025986969470977783, -0.05243593826889992, -0.08797456324100494, -0.06899320334196091, -0.06853509694337845, 0.06221301481127739, 0.19590823352336884, -0.10376439243555069, 0.1700313836336136, 0.147536963224411, -0.07305635511875153, -0.23175598680973053, 0.035342130810022354, 0.04983805492520332, 0.0014306638622656465, 0.04886869341135025, -0.18252557516098022, 0.10521943867206573, 0.019543392583727837, -0.05505957826972008, 0.13485197722911835, -0.1557481735944748, -0.1552847921848297, 0.0722852572798729, 0.03904085233807564, -0.22423844039440155, -0.1354004591703415, -0.09622503817081451, -0.05825018882751465, -0.14065024256706238, 0.06054598465561867, -0.002136280992999673, 0.015948504209518433, 0.03500790148973465, -0.0015643214574083686, 0.027123261243104935, -0.058935679495334625, 0.18609118461608887, -0.004065449349582195, 0.020676052197813988, -0.060264769941568375, -0.0478842556476593, 0.09839435666799545, -0.06130504235625267, 0.12208222597837448, 0.004057085141539574, 0.01594383642077446, -0.10362856835126877, -0.048314861953258514, -0.04328322783112526, 0.05154227837920189, -0.07548051327466965, -0.10070807486772537, -0.043625857681035995, 0.08841723203659058, 0.07005169242620468, -0.03383097052574158, 0.00549331633374095, -0.07189501076936722, 0.10019614547491074, 0.17795267701148987, 0.17573626339435577, 0.009926567785441875, -0.07241068035364151, 0.01677953451871872, -0.04142116755247116, 0.044231921434402466, -0.2513144314289093, 0.03756171092391014, 0.06098250672221184, 0.029438555240631104, 0.09217222779989243, -0.020435843616724014, -0.1820858269929886, -0.04050002992153168, 0.08094815909862518, -0.05452597141265869, -0.22617179155349731, -0.019085140898823738, 0.0954197570681572, -0.2020406424999237, -0.007372708059847355, 0.03995226323604584, -0.048725228756666183, -0.023169852793216705, 0.00010950004070764408, 0.06317184865474701, 0.002471912419423461, 0.09773622453212738, 0.0735151618719101, 0.09715340286493301, -0.08337292820215225, 0.10562895983457565, 0.10150538384914398, -0.09572599828243256, 0.03605884686112404, 0.06754924356937408, -0.05300498008728027, -0.043293699622154236, 0.03665391728281975, 0.033023297786712646, 0.005234600510448217, -0.060321882367134094, 0.013913018628954887, -0.036497246474027634, 0.044923391193151474, 0.08326134830713272, 0.03754979372024536, -0.013354414142668247, 0.06462216377258301, 0.03401726484298706, -0.10898099094629288, 0.10366570204496384, 0.01731540448963642, 0.04105307161808014, -0.08384523540735245, -0.019968897104263306, 0.035425446927547455, 0.030576206743717194, -0.01765924133360386, -0.02306121215224266, -0.02860277332365513, -0.01614218018949032, -0.14299540221691132, -0.023106401786208153, -0.07243485748767853, 0.006181265693157911, 0.014656842686235905, -0.031884219497442245, -0.011233693920075893, 0.02475680410861969, -0.06979699432849884, -0.07426341623067856, -0.006949664559215307, 0.09833318740129471, -0.15115703642368317, 0.008848577737808228, 0.06907843053340912, -0.11088496446609497, 0.08190931379795074, -0.008411259390413761, 0.016245156526565552, 0.022527478635311127, -0.15448406338691711, 0.05601610988378525, 0.0008648968650959432, 0.01916889287531376, 0.025886621326208115, -0.16471809148788452, 0.004104440100491047, -0.04661374166607857, -0.02149827405810356, -0.00004464812809601426, -0.02647159807384014, -0.12325995415449142, 0.06858719140291214, -0.015622655861079693, -0.035931166261434555, -0.02701525390148163, 0.0539589487016201, 0.07888586074113846, -0.027474910020828247, 0.10445091128349304, -0.008690856397151947, 0.04941811040043831, -0.16801609098911285, -0.02470702864229679, -0.04982255399227142, 0.019377702847123146, 0.009884213097393513, -0.007693959400057793, 0.04183054715394974, -0.00976533442735672, 0.21883612871170044, -0.05075952783226967, 0.1607085019350052, 0.05847611650824547, -0.017352959141135216, -0.0007513365126214921, 0.06180921941995621, 0.05997028574347496, 0.04658793285489082, 0.009480604901909828, 0.023740366101264954, -0.022450892254710197, -0.006695089396089315, -0.15932634472846985, 0.01890849508345127, 0.14999441802501678, 0.06301083415746689, 0.024745315313339233, 0.05866100639104843, -0.12775006890296936, -0.12135478109121323, 0.09311001747846603, -0.026755332946777344, 0.00928465835750103, -0.08245618641376495, 0.1358020007610321, 0.14980104565620422, -0.14000412821769714, 0.05256148427724838, -0.06134212389588356, -0.05217423290014267, -0.10388828068971634, -0.12032219022512436, -0.05887215584516525, -0.053666237741708755, 0.002330566756427288, -0.03760887682437897, 0.054546963423490524, 0.03344334661960602, -0.009351172484457493, -0.00022941511997487396, 0.13597318530082703, -0.019751882180571556, -0.0028988157864660025, 0.048313532024621964, 0.03693558648228645, 0.02373051457107067, -0.05275435373187065, 0.02940409444272518, 0.02539868652820587, 0.032232340425252914, 0.06546790152788162, 0.033412106335163116, -0.047448933124542236, 0.03804153576493263, -0.0025254099164158106, -0.11207924783229828, 0.019641218706965446, -0.00460948096588254, -0.0742158442735672, 0.1268945336341858, 0.0407399944961071, 0.010224059224128723, -0.03741471841931343, 0.24361543357372284, -0.06653323769569397, -0.06378097087144852, -0.13251738250255585, 0.10491154342889786, -0.0027236645109951496, 0.06476365029811859, 0.023412218317389488, -0.1284150779247284, 0.005243356805294752, 0.13858191668987274, 0.12181595712900162, 0.0045748427510261536, 0.009228081442415714, 0.0518609918653965, 0.0025186820421367884, -0.06998204439878464, 0.054019294679164886, 0.06992026418447495, 0.12919506430625916, -0.07847554981708527, 0.07680778950452805, 0.0006860480643808842, -0.08370215445756912, -0.02947772853076458, 0.11312682181596756, -0.0409729965031147, 0.03491825982928276, -0.047444481402635574, 0.10916327685117722, -0.05787910893559456, -0.29412412643432617, 0.02350960113108158, -0.09588567912578583, -0.15202060341835022, -0.018367812037467957, 0.05944539234042168, -0.02624768204987049, 0.018029648810625076, 0.06971040368080139, -0.06011629104614258, 0.20098382234573364, 0.0335683599114418, -0.07864278554916382, -0.0664360448718071, 0.04837050288915634, -0.06564252078533173, 0.2949807047843933, 0.008418165147304535, 0.02863333560526371, 0.10770907253026962, -0.03253700211644173, -0.18271861970424652, 0.010723991319537163, 0.1133992001414299, -0.08056149631738663, 0.08200647681951523, 0.19000613689422607, -0.012578671798110008, 0.1209007054567337, 0.05294662341475487, -0.047376248985528946, 0.04217283055186272, -0.03389401361346245, -0.051268599927425385, -0.10752558708190918, 0.058453381061553955, -0.05909625440835953, 0.15447644889354706, 0.10152646154165268, -0.05671518296003342, -0.004550917539745569, -0.05555408447980881, 0.04875178262591362, 0.01804669201374054, 0.12263146042823792, 0.02951994352042675, -0.1865430772304535, 0.032826557755470276, -0.01144319772720337, 0.10186848044395447, -0.25588861107826233, -0.08421015739440918, 0.08833149075508118, -0.011924264021217823, -0.05105875805020332, 0.10560628771781921, 0.057650718837976456, 0.04243382066488266, -0.043439045548439026, -0.10480839014053345, -0.02186836116015911, 0.14663739502429962, -0.1469624787569046, -0.025013303384184837 ]
null
null
sentence-transformers
# annakotarba/sentence-similarity This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 384 dimensional dense vector space and can be used for tasks like clustering or semantic search. <!--- Describe your model here --> ## Usage (Sentence-Transformers) Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed: ``` pip install -U sentence-transformers ``` Then you can use the model like this: ```python from sentence_transformers import SentenceTransformer sentences = ["This is an example sentence", "Each sentence is converted"] model = SentenceTransformer('annakotarba/sentence-similarity') embeddings = model.encode(sentences) print(embeddings) ``` ## Usage (HuggingFace Transformers) Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings. ```python from transformers import AutoTokenizer, AutoModel import torch #Mean Pooling - Take attention mask into account for correct averaging def mean_pooling(model_output, attention_mask): token_embeddings = model_output[0] #First element of model_output contains all token embeddings input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float() return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9) # Sentences we want sentence embeddings for sentences = ['This is an example sentence', 'Each sentence is converted'] # Load model from HuggingFace Hub tokenizer = AutoTokenizer.from_pretrained('annakotarba/sentence-similarity') model = AutoModel.from_pretrained('annakotarba/sentence-similarity') # Tokenize sentences encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt') # Compute token embeddings with torch.no_grad(): model_output = model(**encoded_input) # Perform pooling. In this case, mean pooling. sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask']) print("Sentence embeddings:") print(sentence_embeddings) ``` ## Evaluation Results <!--- Describe how your model was evaluated --> For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=annakotarba/sentence-similarity) ## Training The model was trained with the parameters: **DataLoader**: `torch.utils.data.dataloader.DataLoader` of length 1 with parameters: ``` {'batch_size': 16, 'sampler': 'torch.utils.data.sampler.RandomSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'} ``` **Loss**: `sentence_transformers.losses.TripletLoss.TripletLoss` with parameters: ``` {'distance_metric': 'TripletDistanceMetric.EUCLIDEAN', 'triplet_margin': 5} ``` Parameters of the fit()-Method: ``` { "epochs": 10, "evaluation_steps": 0, "evaluator": "NoneType", "max_grad_norm": 1, "optimizer_class": "<class 'torch.optim.adamw.AdamW'>", "optimizer_params": { "lr": 2e-05 }, "scheduler": "WarmupLinear", "steps_per_epoch": null, "warmup_steps": 10000, "weight_decay": 0.01 } ``` ## Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False}) ) ``` ## Citing & Authors <!--- Describe where people can find more information -->
{"library_name": "sentence-transformers", "tags": ["sentence-transformers", "feature-extraction", "sentence-similarity", "transformers"], "datasets": ["private"], "pipeline_tag": "sentence-similarity"}
sentence-similarity
annakotarba/sentence-similarity
[ "sentence-transformers", "safetensors", "bert", "feature-extraction", "sentence-similarity", "transformers", "dataset:private", "endpoints_compatible", "region:us" ]
2024-02-13T21:56:32+00:00
[]
[]
TAGS #sentence-transformers #safetensors #bert #feature-extraction #sentence-similarity #transformers #dataset-private #endpoints_compatible #region-us
# annakotarba/sentence-similarity This is a sentence-transformers model: It maps sentences & paragraphs to a 384 dimensional dense vector space and can be used for tasks like clustering or semantic search. ## Usage (Sentence-Transformers) Using this model becomes easy when you have sentence-transformers installed: Then you can use the model like this: ## Usage (HuggingFace Transformers) Without sentence-transformers, you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings. ## Evaluation Results For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: URL ## Training The model was trained with the parameters: DataLoader: 'URL.dataloader.DataLoader' of length 1 with parameters: Loss: 'sentence_transformers.losses.TripletLoss.TripletLoss' with parameters: Parameters of the fit()-Method: ## Full Model Architecture ## Citing & Authors
[ "# annakotarba/sentence-similarity\n\nThis is a sentence-transformers model: It maps sentences & paragraphs to a 384 dimensional dense vector space and can be used for tasks like clustering or semantic search.", "## Usage (Sentence-Transformers)\n\nUsing this model becomes easy when you have sentence-transformers installed:\n\n\n\nThen you can use the model like this:", "## Usage (HuggingFace Transformers)\nWithout sentence-transformers, you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.", "## Evaluation Results\n\n\n\nFor an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: URL", "## Training\nThe model was trained with the parameters:\n\nDataLoader:\n\n'URL.dataloader.DataLoader' of length 1 with parameters:\n\n\nLoss:\n\n'sentence_transformers.losses.TripletLoss.TripletLoss' with parameters:\n \n\nParameters of the fit()-Method:", "## Full Model Architecture", "## Citing & Authors" ]
[ "TAGS\n#sentence-transformers #safetensors #bert #feature-extraction #sentence-similarity #transformers #dataset-private #endpoints_compatible #region-us \n", "# annakotarba/sentence-similarity\n\nThis is a sentence-transformers model: It maps sentences & paragraphs to a 384 dimensional dense vector space and can be used for tasks like clustering or semantic search.", "## Usage (Sentence-Transformers)\n\nUsing this model becomes easy when you have sentence-transformers installed:\n\n\n\nThen you can use the model like this:", "## Usage (HuggingFace Transformers)\nWithout sentence-transformers, you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.", "## Evaluation Results\n\n\n\nFor an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: URL", "## Training\nThe model was trained with the parameters:\n\nDataLoader:\n\n'URL.dataloader.DataLoader' of length 1 with parameters:\n\n\nLoss:\n\n'sentence_transformers.losses.TripletLoss.TripletLoss' with parameters:\n \n\nParameters of the fit()-Method:", "## Full Model Architecture", "## Citing & Authors" ]
[ 49, 54, 38, 64, 29, 73, 5, 6 ]
[ "passage: TAGS\n#sentence-transformers #safetensors #bert #feature-extraction #sentence-similarity #transformers #dataset-private #endpoints_compatible #region-us \n# annakotarba/sentence-similarity\n\nThis is a sentence-transformers model: It maps sentences & paragraphs to a 384 dimensional dense vector space and can be used for tasks like clustering or semantic search.## Usage (Sentence-Transformers)\n\nUsing this model becomes easy when you have sentence-transformers installed:\n\n\n\nThen you can use the model like this:## Usage (HuggingFace Transformers)\nWithout sentence-transformers, you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.## Evaluation Results\n\n\n\nFor an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: URL## Training\nThe model was trained with the parameters:\n\nDataLoader:\n\n'URL.dataloader.DataLoader' of length 1 with parameters:\n\n\nLoss:\n\n'sentence_transformers.losses.TripletLoss.TripletLoss' with parameters:\n \n\nParameters of the fit()-Method:## Full Model Architecture## Citing & Authors" ]
[ -0.0203290656208992, 0.07445977628231049, -0.006535157095640898, 0.04221360757946968, 0.11165902763605118, 0.017226768657565117, 0.1476672887802124, 0.07568365335464478, -0.029324961826205254, 0.06642097979784012, 0.016323504969477654, 0.11695820093154907, 0.005899087991565466, 0.02521183341741562, 0.01942664571106434, -0.29047271609306335, 0.034436602145433426, -0.05610763654112816, 0.004780228715389967, 0.07427725195884705, 0.12749417126178741, -0.06242383271455765, 0.05803424492478371, 0.014793245121836662, -0.02640269696712494, 0.018114585429430008, -0.016642270609736443, -0.04145033285021782, 0.10115005075931549, 0.06723339110612869, 0.038001898676157, 0.005723001901060343, 0.006590370554476976, -0.20580561459064484, 0.01740586943924427, 0.07721373438835144, -0.011528471484780312, 0.051566049456596375, 0.02678225375711918, -0.04597305878996849, 0.16139963269233704, -0.05883707478642464, 0.07470910996198654, 0.061854951083660126, -0.12251654267311096, -0.05884303152561188, -0.053253814578056335, -0.025069065392017365, 0.1440439224243164, 0.10329367220401764, -0.059343017637729645, 0.1432616114616394, -0.06341612339019775, 0.08957885205745697, 0.08299405872821808, -0.26399269700050354, -0.04052304849028587, 0.0215651523321867, 0.06684741377830505, 0.051853738725185394, -0.10806691646575928, -0.0011807745322585106, -0.01959656924009323, 0.045223236083984375, 0.06422587484121323, -0.04833720251917839, 0.05463056266307831, -0.012553178705275059, -0.11435534060001373, 0.009589295834302902, 0.2021404355764389, 0.029729735106229782, -0.03628914803266525, -0.18364545702934265, -0.06266508251428604, 0.08431364595890045, -0.038526955991983414, -0.049659427255392075, 0.019219180569052696, 0.05134625732898712, -0.022144172340631485, -0.08760639280080795, -0.11031381040811539, -0.0021711543668061495, -0.061435431241989136, 0.018511159345507622, -0.018902594223618507, -0.04600701853632927, -0.011787047609686852, 0.06173453852534294, -0.05532537028193474, -0.09699995815753937, -0.021781321614980698, -0.040014322847127914, -0.12140492349863052, -0.042238254100084305, -0.05503591522574425, -0.14544984698295593, 0.030506491661071777, 0.14743395149707794, 0.0842210128903389, 0.015526607632637024, -0.049184221774339676, 0.06167636811733246, 0.03869246691465378, 0.16645808517932892, -0.027478426694869995, -0.08395569771528244, -0.046776559203863144, 0.03258031606674194, 0.003957719542086124, -0.011659201234579086, -0.031860556453466415, 0.006437660660594702, 0.037162765860557556, 0.0652037188410759, 0.06401538848876953, 0.06068689003586769, -0.05472934991121292, -0.04250821843743324, 0.06336105614900589, -0.1187606081366539, 0.02248980663716793, 0.002629967639222741, -0.04884704574942589, 0.029383881017565727, 0.07764716446399689, -0.009353730827569962, -0.07191025465726852, 0.033764082938432693, -0.10191119462251663, -0.015677358955144882, -0.04481631517410278, -0.14170290529727936, -0.0064125582575798035, 0.026069333776831627, -0.03509696573019028, -0.12061986327171326, -0.13267721235752106, -0.07048953324556351, 0.0460851676762104, -0.042728718370199203, -0.010510502383112907, -0.11289963126182556, -0.008342372253537178, 0.005331722553819418, -0.006876038387417793, -0.07545803487300873, -0.008306032046675682, 0.020351432263851166, -0.06099558621644974, 0.044645730406045914, 0.04795396700501442, 0.0403132364153862, -0.13373756408691406, 0.019647223874926567, -0.13742035627365112, 0.16039830446243286, -0.03356669098138809, 0.08383464813232422, -0.14625512063503265, 0.02194720134139061, -0.0019283556612208486, 0.058322686702013016, 0.0038956766948103905, 0.15136201679706573, -0.21221919357776642, -0.0696382001042366, 0.12340591102838516, -0.0641983300447464, -0.10529650747776031, 0.11619074642658234, -0.02821851707994938, 0.15841613709926605, 0.12521110475063324, 0.12287644296884537, 0.12029744684696198, -0.04720059409737587, 0.004850405268371105, 0.023531684651970863, -0.06905191391706467, 0.1399015337228775, 0.039851896464824677, -0.07257422804832458, 0.06287319958209991, -0.004621455911546946, -0.029812850058078766, 0.019529670476913452, 0.002579808235168457, -0.05959746986627579, 0.018362538889050484, -0.03657526522874832, 0.059031207114458084, -0.04618753120303154, -0.002907779533416033, 0.0003229136054869741, -0.10774790495634079, 0.11088958382606506, 0.07444929331541061, -0.06295900046825409, 0.020000746473670006, -0.08036550134420395, -0.015891453251242638, -0.014434210024774075, 0.014998107217252254, -0.1892932951450348, -0.12127726525068283, 0.019774332642555237, 0.005752758122980595, 0.10294531285762787, 0.061359383165836334, 0.05746205151081085, 0.03605642169713974, -0.019520167261362076, -0.011459354311227798, 0.05915283039212227, 0.0025063420180231333, -0.09128314256668091, -0.11889339238405228, 0.007526426576077938, -0.03075709007680416, 0.07760936766862869, -0.12322713434696198, 0.017199909314513206, 0.011013330891728401, 0.045948658138513565, 0.04887205734848976, -0.022703183814883232, -0.012863478623330593, -0.03607340529561043, -0.023367051035165787, -0.031705692410469055, 0.040818825364112854, 0.03359012305736542, -0.12842446565628052, 0.10464444011449814, -0.21612975001335144, -0.14503560960292816, 0.07541688531637192, 0.015118679963052273, -0.07468892633914948, -0.04948630556464195, -0.02812116965651512, -0.01336300652474165, -0.041993606835603714, -0.0741516724228859, 0.1572306603193283, 0.0855652242898941, 0.10658151656389236, -0.03229834511876106, -0.023464061319828033, -0.04484507814049721, -0.04574122652411461, -0.0618283674120903, 0.09192105382680893, -0.07783632725477219, -0.16164088249206543, 0.048462651669979095, 0.07842402160167694, -0.06036699190735817, 0.102540984749794, 0.0065105995163321495, -0.05984725430607796, -0.06420359015464783, 0.04393709823489189, 0.05523160099983215, 0.011198614723980427, -0.075465127825737, 0.020642239600419998, 0.04207289218902588, 0.03755277395248413, 0.015117739327251911, -0.05087657272815704, 0.04617351293563843, 0.05824759975075722, 0.011368513107299805, 0.10811569541692734, 0.028251290321350098, -0.004831952974200249, 0.0665997639298439, 0.01791372336447239, 0.019607355818152428, -0.03436237946152687, -0.04253090173006058, -0.126539409160614, 0.16949892044067383, -0.14396408200263977, -0.22690536081790924, -0.1441107839345932, -0.00598150072619319, -0.03881999850273132, 0.021779101341962814, 0.09667228162288666, -0.06466351449489594, -0.06623915582895279, -0.09023299813270569, 0.07496050000190735, 0.04936760291457176, -0.054325297474861145, -0.023070432245731354, 0.04211743175983429, 0.020696675404906273, -0.12107087671756744, -0.014116000384092331, 0.009907261468470097, -0.07855018973350525, -0.003424824681133032, -0.02347068302333355, 0.045773543417453766, 0.12682801485061646, 0.056825533509254456, -0.015406533144414425, -0.01592549867928028, 0.21605409681797028, -0.08006826043128967, 0.06288570165634155, 0.1609872430562973, -0.004901964217424393, 0.06427299976348877, 0.10802759975194931, 0.02084958925843239, -0.05059925466775894, 0.0712713822722435, 0.06782513856887817, -0.024404961615800858, -0.15426287055015564, -0.10887264460325241, -0.062396857887506485, 0.033221740275621414, 0.12745952606201172, 0.048208512365818024, 0.03136320784687996, 0.060451339930295944, -0.02655627578496933, 0.012433059513568878, 0.11055266857147217, 0.12926743924617767, 0.1219182163476944, -0.028995202854275703, 0.10265043377876282, -0.033455170691013336, -0.06793212890625, 0.06094476208090782, -0.001068002195097506, 0.12038195878267288, 0.014179498888552189, 0.13197536766529083, 0.07434994727373123, -0.022939374670386314, -0.04110826551914215, 0.07854444533586502, -0.022949183359742165, 0.014697566628456116, -0.03274816274642944, -0.09452993422746658, -0.017458826303482056, 0.08753377944231033, 0.0941973403096199, -0.008889000862836838, -0.03319482505321503, 0.0638514831662178, 0.13174493610858917, 0.15165060758590698, 0.09205964207649231, -0.23614546656608582, -0.06547988206148148, 0.03902939334511757, -0.07797745615243912, -0.06610633432865143, 0.007769776042550802, 0.037406232208013535, -0.13602019846439362, 0.032198481261730194, -0.03391696885228157, 0.09143631905317307, -0.07736791670322418, 0.03223039582371712, -0.04682927578687668, 0.03364674746990204, -0.021021133288741112, 0.0759185180068016, -0.24770109355449677, 0.08812529593706131, 0.02614179439842701, 0.05569639429450035, -0.05341007187962532, 0.028079545125365257, 0.06773163378238678, 0.003627484431490302, 0.16953596472740173, -0.02772166021168232, 0.012443441897630692, 0.0031443419866263866, -0.06827345490455627, -0.010227234102785587, 0.04554319009184837, -0.10189773142337799, 0.08011898398399353, -0.06649111211299896, -0.02029651403427124, -0.009262917563319206, 0.03524983301758766, -0.08332603424787521, -0.1666892021894455, 0.012223709374666214, 0.013070769608020782, 0.006563820410519838, -0.02327672578394413, 0.002319569466635585, 0.036325328052043915, 0.21290920674800873, -0.09191618859767914, -0.05706128105521202, -0.1283298283815384, -0.013028525747358799, 0.10420060902833939, -0.08513680845499039, -0.005871844012290239, -0.005683587864041328, 0.14458812773227692, -0.056669678539037704, -0.0683673769235611, 0.06731918454170227, -0.07400403171777725, -0.04154951497912407, -0.055787865072488785, 0.09117738157510757, 0.0654570609331131, 0.04677141457796097, 0.04204729571938515, 0.054448243230581284, -0.02905092015862465, -0.08500903099775314, -0.07467767596244812, 0.08781350404024124, -0.01842576637864113, 0.08776482939720154, -0.12719443440437317, -0.04964432492852211, -0.11978419125080109, 0.02988395094871521, 0.1877163052558899, 0.22221192717552185, -0.05958769470453262, 0.0750831663608551, 0.14982403814792633, -0.09405437856912613, -0.21689894795417786, -0.06623661518096924, 0.023326914757490158, 0.027286347001791, 0.11063642054796219, -0.10641716420650482, 0.06258773058652878, 0.058366961777210236, -0.008099835366010666, -0.05423639342188835, -0.27043384313583374, -0.1375323086977005, 0.15241990983486176, 0.029248127713799477, -0.034937016665935516, -0.10856253653764725, -0.05326618254184723, -0.0830078125, -0.02519683726131916, 0.06980322301387787, -0.04188555106520653, 0.09221905469894409, 0.0540594644844532, 0.018840309232473373, 0.05768529325723648, 0.0022619899827986956, 0.13020195066928864, 0.04674554616212845, 0.03220941871404648, -0.056219473481178284, -0.04180053249001503, 0.07775326818227768, -0.0922950804233551, 0.15025967359542847, -0.04653095453977585, 0.03773853927850723, -0.08636080473661423, -0.026796748861670494, -0.043331779539585114, 0.02585139125585556, -0.04903075098991394, -0.06465785950422287, -0.02140823006629944, 0.06474161893129349, 0.10994692891836166, -0.01000426895916462, 0.022242417559027672, -0.07630138844251633, 0.05819704383611679, 0.16413098573684692, 0.1204347163438797, 0.07775961607694626, -0.1698163002729416, 0.0281270332634449, 0.005731325596570969, 0.05427483096718788, -0.08968747407197952, 0.07374846935272217, 0.05758246034383774, -0.019305283203721046, 0.15942402184009552, 0.030881844460964203, -0.09514196217060089, -0.00521049601957202, 0.05131964385509491, -0.07052134722471237, -0.1566087156534195, -0.04325111210346222, 0.010304038412868977, -0.11771677434444427, -0.05842041224241257, 0.17049571871757507, -0.03169042244553566, 0.01249381247907877, 0.04260776937007904, 0.05167873948812485, -0.03908122330904007, 0.1162012591958046, -0.03388043865561485, 0.05080505833029747, -0.04281800612807274, 0.10434463620185852, 0.08606117218732834, -0.08683577924966812, 0.03053836151957512, 0.11717382073402405, -0.07012349367141724, -0.09537126123905182, -0.05421692132949829, 0.06692282855510712, -0.08334483206272125, 0.0343858078122139, -0.039257269352674484, -0.0773136094212532, 0.011349325999617577, 0.041553810238838196, 0.032633811235427856, 0.07526183873414993, -0.09736880660057068, -0.004652323201298714, -0.09697326272726059, 0.0889827087521553, 0.0834343284368515, 0.022265829145908356, -0.0369793176651001, 0.07224886119365692, -0.022493984550237656, 0.011970850639045238, -0.021612979471683502, -0.06014310568571091, -0.07921506464481354, -0.0051216441206634045, -0.05171654373407364, -0.010186134837567806, -0.10564478486776352, -0.012245898135006428, 0.03768644854426384, 0.051190249621868134, -0.003611128544434905, -0.017961863428354263, -0.04394859820604324, -0.07370489090681076, -0.04188866913318634, 0.10509738326072693, -0.1388051062822342, -0.01253074873238802, 0.0416015163064003, -0.09558098763227463, 0.07387690991163254, 0.0032668542116880417, -0.035362351685762405, 0.030994316563010216, -0.038860321044921875, -0.041634123772382736, 0.01861220970749855, 0.021961510181427002, 0.05922829732298851, -0.11220861971378326, 0.0004415248695295304, -0.04384313523769379, 0.023479385301470757, 0.005251146852970123, 0.04381786286830902, -0.10307671129703522, 0.02146882936358452, -0.022122055292129517, -0.028968840837478638, -0.10358769446611404, 0.014947580173611641, -0.003437994047999382, 0.027323422953486443, 0.16615261137485504, -0.06899725645780563, 0.07199451327323914, -0.13620179891586304, -0.001296494621783495, 0.014792696572840214, -0.050982650369405746, 0.0900001972913742, -0.09552925825119019, 0.05248045176267624, -0.03538035601377487, 0.05510484799742699, -0.004096889868378639, 0.06078046187758446, 0.06793629378080368, 0.0032209884375333786, -0.0018264774698764086, 0.010429063811898232, 0.06423996388912201, 0.03437791392207146, -0.01649487391114235, -0.07591769099235535, 0.019083455204963684, 0.010515233501791954, -0.052260469645261765, 0.06137485057115555, 0.08151029795408249, 0.019688842818140984, 0.09034086018800735, 0.06695627421140671, -0.01008689310401678, -0.07959946990013123, -0.025267723947763443, -0.023608194664120674, 0.06830358505249023, -0.026052577421069145, 0.0481124073266983, 0.17821744084358215, -0.15327391028404236, 0.11407805979251862, -0.004256350453943014, -0.06440103054046631, -0.08774135261774063, -0.15558554232120514, -0.06030113622546196, -0.050618287175893784, -0.01791217550635338, -0.12910108268260956, -0.006110135465860367, 0.00851812306791544, 0.013868581503629684, -0.0008761155768297613, 0.13511110842227936, -0.11402484774589539, -0.1064307764172554, 0.07924854010343552, -0.028215477243065834, 0.050684139132499695, 0.02709541656076908, 0.037426117807626724, -0.0010673352517187595, 0.06881116330623627, 0.05160582810640335, 0.05400484427809715, 0.038702066987752914, 0.03300291299819946, -0.08161389827728271, -0.07885216176509857, 0.004476676695048809, -0.004348530899733305, -0.05656641349196434, 0.08152356743812561, 0.056889407336711884, -0.07837066054344177, -0.010168716311454773, 0.2517341077327728, -0.0985972210764885, -0.10931572318077087, -0.17698709666728973, 0.21630340814590454, 0.04207519069314003, 0.0407150536775589, -0.03794870898127556, -0.09932614862918854, -0.012525556609034538, 0.1497890055179596, 0.19305837154388428, -0.0816613957285881, 0.013782192952930927, 0.03275506943464279, 0.015856636688113213, 0.005497629754245281, 0.045945074409246445, 0.04469497501850128, 0.20561841130256653, -0.05437977612018585, 0.1280871331691742, -0.016095492988824844, -0.052379097789525986, -0.061504848301410675, 0.07870013266801834, 0.024597836658358574, 0.03443840891122818, -0.010557441972196102, 0.13016898930072784, -0.0339970700442791, -0.07815010100603104, -0.05355323851108551, -0.05870889872312546, -0.10942589491605759, -0.03263264149427414, 0.03336091339588165, 0.01541627012193203, 0.11367986351251602, 0.02485257014632225, -0.030314579606056213, 0.1206551268696785, -0.016910867765545845, -0.04231375455856323, -0.051613349467515945, 0.027871085330843925, -0.015012085437774658, 0.1603289544582367, -0.007341534830629826, -0.02966357208788395, 0.1167818009853363, 0.0065767052583396435, -0.051667969673871994, 0.08489213138818741, 0.04129301756620407, -0.060147110372781754, 0.13436256349086761, 0.08540207892656326, -0.018229791894555092, 0.10507390648126602, 0.08143499493598938, -0.19240230321884155, 0.07007978111505508, -0.027391090989112854, -0.020758002996444702, -0.058181408792734146, 0.05553685873746872, -0.07989069074392319, 0.12857232987880707, 0.18079876899719238, -0.0315757654607296, -0.001342034200206399, -0.009564126841723919, -0.011297059245407581, 0.026464290916919708, 0.07969494163990021, -0.04252208024263382, -0.10093005001544952, -0.0077896155416965485, -0.00826910138130188, 0.02198563516139984, -0.31392544507980347, -0.12349334359169006, 0.01829085685312748, -0.02093052677810192, -0.031577203422784805, 0.1316796839237213, 0.07585089653730392, 0.005069606006145477, -0.04274666681885719, -0.17746874690055847, 0.016816340386867523, 0.12377428263425827, -0.11008509993553162, -0.08509239554405212 ]
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # smolm-autoreg-bpe-counterfactual-babylm-only_other_det_removal-seed_211-1e-3 This model was trained from scratch on the kanishka/counterfactual-babylm-only_other_det_removal dataset. It achieves the following results on the evaluation set: - Loss: 3.4143 - Accuracy: 0.4110 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.001 - train_batch_size: 32 - eval_batch_size: 64 - seed: 211 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 32000 - num_epochs: 20.0 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:------:|:---------------:|:--------:| | 3.6004 | 1.0 | 18597 | 3.8219 | 0.3575 | | 3.3852 | 2.0 | 37194 | 3.6092 | 0.3797 | | 3.2597 | 3.0 | 55791 | 3.4837 | 0.3910 | | 3.1758 | 4.0 | 74388 | 3.4364 | 0.3981 | | 3.1197 | 5.0 | 92985 | 3.4116 | 0.4017 | | 3.08 | 6.0 | 111582 | 3.3782 | 0.4040 | | 3.0418 | 7.0 | 130179 | 3.3885 | 0.4055 | | 3.0088 | 8.0 | 148776 | 3.3884 | 0.4062 | | 2.9856 | 9.0 | 167373 | 3.3548 | 0.4077 | | 2.9598 | 10.0 | 185970 | 3.3782 | 0.4090 | | 2.9364 | 11.0 | 204567 | 3.3851 | 0.4093 | | 2.9156 | 12.0 | 223164 | 3.3803 | 0.4097 | | 2.8949 | 13.0 | 241761 | 3.3869 | 0.4100 | | 2.8719 | 14.0 | 260358 | 3.3813 | 0.4104 | | 2.8526 | 15.0 | 278955 | 3.3859 | 0.4108 | | 2.8289 | 16.0 | 297552 | 3.3980 | 0.4103 | | 2.8104 | 17.0 | 316149 | 3.3981 | 0.4109 | | 2.7958 | 18.0 | 334746 | 3.4054 | 0.4110 | | 2.781 | 19.0 | 353343 | 3.4057 | 0.4110 | | 2.7571 | 20.0 | 371940 | 3.4143 | 0.4110 | ### Framework versions - Transformers 4.37.2 - Pytorch 2.1.0+cu121 - Datasets 2.16.1 - Tokenizers 0.15.1
{"tags": ["generated_from_trainer"], "datasets": ["kanishka/counterfactual-babylm-only_other_det_removal"], "metrics": ["accuracy"], "model-index": [{"name": "smolm-autoreg-bpe-counterfactual-babylm-only_other_det_removal-seed_211-1e-3", "results": [{"task": {"type": "text-generation", "name": "Causal Language Modeling"}, "dataset": {"name": "kanishka/counterfactual-babylm-only_other_det_removal", "type": "kanishka/counterfactual-babylm-only_other_det_removal"}, "metrics": [{"type": "accuracy", "value": 0.4109943845202858, "name": "Accuracy"}]}]}]}
text-generation
kanishka/smolm-autoreg-bpe-counterfactual-babylm-only_other_det_removal-seed_211-1e-3
[ "transformers", "tensorboard", "safetensors", "opt", "text-generation", "generated_from_trainer", "dataset:kanishka/counterfactual-babylm-only_other_det_removal", "model-index", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2024-02-13T21:56:55+00:00
[]
[]
TAGS #transformers #tensorboard #safetensors #opt #text-generation #generated_from_trainer #dataset-kanishka/counterfactual-babylm-only_other_det_removal #model-index #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
smolm-autoreg-bpe-counterfactual-babylm-only\_other\_det\_removal-seed\_211-1e-3 ================================================================================ This model was trained from scratch on the kanishka/counterfactual-babylm-only\_other\_det\_removal dataset. It achieves the following results on the evaluation set: * Loss: 3.4143 * Accuracy: 0.4110 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 0.001 * train\_batch\_size: 32 * eval\_batch\_size: 64 * seed: 211 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * lr\_scheduler\_warmup\_steps: 32000 * num\_epochs: 20.0 * mixed\_precision\_training: Native AMP ### Training results ### Framework versions * Transformers 4.37.2 * Pytorch 2.1.0+cu121 * Datasets 2.16.1 * Tokenizers 0.15.1
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.001\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 64\n* seed: 211\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 32000\n* num\\_epochs: 20.0\n* mixed\\_precision\\_training: Native AMP", "### Training results", "### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.16.1\n* Tokenizers 0.15.1" ]
[ "TAGS\n#transformers #tensorboard #safetensors #opt #text-generation #generated_from_trainer #dataset-kanishka/counterfactual-babylm-only_other_det_removal #model-index #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.001\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 64\n* seed: 211\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 32000\n* num\\_epochs: 20.0\n* mixed\\_precision\\_training: Native AMP", "### Training results", "### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.16.1\n* Tokenizers 0.15.1" ]
[ 86, 132, 4, 33 ]
[ "passage: TAGS\n#transformers #tensorboard #safetensors #opt #text-generation #generated_from_trainer #dataset-kanishka/counterfactual-babylm-only_other_det_removal #model-index #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.001\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 64\n* seed: 211\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 32000\n* num\\_epochs: 20.0\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.16.1\n* Tokenizers 0.15.1" ]
[ -0.14435753226280212, 0.10658734291791916, -0.0016812057001516223, 0.052540574222803116, 0.12382002919912338, 0.013362912461161613, 0.1440768986940384, 0.13717742264270782, -0.06002655252814293, 0.0815717875957489, 0.13704311847686768, 0.08653424680233002, 0.050680093467235565, 0.1455516815185547, -0.052041154354810715, -0.27463966608047485, 0.0298977829515934, 0.04340428113937378, -0.11333858966827393, 0.12140950560569763, 0.10584647208452225, -0.10508369654417038, 0.06525955349206924, 0.04829183593392372, -0.1495746374130249, -0.003784097731113434, 0.0016174722695723176, -0.06055495887994766, 0.09939466416835785, 0.04239531233906746, 0.11834776401519775, 0.028225457295775414, 0.06623182445764542, -0.19973988831043243, 0.013727649115025997, 0.06070396676659584, 0.03316764533519745, 0.09172575920820236, 0.08184771984815598, -0.02780858427286148, 0.10584073513746262, -0.07318709790706635, 0.06619293987751007, 0.058559831231832504, -0.11767080426216125, -0.26866045594215393, -0.08412777632474899, 0.05951007083058357, 0.09734556823968887, 0.08611050248146057, -0.020801711827516556, 0.13069108128547668, -0.01586167886853218, 0.08853505551815033, 0.2174135148525238, -0.2220102846622467, -0.10279925912618637, -0.024367336183786392, 0.0564151369035244, 0.04011799022555351, -0.11428537964820862, -0.010050070472061634, 0.0633133053779602, 0.018478838726878166, 0.09010551124811172, 0.015384572558104992, 0.04853922128677368, -0.020207375288009644, -0.1345278024673462, -0.0670672133564949, 0.19203335046768188, 0.08487967401742935, -0.061909668147563934, -0.06796187907457352, -0.03642607107758522, -0.18206575512886047, -0.04440198466181755, 0.01691300980746746, 0.011760901659727097, -0.04301098361611366, -0.11321805417537689, -0.031477440148591995, -0.11113853752613068, -0.09654473513364792, 0.02489273063838482, 0.23617686331272125, 0.046680986881256104, -0.015263544395565987, -0.015500323846936226, 0.11257030069828033, 0.058382436633110046, -0.1502818614244461, -0.032920725643634796, 0.019241323694586754, -0.052354734390974045, -0.03304334357380867, -0.059581153094768524, -0.05050259083509445, 0.002352793700993061, 0.1629267781972885, -0.04441932588815689, 0.0691230446100235, 0.01315090898424387, 0.029953449964523315, -0.08367534726858139, 0.1717383712530136, -0.03505879268050194, 0.02531569078564644, -0.021003447473049164, 0.11101797968149185, -0.005707615986466408, -0.017030753195285797, -0.04027051106095314, 0.02549343928694725, 0.13540862500667572, 0.036178261041641235, -0.025484614074230194, 0.041068606078624725, -0.06735756248235703, -0.03294952213764191, 0.028446318581700325, -0.09213444590568542, 0.0019585248082876205, 0.008288722485303879, -0.0693914070725441, -0.01766727864742279, 0.025656849145889282, 0.017128437757492065, 0.016352832317352295, 0.06877873092889786, -0.10474332422018051, -0.01272072084248066, -0.0959540531039238, -0.08572159707546234, 0.011645120568573475, -0.028415456414222717, 0.012061459943652153, -0.10158644616603851, -0.15529200434684753, -0.03680447116494179, 0.05012428015470505, -0.02381390891969204, -0.05978698655962944, -0.044211264699697495, -0.07975542545318604, 0.055786244571208954, -0.01243574172258377, 0.12148415297269821, -0.04651030898094177, 0.10719124227762222, 0.06824695318937302, 0.0332743339240551, 0.011103780940175056, 0.038351353257894516, -0.077996626496315, 0.07027335464954376, -0.10240603983402252, 0.06414292752742767, -0.08378888666629791, 0.06515029817819595, -0.11559886485338211, -0.10616176575422287, -0.036905575543642044, 0.003955894615501165, 0.08671445399522781, 0.11048512905836105, -0.13288514316082, -0.08677691221237183, 0.1633877456188202, -0.10245339572429657, -0.1618831306695938, 0.10598240047693253, -0.023065736517310143, 0.06239394471049309, 0.04410969465970993, 0.14330722391605377, 0.09088075906038284, -0.07415138185024261, -0.03585275635123253, -0.05752870440483093, 0.1057698130607605, 0.008222084492444992, 0.11400759965181351, -0.007997730746865273, -0.04378236085176468, 0.012386937625706196, -0.07456938177347183, 0.05525719374418259, -0.10685046762228012, -0.09507995843887329, -0.04117697849869728, -0.1021147072315216, 0.028054116293787956, 0.05451294034719467, 0.06565798819065094, -0.09565027803182602, -0.11684959381818771, 0.0701955184340477, 0.1271713376045227, -0.09092918038368225, 0.014630360528826714, -0.0836399495601654, 0.02889331616461277, -0.07567696273326874, -0.019841449335217476, -0.16233058273792267, -0.10442477464675903, 0.02825566753745079, -0.06265784054994583, -0.01166998315602541, -0.062108591198921204, 0.09188605844974518, 0.06279357522726059, -0.049490541219711304, -0.08234231919050217, -0.08921192586421967, -0.010649493895471096, -0.08545082807540894, -0.18335458636283875, -0.08937695622444153, -0.027649588882923126, 0.19607579708099365, -0.2457522302865982, 0.04208621755242348, -0.010298894718289375, 0.13618391752243042, 0.04231742024421692, -0.05816413089632988, -0.010111689567565918, 0.03962108865380287, -0.04838475212454796, -0.07996070384979248, 0.033406708389520645, 0.02135961316525936, -0.13234268128871918, 0.0289304256439209, -0.12574900686740875, 0.11782364547252655, 0.0960455909371376, -0.01565435715019703, -0.07950352132320404, -0.026332003995776176, -0.08671987056732178, -0.05956598371267319, -0.018437976017594337, -0.03562483564019203, 0.11383377015590668, 0.03128066286444664, 0.12983322143554688, -0.09612378478050232, -0.0563541017472744, 0.032298747450113297, -0.01867430843412876, -0.034077238291502, 0.11832160502672195, 0.05051739141345024, -0.10561857372522354, 0.1107766181230545, 0.08254780620336533, -0.07204549759626389, 0.17664538323879242, -0.06019667536020279, -0.11423991620540619, -0.031133878976106644, 0.036536362022161484, 0.05142349377274513, 0.12473546713590622, -0.11597217619419098, 0.01484653726220131, 0.027454154565930367, 0.006433304864913225, 0.03432997688651085, -0.19903042912483215, -0.017606599256396294, 0.03835097700357437, -0.04159635305404663, -0.025141870602965355, 0.0024825932923704386, -0.012831257656216621, 0.08569350093603134, -0.010200885124504566, 0.00853764172643423, 0.018840136006474495, -0.003113183891400695, -0.0893486887216568, 0.22116051614284515, -0.06844115257263184, -0.1581544131040573, -0.16715136170387268, 0.0040047806687653065, -0.04930833727121353, -0.0010651125339791179, 0.031209317967295647, -0.08474624156951904, -0.0333673357963562, -0.07820668071508408, 0.017162727192044258, -0.05594252049922943, 0.025393523275852203, 0.034326132386922836, 0.0033212678972631693, 0.10454350709915161, -0.11513502895832062, 0.011072000488638878, 0.0024483238812536, -0.039890434592962265, 0.038847122341394424, 0.009494847618043423, 0.10020659118890762, 0.11189346015453339, 0.018659550696611404, 0.018094949424266815, -0.018361708149313927, 0.18581818044185638, -0.08708573132753372, -0.04660814255475998, 0.12202172726392746, -0.00020235990814398974, 0.048489589244127274, 0.07263320684432983, 0.04756084457039833, -0.09374615550041199, 0.04503446817398071, 0.06437144428491592, -0.02364841289818287, -0.23675993084907532, -0.01370223518460989, -0.04753265157341957, -0.022579442709684372, 0.13249802589416504, 0.042176906019449234, 0.025340436026453972, 0.08021745830774307, -0.04699805751442909, 0.022798053920269012, -0.03149135410785675, 0.10449746251106262, 0.06794684380292892, 0.05269990488886833, 0.1282292753458023, -0.023877792060375214, -0.044071268290281296, 0.027626793831586838, -0.012292386963963509, 0.24154876172542572, -0.001279335585422814, 0.18459665775299072, 0.05530749633908272, 0.14873090386390686, 0.00760100269690156, 0.08869952708482742, 0.03512001782655716, -0.03239999711513519, 0.026493757963180542, -0.061308473348617554, -0.03820866346359253, 0.04946354776620865, 0.0033155763521790504, 0.07038962095975876, -0.13056184351444244, 0.009099659509956837, 0.01149006374180317, 0.29828235507011414, 0.05711667984724045, -0.357538104057312, -0.12524741888046265, 0.00931291002780199, -0.056981444358825684, -0.08441131561994553, -0.001588444341905415, 0.08786484599113464, -0.10396703332662582, 0.05217359587550163, -0.10525590181350708, 0.0999210998415947, -0.040508516132831573, -0.011001438833773136, 0.05822388082742691, 0.07257115840911865, -0.021441899240016937, 0.07104480266571045, -0.24883894622325897, 0.2806391716003418, -0.010001568123698235, 0.08648858219385147, -0.04169638827443123, 0.02351689338684082, 0.04535827413201332, -0.015604855492711067, 0.05515436455607414, -0.007932578213512897, -0.09163559973239899, -0.2110484093427658, -0.0917612835764885, 0.036345843225717545, 0.11518517136573792, -0.0894569456577301, 0.14614707231521606, -0.034885846078395844, 0.011496917344629765, 0.06549738347530365, -0.04949392005801201, -0.15627077221870422, -0.09502546489238739, 0.04600962996482849, 0.02551509067416191, 0.06985963135957718, -0.12397342920303345, -0.12782296538352966, -0.02050255797803402, 0.15952135622501373, -0.04110533744096756, -0.050653595477342606, -0.14826780557632446, 0.08276517689228058, 0.16486655175685883, -0.07687171548604965, 0.03704419359564781, -0.0002713931316975504, 0.1750059723854065, 0.028157707303762436, -0.02085886336863041, 0.07395429164171219, -0.08425861597061157, -0.21833492815494537, -0.03553058207035065, 0.15033622086048126, 0.038273658603429794, 0.038814812898635864, -0.012049456126987934, 0.014489826746284962, -0.030733218416571617, -0.08369676023721695, 0.06051954627037048, -0.003878731746226549, 0.0076351286843419075, 0.030754806473851204, -0.04001728445291519, 0.05774599686264992, -0.06883446872234344, -0.0517890490591526, 0.12811526656150818, 0.3299266993999481, -0.05286720022559166, -0.02950029820203781, 0.021236270666122437, -0.05431481823325157, -0.1208660826086998, 0.034098025411367416, 0.14392946660518646, 0.025622377172112465, 0.01693742722272873, -0.21499304473400116, 0.07170915603637695, 0.10030902922153473, -0.03389238938689232, 0.11892525851726532, -0.251908540725708, -0.13854260742664337, 0.10185274481773376, 0.14261028170585632, 0.03830169513821602, -0.167332261800766, -0.07423195242881775, -0.022431135177612305, -0.1316080540418625, 0.1338070034980774, -0.000504341209307313, 0.12137855589389801, -0.01638714410364628, 0.07038915902376175, 0.02604222297668457, -0.05874868854880333, 0.17618010938167572, -0.02971285954117775, 0.06644345819950104, -0.012037971056997776, 0.03307882696390152, 0.09827376902103424, -0.07343722134828568, 0.027837390080094337, -0.07149972766637802, 0.04521362483501434, -0.1402539312839508, -0.0454833023250103, -0.08735623955726624, 0.06102778762578964, -0.04950559139251709, -0.04093915969133377, -0.00402285810559988, 0.044734735041856766, 0.03814121335744858, 0.001948187593370676, 0.16426397860050201, -0.008953629061579704, 0.16355682909488678, 0.08069203794002533, 0.08817990869283676, -0.0060503832064569, -0.04929398372769356, -0.030487941578030586, -0.020878219977021217, 0.0593833327293396, -0.11865448206663132, 0.03285910189151764, 0.12666289508342743, 0.04543084651231766, 0.14560261368751526, 0.059090111404657364, -0.074110247194767, 0.025905558839440346, 0.07558692991733551, -0.09276081621646881, -0.1091805025935173, -0.04913351684808731, 0.0708838477730751, -0.18439790606498718, 0.03602393716573715, 0.12325865775346756, -0.07763439416885376, -0.010506621561944485, -0.0116881113499403, -0.005998685024678707, -0.025128081440925598, 0.1972949057817459, 0.051515012979507446, 0.08075594902038574, -0.07621990144252777, 0.08191204816102982, 0.024457011371850967, -0.10592184215784073, 0.027258889749646187, 0.02997477352619171, -0.04795351251959801, -0.017051521688699722, -0.0003174956073053181, 0.10346046090126038, -0.06839585304260254, -0.05291342735290527, -0.15189139544963837, -0.10688212513923645, 0.06649981439113617, 0.10954352468252182, 0.045639779418706894, 0.04380170255899429, -0.014881174080073833, 0.043307773768901825, -0.12428368628025055, 0.10902638733386993, 0.09368042647838593, 0.10570523887872696, -0.1598726511001587, 0.14496563374996185, -0.005682756658643484, 0.010699191130697727, -0.0009108392405323684, -0.005180228501558304, -0.09315072000026703, 0.0010400754399597645, -0.12057602405548096, -0.011572103016078472, -0.05213293060660362, -0.011587431654334068, 0.011608853004872799, -0.06989127397537231, -0.09139741212129593, 0.020603569224476814, -0.11651327461004257, -0.06268849223852158, 0.02814069762825966, 0.06487801671028137, -0.09759511798620224, -0.016385124996304512, 0.06405677646398544, -0.11478208005428314, 0.06660913676023483, 0.06408341228961945, 0.03903971239924431, 0.04363925755023956, -0.0693257674574852, 0.04656928405165672, 0.024459831416606903, -0.009416138753294945, 0.01366568636149168, -0.13931334018707275, 0.005524197593331337, -0.009768834337592125, 0.04163065925240517, 0.0024526906199753284, 0.035471636801958084, -0.13713131844997406, -0.057690899819135666, -0.004617213737219572, -0.02887793444097042, -0.060367170721292496, 0.032225076109170914, 0.03856046125292778, 0.04620344564318657, 0.18251140415668488, -0.06832800060510635, 0.013360433280467987, -0.24035562574863434, 0.01587994024157524, -0.021974019706249237, -0.09850379824638367, -0.04284638911485672, -0.03205152228474617, 0.0665743350982666, -0.06817380338907242, 0.09304530918598175, -0.07483810931444168, 0.04555812478065491, 0.029206475242972374, -0.07002078741788864, 0.05444811284542084, 0.027906810864806175, 0.2937453091144562, 0.0535288043320179, -0.00913335382938385, 0.07751061767339706, 0.012876978144049644, 0.06542480736970901, 0.08323168754577637, 0.16189944744110107, 0.15797725319862366, -0.04588516429066658, 0.11029339581727982, 0.03812868148088455, -0.06663493812084198, -0.12478704750537872, 0.05604679137468338, -0.02804550901055336, 0.0989859327673912, 0.007565723266452551, 0.20454375445842743, 0.1427862048149109, -0.17496609687805176, 0.017776336520910263, -0.01639529876410961, -0.0729077160358429, -0.08957172185182571, -0.07796164602041245, -0.08446740359067917, -0.1504935324192047, 0.024117477238178253, -0.1327294558286667, 0.021410565823316574, 0.07268636673688889, 0.019317148253321648, 0.004755955655127764, 0.17787718772888184, 0.06932752579450607, -0.004195555578917265, 0.10117898881435394, -0.01345760840922594, -0.022671204060316086, -0.0540795736014843, -0.12743286788463593, 0.04454484209418297, -0.013391408137977123, 0.046943239867687225, -0.06772498786449432, -0.06907224655151367, 0.06789340078830719, 0.018873097375035286, -0.1331579089164734, 0.010391592979431152, 0.004423345904797316, 0.05744417384266853, 0.014613423496484756, 0.011559812352061272, 0.01708213798701763, -0.022257056087255478, 0.2450239211320877, -0.07350572198629379, -0.01439501903951168, -0.1428598016500473, 0.20357024669647217, 0.010639057494699955, -0.041457898914813995, 0.029853835701942444, -0.08086522668600082, 0.016795190051198006, 0.15505634248256683, 0.08194061368703842, 0.002250856952741742, -0.021258454769849777, 0.0007957206107676029, -0.0175788551568985, -0.054802436381578445, 0.10468584299087524, 0.0682067722082138, -0.0009656796464696527, -0.06975696980953217, -0.050018392503261566, -0.053819794207811356, -0.032378848642110825, -0.01676342822611332, 0.07861198484897614, 0.01981944963335991, -0.010598245076835155, -0.04137555882334709, 0.08927024155855179, -0.04637850821018219, -0.0785273015499115, 0.016421392560005188, -0.15162138640880585, -0.17985692620277405, -0.048373814672231674, 0.044613827019929886, -0.0001935917098307982, 0.05585839971899986, -0.019957957789301872, -0.010852972976863384, 0.09851735830307007, -0.00557751813903451, -0.039822760969400406, -0.1155891865491867, 0.07814148813486099, -0.05775967612862587, 0.23828057944774628, -0.04054082930088043, 0.011553609743714333, 0.12511838972568512, 0.02320616878569126, -0.10895443707704544, 0.047204453498125076, 0.06577235460281372, -0.10053378343582153, 0.056843169033527374, 0.20059886574745178, -0.03266487643122673, 0.12995819747447968, 0.03783425688743591, -0.1429465413093567, -0.006040728650987148, -0.06471134722232819, -0.033468857407569885, -0.07881179451942444, -0.018595481291413307, -0.05239235609769821, 0.1271543800830841, 0.21856941282749176, -0.07816492766141891, -0.03368552029132843, -0.0650767832994461, 0.055221669375896454, 0.08115807920694351, 0.10450413823127747, -0.03471476212143898, -0.30772122740745544, 0.00999431498348713, 0.0470186322927475, -0.007102907635271549, -0.2891097366809845, -0.08555103093385696, 0.017278527840971947, -0.07510074973106384, -0.02848178893327713, 0.10396669059991837, 0.0913410410284996, 0.04699656367301941, -0.0463990643620491, -0.06530129164457321, -0.06371541321277618, 0.18344537913799286, -0.1574190855026245, -0.07202012091875076 ]
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # smolm-autoreg-bpe-counterfactual-babylm-only_other_det_removal-seed_1024-1e-3 This model was trained from scratch on the kanishka/counterfactual-babylm-only_other_det_removal dataset. It achieves the following results on the evaluation set: - Loss: 3.4160 - Accuracy: 0.4105 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.001 - train_batch_size: 32 - eval_batch_size: 64 - seed: 1024 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 32000 - num_epochs: 20.0 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:------:|:---------------:|:--------:| | 3.6007 | 1.0 | 18597 | 3.8014 | 0.3575 | | 3.3846 | 2.0 | 37194 | 3.5881 | 0.3796 | | 3.2609 | 3.0 | 55791 | 3.4855 | 0.3918 | | 3.1804 | 4.0 | 74388 | 3.4168 | 0.3979 | | 3.1278 | 5.0 | 92985 | 3.4013 | 0.4018 | | 3.081 | 6.0 | 111582 | 3.3683 | 0.4041 | | 3.0471 | 7.0 | 130179 | 3.3773 | 0.4055 | | 3.0189 | 8.0 | 148776 | 3.3797 | 0.4069 | | 2.988 | 9.0 | 167373 | 3.3716 | 0.4074 | | 2.9624 | 10.0 | 185970 | 3.3675 | 0.4088 | | 2.9372 | 11.0 | 204567 | 3.3803 | 0.4093 | | 2.9153 | 12.0 | 223164 | 3.3654 | 0.4096 | | 2.8939 | 13.0 | 241761 | 3.3777 | 0.4098 | | 2.8704 | 14.0 | 260358 | 3.3811 | 0.4103 | | 2.8503 | 15.0 | 278955 | 3.3847 | 0.4102 | | 2.8343 | 16.0 | 297552 | 3.3952 | 0.4100 | | 2.8131 | 17.0 | 316149 | 3.4062 | 0.4103 | | 2.7975 | 18.0 | 334746 | 3.4120 | 0.4102 | | 2.7753 | 19.0 | 353343 | 3.4110 | 0.4105 | | 2.7567 | 20.0 | 371940 | 3.4160 | 0.4105 | ### Framework versions - Transformers 4.37.2 - Pytorch 2.1.0+cu121 - Datasets 2.16.1 - Tokenizers 0.15.1
{"tags": ["generated_from_trainer"], "datasets": ["kanishka/counterfactual-babylm-only_other_det_removal"], "metrics": ["accuracy"], "model-index": [{"name": "smolm-autoreg-bpe-counterfactual-babylm-only_other_det_removal-seed_1024-1e-3", "results": [{"task": {"type": "text-generation", "name": "Causal Language Modeling"}, "dataset": {"name": "kanishka/counterfactual-babylm-only_other_det_removal", "type": "kanishka/counterfactual-babylm-only_other_det_removal"}, "metrics": [{"type": "accuracy", "value": 0.4105017384701812, "name": "Accuracy"}]}]}]}
text-generation
kanishka/smolm-autoreg-bpe-counterfactual-babylm-only_other_det_removal-seed_1024-1e-3
[ "transformers", "tensorboard", "safetensors", "opt", "text-generation", "generated_from_trainer", "dataset:kanishka/counterfactual-babylm-only_other_det_removal", "model-index", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2024-02-13T21:57:20+00:00
[]
[]
TAGS #transformers #tensorboard #safetensors #opt #text-generation #generated_from_trainer #dataset-kanishka/counterfactual-babylm-only_other_det_removal #model-index #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
smolm-autoreg-bpe-counterfactual-babylm-only\_other\_det\_removal-seed\_1024-1e-3 ================================================================================= This model was trained from scratch on the kanishka/counterfactual-babylm-only\_other\_det\_removal dataset. It achieves the following results on the evaluation set: * Loss: 3.4160 * Accuracy: 0.4105 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 0.001 * train\_batch\_size: 32 * eval\_batch\_size: 64 * seed: 1024 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * lr\_scheduler\_warmup\_steps: 32000 * num\_epochs: 20.0 * mixed\_precision\_training: Native AMP ### Training results ### Framework versions * Transformers 4.37.2 * Pytorch 2.1.0+cu121 * Datasets 2.16.1 * Tokenizers 0.15.1
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.001\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 64\n* seed: 1024\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 32000\n* num\\_epochs: 20.0\n* mixed\\_precision\\_training: Native AMP", "### Training results", "### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.16.1\n* Tokenizers 0.15.1" ]
[ "TAGS\n#transformers #tensorboard #safetensors #opt #text-generation #generated_from_trainer #dataset-kanishka/counterfactual-babylm-only_other_det_removal #model-index #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.001\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 64\n* seed: 1024\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 32000\n* num\\_epochs: 20.0\n* mixed\\_precision\\_training: Native AMP", "### Training results", "### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.16.1\n* Tokenizers 0.15.1" ]
[ 86, 132, 4, 33 ]
[ "passage: TAGS\n#transformers #tensorboard #safetensors #opt #text-generation #generated_from_trainer #dataset-kanishka/counterfactual-babylm-only_other_det_removal #model-index #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.001\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 64\n* seed: 1024\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 32000\n* num\\_epochs: 20.0\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.37.2\n* Pytorch 2.1.0+cu121\n* Datasets 2.16.1\n* Tokenizers 0.15.1" ]
[ -0.14568059146404266, 0.10777363181114197, -0.001654144492931664, 0.0538385771214962, 0.12346795201301575, 0.013450076803565025, 0.14533382654190063, 0.1381264477968216, -0.062093187123537064, 0.08275678753852844, 0.13637693226337433, 0.08809170871973038, 0.05167549103498459, 0.14028701186180115, -0.05023467540740967, -0.27506038546562195, 0.02981751225888729, 0.04402748867869377, -0.10711584985256195, 0.12202731519937515, 0.10570227354764938, -0.10592283308506012, 0.06472203135490417, 0.047630541026592255, -0.1478109359741211, -0.005436806008219719, 0.002959693782031536, -0.06042778491973877, 0.10117758810520172, 0.03931259736418724, 0.11875077337026596, 0.028426796197891235, 0.06860212236642838, -0.202187642455101, 0.013539453037083149, 0.06203124299645424, 0.0332513302564621, 0.09099076688289642, 0.08057572692632675, -0.02503000944852829, 0.10743462294340134, -0.07131776958703995, 0.06768732517957687, 0.059729430824518204, -0.11841326951980591, -0.26831886172294617, -0.0850137323141098, 0.059906795620918274, 0.09860929101705551, 0.08457789570093155, -0.02116006426513195, 0.13269832730293274, -0.014564599841833115, 0.09088630229234695, 0.21606563031673431, -0.2253575325012207, -0.10140379518270493, -0.021490024402737617, 0.057806603610515594, 0.0444304496049881, -0.11487804353237152, -0.008917903527617455, 0.06415211409330368, 0.019323324784636497, 0.09002364426851273, 0.016342977061867714, 0.04662396013736725, -0.02088622935116291, -0.13478614389896393, -0.06770570576190948, 0.18498173356056213, 0.08647742122411728, -0.0631881058216095, -0.06842194497585297, -0.037425022572278976, -0.18655487895011902, -0.044032491743564606, 0.018734127283096313, 0.010783952660858631, -0.04319878667593002, -0.11326490342617035, -0.029948312789201736, -0.1101989597082138, -0.09752939641475677, 0.027232658118009567, 0.2319440245628357, 0.048260051757097244, -0.014489026740193367, -0.01493010763078928, 0.11285663396120071, 0.05861290171742439, -0.1497252732515335, -0.03381079435348511, 0.019386863335967064, -0.053417451679706573, -0.031629253178834915, -0.0604918897151947, -0.04668838158249855, 0.002107565989717841, 0.1666485071182251, -0.043777257204055786, 0.06914077699184418, 0.013434131629765034, 0.030698828399181366, -0.08568429201841354, 0.17312629520893097, -0.034739766269922256, 0.02180374227464199, -0.020627299323678017, 0.11195272207260132, -0.003928627818822861, -0.017949461936950684, -0.040732599794864655, 0.022259870544075966, 0.13522392511367798, 0.035999614745378494, -0.02494914084672928, 0.041091591119766235, -0.06596735119819641, -0.03126965090632439, 0.02949836477637291, -0.09175808727741241, 0.002851808676496148, 0.008456065319478512, -0.06852386146783829, -0.01670784316956997, 0.02600247599184513, 0.017544565722346306, 0.015147311612963676, 0.06567548215389252, -0.10234545916318893, -0.012839147821068764, -0.0957108810544014, -0.08747159689664841, 0.011781677603721619, -0.03441572189331055, 0.012460136786103249, -0.10380320250988007, -0.1547432839870453, -0.03886904940009117, 0.04786626249551773, -0.023226654157042503, -0.059950388967990875, -0.044731248170137405, -0.07942292094230652, 0.05459209904074669, -0.01286495290696621, 0.11649253964424133, -0.046084750443696976, 0.107466921210289, 0.06850273162126541, 0.034779515117406845, 0.010618738830089569, 0.035899125039577484, -0.07666251808404922, 0.06947684288024902, -0.0998569205403328, 0.06440743058919907, -0.08234164118766785, 0.06565629690885544, -0.11480021476745605, -0.10732290893793106, -0.034576013684272766, 0.0037795465905219316, 0.08893691748380661, 0.1101188138127327, -0.1324603110551834, -0.08447150886058807, 0.16715103387832642, -0.10237768292427063, -0.16424523293972015, 0.10865399986505508, -0.022554315626621246, 0.06133481115102768, 0.046578530222177505, 0.14106735587120056, 0.09296943992376328, -0.07552652806043625, -0.035476360470056534, -0.057437408715486526, 0.1076783835887909, 0.0041085402481257915, 0.11278814822435379, -0.009096027351915836, -0.04468842223286629, 0.01214772928506136, -0.0718301385641098, 0.05692470073699951, -0.10734532028436661, -0.09565702825784683, -0.04155024141073227, -0.10321133583784103, 0.028059611096978188, 0.05435275286436081, 0.06589222699403763, -0.0960192158818245, -0.11640821397304535, 0.06501670181751251, 0.12691600620746613, -0.09239806979894638, 0.016911080107092857, -0.08432703465223312, 0.030501265078783035, -0.07722976058721542, -0.020766615867614746, -0.1624048501253128, -0.10107887536287308, 0.029606513679027557, -0.06162577122449875, -0.014439266175031662, -0.06401759386062622, 0.09240748733282089, 0.0638182982802391, -0.04935973882675171, -0.08250715583562851, -0.08975151926279068, -0.009987312369048595, -0.08774460107088089, -0.18114930391311646, -0.08975709229707718, -0.028650807216763496, 0.19425447285175323, -0.24572379887104034, 0.04375801607966423, -0.013617248274385929, 0.1355232298374176, 0.04085835441946983, -0.058977287262678146, -0.01111556962132454, 0.041815854609012604, -0.048343755304813385, -0.07982248067855835, 0.03494369611144066, 0.02208363637328148, -0.1322445571422577, 0.02801056206226349, -0.12657296657562256, 0.11099554598331451, 0.09512981027364731, -0.01752190664410591, -0.0802648589015007, -0.025843381881713867, -0.08751218020915985, -0.058777518570423126, -0.0186862014234066, -0.03640493005514145, 0.11016606539487839, 0.0315372496843338, 0.13056005537509918, -0.09640049189329147, -0.05527418479323387, 0.03292032703757286, -0.017908724024891853, -0.03108546882867813, 0.11763429641723633, 0.05118212848901749, -0.09809587150812149, 0.10710014402866364, 0.08095408976078033, -0.07361190021038055, 0.17297619581222534, -0.0622098445892334, -0.11442991346120834, -0.029767118394374847, 0.03425789251923561, 0.05141020938754082, 0.12341565638780594, -0.11526159942150116, 0.015611734241247177, 0.028229225426912308, 0.00650536734610796, 0.032961130142211914, -0.1992826610803604, -0.016838693991303444, 0.03908881917595863, -0.042147472500801086, -0.028176922351121902, 0.003040732815861702, -0.01364947110414505, 0.08564984798431396, -0.013156377710402012, 0.005781413055956364, 0.017862113192677498, -0.0032981233671307564, -0.09005775302648544, 0.22112973034381866, -0.06812290847301483, -0.152446910738945, -0.16752378642559052, 0.0032650968059897423, -0.04808132350444794, -0.0007839384488761425, 0.03204382583498955, -0.08415640145540237, -0.032200224697589874, -0.0809469223022461, 0.015295353718101978, -0.05554187297821045, 0.02732645533978939, 0.03683580830693245, 0.0022505915258079767, 0.10471362620592117, -0.11474411189556122, 0.012016400694847107, 0.002932000672444701, -0.04016891494393349, 0.039233963936567307, 0.009275078773498535, 0.10040666908025742, 0.11489369720220566, 0.017674574628472328, 0.01748187094926834, -0.018167436122894287, 0.18959642946720123, -0.0865672305226326, -0.0440966933965683, 0.12266767770051956, -0.0006299616652540863, 0.04982706904411316, 0.07262231409549713, 0.047156922519207, -0.09328918159008026, 0.04477832093834877, 0.06349319219589233, -0.024402430281043053, -0.23613625764846802, -0.010741320438683033, -0.0462968647480011, -0.020793605595827103, 0.13317567110061646, 0.04084504768252373, 0.025901000946760178, 0.08229222893714905, -0.04649422690272331, 0.022088773548603058, -0.03222781792283058, 0.1043325737118721, 0.06623926758766174, 0.05201614648103714, 0.12689831852912903, -0.02494075708091259, -0.043682049959897995, 0.02620788663625717, -0.013203723356127739, 0.23682577908039093, -0.0019516961183398962, 0.18418535590171814, 0.05575406178832054, 0.15131202340126038, 0.006835507228970528, 0.09007024019956589, 0.03377699479460716, -0.031349338591098785, 0.027517804875969887, -0.060894161462783813, -0.03865077346563339, 0.048302020877599716, 0.0024230352137237787, 0.06859838217496872, -0.131269633769989, 0.00851095374673605, 0.010222296230494976, 0.30003663897514343, 0.06001564860343933, -0.3600427508354187, -0.12508760392665863, 0.008189977146685123, -0.05531827732920647, -0.08487389236688614, -0.0005556477699428797, 0.09011045098304749, -0.10396550595760345, 0.05239620432257652, -0.1041208878159523, 0.10096549242734909, -0.04189302399754524, -0.011219419538974762, 0.057683270424604416, 0.07027950137853622, -0.021265584975481033, 0.06950693577528, -0.24780438840389252, 0.2771577537059784, -0.011151060461997986, 0.08805380016565323, -0.04098198190331459, 0.023203831166028976, 0.043234068900346756, -0.013466506265103817, 0.05501571670174599, -0.007396674249321222, -0.09473433345556259, -0.21482256054878235, -0.0919838547706604, 0.035705652087926865, 0.11462630331516266, -0.08776235580444336, 0.1475687026977539, -0.03389175981283188, 0.011536910198628902, 0.06578005850315094, -0.053660664707422256, -0.15797486901283264, -0.09443237632513046, 0.04353151097893715, 0.029730921611189842, 0.06952057033777237, -0.12279097735881805, -0.12702180445194244, -0.02430361695587635, 0.15268021821975708, -0.04391118139028549, -0.05106257274746895, -0.14836761355400085, 0.08130573481321335, 0.16518238186836243, -0.07707948982715607, 0.037734437733888626, -0.00035837534232996404, 0.17387454211711884, 0.0276284608989954, -0.01829334907233715, 0.0721231997013092, -0.08446943759918213, -0.22098304331302643, -0.03602875396609306, 0.14986960589885712, 0.03833436965942383, 0.038464345037937164, -0.010746384039521217, 0.01357244048267603, -0.029552582651376724, -0.0842873603105545, 0.06122220307588577, -0.0013105778489261866, 0.00906376913189888, 0.030655452981591225, -0.039775583893060684, 0.06287138164043427, -0.0693000853061676, -0.051775217056274414, 0.12778453528881073, 0.3329415023326874, -0.054273635149002075, -0.030788768082857132, 0.02364950068295002, -0.05470690876245499, -0.11840948462486267, 0.030543476343154907, 0.1426914930343628, 0.02621147781610489, 0.01926814205944538, -0.21155337989330292, 0.06884628534317017, 0.09568425267934799, -0.03376294672489166, 0.11631811410188675, -0.2551769018173218, -0.13868430256843567, 0.10154388099908829, 0.1436792016029358, 0.04067866504192352, -0.16663596034049988, -0.07419198006391525, -0.02270696498453617, -0.13282577693462372, 0.13067492842674255, -0.001646992634050548, 0.12165559828281403, -0.017043380066752434, 0.06850123405456543, 0.02651406265795231, -0.05926208198070526, 0.17673878371715546, -0.027331624180078506, 0.06565336883068085, -0.013041971251368523, 0.03643304482102394, 0.09810007363557816, -0.07404005527496338, 0.027453146874904633, -0.07081861793994904, 0.046985942870378494, -0.14366598427295685, -0.045261044055223465, -0.08805785328149796, 0.05852995440363884, -0.0500950925052166, -0.0398402214050293, -0.003957428503781557, 0.04312073066830635, 0.04071188345551491, 0.0011510071344673634, 0.1615355908870697, -0.008342995308339596, 0.16316254436969757, 0.08292342722415924, 0.08631571382284164, -0.0020868456922471523, -0.048505257815122604, -0.03140191361308098, -0.021176492795348167, 0.05843222886323929, -0.116500124335289, 0.032186832278966904, 0.12581977248191833, 0.044645968824625015, 0.14443396031856537, 0.05844654515385628, -0.07416858524084091, 0.025792855769395828, 0.07571154832839966, -0.09121449291706085, -0.1067904531955719, -0.04694733768701553, 0.06520835310220718, -0.1853918433189392, 0.035912878811359406, 0.12411240488290787, -0.07820108532905579, -0.009392649866640568, -0.012574163265526295, -0.005569173954427242, -0.02500765398144722, 0.19807763397693634, 0.05462665483355522, 0.08079599589109421, -0.07486496865749359, 0.08151809871196747, 0.025042260065674782, -0.10614557564258575, 0.02703060582280159, 0.027836672961711884, -0.04825277626514435, -0.016879387199878693, 0.0035654674284160137, 0.10124775022268295, -0.07009574770927429, -0.053688645362854004, -0.1538785845041275, -0.10484553128480911, 0.06580886989831924, 0.1071614995598793, 0.04532429575920105, 0.04447300732135773, -0.014463095925748348, 0.042118605226278305, -0.12521886825561523, 0.1089891865849495, 0.09379151463508606, 0.10566418617963791, -0.16062207520008087, 0.14835461974143982, -0.0061759925447404385, 0.012648987583816051, 0.00004937317135045305, -0.003420460969209671, -0.09172023087739944, 0.0018334704218432307, -0.12168929725885391, -0.011292564682662487, -0.05134230852127075, -0.011614222079515457, 0.011417113244533539, -0.06988752633333206, -0.09028217941522598, 0.019752344116568565, -0.11649777740240097, -0.06272923201322556, 0.028098544105887413, 0.06424572318792343, -0.09677276015281677, -0.016543027013540268, 0.06345745176076889, -0.11501943320035934, 0.06740608811378479, 0.06069301441311836, 0.03680405765771866, 0.04405459016561508, -0.068482406437397, 0.048541296273469925, 0.023943224921822548, -0.009544843807816505, 0.012182400561869144, -0.1345403492450714, 0.005876796320080757, -0.010267464444041252, 0.04143267869949341, 0.0025607491843402386, 0.034000832587480545, -0.13761568069458008, -0.057760968804359436, -0.004271598998457193, -0.027487564831972122, -0.061314795166254044, 0.031005844473838806, 0.034918561577796936, 0.04624399542808533, 0.18149058520793915, -0.06953427195549011, 0.015038312412798405, -0.24028830230236053, 0.015218089334666729, -0.02112387679517269, -0.09733004868030548, -0.03969499096274376, -0.03377469629049301, 0.06808499246835709, -0.0668664425611496, 0.09359853714704514, -0.0750366821885109, 0.04587528109550476, 0.028049863874912262, -0.07158203423023224, 0.05583733692765236, 0.02702411264181137, 0.29703158140182495, 0.05253862962126732, -0.01040821522474289, 0.07764947414398193, 0.011473159305751324, 0.06563001871109009, 0.08515314012765884, 0.15992562472820282, 0.15801048278808594, -0.043071188032627106, 0.11162769049406052, 0.03863269463181496, -0.06415286660194397, -0.1209225133061409, 0.052438102662563324, -0.028353789821267128, 0.09896580874919891, 0.006045577581971884, 0.20676656067371368, 0.14154520630836487, -0.17282800376415253, 0.016937237232923508, -0.014074500650167465, -0.07340161502361298, -0.0919017419219017, -0.07680431753396988, -0.08417511731386185, -0.150224506855011, 0.02471476048231125, -0.13073550164699554, 0.020965851843357086, 0.07409678399562836, 0.019582122564315796, 0.003992079291492701, 0.17606234550476074, 0.07154477387666702, -0.0015431492356583476, 0.10200542211532593, -0.012864947319030762, -0.0212390199303627, -0.05048154667019844, -0.12898530066013336, 0.046535734087228775, -0.012861314229667187, 0.048242971301078796, -0.0682547464966774, -0.07085804641246796, 0.06876223534345627, 0.020504094660282135, -0.13362672924995422, 0.010341066867113113, 0.004375956952571869, 0.0608254075050354, 0.015166870318353176, 0.013834756799042225, 0.01796882413327694, -0.023120639845728874, 0.24475879967212677, -0.07431414723396301, -0.01589420810341835, -0.1429746001958847, 0.19825270771980286, 0.012026755139231682, -0.0426441989839077, 0.0295709278434515, -0.08242131024599075, 0.017019344493746758, 0.1577194333076477, 0.07951898872852325, 0.0022473791614174843, -0.021898962557315826, 0.0005142598529346287, -0.017851408571004868, -0.05350204184651375, 0.10663028806447983, 0.06655260920524597, -0.0036713082809001207, -0.06813367456197739, -0.049088314175605774, -0.05361182242631912, -0.03283706679940224, -0.017176806926727295, 0.07909901440143585, 0.018476584926247597, -0.011158050037920475, -0.0420047827064991, 0.09104369580745697, -0.04555635526776314, -0.0802939161658287, 0.01868598721921444, -0.1479380875825882, -0.17856788635253906, -0.0466996431350708, 0.04583654925227165, 0.0024671396240592003, 0.05444708466529846, -0.020780513063073158, -0.009451990947127342, 0.10175083577632904, -0.007349093910306692, -0.03835099935531616, -0.11397512257099152, 0.07588013261556625, -0.05774259567260742, 0.23822888731956482, -0.039301060140132904, 0.014763526618480682, 0.12601646780967712, 0.023169491440057755, -0.109395831823349, 0.04782623052597046, 0.06452518701553345, -0.09883161634206772, 0.057257309556007385, 0.20117777585983276, -0.032002002000808716, 0.12416047602891922, 0.036904603242874146, -0.1396152228116989, -0.004878288134932518, -0.06319070607423782, -0.03414873778820038, -0.0809842124581337, -0.01974937506020069, -0.05087096616625786, 0.1280623823404312, 0.2164745330810547, -0.08052768558263779, -0.032327789813280106, -0.06357238441705704, 0.052749134600162506, 0.08069675415754318, 0.09946285933256149, -0.03415115550160408, -0.3075272738933563, 0.010243833996355534, 0.047102417796850204, -0.0055086989887058735, -0.28775301575660706, -0.08500432968139648, 0.01851627416908741, -0.07618071138858795, -0.03055783174932003, 0.10561308264732361, 0.08976505696773529, 0.046467434614896774, -0.04580552503466606, -0.06781630963087082, -0.0641111508011818, 0.18472956120967865, -0.16042183339595795, -0.07511773705482483 ]
null
null
transformers
# NeuralTrix-v3-fp16 NeuralTrix-v3-fp16 is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing): * [mlabonne/OmniBeagle-7B](https://huggingface.co/mlabonne/OmniBeagle-7B) * [flemmingmiguel/MBX-7B-v3](https://huggingface.co/flemmingmiguel/MBX-7B-v3) * [bardsai/jaskier-7b-dpo-v3.3](https://huggingface.co/bardsai/jaskier-7b-dpo-v3.3) ## 🧩 Configuration ```yaml models: - model: mistralai/Mistral-7B-v0.1 # no parameters necessary for base model - model: mlabonne/OmniBeagle-7B parameters: density: 0.65 weight: 0.4 - model: flemmingmiguel/MBX-7B-v3 parameters: density: 0.6 weight: 0.35 - model: bardsai/jaskier-7b-dpo-v3.3 parameters: density: 0.6 weight: 0.35 merge_method: dare_ties base_model: mistralai/Mistral-7B-v0.1 parameters: int8_mask: true dtype: float16 ``` ## 💻 Usage ```python !pip install -qU transformers accelerate from transformers import AutoTokenizer import transformers import torch model = "CultriX/NeuralTrix-v3-fp16" messages = [{"role": "user", "content": "What is a large language model?"}] tokenizer = AutoTokenizer.from_pretrained(model) prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True) pipeline = transformers.pipeline( "text-generation", model=model, torch_dtype=torch.float16, device_map="auto", ) outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95) print(outputs[0]["generated_text"]) ```
{"tags": ["merge", "mergekit", "lazymergekit", "mlabonne/OmniBeagle-7B", "flemmingmiguel/MBX-7B-v3", "bardsai/jaskier-7b-dpo-v3.3"], "base_model": ["mlabonne/OmniBeagle-7B", "flemmingmiguel/MBX-7B-v3", "bardsai/jaskier-7b-dpo-v3.3"]}
text-generation
CultriX/NeuralTrix-v3-fp16
[ "transformers", "safetensors", "mistral", "text-generation", "merge", "mergekit", "lazymergekit", "mlabonne/OmniBeagle-7B", "flemmingmiguel/MBX-7B-v3", "bardsai/jaskier-7b-dpo-v3.3", "base_model:mlabonne/OmniBeagle-7B", "base_model:flemmingmiguel/MBX-7B-v3", "base_model:bardsai/jaskier-7b-dpo-v3.3", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2024-02-13T21:58:12+00:00
[]
[]
TAGS #transformers #safetensors #mistral #text-generation #merge #mergekit #lazymergekit #mlabonne/OmniBeagle-7B #flemmingmiguel/MBX-7B-v3 #bardsai/jaskier-7b-dpo-v3.3 #base_model-mlabonne/OmniBeagle-7B #base_model-flemmingmiguel/MBX-7B-v3 #base_model-bardsai/jaskier-7b-dpo-v3.3 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# NeuralTrix-v3-fp16 NeuralTrix-v3-fp16 is a merge of the following models using LazyMergekit: * mlabonne/OmniBeagle-7B * flemmingmiguel/MBX-7B-v3 * bardsai/jaskier-7b-dpo-v3.3 ## Configuration ## Usage
[ "# NeuralTrix-v3-fp16\n\nNeuralTrix-v3-fp16 is a merge of the following models using LazyMergekit:\n* mlabonne/OmniBeagle-7B\n* flemmingmiguel/MBX-7B-v3\n* bardsai/jaskier-7b-dpo-v3.3", "## Configuration", "## Usage" ]
[ "TAGS\n#transformers #safetensors #mistral #text-generation #merge #mergekit #lazymergekit #mlabonne/OmniBeagle-7B #flemmingmiguel/MBX-7B-v3 #bardsai/jaskier-7b-dpo-v3.3 #base_model-mlabonne/OmniBeagle-7B #base_model-flemmingmiguel/MBX-7B-v3 #base_model-bardsai/jaskier-7b-dpo-v3.3 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# NeuralTrix-v3-fp16\n\nNeuralTrix-v3-fp16 is a merge of the following models using LazyMergekit:\n* mlabonne/OmniBeagle-7B\n* flemmingmiguel/MBX-7B-v3\n* bardsai/jaskier-7b-dpo-v3.3", "## Configuration", "## Usage" ]
[ 152, 76, 4, 3 ]
[ "passage: TAGS\n#transformers #safetensors #mistral #text-generation #merge #mergekit #lazymergekit #mlabonne/OmniBeagle-7B #flemmingmiguel/MBX-7B-v3 #bardsai/jaskier-7b-dpo-v3.3 #base_model-mlabonne/OmniBeagle-7B #base_model-flemmingmiguel/MBX-7B-v3 #base_model-bardsai/jaskier-7b-dpo-v3.3 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# NeuralTrix-v3-fp16\n\nNeuralTrix-v3-fp16 is a merge of the following models using LazyMergekit:\n* mlabonne/OmniBeagle-7B\n* flemmingmiguel/MBX-7B-v3\n* bardsai/jaskier-7b-dpo-v3.3## Configuration## Usage" ]
[ -0.05213717743754387, 0.06623387336730957, -0.00777696305885911, 0.026291394606232643, 0.058034636080265045, 0.02851688489317894, 0.146467387676239, 0.1167910248041153, -0.004930292721837759, 0.05034543573856354, 0.07896433025598526, 0.1749364286661148, 0.06141754239797592, 0.14396540820598602, -0.04149903729557991, -0.2007533609867096, 0.06780695170164108, 0.02038366161286831, -0.0742264986038208, 0.08550137281417847, 0.1109648272395134, -0.03518925607204437, 0.08982761949300766, 0.021304672583937645, -0.0846468061208725, 0.018001195043325424, -0.008609901182353497, -0.024886788800358772, 0.07663101702928543, 0.09399260580539703, 0.04150735214352608, 0.05809932202100754, -0.026329096406698227, -0.16715465486049652, 0.02993500977754593, 0.018911931663751602, -0.028510192409157753, 0.07158421725034714, 0.12389364838600159, -0.05333666875958443, 0.08839349448680878, -0.05535547807812691, 0.02284124121069908, 0.07735863327980042, -0.10009220242500305, -0.12575244903564453, -0.09120561927556992, 0.0841505378484726, 0.06349294632673264, 0.030364295467734337, -0.02034507691860199, 0.13330571353435516, 0.0024769618175923824, 0.1094052717089653, 0.20818446576595306, -0.29532527923583984, -0.03407833352684975, 0.14377090334892273, 0.06359394639730453, -0.057340800762176514, -0.01772741600871086, 0.03510022163391113, 0.002188062062487006, -0.00029287044890224934, 0.05809810757637024, -0.0799129530787468, 0.14005188643932343, -0.06684184819459915, -0.11836396902799606, 0.009067145176231861, 0.12059874832630157, 0.033361807465553284, -0.025250455364584923, -0.10050313919782639, -0.05978541448712349, 0.05558698624372482, -0.05514559894800186, -0.024058153852820396, 0.021144378930330276, -0.030455030500888824, 0.06725244224071503, -0.05071215704083443, -0.025421055033802986, -0.021837633103132248, -0.04560470208525658, 0.12947700917720795, 0.011516647413372993, 0.0011245445348322392, -0.011334069073200226, 0.04762842878699303, -0.10277187824249268, -0.11809761077165604, -0.016036638990044594, -0.05102349445223808, -0.03832254931330681, -0.028446858748793602, -0.03585948795080185, -0.1003429964184761, 0.10873875021934509, 0.2043026089668274, -0.02001841552555561, 0.0715901255607605, -0.0034397938288748264, 0.025419054552912712, -0.0011514307698234916, -0.02360328659415245, -0.0959308072924614, -0.13531236350536346, 0.055874258279800415, 0.08515884727239609, 0.03231596574187279, -0.0014836547197774053, -0.07126404345035553, -0.03386489301919937, 0.029525624588131905, 0.023370033130049706, 0.07530537992715836, 0.0911330133676529, -0.06927034258842468, -0.07339714467525482, 0.12969696521759033, -0.0931280329823494, 0.01773376204073429, 0.012660030275583267, -0.04051036015152931, 0.023836947977542877, 0.04633598029613495, 0.03303247317671776, -0.018616512417793274, 0.05552423745393753, -0.05425772815942764, -0.02931961417198181, -0.03533417731523514, -0.09529617428779602, 0.02872559055685997, -0.01721087470650673, -0.01962899975478649, -0.10717175155878067, -0.17199835181236267, -0.025834470987319946, 0.032976020127534866, -0.05815284326672554, -0.03471049293875694, -0.04389413446187973, -0.020566411316394806, -0.008420058526098728, 0.0053579919040203094, 0.06130843982100487, -0.0054866960272192955, 0.014467825181782246, 0.042932771146297455, 0.05611356347799301, -0.07259122282266617, 0.006602955982089043, -0.035535942763090134, 0.09129799157381058, -0.20710735023021698, 0.08715970069169998, -0.07391289621591568, 0.036132194101810455, -0.13216949999332428, -0.02138022519648075, 0.0046523637138307095, 0.006605539005249739, 0.034115515649318695, 0.13432955741882324, -0.07895651459693909, -0.0798049047589302, 0.1364143043756485, -0.10447517037391663, -0.11264850944280624, 0.08403033018112183, 0.01031423918902874, 0.0026299043092876673, 0.054395146667957306, 0.19235411286354065, 0.1308000236749649, -0.046385202556848526, -0.02896673046052456, -0.016412358731031418, 0.01681208796799183, 0.019327646121382713, 0.09478407353162766, -0.01062196958810091, -0.08478063344955444, 0.039399124681949615, -0.026182854548096657, 0.06814642995595932, -0.013935002498328686, -0.041754256933927536, -0.020329991355538368, -0.061442453414201736, 0.06670607626438141, -0.04559991508722305, 0.010336680337786674, -0.07069043070077896, -0.03465497866272926, 0.037181928753852844, 0.1052037924528122, -0.03242814168334007, 0.0008440684177912772, -0.09480806440114975, 0.09075804054737091, -0.04658954590559006, 0.04873622953891754, -0.11070740222930908, -0.06644969433546066, 0.014260860159993172, -0.07503943145275116, 0.010004072450101376, -0.03286353871226311, 0.08557916432619095, 0.051605112850666046, -0.05489691346883774, -0.049547407776117325, 0.08955501019954681, 0.0099312299862504, -0.0039249905385077, -0.17693550884723663, -0.06789547204971313, -0.06711667776107788, 0.205022931098938, -0.10471487045288086, 0.06337965279817581, -0.0028671342879533768, 0.1972646415233612, 0.00803370401263237, -0.01343239564448595, 0.05716914311051369, -0.0009537390433251858, -0.014232761226594448, -0.014871221967041492, 0.0846240296959877, -0.0056348745711147785, -0.10320308059453964, 0.08992911130189896, -0.11556520313024521, 0.15327471494674683, 0.10575664788484573, 0.04588227719068527, -0.03100532293319702, -0.06662455946207047, -0.01061303447932005, -0.036944955587387085, 0.07811484485864639, -0.06272725760936737, 0.043687138706445694, 0.03333372622728348, 0.11719494313001633, -0.08189723640680313, -0.04927430674433708, 0.020806485787034035, -0.038680367171764374, -0.050320882350206375, 0.07635419070720673, -0.04467132315039635, -0.19898296892642975, 0.10488788783550262, 0.11035333573818207, 0.022025834769010544, 0.1158464252948761, 0.01497263740748167, 0.012450028210878372, -0.0878148078918457, 0.03474413603544235, 0.05188623443245888, 0.0015724824042990804, -0.08812401443719864, 0.04080682247877121, 0.06403397768735886, -0.005312266293913126, 0.046834059059619904, -0.04037848114967346, 0.013684363104403019, 0.01855107583105564, -0.019000163301825523, 0.11847005039453506, 0.07691828161478043, 0.0061105890199542046, 0.07632996141910553, 0.009336613118648529, -0.04646574333310127, 0.007844757288694382, -0.009901946410536766, -0.08982476592063904, 0.16095753014087677, -0.142525777220726, -0.2544017732143402, -0.13138484954833984, -0.12724334001541138, -0.1300799548625946, -0.01374422013759613, 0.020593922585248947, -0.0299354437738657, -0.04284125939011574, -0.10142877697944641, 0.06545080989599228, -0.00852538738399744, -0.012898988090455532, 0.003761815372854471, 0.020571893081068993, 0.05648575723171234, -0.1103026419878006, -0.021538572385907173, 0.03891924023628235, -0.030512673780322075, 0.04921864718198776, -0.05078136920928955, 0.0652625560760498, 0.06690965592861176, 0.03645123913884163, 0.012418076395988464, -0.017776984721422195, 0.23528611660003662, -0.031729236245155334, 0.06774503737688065, 0.16301320493221283, -0.024973193183541298, 0.0369928739964962, 0.1485113948583603, 0.012670893222093582, -0.053137969225645065, -0.011478712782263756, 0.010509799234569073, -0.010268204845488071, -0.17933103442192078, -0.08553578704595566, -0.05740594118833542, 0.07706079632043839, 0.06793173402547836, 0.05467575788497925, 0.08014538884162903, 0.06767681241035461, -0.06971409916877747, 0.0006363761494867504, 0.08534383773803711, 0.08302349597215652, 0.18287871778011322, -0.011062399484217167, 0.1011301726102829, -0.019687846302986145, -0.030612636357545853, 0.043930500745773315, 0.05663833022117615, 0.02178836800158024, 0.04503786936402321, 0.13151755928993225, 0.05116179585456848, 0.07714880257844925, 0.03158291429281235, 0.08112823218107224, -0.01988459937274456, -0.016222229227423668, -0.027111118659377098, -0.0888892337679863, -0.004643856547772884, 0.0023600333370268345, 0.018644537776708603, 0.07231469452381134, -0.028395194560289383, -0.014017683453857899, 0.07189780473709106, 0.09864617884159088, 0.08130554109811783, -0.2603948712348938, -0.03168616443872452, 0.034563932567834854, 0.012233210727572441, -0.048694368451833725, -0.019697878509759903, 0.008856001310050488, -0.09589952975511551, 0.1594468355178833, -0.04400787875056267, 0.07040766626596451, -0.022640671581029892, 0.008828181773424149, -0.013275552541017532, 0.11148715764284134, -0.011752101592719555, 0.050219349563121796, -0.2515852749347687, 0.10909236967563629, 0.05043704807758331, 0.0029635634273290634, 0.007317398674786091, 0.0538911372423172, 0.0374157540500164, 0.10764691978693008, 0.09414758533239365, 0.0008940189727582037, 0.07161977887153625, -0.0618424192070961, -0.10865087807178497, -0.01783623918890953, 0.04410877078771591, -0.06505794823169708, 0.08945393562316895, -0.026096543297171593, -0.0688151866197586, -0.0182573813945055, 0.07900387793779373, -0.16917604207992554, -0.1382807046175003, 0.08559980988502502, 0.056573692709207535, 0.042863257229328156, -0.09789405018091202, -0.03763896971940994, -0.08398589491844177, 0.20391499996185303, -0.0484284982085228, -0.06892531365156174, -0.11045242846012115, 0.02556668594479561, 0.12649789452552795, -0.052984051406383514, 0.04621017351746559, -0.05644284188747406, 0.0942128449678421, -0.09454268962144852, -0.12087487429380417, 0.04697946086525917, -0.10547531396150589, -0.10224751383066177, -0.036802299320697784, 0.16888602077960968, -0.04379814863204956, 0.03901160880923271, 0.026275543496012688, 0.015684522688388824, 0.013790590688586235, -0.05092410743236542, 0.018223104998469353, 0.0635404959321022, 0.0051457262597978115, 0.08246190845966339, -0.026129109784960747, -0.11888391524553299, -0.060927726328372955, 0.008722146973013878, 0.14841148257255554, 0.2831839919090271, -0.006539258640259504, 0.06501942873001099, 0.10125081986188889, -0.051626890897750854, -0.15786416828632355, -0.059950072318315506, 0.05884680896997452, -0.0076687573455274105, 0.020921669900417328, -0.10749205201864243, 0.04848369210958481, 0.10067995637655258, 0.0011528655886650085, 0.058486368507146835, -0.28754353523254395, -0.11154503375291824, 0.06494557112455368, 0.08838161826133728, 0.11082667857408524, -0.1383742392063141, -0.0846535935997963, -0.06897789239883423, -0.17393572628498077, 0.09335139393806458, -0.03360741212964058, 0.07382600754499435, -0.027505110949277878, -0.023727068677544594, 0.030724510550498962, -0.037482284009456635, 0.1447255164384842, -0.026703370735049248, 0.02604404278099537, -0.07170010358095169, -0.04129759594798088, 0.08221615105867386, -0.04651612415909767, 0.051240481436252594, -0.1236150860786438, 0.009828625246882439, -0.07868313044309616, -0.033475302159786224, -0.07732650637626648, 0.08048541843891144, -0.052372418344020844, -0.044757917523384094, -0.025853361934423447, 0.06397686898708344, 0.035214632749557495, 0.028856629505753517, 0.0836983248591423, -0.04852443188428879, 0.08803597837686539, 0.25145047903060913, 0.09899391233921051, -0.04392393305897713, -0.06389517337083817, -0.003919860348105431, -0.03636845946311951, 0.027670998126268387, -0.016556182876229286, -0.0018203473882749677, 0.07829174399375916, -0.015907850116491318, 0.09798908978700638, 0.026165667921304703, -0.09364451467990875, -0.04379931092262268, 0.0899234339594841, -0.14301340281963348, -0.12249570339918137, -0.02996746636927128, 0.03736383095383644, -0.04076119884848595, 0.031796980649232864, 0.20997804403305054, -0.01829272136092186, -0.02853783220052719, 0.040872253477573395, -0.0028271886985749006, -0.07200653105974197, 0.15495219826698303, 0.012506445869803429, 0.0703837051987648, -0.10759046673774719, 0.031554315239191055, 0.027629680931568146, -0.059157099574804306, -0.023161068558692932, 0.1178898960351944, -0.0936337411403656, -0.0734967440366745, -0.047530896961688995, 0.17299355566501617, -0.00653908122330904, -0.011240766383707523, -0.09031573683023453, -0.12057515233755112, 0.041615668684244156, 0.1316007375717163, 0.04696614667773247, 0.010039378888905048, 0.04153592884540558, -0.04494884982705116, -0.05222518369555473, 0.08499804884195328, 0.0219377800822258, 0.08077087253332138, -0.11678411066532135, 0.042366448789834976, -0.03646047040820122, 0.013146501034498215, -0.021295996382832527, 0.006601222325116396, -0.15571144223213196, -0.05012767016887665, -0.10088963806629181, -0.021174784749746323, -0.13907159864902496, -0.01847614161670208, -0.01749250665307045, 0.010089742951095104, -0.004664476495236158, -0.0023972627241164446, -0.045906104147434235, -0.05610997602343559, -0.01608162187039852, 0.06369087845087051, -0.07944697141647339, -0.03186094015836716, 0.005500277038663626, -0.08517134189605713, 0.05939942225813866, 0.03322076052427292, 0.0031293006613850594, -0.01908220909535885, -0.08778081834316254, -0.07004594057798386, 0.042266860604286194, -0.0009659461211413145, 0.02614620327949524, -0.14710432291030884, -0.0015872783260419965, -0.0073427134193480015, -0.05377596244215965, -0.0016319439746439457, 0.1216093897819519, -0.09035094082355499, 0.01778334565460682, -0.046576935797929764, -0.03232857957482338, -0.04485825449228287, -0.007706534117460251, 0.06051280349493027, 0.017301736399531364, 0.12188102304935455, -0.06629420071840286, 0.056025054305791855, -0.18606014549732208, -0.010096444748342037, -0.025824761018157005, -0.13124258816242218, -0.009430088102817535, -0.008237247355282307, 0.03198348730802536, -0.009898028336465359, 0.08225210756063461, -0.018619149923324585, -0.1078457459807396, 0.021457906812429428, -0.0652860552072525, 0.01940411701798439, 0.040673281997442245, 0.1456710547208786, 0.05898960307240486, -0.015576083213090897, -0.003035137429833412, 0.06586345285177231, 0.04455428570508957, 0.04524541646242142, 0.028667405247688293, 0.11721377819776535, 0.01072584930807352, 0.07285789400339127, 0.0997687578201294, -0.03131932020187378, -0.0024444295559078455, 0.04912671446800232, -0.003918141592293978, 0.06502652168273926, -0.012018497101962566, 0.11491362750530243, 0.150068998336792, -0.14509008824825287, 0.05988273024559021, 0.029533563181757927, -0.028000302612781525, -0.08276274800300598, -0.15882709622383118, -0.10425274819135666, -0.1001519039273262, -0.024157408624887466, -0.12435177713632584, -0.020449500530958176, -0.013826164416968822, 0.028673483058810234, 0.01702558435499668, 0.11330825835466385, -0.014369776472449303, -0.03704267740249634, 0.054243095219135284, -0.007195149082690477, -0.05414624884724617, -0.0020096139051020145, -0.021591700613498688, -0.0027625849470496178, 0.0008617936400696635, 0.004334623459726572, 0.010753576643764973, 0.0051648905500769615, 0.054436229169368744, -0.0335257425904274, -0.1266639530658722, 0.013918912038207054, 0.015184210613369942, 0.03407520428299904, 0.08762829750776291, 0.017159070819616318, -0.05603522062301636, -0.01967620849609375, 0.09630592167377472, -0.00424550985917449, -0.08407235890626907, -0.07140957564115524, 0.12352818250656128, -0.01981601119041443, 0.049486804753541946, 0.00959210004657507, -0.027752142399549484, -0.003971570171415806, 0.17816272377967834, 0.2663080394268036, -0.033158402889966965, 0.010942046530544758, 0.025775805115699768, 0.007731806021183729, 0.06193717569112778, 0.059115245938301086, 0.0537027008831501, 0.13323526084423065, -0.040633123368024826, 0.06734846532344818, -0.015131428837776184, -0.05945846438407898, -0.055733680725097656, 0.019475534558296204, 0.04814697429537773, 0.0024856470990926027, 0.04394860938191414, 0.092182457447052, -0.058974023908376694, -0.03756776079535484, 0.013155622407793999, -0.12369842827320099, -0.11116901785135269, -0.09464442729949951, 0.02567771077156067, 0.01122554112225771, 0.09196174889802933, -0.023687876760959625, -0.038508158177137375, 0.0742141380906105, -0.03433670476078987, -0.08894310146570206, -0.09871866554021835, 0.0346992053091526, -0.06909579038619995, 0.09618084877729416, -0.024428697302937508, 0.03927824646234512, 0.11968570947647095, -0.02813294529914856, -0.1260935366153717, 0.031158171594142914, 0.02403743378818035, -0.036857351660728455, 0.043301090598106384, 0.09542708098888397, -0.0020173757802695036, 0.11701749265193939, 0.021624868735671043, -0.1564760059118271, 0.033794011920690536, 0.058893367648124695, -0.04207659512758255, -0.0577811673283577, 0.0955289676785469, -0.06577108055353165, 0.10199129581451416, 0.16947659850120544, -0.04882703721523285, -0.017414892092347145, -0.0373520702123642, 0.036997124552726746, 0.08594486117362976, 0.07700994610786438, -0.047934845089912415, -0.19873875379562378, -0.007645979057997465, -0.03971622884273529, 0.002448405371978879, -0.2424560785293579, -0.07970584183931351, -0.1120544895529747, -0.03884035721421242, -0.0812382698059082, 0.0782976746559143, 0.09088709950447083, 0.02281326800584793, -0.017847733572125435, -0.11391130834817886, -0.025448786094784737, 0.10858932137489319, -0.10402266681194305, -0.0836792066693306 ]
null
null
stable-baselines3
# **PPO** Agent playing **LunarLander-v2** This is a trained model of a **PPO** agent playing **LunarLander-v2** using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3). ## Usage (with Stable-baselines3) TODO: Add your code ```python from stable_baselines3 import ... from huggingface_sb3 import load_from_hub ... ```
{"library_name": "stable-baselines3", "tags": ["LunarLander-v2", "deep-reinforcement-learning", "reinforcement-learning", "stable-baselines3"], "model-index": [{"name": "PPO", "results": [{"task": {"type": "reinforcement-learning", "name": "reinforcement-learning"}, "dataset": {"name": "LunarLander-v2", "type": "LunarLander-v2"}, "metrics": [{"type": "mean_reward", "value": "261.89 +/- 63.96", "name": "mean_reward", "verified": false}]}]}]}
reinforcement-learning
huxin/ppo-LunarLander-v2-3e6
[ "stable-baselines3", "LunarLander-v2", "deep-reinforcement-learning", "reinforcement-learning", "model-index", "region:us" ]
2024-02-13T22:00:17+00:00
[]
[]
TAGS #stable-baselines3 #LunarLander-v2 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us
# PPO Agent playing LunarLander-v2 This is a trained model of a PPO agent playing LunarLander-v2 using the stable-baselines3 library. ## Usage (with Stable-baselines3) TODO: Add your code
[ "# PPO Agent playing LunarLander-v2\nThis is a trained model of a PPO agent playing LunarLander-v2\nusing the stable-baselines3 library.", "## Usage (with Stable-baselines3)\nTODO: Add your code" ]
[ "TAGS\n#stable-baselines3 #LunarLander-v2 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us \n", "# PPO Agent playing LunarLander-v2\nThis is a trained model of a PPO agent playing LunarLander-v2\nusing the stable-baselines3 library.", "## Usage (with Stable-baselines3)\nTODO: Add your code" ]
[ 39, 41, 17 ]
[ "passage: TAGS\n#stable-baselines3 #LunarLander-v2 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us \n# PPO Agent playing LunarLander-v2\nThis is a trained model of a PPO agent playing LunarLander-v2\nusing the stable-baselines3 library.## Usage (with Stable-baselines3)\nTODO: Add your code" ]
[ 0.03942384943366051, 0.04900386184453964, -0.005304091144353151, 0.026427261531352997, 0.107408307492733, -0.026511888951063156, 0.11188238859176636, 0.0814051404595375, 0.10722193866968155, 0.04762078449130058, 0.08338645845651627, 0.06030960753560066, 0.05080918222665787, 0.2571701407432556, 0.04754156619310379, -0.22987541556358337, 0.036159250885248184, -0.04869936779141426, 0.12395193427801132, 0.07178173214197159, -0.0038484656251966953, -0.06485428661108017, 0.020415637642145157, -0.013290755450725555, 0.05367108806967735, 0.04282612353563309, -0.01716216839849949, -0.08207534998655319, 0.07169748842716217, -0.06345846503973007, 0.06986866891384125, 0.07677983492612839, 0.13218913972377777, -0.17832116782665253, 0.029566360637545586, 0.02571309357881546, -0.07189024239778519, 0.01342033501714468, 0.008019951172173023, 0.05120139941573143, 0.17303818464279175, 0.019879888743162155, 0.07844575494527817, -0.0025605305563658476, -0.15412317216396332, -0.018950799480080605, 0.0436202734708786, 0.12546207010746002, 0.08808347582817078, 0.04605821147561073, 0.01970590092241764, 0.17503218352794647, -0.054352790117263794, -0.028833400458097458, 0.21759237349033356, -0.2881564497947693, -0.031460098922252655, 0.321048766374588, 0.06997483223676682, 0.09725230932235718, -0.07540661096572876, -0.03619609400629997, 0.007783263456076384, -0.013137873262166977, -0.028666524216532707, -0.07447073608636856, 0.17313385009765625, 0.05152064561843872, -0.05057951435446739, -0.09541505575180054, 0.16948209702968597, 0.006921638268977404, 0.0018855923553928733, -0.019282981753349304, 0.009060598909854889, 0.07402525842189789, -0.016097044572234154, -0.07255112379789352, 0.057438433170318604, 0.05330665782094002, 0.019649166613817215, -0.1435653269290924, -0.10762494057416916, -0.022740179672837257, -0.008012006990611553, 0.17786912620067596, -0.009255532175302505, 0.042902372777462006, 0.003065188182517886, 0.10384012013673782, -0.12480384111404419, -0.03354184702038765, -0.0454259067773819, -0.07565800100564957, -0.0223417766392231, -0.02058211714029312, -0.03580251708626747, 0.07184842973947525, 0.11971849203109741, 0.027368178591132164, 0.09350208193063736, 0.047715865075588226, -0.03206788748502731, 0.06343851238489151, 0.05555703118443489, 0.14222665131092072, 0.05807621404528618, 0.012854371219873428, 0.13179877400398254, 0.055213116109371185, 0.033023182302713394, -0.0613492950797081, -0.18252409994602203, 0.07489913702011108, -0.07031869143247604, 0.007941240444779396, 0.12051256000995636, -0.04480670019984245, -0.1183447614312172, -0.037500523030757904, -0.017392054200172424, -0.06224250793457031, -0.025395862758159637, 0.0547584593296051, -0.02883218228816986, -0.03973718360066414, 0.0011496668448671699, 0.09384800493717194, 0.00953749567270279, -0.1752052903175354, 0.03303423151373863, -0.025042934343218803, -0.10782608389854431, 0.009975161403417587, 0.0022444494534283876, 0.03394931182265282, 0.04408763721585274, -0.11822668462991714, -0.30899152159690857, -0.07652641832828522, 0.05490870401263237, -0.06516939401626587, -0.18425025045871735, -0.13193942606449127, 0.02454492449760437, -0.09037084132432938, -0.044885024428367615, -0.12759265303611755, -0.028549788519740105, 0.01743689924478531, 0.011519349180161953, 0.10758619755506516, -0.0106219332665205, -0.012188062071800232, -0.1571401208639145, 0.008273907005786896, -0.20951123535633087, 0.0890483483672142, -0.019150104373693466, 0.037884220480918884, -0.032381169497966766, -0.07404014468193054, 0.030707746744155884, 0.052499737590551376, -0.01474119070917368, 0.13510210812091827, -0.15592676401138306, -0.03691192343831062, -0.007996266707777977, -0.13611900806427002, -0.04786273464560509, -0.10358831286430359, -0.04357128217816353, 0.13354332745075226, 0.018664736300706863, 0.15356586873531342, -0.08709818124771118, -0.0722038671374321, 0.20489206910133362, -0.010411538183689117, -0.12820468842983246, -0.076752208173275, 0.10165707021951675, 0.021510310471057892, -0.056606587022542953, -0.02523270808160305, -0.1839766949415207, -0.0152357779443264, -0.04550420492887497, -0.047039128839969635, 0.01796751655638218, -0.010888241231441498, 0.13837894797325134, 0.08494598418474197, 0.05018039792776108, -0.06086122244596481, -0.006730288732796907, 0.10779471695423126, 0.08823856711387634, 0.008680110797286034, 0.023406028747558594, -0.05774238705635071, 0.09552932530641556, -0.04003755748271942, -0.0142367510125041, -0.08283266425132751, -0.036246106028556824, -0.026256313547492027, 0.17507147789001465, 0.09440762549638748, 0.2257927656173706, 0.09567736834287643, 0.039160262793302536, 0.031270865350961685, -0.13181598484516144, -0.1425403207540512, -0.0017254541162401438, 0.09020978957414627, -0.14270411431789398, -0.04119925573468208, -0.08974775671958923, -0.17768175899982452, -0.12202505767345428, 0.0006432619411498308, -0.17960017919540405, 0.06390921026468277, 0.05408334732055664, -0.035177867859601974, 0.03272094577550888, 0.13032332062721252, -0.011533179320394993, -0.03967514634132385, 0.0831870287656784, 0.0379033200442791, -0.041234664618968964, -0.021742934361100197, 0.11885567009449005, 0.15673065185546875, 0.13124459981918335, -0.03511447086930275, 0.004914294462651014, 0.07076404243707657, -0.02309088408946991, 0.06539414077997208, 0.0558244064450264, 0.20973342657089233, 0.188301220536232, 0.038996949791908264, 0.008822928182780743, -0.07048165798187256, 0.0855446457862854, -0.0742373839020729, -0.14302679896354675, -0.05579735338687897, 0.08729292452335358, 0.016605578362941742, 0.023469142615795135, 0.08711627870798111, 0.024545932188630104, 0.09132762253284454, 0.15968108177185059, 0.01990218088030815, -0.09659269452095032, -0.050218869000673294, 0.01175848301500082, 0.027713103219866753, 0.04794301092624664, -0.04514073207974434, -0.00937939714640379, 0.017020760104060173, -0.10303554683923721, 0.031789086759090424, -0.1413339376449585, -0.1358717679977417, 0.044326696544885635, 0.003906996920704842, 0.010907664895057678, 0.02786896750330925, -0.0038291432429105043, 0.019039705395698547, 0.04351753741502762, -0.06975466758012772, 0.047416772693395615, -0.024745507165789604, -0.020031947642564774, 0.03340689837932587, -0.057257164269685745, -0.205775648355484, -0.17696654796600342, 0.00013708483311347663, -0.09910997003316879, 0.10194740444421768, 0.018308809027075768, -0.12373185902833939, 0.047737859189510345, -0.05822649225592613, 0.027574289590120316, -0.01875593699514866, -0.049130141735076904, 0.10507171601057053, 0.1525275856256485, -0.016146350651979446, 0.018018173053860664, -0.04865182936191559, -0.10157987475395203, -0.19632206857204437, 0.0691583976149559, 0.04680244252085686, 0.014610917307436466, 0.10669491440057755, 0.018072687089443207, 0.02367905154824257, -0.007674071006476879, -0.016521066427230835, -0.011659215204417706, -0.08781040459871292, 0.31909599900245667, 0.04510033503174782, -0.025173069909214973, 0.02041010931134224, -0.0043001663871109486, -0.028083480894565582, 0.03263787180185318, -0.0985708013176918, -0.07548979669809341, -0.08774089068174362, -0.04367410019040108, -0.09784720093011856, 0.053299110382795334, 0.05916472524404526, 0.003188040340319276, -0.07727594673633575, 0.04221395403146744, 0.11369874328374863, -0.0923808291554451, -0.07137343287467957, 0.07477962225675583, 0.0972946360707283, -0.07331304252147675, 0.00012658814375754446, 0.00874367356300354, 0.023951783776283264, 0.037102166563272476, 0.06778035312891006, -0.03966575115919113, 0.08589404821395874, -0.19917890429496765, 0.0372927263379097, 0.106058269739151, 0.023754918947815895, 0.0638108178973198, 0.07643651217222214, -0.1058402881026268, -0.008500572293996811, -0.032518330961465836, -0.21341575682163239, 0.1668180525302887, 0.1355515867471695, 0.06788124144077301, -0.025637222453951836, -0.00461410591378808, -0.0649740919470787, 0.05773647129535675, 0.02723747305572033, -0.14758841693401337, 0.004883295856416225, 0.06064270809292793, 0.026899009943008423, 0.01614922471344471, 0.07971042394638062, 0.014697225764393806, -0.1801026314496994, -0.014406266622245312, 0.10730406641960144, 0.002390873385593295, 0.0053148469887673855, -0.03175045922398567, -0.1755964607000351, 0.0751047357916832, 0.004285442177206278, 0.07233936339616776, -0.1676585078239441, 0.14297930896282196, -0.10089799761772156, 0.07726949453353882, -0.004285062663257122, -0.021311495453119278, 0.02507244050502777, -0.0541163794696331, 0.15163759887218475, 0.01058570109307766, -0.021810131147503853, -0.1200498715043068, -0.1717042326927185, -0.019227758049964905, -0.11788936704397202, -0.11679866164922714, 0.050424277782440186, 0.062185097485780716, 0.04923136904835701, -0.061147067695856094, 0.1518532931804657, -0.047422297298908234, 0.060713399201631546, -0.06893875449895859, -0.06755045056343079, 0.03764858841896057, -0.12588608264923096, -0.08176055550575256, 0.05573027580976486, 0.19166934490203857, 0.15833087265491486, -0.02816431224346161, -0.03472423925995827, -0.047419581562280655, -0.006212298292666674, -0.007802055217325687, 0.0275666993111372, 0.023223137483000755, 0.07315318286418915, -0.07681374251842499, -0.11649256944656372, 0.033787861466407776, -0.06713802367448807, -0.055589709430933, -0.015439179725944996, 0.1513158082962036, 0.04671623185276985, 0.07720734924077988, -0.018946662545204163, 0.03887668624520302, -0.001724981120787561, -0.056474871933460236, 0.16197094321250916, 0.03885216265916824, -0.05193585529923439, 0.06837689876556396, 0.053174007683992386, 0.043745119124650955, 0.03011113777756691, -0.026783017441630363, 0.206032395362854, 0.1980147808790207, 0.014206883497536182, 0.2175983190536499, 0.03177616000175476, -0.03772832080721855, -0.1300560086965561, -0.065880686044693, -0.006372632458806038, 0.03559038043022156, 0.08070417493581772, -0.18207235634326935, -0.015011128038167953, -0.05689644813537598, -0.034518610686063766, -0.15059494972229004, -0.28553900122642517, -0.05957856774330139, 0.20075850188732147, 0.14706264436244965, 0.27519428730010986, -0.10432573407888412, 0.035197313874959946, 0.02663275972008705, -0.04912831634283066, -0.006501141935586929, 0.00018665487004909664, 0.10268618166446686, -0.15421873331069946, 0.1176437959074974, 0.08486983180046082, -0.019002694636583328, 0.01058861706405878, -0.1619086116552353, 0.00936629343777895, -0.12191236019134521, 0.05354422330856323, 0.1400289237499237, -0.048128653317689896, -0.054873593151569366, 0.14033560454845428, -0.024562934413552284, -0.22685599327087402, -0.04648222774267197, -0.043600670993328094, -0.010640020482242107, 0.026607351377606392, -0.1013401448726654, 0.04101909324526787, 0.1330099105834961, 0.009380043484270573, 0.1147187277674675, 0.11749245226383209, -0.052566803991794586, 0.10792597383260727, 0.2257719188928604, -0.018785694614052773, 0.04689010605216026, -0.12743118405342102, -0.0012336712097749114, -0.028270328417420387, 0.013657891191542149, -0.09504974633455276, -0.09938385337591171, 0.02366873063147068, 0.02872389927506447, 0.009118586778640747, 0.0921793207526207, -0.029922157526016235, 0.0759170651435852, 0.06817561388015747, -0.13014446198940277, -0.16288450360298157, 0.015828335657715797, -0.007344507612287998, 0.08354310691356659, 0.00027861111448146403, 0.08878035843372345, -0.11932205408811569, -0.018093237653374672, -0.03153328225016594, -0.03319635987281799, -0.130486860871315, -0.07138993591070175, 0.06156524643301964, 0.028095467016100883, -0.06602972000837326, 0.1398407518863678, 0.026440169662237167, 0.15942534804344177, 0.049197953194379807, 0.012499804608523846, 0.07227300107479095, -0.05345509201288223, 0.1283530443906784, 0.13818155229091644, -0.00868943240493536, -0.05460423603653908, -0.1013643890619278, -0.10236792266368866, 0.08925779908895493, -0.05773641914129257, 0.07476430386304855, -0.14885357022285461, -0.06675903499126434, 0.015772046521306038, 0.016141414642333984, -0.09562095999717712, 0.02571965754032135, -0.01625603251159191, -0.18119946122169495, 0.056570518761873245, -0.048285093158483505, 0.0440407395362854, -0.06347788125276566, -0.1110161691904068, -0.17226378619670868, 0.06091433763504028, 0.08593481779098511, -0.053876690566539764, -0.12229149043560028, 0.011023230850696564, -0.00012518465518951416, -0.06341652572154999, -0.05023367330431938, 0.09722746908664703, -0.11020902544260025, 0.031452205032110214, -0.012567701749503613, 0.08853451162576675, -0.03510405123233795, -0.011538895778357983, 0.044220831245183945, -0.08039166033267975, -0.009481523185968399, 0.03534642979502678, -0.026372017338871956, -0.04127239063382149, -0.2689029574394226, 0.0036654395516961813, 0.0341104120016098, 0.02497158572077751, 0.07856601476669312, 0.011906822212040424, 0.021174922585487366, 0.03993808850646019, -0.15396519005298615, -0.013395369984209538, 0.14574195444583893, -0.07689505815505981, -0.022186370566487312, 0.05703273415565491, -0.09054436534643173, 0.013882770203053951, -0.030287226662039757, 0.1345842480659485, 0.023923413828015327, 0.06404478847980499, -0.0851147472858429, 0.10106813907623291, -0.1451139897108078, -0.04998219385743141, -0.01244612317532301, 0.09761348366737366, 0.07019034773111343, -0.10272270441055298, 0.014697125181555748, 0.04210108891129494, 0.19416837394237518, 0.016384804621338844, -0.0356343574821949, -0.03396720811724663, 0.004015897400677204, 0.22076453268527985, 0.03044266067445278, 0.10457023978233337, 0.07281364500522614, -0.026583973318338394, 0.12624378502368927, 0.09929762035608292, 0.11280370503664017, -0.055645186454057693, 0.13904185593128204, 0.04667386785149574, 0.038641396909952164, 0.0614289753139019, 0.06836545467376709, 0.09098632633686066, -0.0008288522367365658, 0.1138714924454689, 0.013811973854899406, -0.02422109805047512, -0.021335409954190254, 0.17759373784065247, 0.10501719266176224, -0.14769648015499115, 0.029047364369034767, -0.01258957851678133, 0.039933037012815475, -0.014194529503583908, -0.15634691715240479, -0.07240267097949982, -0.3315149247646332, 0.1226184144616127, -0.07119352370500565, 0.019930170848965645, 0.007913772016763687, -0.037425633519887924, -0.03296699747443199, -0.04477746784687042, 0.13151589035987854, -0.013641550205647945, -0.006079165264964104, -0.04815853759646416, -0.015360191464424133, -0.11607866734266281, -0.11200575530529022, -0.013207737356424332, -0.13671602308750153, -0.010119039565324783, 0.05595948174595833, 0.003977729007601738, 0.01821410097181797, -0.03142618387937546, 0.0024383175186812878, 0.06541839241981506, -0.05751744285225868, 0.056182678788900375, 0.12097269296646118, 0.08766137808561325, -0.1058853268623352, 0.031048951670527458, 0.2011747509241104, 0.04359564557671547, -0.12483977526426315, 0.01449228823184967, 0.1819491684436798, 0.004885740112513304, 0.017068125307559967, -0.006097703706473112, -0.0540788508951664, -0.07554277032613754, 0.1251034289598465, 0.08296554535627365, -0.09985227137804031, 0.015833314508199692, -0.0726347416639328, -0.01594804972410202, -0.06374675035476685, 0.10130585730075836, 0.09538925439119339, 0.04440245032310486, -0.10621760785579681, -0.08487539738416672, -0.10891728103160858, 0.040588874369859695, -0.08629853278398514, -0.07311757653951645, 0.09629398584365845, -0.07057105004787445, -0.07029950618743896, 0.025521177798509598, -0.17978744208812714, -0.009467960335314274, 0.1711762249469757, -0.24654000997543335, -0.0916430801153183, -0.10857923328876495, 0.14477859437465668, 0.016497576609253883, 0.1013975441455841, -0.006207061931490898, -0.007889035157859325, -0.20577777922153473, 0.024890204891562462, -0.05293011665344238, -0.02073732763528824, 0.07814782857894897, -0.09476397186517715, 0.22629831731319427, -0.08276885002851486, 0.020940175279974937, 0.012659613974392414, 0.0870661810040474, -0.030675338581204414, 0.09283176809549332, -0.03660329803824425, -0.12576518952846527, -0.03620953485369682, 0.03001813031733036, 0.013904244638979435, 0.10071761906147003, 0.09772487729787827, -0.03414725139737129, 0.03389119729399681, 0.09747414290904999, 0.04172342270612717, -0.023843804374337196, 0.0360250361263752, -0.17077107727527618, 0.02182629331946373, -0.018498148769140244, -0.06935930997133255, 0.03687669709324837, -0.06603235751390457, 0.1639697551727295, 0.04022442549467087, 0.0670473501086235, -0.036152735352516174, 0.0073931049555540085, -0.014454689808189869, -0.013775371946394444, -0.026180334389209747, -0.17259705066680908, -0.10422050207853317, -0.1347656100988388, -0.012701659463346004, -0.034971047192811966, 0.04591470584273338, 0.023234914988279343, -0.0003200018545612693, -0.014577031135559082, -0.12090865522623062, 0.04360328987240791, 0.11146783083677292, -0.04631396010518074, -0.026193076744675636 ]
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # bert-azahead-v1.1 This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the azaheadhealth dataset. It achieves the following results on the evaluation set: - Loss: 0.4785 - Accuracy: 0.7917 - F1: 0.6154 - Precision: 0.6667 - Recall: 0.5714 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 2 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 8 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:---------:|:------:| | 0.6318 | 1.0 | 10 | 0.5247 | 0.6667 | 0.0 | 0.0 | 0.0 | | 0.5623 | 2.0 | 20 | 0.4065 | 0.7917 | 0.5455 | 0.75 | 0.4286 | | 0.4688 | 3.0 | 30 | 0.3514 | 0.7917 | 0.5455 | 0.75 | 0.4286 | | 0.4252 | 4.0 | 40 | 0.3224 | 0.8333 | 0.6667 | 0.8 | 0.5714 | | 0.2409 | 5.0 | 50 | 0.4115 | 0.75 | 0.4 | 0.6667 | 0.2857 | | 0.2196 | 6.0 | 60 | 0.3672 | 0.7917 | 0.6667 | 0.625 | 0.7143 | | 0.1417 | 7.0 | 70 | 0.4441 | 0.7917 | 0.5455 | 0.75 | 0.4286 | | 0.0842 | 8.0 | 80 | 0.4422 | 0.7917 | 0.6154 | 0.6667 | 0.5714 | | 0.065 | 9.0 | 90 | 0.4556 | 0.7917 | 0.6154 | 0.6667 | 0.5714 | | 0.0657 | 10.0 | 100 | 0.4785 | 0.7917 | 0.6154 | 0.6667 | 0.5714 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.2.0+cu121 - Datasets 2.16.1 - Tokenizers 0.13.2
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["azaheadhealth"], "metrics": ["accuracy", "f1", "precision", "recall"], "base_model": "bert-base-uncased", "model-index": [{"name": "bert-azahead-v1.1", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "azaheadhealth", "type": "azaheadhealth", "config": "small", "split": "test", "args": "small"}, "metrics": [{"type": "accuracy", "value": 0.7916666666666666, "name": "Accuracy"}, {"type": "f1", "value": 0.6153846153846154, "name": "F1"}, {"type": "precision", "value": 0.6666666666666666, "name": "Precision"}, {"type": "recall", "value": 0.5714285714285714, "name": "Recall"}]}]}]}
text-classification
zwellington/bert-azahead-v1.1
[ "transformers", "pytorch", "bert", "text-classification", "generated_from_trainer", "dataset:azaheadhealth", "base_model:bert-base-uncased", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2024-02-13T22:01:38+00:00
[]
[]
TAGS #transformers #pytorch #bert #text-classification #generated_from_trainer #dataset-azaheadhealth #base_model-bert-base-uncased #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
bert-azahead-v1.1 ================= This model is a fine-tuned version of bert-base-uncased on the azaheadhealth dataset. It achieves the following results on the evaluation set: * Loss: 0.4785 * Accuracy: 0.7917 * F1: 0.6154 * Precision: 0.6667 * Recall: 0.5714 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 2e-05 * train\_batch\_size: 2 * eval\_batch\_size: 8 * seed: 42 * gradient\_accumulation\_steps: 8 * total\_train\_batch\_size: 16 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * num\_epochs: 10 ### Training results ### Framework versions * Transformers 4.31.0 * Pytorch 2.2.0+cu121 * Datasets 2.16.1 * Tokenizers 0.13.2
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 2\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 16\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 10", "### Training results", "### Framework versions\n\n\n* Transformers 4.31.0\n* Pytorch 2.2.0+cu121\n* Datasets 2.16.1\n* Tokenizers 0.13.2" ]
[ "TAGS\n#transformers #pytorch #bert #text-classification #generated_from_trainer #dataset-azaheadhealth #base_model-bert-base-uncased #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 2\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 16\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 10", "### Training results", "### Framework versions\n\n\n* Transformers 4.31.0\n* Pytorch 2.2.0+cu121\n* Datasets 2.16.1\n* Tokenizers 0.13.2" ]
[ 74, 126, 4, 33 ]
[ "passage: TAGS\n#transformers #pytorch #bert #text-classification #generated_from_trainer #dataset-azaheadhealth #base_model-bert-base-uncased #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 2\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 16\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 10### Training results### Framework versions\n\n\n* Transformers 4.31.0\n* Pytorch 2.2.0+cu121\n* Datasets 2.16.1\n* Tokenizers 0.13.2" ]
[ -0.14127081632614136, 0.1864759624004364, -0.0021274934988468885, 0.10424915701150894, 0.16139240562915802, 0.0411984920501709, 0.09665961563587189, 0.16047438979148865, -0.06995446234941483, 0.07859545946121216, 0.1170661449432373, 0.11532752960920334, 0.048261608928442, 0.1709911972284317, -0.03300221264362335, -0.2644162178039551, 0.005149866454303265, 0.015973426401615143, -0.06476719677448273, 0.14438249170780182, 0.08685208857059479, -0.12232634425163269, 0.07114017754793167, -0.006103219464421272, -0.151322603225708, -0.043146681040525436, -0.01886911131441593, -0.04045267775654793, 0.12756195664405823, -0.005591659341007471, 0.09897663444280624, 0.041397470980882645, 0.10399807244539261, -0.1972762644290924, 0.002702245255932212, 0.046034689992666245, 0.014317108318209648, 0.11512135714292526, 0.06115106865763664, -0.027378160506486893, 0.08831348270177841, -0.10459308326244354, 0.05159498378634453, 0.021838895976543427, -0.10899312049150467, -0.21185755729675293, -0.11950904130935669, 0.09088119119405746, 0.10292777419090271, 0.08313575387001038, -0.011096211150288582, 0.08093763887882233, -0.10638406127691269, 0.06778804212808609, 0.24467378854751587, -0.29610079526901245, -0.054039131850004196, 0.03894338756799698, 0.010569361038506031, 0.05511569604277611, -0.10942810773849487, -0.049115315079689026, 0.029175879433751106, 0.04128964617848396, 0.1253642737865448, -0.001187211717478931, -0.027478337287902832, 0.011576994322240353, -0.15125608444213867, -0.05799096077680588, 0.12069139629602432, 0.05133119970560074, -0.02598169632256031, -0.07082369178533554, -0.057566240429878235, -0.2149919867515564, -0.02103768289089203, 0.010003091767430305, 0.0380285419523716, -0.04840398207306862, -0.07877475023269653, 0.004690429661422968, -0.07428869605064392, -0.06928467750549316, -0.018266549333930016, 0.05852936580777168, 0.046550072729587555, 0.0029882837552577257, 0.028031853958964348, 0.1402897983789444, 0.028728270903229713, -0.1426321268081665, 0.016581634059548378, 0.010873832739889622, -0.034182414412498474, -0.02684025466442108, -0.032280124723911285, 0.025546548888087273, 0.018748655915260315, 0.13143500685691833, -0.056466326117515564, 0.016166256740689278, 0.03911618888378143, 0.03681951388716698, -0.07623564451932907, 0.157741978764534, -0.09282277524471283, -0.07517924159765244, -0.015181156806647778, 0.1292739063501358, 0.016605226323008537, -0.011730381287634373, -0.11783138662576675, 0.006963612046092749, 0.12169007211923599, 0.04637373611330986, -0.03323576599359512, 0.03546226769685745, -0.07188700139522552, -0.052415598183870316, 0.0701947957277298, -0.09543529152870178, 0.024555452167987823, 0.013931580819189548, -0.08915293216705322, -0.03565755859017372, 0.014496754854917526, 0.014446967281401157, -0.03270026668906212, 0.12385275214910507, -0.09183446317911148, 0.011804219335317612, -0.08124564588069916, -0.0991930142045021, 0.026934713125228882, -0.12851475179195404, -0.012565318495035172, -0.0652870237827301, -0.1553211659193039, -0.0443529412150383, 0.04472268745303154, -0.05281347036361694, -0.09623245894908905, -0.08027415722608566, -0.06537126749753952, 0.023914672434329987, -0.025282086804509163, 0.11605982482433319, -0.06143581494688988, 0.12920324504375458, -0.009269634261727333, 0.06923095136880875, 0.0005015188362449408, 0.06873252987861633, -0.08971290290355682, 0.03733775019645691, -0.15379005670547485, 0.06790109723806381, -0.04615835100412369, 0.0384652279317379, -0.12274801731109619, -0.1161331832408905, 0.02438691258430481, -0.04286254569888115, 0.09511931240558624, 0.12228018790483475, -0.15864330530166626, -0.07560718804597855, 0.18320980668067932, -0.06765521317720413, -0.128435879945755, 0.11255306750535965, -0.039516303688287735, 0.016338437795639038, 0.03643537312746048, 0.17483724653720856, 0.07144530862569809, -0.0599316842854023, -0.02721746265888214, -0.03464500978589058, 0.08572261780500412, -0.058981217443943024, 0.11344155669212341, -0.023745518177747726, 0.04591171070933342, 0.015777956694364548, -0.07529664784669876, 0.0553227961063385, -0.10885971784591675, -0.09356837719678879, -0.016123291105031967, -0.09886518120765686, 0.08518724143505096, 0.05733353644609451, 0.05697333812713623, -0.06697414070367813, -0.09370288997888565, 0.03950369730591774, 0.1170216053724289, -0.07246814668178558, 0.023786133155226707, -0.05330865830183029, 0.10182330757379532, -0.061599548906087875, -0.02524830400943756, -0.18748271465301514, -0.0349770225584507, 0.022848550230264664, 0.002758128335699439, -0.00993169192224741, -0.017361046746373177, 0.0417480431497097, 0.0738673210144043, -0.058925777673721313, -0.04812018200755119, -0.05413039028644562, -0.017480624839663506, -0.11442628502845764, -0.22577176988124847, -0.07461325079202652, -0.01681804470717907, 0.1882954239845276, -0.20670275390148163, 0.032163411378860474, -0.019752616062760353, 0.10774653404951096, 0.01718405820429325, -0.027778951451182365, -0.005517193116247654, 0.08016590029001236, -0.03266865015029907, -0.06278157234191895, 0.08637140691280365, -0.004601651802659035, -0.09308689087629318, -0.01796088181436062, -0.07292357087135315, 0.11872553825378418, 0.1113000363111496, -0.0036994542460888624, -0.07172217965126038, -0.0620441772043705, -0.08095709979534149, -0.03884583339095116, -0.046778615564107895, 0.027385378256440163, 0.12990021705627441, 0.03193431347608566, 0.14844189584255219, -0.07856467366218567, -0.04073641821742058, 0.026665978133678436, -0.00322238402441144, 0.02206859551370144, 0.1295170933008194, 0.1323527842760086, -0.0791301354765892, 0.14664654433727264, 0.12752455472946167, -0.033318039029836655, 0.12026464194059372, -0.04110056534409523, -0.09488081187009811, -0.0325460322201252, -0.005023829638957977, 0.000028738915716530755, 0.11758357286453247, -0.12879306077957153, -0.017689872533082962, 0.04299033060669899, 0.030267102643847466, 0.02068479359149933, -0.20113427937030792, -0.028088146820664406, 0.018417825922369957, -0.061687566339969635, -0.03704506531357765, -0.014491057954728603, 0.013942467980086803, 0.11672292649745941, 0.030312659218907356, -0.06691212207078934, 0.02960408292710781, -0.0020557164680212736, -0.09025237709283829, 0.1850239485502243, -0.08625318109989166, -0.16994495689868927, -0.11629921197891235, -0.048050325363874435, -0.03683965653181076, -0.012148686684668064, 0.03917708992958069, -0.0708274394273758, -0.0015245777321979403, -0.06367745995521545, 0.003522875951603055, 0.003907621372491121, 0.01987592689692974, -0.00558721087872982, 0.01807420141994953, 0.05119389295578003, -0.09222930669784546, 0.004427181091159582, -0.04823664203286171, -0.053029630333185196, 0.04593199864029884, 0.0163131020963192, 0.11184298992156982, 0.17597880959510803, 0.0050448025576770306, 0.021537791937589645, -0.047673482447862625, 0.1839546263217926, -0.0732189193367958, -0.019143855199217796, 0.11918159574270248, 0.0036738452035933733, 0.073428213596344, 0.15878435969352722, 0.05013475567102432, -0.07198843359947205, 0.0035586562007665634, 0.026660839095711708, -0.019185636192560196, -0.235756054520607, -0.05628002807497978, -0.05609964206814766, -0.006391466595232487, 0.1203928291797638, 0.0236598439514637, -0.0006392924697138369, 0.05573131889104843, 0.00644369563087821, 0.02952377125620842, -0.04678509011864662, 0.0640186294913292, 0.07165749371051788, 0.03690120205283165, 0.13306930661201477, -0.02267155982553959, -0.019150758162140846, 0.04843023419380188, 0.007902189157903194, 0.24744261801242828, -0.05376720800995827, 0.145536407828331, 0.06594677269458771, 0.21467973291873932, 0.007792002521455288, 0.0694933608174324, -0.007150101009756327, 0.0012126195942983031, -0.011551759205758572, -0.04061839357018471, -0.06805576384067535, 0.014639237895607948, 0.004530936945229769, 0.057096872478723526, -0.15097790956497192, 0.012759430333971977, 0.05358272045850754, 0.3124788999557495, 0.10052923113107681, -0.316965788602829, -0.10108186304569244, -0.005783815402537584, -0.022219233214855194, -0.032989807426929474, 0.0022967467084527016, 0.11597004532814026, -0.11678086221218109, 0.02158324234187603, -0.07074619084596634, 0.09345478564500809, -0.06986122578382492, 0.016062794253230095, 0.08375514298677444, 0.05791877955198288, 0.005887477658689022, 0.08642281591892242, -0.2498178631067276, 0.279199481010437, 0.0008258603629656136, 0.035278983414173126, -0.06887393444776535, 0.0015795101644471288, 0.03244584798812866, 0.050278741866350174, 0.09281091392040253, 0.01105443574488163, -0.011134047992527485, -0.1914634257555008, -0.12381315976381302, 0.01627488061785698, 0.088870570063591, -0.09809532016515732, 0.10698918253183365, -0.023631324991583824, -0.010934937745332718, 0.023217691108584404, -0.017427831888198853, -0.029151760041713715, -0.10526515543460846, 0.02569182962179184, -0.02307957224547863, 0.015126022510230541, -0.07527557760477066, -0.12209361046552658, -0.06813462823629379, 0.17455744743347168, -0.08058120310306549, -0.05946014076471329, -0.12338925153017044, 0.11811818182468414, 0.12655401229858398, -0.09712811559438705, 0.0378694050014019, 0.0038995807990431786, 0.08173547685146332, 0.014315362088382244, -0.046552207320928574, 0.08862484991550446, -0.053133729845285416, -0.224472314119339, -0.07051357626914978, 0.15655635297298431, 0.03542015329003334, 0.07985285669565201, -0.026277678087353706, 0.020668797194957733, -0.015312189236283302, -0.0803060308098793, 0.03612704947590828, 0.04278501495718956, 0.10574064403772354, 0.04223858565092087, -0.07362866401672363, 0.009950379841029644, -0.06556343287229538, -0.0398503802716732, 0.18948224186897278, 0.2419034242630005, -0.11058363318443298, 0.07420258224010468, 0.06324613094329834, -0.05679332837462425, -0.18612177670001984, 0.006712668109685183, 0.11100301146507263, 0.03329078108072281, 0.012590218335390091, -0.20081324875354767, 0.07359792292118073, 0.11369103193283081, -0.02585228905081749, 0.1030125766992569, -0.3415584862232208, -0.10900771617889404, 0.06373754143714905, 0.10018293559551239, 0.04961664229631424, -0.14633291959762573, -0.0498814657330513, 0.014755301177501678, -0.12875200808048248, 0.11685744673013687, -0.04002353176474571, 0.12659603357315063, -0.038292042911052704, 0.06694597750902176, 0.014854933135211468, -0.05861499533057213, 0.1247631385922432, 0.03245483338832855, 0.04952489957213402, -0.03643379732966423, -0.06357422471046448, 0.06179909408092499, -0.06983409076929092, 0.02463187277317047, -0.055432118475437164, 0.04479823634028435, -0.13757532835006714, -0.017355510964989662, -0.10136662423610687, 0.00912002008408308, -0.04788844287395477, -0.06183500215411186, -0.03362394496798515, 0.074741430580616, 0.07468817383050919, 0.0010655024088919163, 0.11381200700998306, 0.0011966784950345755, 0.140844464302063, 0.09420072287321091, 0.05746756121516228, -0.05459190905094147, -0.06880869716405869, -0.03389598801732063, -0.024578409269452095, 0.038798797875642776, -0.15556876361370087, 0.022187627851963043, 0.16114452481269836, 0.041971251368522644, 0.14620760083198547, 0.06587249040603638, -0.0467383898794651, 0.010964994318783283, 0.06899778544902802, -0.13358120620250702, -0.06317835301160812, -0.0049306596629321575, -0.032583363354206085, -0.17732658982276917, 0.030667372047901154, 0.09865017980337143, -0.049882449209690094, -0.03411152958869934, -0.007748755626380444, 0.03236904367804527, -0.03187074139714241, 0.19375410676002502, 0.0621078684926033, 0.08324558287858963, -0.10850345343351364, 0.06966184079647064, 0.08743541687726974, -0.09937988221645355, 0.00629840325564146, 0.07550092786550522, -0.1109996885061264, -0.03346339613199234, 0.024204310029745102, 0.11793318390846252, -0.03097749873995781, -0.03941933065652847, -0.12023061513900757, -0.09957966208457947, 0.08915084600448608, 0.10646586865186691, 0.0861656442284584, 0.03833673894405365, -0.02888593077659607, -0.01644478738307953, -0.12253622710704803, 0.10349784791469574, 0.048694539815187454, 0.060052696615457535, -0.12291423976421356, 0.14592991769313812, 0.007289527449756861, 0.06879901140928268, -0.011344565078616142, 0.0246249008923769, -0.10229372978210449, -0.008676169440150261, -0.10086910426616669, -0.017865225672721863, -0.05460657551884651, -0.011767772026360035, -0.016110649332404137, -0.04897799715399742, -0.038538381457328796, 0.00977371260523796, -0.10650322586297989, -0.05104677751660347, -0.004283013753592968, 0.05547734722495079, -0.1226024255156517, -0.028147554025053978, 0.018405532464385033, -0.08434253931045532, 0.11313821375370026, 0.03414430096745491, 0.04222068563103676, 0.01723821833729744, -0.06521940231323242, -0.00910167209804058, 0.03339637443423271, 0.006653174292296171, 0.06788811832666397, -0.13621468842029572, 0.001089948695152998, -0.025188032537698746, 0.007170765195041895, 0.030866410583257675, 0.07430111616849899, -0.14288954436779022, 0.006612309720367193, -0.030341416597366333, -0.04076759144663811, -0.05830672010779381, 0.05320395156741142, 0.06644909083843231, 0.03280789777636528, 0.17092567682266235, -0.0603521466255188, 0.062109120190143585, -0.23004478216171265, -0.01680106855928898, -0.03457419574260712, -0.060912929475307465, -0.10194990783929825, -0.04097886011004448, 0.07108846306800842, -0.049799732863903046, 0.10787665098905563, -0.0035699172876775265, 0.0583021305501461, 0.013646440580487251, -0.02074703574180603, 0.012750934809446335, 0.02058008499443531, 0.1489838808774948, 0.03874386101961136, -0.039208896458148956, 0.07821463793516159, 0.04018693044781685, 0.059270069003105164, 0.11211079359054565, 0.20900467038154602, 0.11719157546758652, 0.07841084152460098, 0.057651229202747345, 0.03807773068547249, -0.09577836841344833, -0.1826714426279068, 0.0698709636926651, -0.024866951629519463, 0.09828642010688782, -0.013450751081109047, 0.22091828286647797, 0.08052441477775574, -0.20819498598575592, 0.050898682326078415, -0.031098531559109688, -0.08726285398006439, -0.09326760470867157, -0.05647807568311691, -0.07532718032598495, -0.14121797680854797, -0.008344919420778751, -0.11898063123226166, 0.0012024195166304708, 0.12735964357852936, 0.0039659952744841576, 0.003293442539870739, 0.08084432035684586, 0.030589578673243523, 0.026656493544578552, 0.07401549071073532, 0.035209596157073975, 0.008335165679454803, -0.07433343678712845, -0.06524138897657394, 0.01510921772569418, -0.0015715458430349827, 0.06753412634134293, -0.06351180374622345, -0.044485725462436676, 0.04704992473125458, -0.0037244409322738647, -0.104335717856884, 0.013667169027030468, 0.003649177961051464, 0.07202170789241791, 0.08885862678289413, 0.022051818668842316, 0.01612880453467369, -0.021628940477967262, 0.24559904634952545, -0.07242792844772339, -0.05830920487642288, -0.1338450163602829, 0.25588271021842957, 0.03622464835643768, -0.04215199127793312, 0.049713656306266785, -0.09397856146097183, 0.000381006597308442, 0.1926053762435913, 0.21330267190933228, -0.08773264288902283, -0.02499544434249401, 0.03452182933688164, -0.009213840588927269, -0.011576919816434383, 0.08950873464345932, 0.13300198316574097, 0.07952138781547546, -0.0976974293589592, -0.04008413478732109, -0.06390684843063354, -0.030290232971310616, -0.019111745059490204, 0.06610561162233353, 0.029097983613610268, 0.00596980657428503, -0.04125797003507614, 0.06417300552129745, -0.056970562785863876, -0.09111447632312775, 0.06772065162658691, -0.2414911836385727, -0.20197297632694244, -0.026457034051418304, 0.05272345989942551, 0.015144290402531624, 0.05388526991009712, -0.00642564008012414, -0.0007998377550393343, 0.12559929490089417, -0.019794490188360214, -0.07905639708042145, -0.08233987540006638, 0.09176471084356308, -0.06930475682020187, 0.20817683637142181, -0.03988676890730858, 0.03206069394946098, 0.12849678099155426, 0.06662611663341522, -0.10415007919073105, 0.036304984241724014, 0.05944171920418739, -0.03757873550057411, 0.035683877766132355, 0.11354789137840271, -0.02518998645246029, 0.06960902363061905, 0.05311840772628784, -0.12953029572963715, 0.0011058790842071176, -0.07067456096410751, -0.05169299617409706, -0.05857735499739647, -0.0067614600993692875, -0.02686091512441635, 0.1382051706314087, 0.22443358600139618, -0.05740164592862129, -0.0005910610780119896, -0.06672739237546921, 0.0043013859540224075, 0.0361604169011116, 0.0699709802865982, -0.04349817708134651, -0.21775926649570465, 0.02631438709795475, 0.013902408070862293, 0.00020685579511336982, -0.252655953168869, -0.08254816383123398, 0.023855911567807198, -0.06646929681301117, -0.11178884655237198, 0.11155018210411072, 0.013062269426882267, 0.05060139298439026, -0.049825675785541534, -0.02781360223889351, -0.06737172603607178, 0.15686413645744324, -0.17622804641723633, -0.08274970948696136 ]
null
null
null
## Navarna 7B GGUF VERSION here is the orignal [TokenBender/Navarna_v0_1_OpenHermes_Hindi](https://huggingface.co/TokenBender/Navarna_v0_1_OpenHermes_Hindi)
{"tags": ["gguf", "llama.cpp", "hindi"]}
null
IndiaBuild/GGUF_Navarna_v0_1_OpenHermes_Hindi
[ "gguf", "llama.cpp", "hindi", "region:us" ]
2024-02-13T22:08:29+00:00
[]
[]
TAGS #gguf #llama.cpp #hindi #region-us
## Navarna 7B GGUF VERSION here is the orignal TokenBender/Navarna_v0_1_OpenHermes_Hindi
[ "## Navarna 7B GGUF VERSION here is the orignal TokenBender/Navarna_v0_1_OpenHermes_Hindi" ]
[ "TAGS\n#gguf #llama.cpp #hindi #region-us \n", "## Navarna 7B GGUF VERSION here is the orignal TokenBender/Navarna_v0_1_OpenHermes_Hindi" ]
[ 17, 33 ]
[ "passage: TAGS\n#gguf #llama.cpp #hindi #region-us \n## Navarna 7B GGUF VERSION here is the orignal TokenBender/Navarna_v0_1_OpenHermes_Hindi" ]
[ -0.029690802097320557, 0.009596111252903938, -0.0061983950436115265, 0.05897752568125725, -0.029509080573916435, 0.014696097001433372, -0.0011263947235420346, 0.09273182600736618, 0.08952407538890839, 0.07670019567012787, 0.143319234251976, 0.05157115310430527, 0.020227788016200066, 0.07494297623634338, 0.051406893879175186, -0.2603375017642975, 0.0037216024938970804, 0.057700395584106445, 0.10499405115842819, 0.07055439800024033, 0.031125536188483238, -0.008876672014594078, 0.04064173996448517, 0.021446824073791504, -0.14473870396614075, 0.08136313408613205, -0.053331270813941956, -0.010894124396145344, -0.018074266612529755, -0.03755040839314461, 0.03144364804029465, -0.01709536276757717, -0.0896412804722786, -0.10957492142915726, 0.04809590056538582, -0.12468007951974869, -0.04603680968284607, -0.02753400430083275, -0.03201799839735031, -0.0764077678322792, 0.27882519364356995, 0.07398219406604767, -0.055813271552324295, 0.05473632737994194, -0.13228102028369904, -0.16161304712295532, 0.03188032656908035, 0.06607022136449814, -0.06364618986845016, 0.034006159752607346, -0.00012658536434173584, 0.10366202890872955, -0.09165796637535095, 0.047872111201286316, 0.1314440816640854, -0.2383430302143097, -0.05337908864021301, 0.20701101422309875, -0.036895085126161575, 0.10175586491823196, -0.13314104080200195, 0.07955997437238693, 0.0613882876932621, -0.038631584495306015, -0.1672360748052597, -0.09581875801086426, -0.09761660546064377, 0.0727192834019661, -0.0005027066799812019, -0.0018590084509924054, 0.17900343239307404, -0.05084836855530739, 0.062324292957782745, 0.05771522596478462, -0.08370394259691238, -0.05591016262769699, -0.0276976078748703, 0.03689039126038551, 0.010787906125187874, 0.14849132299423218, 0.2214156985282898, -0.050716109573841095, -0.10169260948896408, 0.010844683274626732, -0.09152234345674515, 0.055312324315309525, 0.006767312530428171, 0.07872376590967178, -0.1288965791463852, -0.11153481900691986, -0.2507131099700928, -0.09761343896389008, -0.050966691225767136, -0.025611279532313347, 0.09888004511594772, 0.0854402706027031, -0.015173033811151981, -0.057702627032995224, 0.10803874582052231, 0.025250844657421112, -0.1461920291185379, 0.09914113581180573, -0.09925083070993423, 0.07835806906223297, 0.05611632391810417, 0.030722949653863907, 0.018815910443663597, 0.20667417347431183, 0.06817509979009628, -0.14640942215919495, 0.17391197383403778, -0.059286441653966904, -0.07846979796886444, -0.01652298867702484, -0.09723438322544098, 0.06099211424589157, -0.12254229187965393, 0.00024090266379062086, -0.01471512671560049, -0.03572523966431618, 0.09588813036680222, -0.057700157165527344, 0.01150923129171133, -0.049290332943201065, 0.022521981969475746, -0.026786277070641518, -0.07246556878089905, -0.008532881736755371, 0.04490184411406517, -0.10539592802524567, -0.12551109492778778, 0.045456912368535995, -0.07143198698759079, 0.08788546174764633, 0.02469569258391857, -0.19667485356330872, 0.05585106089711189, -0.1557004749774933, -0.13721485435962677, 0.03734494745731354, 0.01516416110098362, -0.07287146151065826, 0.019825994968414307, -0.06460695713758469, 0.026973148807883263, 0.060045477002859116, -0.01655811443924904, -0.0920494943857193, -0.05925128981471062, 0.05637224018573761, 0.01664220727980137, 0.08029504120349884, -0.06054596230387688, 0.0019507278921082616, -0.06640620529651642, 0.0762806087732315, -0.09598702192306519, -0.03829106688499451, -0.09710618853569031, 0.08075954020023346, 0.033553220331668854, 0.04450196772813797, -0.08973512798547745, -0.009569652378559113, -0.052228160202503204, 0.13566766679286957, -0.05697343870997429, -0.06193503364920616, 0.21930702030658722, -0.18123158812522888, -0.12910978496074677, 0.05132537707686424, 0.07835107296705246, 0.006198980379849672, -0.0637170597910881, 0.411785364151001, 0.012874208390712738, -0.04201488196849823, 0.010152066126465797, 0.07886454463005066, 0.007124979514628649, -0.04000083729624748, 0.011105134151875973, -0.08055099099874496, 0.04547278955578804, 0.02100759744644165, 0.0758819654583931, 0.04273137077689171, -0.008886478841304779, -0.03413313999772072, -0.04110727831721306, -0.05839243903756142, -0.06292544305324554, 0.014684046618640423, 0.07236410677433014, -0.10984434187412262, 0.05536695942282677, -0.1365603655576706, 0.11304345726966858, 0.039260417222976685, 0.04841255769133568, -0.16904900968074799, 0.18751925230026245, -0.10447743535041809, 0.041165146976709366, 0.0016969083808362484, 0.07182566076517105, 0.03111107647418976, 0.020306935533881187, 0.014893796294927597, -0.017021173611283302, 0.08596225827932358, -0.08731300383806229, 0.0035125166177749634, 0.0057375687174499035, 0.07202013581991196, -0.030019156634807587, 0.0659760907292366, -0.12622365355491638, 0.15274541079998016, -0.026212312281131744, 0.04613593593239784, -0.09927752614021301, -0.03654405474662781, 0.22569283843040466, 0.0979616791009903, 0.028393564745783806, -0.06682637333869934, 0.09068354964256287, 0.010199982672929764, 0.018968602642416954, -0.010561616159975529, 0.014590329490602016, 0.054230205714702606, -0.07114182412624359, 0.1711309552192688, -0.07642707228660583, 0.17741136252880096, 0.0744885727763176, -0.1814703345298767, 0.06854032725095749, 0.0202287919819355, -0.03205778822302818, -0.007733422797173262, 0.13576997816562653, 0.10264015197753906, -0.005415435414761305, -0.08149061352014542, 0.057450369000434875, -0.05254824087023735, -0.057818736881017685, 0.04052073508501053, -0.09037775546312332, -0.09499116986989975, 0.04994690418243408, 0.08367130160331726, -0.11736403405666351, 0.14958477020263672, 0.16012124717235565, 0.055941008031368256, 0.3019252121448517, -0.007941046729683876, 0.021840598434209824, -0.027602266520261765, 0.02110670693218708, -0.052247054874897, 0.043973781168460846, -0.2982626259326935, -0.06043653190135956, -0.004692485090345144, 0.007977373898029327, 0.07581581175327301, -0.01801350526511669, -0.07848745584487915, -0.017578423023223877, -0.0015955098206177354, 0.03577107563614845, 0.17556168138980865, -0.11497798562049866, 0.031806644052267075, 0.012025807052850723, 0.03150400146842003, 0.05774126946926117, 0.05021297559142113, -0.08910051733255386, 0.09699449688196182, -0.1884620636701584, -0.21343934535980225, -0.019800810143351555, -0.1279134601354599, -0.03310389071702957, 0.031459685415029526, 0.03516257554292679, -0.20130422711372375, -0.06159763038158417, 0.029924681410193443, 0.0009262826642952859, -0.11313942074775696, -0.005758755374699831, 0.042338695377111435, -0.0760255753993988, -0.08879071474075317, -0.10866910219192505, -0.04133692756295204, -0.03245675191283226, -0.13708244264125824, 0.13380545377731323, -0.10546742379665375, 0.05689295008778572, 0.07797522842884064, 0.02423599548637867, 0.08342184871435165, 0.03795485943555832, 0.10413379222154617, -0.12461695820093155, -0.002035395475104451, 0.03950878605246544, 0.05781860649585724, 0.029810404404997826, 0.11923828721046448, 0.011659794487059116, -0.15686534345149994, -0.0727064460515976, -0.0008906037546694279, -0.07794029265642166, -0.14853018522262573, -0.08543643355369568, -0.08645841479301453, 0.08349299430847168, -0.06359736621379852, 0.07043594866991043, 0.16971531510353088, 0.03319637477397919, 0.03006322868168354, -0.0283381175249815, -0.0648188516497612, 0.030006472021341324, 0.13694532215595245, -0.014637653715908527, 0.02086314745247364, -0.06130943447351456, 0.03366682305932045, 0.11078038066625595, 0.0030271781142801046, 0.09302593767642975, 0.1107817217707634, 0.12804186344146729, 0.12075305730104446, 0.1185925155878067, 0.12899482250213623, -0.0900360569357872, -0.15896041691303253, -0.09499021619558334, 0.048970337957143784, -0.041100479662418365, -0.07128310948610306, 0.06820946931838989, 0.0017602299340069294, -0.10293365269899368, 0.11381377279758453, -0.11717325448989868, 0.06572624295949936, -0.03492068871855736, 0.007632479537278414, 0.07556439191102982, 0.008590131998062134, 0.02469249628484249, 0.04728872328996658, -0.05393310263752937, 0.08543195575475693, 0.01670311577618122, 0.032304514199495316, 0.12875103950500488, 0.06938730180263519, 0.027389051392674446, -0.02494828961789608, 0.07248959690332413, -0.264226496219635, -0.15792858600616455, 0.012928448617458344, 0.06292523443698883, -0.22308450937271118, 0.21096284687519073, 0.07885731756687164, -0.061323124915361404, -0.015686703845858574, -0.07086798548698425, -0.006228193175047636, 0.09493187069892883, 0.054867930710315704, 0.06360042095184326, -0.007130638230592012, -0.01568782702088356, 0.06574474275112152, 0.012547826394438744, 0.1334509253501892, 0.056265927851200104, -0.03878915309906006, 0.04816548898816109, 0.016806481406092644, -0.0750405341386795, 0.151646688580513, -0.05373380333185196, -0.06404188275337219, 0.1177932620048523, 0.062432482838630676, -0.053949564695358276, -0.06876121461391449, 0.009262665174901485, -0.04153565689921379, 0.1284426897764206, -0.0723426416516304, -0.08244627714157104, 0.023399854078888893, -0.027973182499408722, 0.012736820615828037, -0.04470893740653992, 0.06249864026904106, -0.07636493444442749, -0.02827306278049946, -0.08599095791578293, -0.013447868637740612, 0.06133430823683739, 0.03079601377248764, -0.00041309025255031884, 0.023439297452569008, 0.09361962229013443, 0.03992845490574837, -0.04610920324921608, -0.030474968254566193, 0.08504468947649002, 0.003919881768524647, -0.08489397913217545, 0.12786968052387238, 0.07698345184326172, -0.1500663459300995, -0.007733872625976801, -0.02597363479435444, 0.09189119189977646, 0.06630781292915344, -0.1671917736530304, 0.0860399603843689, 0.20941942930221558, 0.04435255378484726, 0.21225130558013916, 0.17773115634918213, -0.08901257067918777, -0.14058919250965118, -0.10454748570919037, -0.1827014535665512, -0.029204638674855232, -0.08969667553901672, -0.20928935706615448, 0.039398908615112305, 0.08930273354053497, -0.041922688484191895, 0.2579009532928467, -0.1712198108434677, -0.03205182030797005, 0.060305994004011154, 0.032777611166238785, 0.35139191150665283, -0.15202103555202484, -0.13233347237110138, 0.056052666157484055, -0.11538011580705643, -0.02677581086754799, 0.01800606958568096, 0.048335958272218704, -0.10416660457849503, 0.05832836776971817, 0.03451237827539444, 0.007745470851659775, 0.09596529603004456, -0.057563796639442444, 0.004949626978486776, -0.15831705927848816, -0.0361316092312336, 0.1270211786031723, 0.03429471701383591, 0.007174602709710598, -0.028306758031249046, -0.043232958763837814, -0.19458657503128052, 0.015040833503007889, -0.07407040148973465, 0.007376617752015591, 0.023065535351634026, -0.06380072236061096, -0.03402210399508476, 0.04783644527196884, -0.06794072687625885, 0.07365745306015015, 0.19949279725551605, -0.0951242595911026, 0.19932328164577484, 0.05529070645570755, 0.02842114306986332, -0.20911289751529694, 0.19101294875144958, -0.09733153879642487, -0.03246092051267624, 0.0784681886434555, -0.15957967936992645, -0.05539122596383095, 0.018345486372709274, 0.02641662023961544, 0.013588089495897293, -0.006883447524160147, -0.05142727121710777, 0.07770966738462448, 0.1581287682056427, -0.06474379450082779, -0.0637318417429924, -0.025072308257222176, 0.12435717135667801, 0.1860051155090332, -0.015049034729599953, 0.08332493901252747, 0.027817485854029655, -0.04175892472267151, 0.0065043712966144085, -0.0032499178778380156, -0.06469608098268509, -0.0618598647415638, 0.07319288700819016, -0.008194858208298683, -0.08822046965360641, 0.07690909504890442, 0.06666602194309235, -0.1438886523246765, 0.04691608250141144, 0.24628183245658875, -0.10987032949924469, -0.10391587018966675, -0.1440819948911667, 0.16102179884910583, -0.16027876734733582, 0.04707515612244606, -0.0051747034303843975, -0.006708287168294191, 0.05954284965991974, 0.23099374771118164, -0.025039097294211388, 0.0683944970369339, 0.037720128893852234, 0.053145211189985275, 0.06495961546897888, -0.007187035866081715, -0.10861101001501083, 0.04611394181847572, -0.1006455048918724, -0.18880535662174225, -0.014270292595028877, 0.14086095988750458, -0.0476527214050293, -0.07460255175828934, -0.12146657705307007, 0.006985588930547237, -0.07566478848457336, 0.03866200894117355, -0.05341969430446625, -0.05298394709825516, 0.0020660697482526302, -0.022880736738443375, -0.05795795097947121, 0.0016153710894286633, -0.05986974388360977, 0.0348719097673893, -0.009298098273575306, 0.04110180586576462, -0.03808751702308655, -0.09996084868907928, 0.07382063567638397, 0.01745578460395336, 0.09532157331705093, 0.14832264184951782, -0.01512835267931223, 0.11627322435379028, -0.1065797507762909, 0.01088810060173273, 0.01619112305343151, 0.0016392098041251302, 0.007738306652754545, 0.03564338758587837, -0.06513804197311401, 0.03183840215206146, -0.05571558326482773, 0.05326997861266136, 0.08428376913070679, -0.08387507498264313, 0.024982281029224396, -0.0571044497191906, -0.09621825069189072, -0.05440825968980789, 0.00046037312131375074, 0.05515209212899208, 0.006336457096040249, -0.054010942578315735, -0.044974587857723236, -0.03625853732228279, 0.05197390913963318, 0.0010326706105843186, 0.04350657016038895, -0.08456248044967651, -0.021016811951994896, -0.01145076286047697, -0.03560324013233185, -0.08121905475854874, 0.21467168629169464, -0.07163070142269135, -0.15617740154266357, 0.04772046580910683, 0.0014368814881891012, 0.03823260962963104, -0.054822102189064026, 0.23103083670139313, 0.14331257343292236, -0.02868623659014702, -0.013872389681637287, 0.048618581146001816, -0.026618070900440216, 0.020869648084044456, -0.1343841701745987, -0.0011580368736758828, 0.09008828550577164, -0.017666852101683617, 0.0008835501503199339, -0.06257601827383041, 0.010576365515589714, 0.013550732284784317, -0.08885064721107483, 0.032710108906030655, 0.016186919063329697, -0.025525657460093498, 0.35315290093421936, 0.002074561547487974, 0.036100246012210846, -0.0014856799971312284, 0.024596435949206352, -0.02274860069155693, -0.23380088806152344, -0.07568084448575974, -0.14311803877353668, 0.11002453416585922, -0.09207392483949661, 0.05908745154738426, -0.03985286131501198, 0.08731649816036224, -0.0008245303761214018, 0.20169886946678162, 0.06074285879731178, -0.046225275844335556, 0.004893940407782793, -0.032506536692380905, -0.027688762173056602, -0.02870791219174862, -0.023812051862478256, -0.07589501142501831, -0.05233988165855408, -0.03405420482158661, 0.028306782245635986, 0.046936195343732834, -0.08038859069347382, -0.11016358435153961, -0.05157720297574997, 0.00290570966899395, 0.06879866123199463, -0.05280846729874611, 0.07672528922557831, -0.0060515073128044605, -0.042766354978084564, -0.010224049910902977, 0.07739482820034027, 0.06398507952690125, 0.02483086660504341, 0.07458514720201492, 0.019761277362704277, -0.00854872539639473, 0.054151296615600586, -0.047911230474710464, 0.03128122538328171, -0.03311414271593094, 0.18127810955047607, 0.18026158213615417, 0.0027090031653642654, 0.04752570763230324, 0.06920080631971359, 0.059232763946056366, 0.01847023516893387, 0.008975498378276825, 0.08606921881437302, 0.17108072340488434, -0.07645556330680847, -0.08145767450332642, -0.11517085134983063, 0.044244591146707535, -0.16448214650154114, -0.04108228534460068, 0.06312581896781921, -0.032475728541612625, -0.045396629720926285, 0.032142359763383865, -0.20568111538887024, 0.18363219499588013, 0.044544290751218796, -0.18433590233325958, -0.028450744226574898, -0.05826161429286003, -0.08915206044912338, 0.031717561185359955, 0.05256799980998039, -0.018773015588521957, -0.10258545726537704, -0.08669281750917435, 0.11931692808866501, -0.24194689095020294, -0.16232703626155853, 0.06868740171194077, -0.08607848733663559, 0.04778178036212921, -0.07134608179330826, -0.06271997839212418, 0.06766481697559357, 0.03845038264989853, 0.010289949364960194, 0.08535940200090408, 0.06044488772749901, 0.06082728132605553, -0.2079353630542755, 0.11444977670907974, 0.05028538033366203, -0.02142486535012722, 0.08593978732824326, 0.07273752242326736, -0.04834184795618057, 0.18225322663784027, 0.08351865410804749, 0.003680796828120947, 0.12733547389507294, -0.1285339742898941, 0.08644896745681763, -0.01820792257785797, 0.022937355563044548, -0.11034785956144333, -0.07854393869638443, 0.013398691080510616, 0.026125192642211914, -0.09368151426315308, -0.07612001150846481, -0.07784563302993774, 0.02112554758787155, 0.15361016988754272, 0.06605928391218185, -0.09108510613441467, -0.03879108279943466, -0.05145176127552986, 0.10612453520298004, -0.09094265848398209, 0.11845836788415909, 0.08762560039758682, 0.06285087764263153, -0.010250800289213657, -0.1265300065279007, 0.10216827690601349, -0.00720532052218914, -0.09670009464025497, -0.06342291831970215 ]
null
null
transformers
# Model Trained Using AutoTrain - Problem type: Text Classification ## Validation Metrics loss: 1.0708461999893188 f1_macro: 0.2661337426604252 f1_micro: 0.7226277372262774 f1_weighted: 0.6581589218161064 precision_macro: 0.2486322319655653 precision_micro: 0.7226277372262774 precision_weighted: 0.6234505964432971 recall_macro: 0.30050505050505044 recall_micro: 0.7226277372262774 recall_weighted: 0.7226277372262774 accuracy: 0.7226277372262774
{"tags": ["autotrain", "text-classification"], "datasets": ["autotrain-suricata2/autotrain-data"], "widget": [{"text": "I love AutoTrain"}]}
text-classification
tali1/autotrain-suricata2
[ "transformers", "safetensors", "bert", "text-classification", "autotrain", "dataset:autotrain-suricata2/autotrain-data", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2024-02-13T22:08:41+00:00
[]
[]
TAGS #transformers #safetensors #bert #text-classification #autotrain #dataset-autotrain-suricata2/autotrain-data #autotrain_compatible #endpoints_compatible #region-us
# Model Trained Using AutoTrain - Problem type: Text Classification ## Validation Metrics loss: 1.0708461999893188 f1_macro: 0.2661337426604252 f1_micro: 0.7226277372262774 f1_weighted: 0.6581589218161064 precision_macro: 0.2486322319655653 precision_micro: 0.7226277372262774 precision_weighted: 0.6234505964432971 recall_macro: 0.30050505050505044 recall_micro: 0.7226277372262774 recall_weighted: 0.7226277372262774 accuracy: 0.7226277372262774
[ "# Model Trained Using AutoTrain\n\n- Problem type: Text Classification", "## Validation Metrics\nloss: 1.0708461999893188\n\nf1_macro: 0.2661337426604252\n\nf1_micro: 0.7226277372262774\n\nf1_weighted: 0.6581589218161064\n\nprecision_macro: 0.2486322319655653\n\nprecision_micro: 0.7226277372262774\n\nprecision_weighted: 0.6234505964432971\n\nrecall_macro: 0.30050505050505044\n\nrecall_micro: 0.7226277372262774\n\nrecall_weighted: 0.7226277372262774\n\naccuracy: 0.7226277372262774" ]
[ "TAGS\n#transformers #safetensors #bert #text-classification #autotrain #dataset-autotrain-suricata2/autotrain-data #autotrain_compatible #endpoints_compatible #region-us \n", "# Model Trained Using AutoTrain\n\n- Problem type: Text Classification", "## Validation Metrics\nloss: 1.0708461999893188\n\nf1_macro: 0.2661337426604252\n\nf1_micro: 0.7226277372262774\n\nf1_weighted: 0.6581589218161064\n\nprecision_macro: 0.2486322319655653\n\nprecision_micro: 0.7226277372262774\n\nprecision_weighted: 0.6234505964432971\n\nrecall_macro: 0.30050505050505044\n\nrecall_micro: 0.7226277372262774\n\nrecall_weighted: 0.7226277372262774\n\naccuracy: 0.7226277372262774" ]
[ 58, 16, 153 ]
[ "passage: TAGS\n#transformers #safetensors #bert #text-classification #autotrain #dataset-autotrain-suricata2/autotrain-data #autotrain_compatible #endpoints_compatible #region-us \n# Model Trained Using AutoTrain\n\n- Problem type: Text Classification## Validation Metrics\nloss: 1.0708461999893188\n\nf1_macro: 0.2661337426604252\n\nf1_micro: 0.7226277372262774\n\nf1_weighted: 0.6581589218161064\n\nprecision_macro: 0.2486322319655653\n\nprecision_micro: 0.7226277372262774\n\nprecision_weighted: 0.6234505964432971\n\nrecall_macro: 0.30050505050505044\n\nrecall_micro: 0.7226277372262774\n\nrecall_weighted: 0.7226277372262774\n\naccuracy: 0.7226277372262774" ]
[ -0.12731701135635376, 0.14181536436080933, -0.0008813882013782859, 0.10024545341730118, 0.10352060198783875, 0.022672511637210846, 0.08482895791530609, 0.11507615447044373, -0.04602707177400589, 0.14486812055110931, 0.1384495347738266, 0.0852150022983551, 0.06021098420023918, 0.1444980502128601, -0.16533911228179932, -0.10960286110639572, 0.0207266416400671, 0.002573245670646429, 0.10717029124498367, 0.08235249668359756, 0.06564276665449142, -0.12182890623807907, 0.09803164750337601, -0.05244677886366844, -0.18612003326416016, 0.03868981450796127, 0.1350616067647934, -0.1104898676276207, 0.07769685238599777, 0.07679197192192078, 0.09111444652080536, -0.021318333223462105, 0.10463929921388626, -0.06131429225206375, -0.0290415920317173, 0.011889485642313957, 0.013082251884043217, 0.06251688301563263, 0.12248159199953079, 0.020290706306695938, -0.03739115968346596, -0.06310401856899261, 0.06958931684494019, 0.061450209468603134, -0.093319371342659, -0.035192739218473434, -0.11319134384393692, 0.057710640132427216, 0.07206607609987259, 0.1073707789182663, -0.028104377910494804, 0.20881077647209167, -0.02788708545267582, 0.08092167228460312, 0.09334150701761246, -0.25396955013275146, -0.0642644390463829, 0.13905160129070282, -0.04616799205541611, -0.038058433681726456, -0.0684436485171318, 0.06050916016101837, 0.14127030968666077, 0.02798914536833763, 0.030061978846788406, -0.0143370246514678, -0.017528552561998367, 0.012393531389534473, -0.08898286521434784, -0.036466334015131, 0.22784413397312164, 0.018546679988503456, -0.08026851713657379, -0.10420919209718704, -0.04065779596567154, -0.09480556100606918, -0.05758059024810791, -0.058660056442022324, -0.010409893468022346, -0.06005467474460602, -0.07611455768346786, 0.10677646845579147, -0.04912113398313522, -0.10364820063114166, -0.15338577330112457, 0.06987795233726501, -0.018963439390063286, 0.025089766830205917, -0.006611892022192478, 0.023613234981894493, -0.036669276654720306, -0.10667414218187332, 0.009356924332678318, 0.009818424470722675, -0.04203719273209572, -0.06613843888044357, 0.00804898887872696, 0.020272336900234222, 0.041090209037065506, 0.10903742909431458, -0.0011470247991383076, 0.07521295547485352, -0.040548175573349, -0.02548348158597946, -0.056749630719423294, 0.16808277368545532, -0.07517591863870621, -0.07856590300798416, 0.02432093396782875, 0.023065710440278053, 0.03918185830116272, -0.019713984802365303, -0.03783561289310455, -0.09785941243171692, 0.16555075347423553, 0.056291237473487854, -0.028338400647044182, 0.008058685809373856, -0.06603492051362991, -0.00835493952035904, -0.045151643455028534, -0.10494133830070496, 0.02350529283285141, -0.046714019030332565, -0.13486216962337494, -0.004857243970036507, 0.008444934152066708, 0.025135790929198265, -0.04410238936543465, 0.07777898758649826, -0.129678413271904, -0.012987889349460602, -0.04946709796786308, -0.11548468470573425, 0.04071479290723801, -0.033584918826818466, -0.01970704272389412, -0.11659850180149078, -0.21561937034130096, -0.07081073522567749, -0.08227083832025528, -0.10236748307943344, -0.022058378905057907, -0.049694813787937164, -0.06612729281187057, 0.07707715779542923, 0.011068718507885933, 0.09185624867677689, -0.012835271656513214, 0.041482411324977875, 0.060571473091840744, 0.06769163906574249, -0.018338626250624657, 0.011681446805596352, -0.07195784896612167, -0.010084784589707851, -0.14669741690158844, 0.016764100641012192, -0.04312332719564438, 0.04601744934916496, -0.11771688610315323, -0.008253062143921852, 0.06083763390779495, -0.013444055803120136, 0.07647280395030975, 0.18574996292591095, -0.15400025248527527, -0.01175019983202219, 0.07586630433797836, -0.0783301368355751, -0.10430210828781128, 0.10711255669593811, -0.005927893333137035, 0.05662060156464577, 0.06715795397758484, 0.08994722366333008, 0.018698832020163536, -0.08922801911830902, -0.0687238946557045, -0.06305649131536484, -0.005464798770844936, -0.027130819857120514, 0.0607229620218277, -0.02471880055963993, -0.13282257318496704, 0.02599240466952324, 0.11963528394699097, -0.03541168197989464, -0.08037757873535156, -0.047249678522348404, -0.0495079830288887, -0.05925966054201126, 0.01683371514081955, 0.015879878774285316, 0.055137861520051956, -0.10222258418798447, -0.0314084067940712, 0.056772224605083466, 0.08213915675878525, 0.007719273678958416, -0.057815298438072205, -0.13195250928401947, 0.052173975855112076, -0.1533859521150589, -0.0460098460316658, -0.17081664502620697, -0.08396827429533005, 0.013669980689883232, 0.03446508198976517, -0.03572985902428627, -0.0628729909658432, 0.0895453616976738, 0.02505366876721382, -0.05343552306294441, -0.005864047445356846, 0.0976613387465477, 0.00597491767257452, -0.13176316022872925, -0.09928316622972488, 0.036224596202373505, 0.016352100297808647, 0.1717352569103241, -0.1926945596933365, -0.010416463948786259, 0.038661107420921326, 0.0823366567492485, 0.017793182283639908, -0.025525303557515144, -0.027508888393640518, 0.038514986634254456, -0.017818551510572433, -0.061755403876304626, 0.036802027374506, -0.023798737674951553, -0.08552869409322739, 0.008417289704084396, -0.2600274384021759, 0.2571115791797638, 0.12432071566581726, 0.06633927673101425, -0.09432777762413025, -0.04633428901433945, 0.01798294670879841, -0.047515980899333954, -0.11126973479986191, -0.009752173908054829, 0.0058450112119317055, -0.005580557975918055, 0.11218054592609406, -0.06582943350076675, -0.046894632279872894, 0.04958784952759743, -0.011184257455170155, -0.08008014410734177, 0.17848342657089233, -0.05991891399025917, -0.1653938591480255, 0.12023510783910751, 0.04706002026796341, -0.05406377464532852, 0.09487427026033401, -0.00431462936103344, -0.04911626875400543, -0.07042720913887024, -0.059681881219148636, 0.049882758408784866, 0.09884537756443024, 0.05916181951761246, 0.04964283108711243, 0.032697826623916626, 0.016474170610308647, -0.00918834563344717, -0.08721260726451874, -0.00743497209623456, 0.0008766849641688168, -0.005809219088405371, -0.0031923295464366674, -0.00028860202291980386, 0.06045932695269585, 0.17787806689739227, 0.008808298036456108, -0.036471497267484665, 0.03874815255403519, -0.011388278566300869, -0.10667113959789276, 0.19376392662525177, -0.09109658747911453, -0.13168761134147644, -0.18159262835979462, -0.016684724017977715, -0.0800916850566864, -0.008430243469774723, -0.012735764496028423, -0.09319724887609482, -0.08865559846162796, -0.10104212164878845, -0.028089703992009163, -0.003826980944722891, -0.019232485443353653, 0.057111479341983795, -0.011872894130647182, 0.06996321678161621, -0.0718684196472168, -0.061238858848810196, -0.02323254570364952, -0.04993819445371628, 0.07130597531795502, -0.012846056371927261, 0.12329348176717758, 0.13594713807106018, -0.03375837579369545, 0.04437769949436188, -0.048404958099126816, 0.13105835020542145, -0.0068851071409881115, -0.0110784275457263, 0.19799819588661194, 0.05622689425945282, 0.0065010832622647285, 0.12557737529277802, 0.004680037498474121, -0.09835360944271088, 0.051933690905570984, 0.04886849969625473, -0.015108239836990833, -0.1773596704006195, -0.13614006340503693, 0.020960060879588127, 0.024679357185959816, 0.11924319714307785, 0.005703965201973915, 0.06728260219097137, 0.09012532979249954, 0.008822795003652573, 0.08421104401350021, -0.03911415487527847, 0.07384838908910751, 0.137332022190094, 0.04326407238841057, 0.16951856017112732, -0.0700252577662468, -0.008163480088114738, 0.07867959141731262, -0.08030221611261368, 0.05872029438614845, 0.028717773035168648, 0.02432185783982277, -0.038858018815517426, 0.09690921753644943, 0.05343404784798622, 0.1106506809592247, 0.09630919247865677, -0.051415394991636276, 0.029185175895690918, -0.1027977392077446, -0.0936548188328743, 0.027431631460785866, -0.03706127032637596, 0.10706605017185211, -0.1498466581106186, 0.06883054971694946, 0.038652706891298294, 0.1318284273147583, 0.09497127681970596, -0.43414467573165894, -0.10510079562664032, 0.06477292627096176, 0.0147759560495615, -0.12755143642425537, 0.010358132421970367, 0.06353896856307983, -0.10922055691480637, 0.03156409040093422, -0.020607439801096916, 0.10197129845619202, -0.008429153822362423, 0.009406539611518383, -0.0604296438395977, 0.05545378848910332, -0.0484171062707901, 0.03970123827457428, -0.2443276047706604, 0.18673886358737946, 0.07251451164484024, 0.08848076313734055, -0.09456128627061844, -0.003891907399520278, 0.050748370587825775, -0.022282712161540985, 0.15635527670383453, -0.015869971364736557, -0.17458324134349823, -0.32708871364593506, -0.1018182635307312, 0.012067251838743687, 0.04035959392786026, 0.01699083484709263, 0.09933105111122131, -0.05600137263536453, 0.00297226058319211, 0.03345635533332825, -0.08477921038866043, -0.15116006135940552, -0.016008172184228897, 0.010667961090803146, 0.11978159844875336, -0.029935304075479507, -0.06127924472093582, -0.09339634329080582, -0.04615812748670578, 0.09370268881320953, -0.10347233712673187, -0.03597163036465645, -0.16752049326896667, 0.11113699525594711, 0.12266343832015991, -0.07528635859489441, 0.03620750084519386, 0.00698471674695611, 0.12458891421556473, 0.015730075538158417, -0.09380586445331573, 0.09948988258838654, -0.08754897862672806, -0.10852690041065216, -0.008993523195385933, 0.06949928402900696, 0.01741657592356205, 0.03810613602399826, 0.04490286484360695, 0.0689331516623497, 0.0314159169793129, -0.09431171417236328, 0.04749978706240654, 0.040844570845365524, 0.19706052541732788, 0.07626543939113617, -0.03243529424071312, -0.16487444937229156, -0.05597468838095665, -0.024478977546095848, 0.10417397320270538, 0.28700220584869385, -0.06699450314044952, -0.049798376858234406, 0.055361829698085785, -0.03873394802212715, -0.2452358901500702, 0.06813593208789825, -0.03710472956299782, 0.06722482293844223, -0.027485938742756844, 0.028688255697488785, 0.1182774156332016, 0.16704875230789185, -0.030346142128109932, -0.009043088182806969, -0.24927043914794922, -0.14005663990974426, 0.20810846984386444, 0.1000455692410469, 0.1312168836593628, -0.1175837367773056, -0.06457370519638062, -0.13946276903152466, -0.10943891853094101, 0.02769557572901249, -0.052058134227991104, 0.05661831796169281, -0.058552902191877365, 0.04337698593735695, 0.06810557097196579, -0.0756714791059494, 0.12848106026649475, -0.03876037895679474, 0.07710496336221695, -0.06960643082857132, -0.05157773196697235, 0.0009167435928247869, -0.07917504757642746, 0.1199033185839653, 0.056137386709451675, 0.08111125975847244, -0.13267292082309723, -0.022498931735754013, -0.009980854578316212, 0.06171397492289543, -0.06680252403020859, -0.0508752278983593, -0.00646496145054698, 0.040491778403520584, -0.054413288831710815, -0.094204843044281, -0.03795643523335457, -0.048326317220926285, 0.12371328473091125, 0.1648155301809311, 0.10961638391017914, -0.03481407091021538, 0.06253990530967712, 0.07596477121114731, -0.06960757821798325, 0.06260297447443008, -0.047906599938869476, 0.07537387311458588, 0.13985012471675873, 0.01727289706468582, 0.13021017611026764, 0.05707821622490883, -0.02288607321679592, -0.03061673603951931, 0.029839856550097466, -0.14844845235347748, 0.018462397158145905, -0.028095347806811333, 0.03871610015630722, -0.05673864111304283, -0.028212860226631165, 0.12034770846366882, -0.05072678253054619, 0.01011507585644722, -0.024771256372332573, 0.02082349732518196, 0.029374413192272186, 0.2689836323261261, -0.012351722456514835, 0.05942273512482643, -0.08530466258525848, 0.10795201361179352, 0.05675971508026123, -0.1432880163192749, 0.07131950557231903, 0.01429983600974083, -0.06705819070339203, -0.013602199032902718, 0.0768527090549469, 0.2031833976507187, -0.14516650140285492, -0.0740564838051796, -0.10819108784198761, -0.16448432207107544, 0.06870846450328827, 0.29642823338508606, 0.08628708124160767, 0.02137630060315132, 0.006291192956268787, -0.08592035621404648, -0.1099415048956871, 0.0884694904088974, 0.13225263357162476, 0.025324523448944092, -0.1257660835981369, 0.14817029237747192, -0.06254454702138901, -0.04278462380170822, -0.043174274265766144, 0.03152338042855263, -0.18218299746513367, -0.007996040396392345, -0.08020708709955215, 0.07236113399267197, -0.03510065749287605, -0.00655775610357523, -0.025245625525712967, -0.00011093066859757528, -0.054888635873794556, 0.03442075103521347, -0.020543191581964493, -0.022294268012046814, 0.016389716416597366, 0.062327586114406586, -0.1178411990404129, -0.06041298061609268, 0.05171116068959236, -0.024760877713561058, 0.02956731617450714, 0.07242073863744736, 0.08808264881372452, 0.01201438345015049, -0.05789773911237717, 0.011760372668504715, 0.07211537659168243, -0.02981586754322052, 0.06973204761743546, -0.17745234072208405, 0.03147311136126518, 0.03858034685254097, 0.016710110008716583, 0.07892592996358871, 0.08239642530679703, -0.10170489549636841, -0.034418366849422455, -0.05143549293279648, -0.038583237677812576, -0.13294796645641327, 0.03453958034515381, 0.1835619956254959, 0.00012475071707740426, 0.07946790009737015, -0.11398062109947205, -0.012663415633141994, -0.18513154983520508, 0.01402211282402277, -0.04438570514321327, -0.10705505311489105, -0.07006388902664185, 0.03610294684767723, 0.10258802771568298, 0.001613953267224133, 0.04427020624279976, -0.021965626627206802, 0.016517404466867447, 0.0717543214559555, 0.06271570920944214, -0.03635736182332039, 0.003913485910743475, 0.1971011608839035, 0.08520887047052383, -0.044543929398059845, 0.06421986222267151, 0.06907098740339279, 0.10804153978824615, 0.04804183170199394, 0.07835160940885544, 0.09816961735486984, -0.07109697163105011, 0.12026596069335938, 0.025195591151714325, -0.07984532415866852, -0.012334245257079601, 0.0974215567111969, -0.14747069776058197, 0.019434131681919098, -0.03999282047152519, -0.013897569850087166, 0.15624144673347473, -0.1214345246553421, -0.011353673413395882, -0.12506012618541718, -0.07854656875133514, -0.17120739817619324, -0.059451572597026825, -0.15244072675704956, 0.0028333195950835943, -0.004480070434510708, -0.11214260756969452, 0.053944483399391174, 0.14884857833385468, 0.038926150649785995, 0.0015047398628666997, 0.13437113165855408, -0.2005823403596878, 0.011886455118656158, 0.03661691024899483, -0.01019970327615738, -0.028936179354786873, -0.04265085980296135, -0.0495326854288578, 0.03300277516245842, 0.017931735143065453, 0.04894696921110153, 0.003533288836479187, 0.03319129720330238, 0.03331347554922104, 0.006637332960963249, -0.09141324460506439, -0.028352944180369377, 0.026931550353765488, 0.04747963696718216, 0.0400555357336998, 0.026679271832108498, 0.03700621426105499, -0.03294362127780914, 0.24578739702701569, -0.1042223572731018, -0.014371576718986034, -0.09234172850847244, 0.24102947115898132, 0.01596577651798725, 0.07463837414979935, 0.003722965018823743, -0.054362762719392776, 0.06724660098552704, 0.13071095943450928, 0.0700896754860878, -0.02478717640042305, -0.05492594093084335, -0.04084701091051102, -0.013982849195599556, -0.04949215427041054, 0.07218311727046967, 0.04450305551290512, 0.05379320681095123, -0.06560728698968887, 0.07140635699033737, -0.010538711212575436, -0.04002857953310013, -0.01685497909784317, 0.08622767776250839, -0.01388455554842949, 0.04913150146603584, -0.02912948466837406, 0.07027892768383026, 0.005596762988716364, 0.0491507388651371, 0.083610400557518, -0.1164887547492981, -0.1483301967382431, 0.062320686876773834, 0.011992795392870903, -0.06577848643064499, 0.09250746667385101, -0.059222958981990814, 0.040718887001276016, -0.06024373695254326, -0.01321166381239891, -0.10801246017217636, -0.10886966437101364, -0.00219698715955019, 0.14826208353042603, 0.26702558994293213, 0.018538320437073708, 0.09925223886966705, 0.13999246060848236, -0.04643534496426582, -0.16704382002353668, 0.11616399884223938, -0.002357342978939414, -0.10828699916601181, 0.11189129948616028, 0.13040220737457275, -0.010613080114126205, 0.17674845457077026, 0.030427271500229836, -0.18700779974460602, 0.007847451604902744, -0.04080072045326233, -0.0070678056217730045, -0.0498981736600399, 0.0403883196413517, -0.05794673040509224, 0.16949626803398132, 0.10835883766412735, -0.0448114238679409, 0.018421990796923637, -0.043838754296302795, 0.05398363992571831, -0.007800818886607885, 0.05269942432641983, 0.03680189698934555, -0.14637771248817444, 0.07992095500230789, -0.08238304406404495, -0.02673347480595112, -0.35841935873031616, -0.05342112481594086, -0.034770384430885315, -0.06995800137519836, -0.06714019179344177, 0.09287466108798981, 0.08514925837516785, 0.04051394388079643, -0.0535004623234272, -0.219204843044281, 0.04922128841280937, 0.15412922203540802, -0.054688677191734314, -0.11429524421691895 ]
null
null
stable-baselines3
# **PPO** Agent playing **LunarLander-v2** This is a trained model of a **PPO** agent playing **LunarLander-v2** using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3). ## Usage (with Stable-baselines3) TODO: Add your code ```python from stable_baselines3 import ... from huggingface_sb3 import load_from_hub ... ```
{"library_name": "stable-baselines3", "tags": ["LunarLander-v2", "deep-reinforcement-learning", "reinforcement-learning", "stable-baselines3"], "model-index": [{"name": "PPO", "results": [{"task": {"type": "reinforcement-learning", "name": "reinforcement-learning"}, "dataset": {"name": "LunarLander-v2", "type": "LunarLander-v2"}, "metrics": [{"type": "mean_reward", "value": "245.73 +/- 17.59", "name": "mean_reward", "verified": false}]}]}]}
reinforcement-learning
dhajnes/drl_course
[ "stable-baselines3", "LunarLander-v2", "deep-reinforcement-learning", "reinforcement-learning", "model-index", "region:us" ]
2024-02-13T22:13:30+00:00
[]
[]
TAGS #stable-baselines3 #LunarLander-v2 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us
# PPO Agent playing LunarLander-v2 This is a trained model of a PPO agent playing LunarLander-v2 using the stable-baselines3 library. ## Usage (with Stable-baselines3) TODO: Add your code
[ "# PPO Agent playing LunarLander-v2\nThis is a trained model of a PPO agent playing LunarLander-v2\nusing the stable-baselines3 library.", "## Usage (with Stable-baselines3)\nTODO: Add your code" ]
[ "TAGS\n#stable-baselines3 #LunarLander-v2 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us \n", "# PPO Agent playing LunarLander-v2\nThis is a trained model of a PPO agent playing LunarLander-v2\nusing the stable-baselines3 library.", "## Usage (with Stable-baselines3)\nTODO: Add your code" ]
[ 39, 41, 17 ]
[ "passage: TAGS\n#stable-baselines3 #LunarLander-v2 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us \n# PPO Agent playing LunarLander-v2\nThis is a trained model of a PPO agent playing LunarLander-v2\nusing the stable-baselines3 library.## Usage (with Stable-baselines3)\nTODO: Add your code" ]
[ 0.03942384943366051, 0.04900386184453964, -0.005304091144353151, 0.026427261531352997, 0.107408307492733, -0.026511888951063156, 0.11188238859176636, 0.0814051404595375, 0.10722193866968155, 0.04762078449130058, 0.08338645845651627, 0.06030960753560066, 0.05080918222665787, 0.2571701407432556, 0.04754156619310379, -0.22987541556358337, 0.036159250885248184, -0.04869936779141426, 0.12395193427801132, 0.07178173214197159, -0.0038484656251966953, -0.06485428661108017, 0.020415637642145157, -0.013290755450725555, 0.05367108806967735, 0.04282612353563309, -0.01716216839849949, -0.08207534998655319, 0.07169748842716217, -0.06345846503973007, 0.06986866891384125, 0.07677983492612839, 0.13218913972377777, -0.17832116782665253, 0.029566360637545586, 0.02571309357881546, -0.07189024239778519, 0.01342033501714468, 0.008019951172173023, 0.05120139941573143, 0.17303818464279175, 0.019879888743162155, 0.07844575494527817, -0.0025605305563658476, -0.15412317216396332, -0.018950799480080605, 0.0436202734708786, 0.12546207010746002, 0.08808347582817078, 0.04605821147561073, 0.01970590092241764, 0.17503218352794647, -0.054352790117263794, -0.028833400458097458, 0.21759237349033356, -0.2881564497947693, -0.031460098922252655, 0.321048766374588, 0.06997483223676682, 0.09725230932235718, -0.07540661096572876, -0.03619609400629997, 0.007783263456076384, -0.013137873262166977, -0.028666524216532707, -0.07447073608636856, 0.17313385009765625, 0.05152064561843872, -0.05057951435446739, -0.09541505575180054, 0.16948209702968597, 0.006921638268977404, 0.0018855923553928733, -0.019282981753349304, 0.009060598909854889, 0.07402525842189789, -0.016097044572234154, -0.07255112379789352, 0.057438433170318604, 0.05330665782094002, 0.019649166613817215, -0.1435653269290924, -0.10762494057416916, -0.022740179672837257, -0.008012006990611553, 0.17786912620067596, -0.009255532175302505, 0.042902372777462006, 0.003065188182517886, 0.10384012013673782, -0.12480384111404419, -0.03354184702038765, -0.0454259067773819, -0.07565800100564957, -0.0223417766392231, -0.02058211714029312, -0.03580251708626747, 0.07184842973947525, 0.11971849203109741, 0.027368178591132164, 0.09350208193063736, 0.047715865075588226, -0.03206788748502731, 0.06343851238489151, 0.05555703118443489, 0.14222665131092072, 0.05807621404528618, 0.012854371219873428, 0.13179877400398254, 0.055213116109371185, 0.033023182302713394, -0.0613492950797081, -0.18252409994602203, 0.07489913702011108, -0.07031869143247604, 0.007941240444779396, 0.12051256000995636, -0.04480670019984245, -0.1183447614312172, -0.037500523030757904, -0.017392054200172424, -0.06224250793457031, -0.025395862758159637, 0.0547584593296051, -0.02883218228816986, -0.03973718360066414, 0.0011496668448671699, 0.09384800493717194, 0.00953749567270279, -0.1752052903175354, 0.03303423151373863, -0.025042934343218803, -0.10782608389854431, 0.009975161403417587, 0.0022444494534283876, 0.03394931182265282, 0.04408763721585274, -0.11822668462991714, -0.30899152159690857, -0.07652641832828522, 0.05490870401263237, -0.06516939401626587, -0.18425025045871735, -0.13193942606449127, 0.02454492449760437, -0.09037084132432938, -0.044885024428367615, -0.12759265303611755, -0.028549788519740105, 0.01743689924478531, 0.011519349180161953, 0.10758619755506516, -0.0106219332665205, -0.012188062071800232, -0.1571401208639145, 0.008273907005786896, -0.20951123535633087, 0.0890483483672142, -0.019150104373693466, 0.037884220480918884, -0.032381169497966766, -0.07404014468193054, 0.030707746744155884, 0.052499737590551376, -0.01474119070917368, 0.13510210812091827, -0.15592676401138306, -0.03691192343831062, -0.007996266707777977, -0.13611900806427002, -0.04786273464560509, -0.10358831286430359, -0.04357128217816353, 0.13354332745075226, 0.018664736300706863, 0.15356586873531342, -0.08709818124771118, -0.0722038671374321, 0.20489206910133362, -0.010411538183689117, -0.12820468842983246, -0.076752208173275, 0.10165707021951675, 0.021510310471057892, -0.056606587022542953, -0.02523270808160305, -0.1839766949415207, -0.0152357779443264, -0.04550420492887497, -0.047039128839969635, 0.01796751655638218, -0.010888241231441498, 0.13837894797325134, 0.08494598418474197, 0.05018039792776108, -0.06086122244596481, -0.006730288732796907, 0.10779471695423126, 0.08823856711387634, 0.008680110797286034, 0.023406028747558594, -0.05774238705635071, 0.09552932530641556, -0.04003755748271942, -0.0142367510125041, -0.08283266425132751, -0.036246106028556824, -0.026256313547492027, 0.17507147789001465, 0.09440762549638748, 0.2257927656173706, 0.09567736834287643, 0.039160262793302536, 0.031270865350961685, -0.13181598484516144, -0.1425403207540512, -0.0017254541162401438, 0.09020978957414627, -0.14270411431789398, -0.04119925573468208, -0.08974775671958923, -0.17768175899982452, -0.12202505767345428, 0.0006432619411498308, -0.17960017919540405, 0.06390921026468277, 0.05408334732055664, -0.035177867859601974, 0.03272094577550888, 0.13032332062721252, -0.011533179320394993, -0.03967514634132385, 0.0831870287656784, 0.0379033200442791, -0.041234664618968964, -0.021742934361100197, 0.11885567009449005, 0.15673065185546875, 0.13124459981918335, -0.03511447086930275, 0.004914294462651014, 0.07076404243707657, -0.02309088408946991, 0.06539414077997208, 0.0558244064450264, 0.20973342657089233, 0.188301220536232, 0.038996949791908264, 0.008822928182780743, -0.07048165798187256, 0.0855446457862854, -0.0742373839020729, -0.14302679896354675, -0.05579735338687897, 0.08729292452335358, 0.016605578362941742, 0.023469142615795135, 0.08711627870798111, 0.024545932188630104, 0.09132762253284454, 0.15968108177185059, 0.01990218088030815, -0.09659269452095032, -0.050218869000673294, 0.01175848301500082, 0.027713103219866753, 0.04794301092624664, -0.04514073207974434, -0.00937939714640379, 0.017020760104060173, -0.10303554683923721, 0.031789086759090424, -0.1413339376449585, -0.1358717679977417, 0.044326696544885635, 0.003906996920704842, 0.010907664895057678, 0.02786896750330925, -0.0038291432429105043, 0.019039705395698547, 0.04351753741502762, -0.06975466758012772, 0.047416772693395615, -0.024745507165789604, -0.020031947642564774, 0.03340689837932587, -0.057257164269685745, -0.205775648355484, -0.17696654796600342, 0.00013708483311347663, -0.09910997003316879, 0.10194740444421768, 0.018308809027075768, -0.12373185902833939, 0.047737859189510345, -0.05822649225592613, 0.027574289590120316, -0.01875593699514866, -0.049130141735076904, 0.10507171601057053, 0.1525275856256485, -0.016146350651979446, 0.018018173053860664, -0.04865182936191559, -0.10157987475395203, -0.19632206857204437, 0.0691583976149559, 0.04680244252085686, 0.014610917307436466, 0.10669491440057755, 0.018072687089443207, 0.02367905154824257, -0.007674071006476879, -0.016521066427230835, -0.011659215204417706, -0.08781040459871292, 0.31909599900245667, 0.04510033503174782, -0.025173069909214973, 0.02041010931134224, -0.0043001663871109486, -0.028083480894565582, 0.03263787180185318, -0.0985708013176918, -0.07548979669809341, -0.08774089068174362, -0.04367410019040108, -0.09784720093011856, 0.053299110382795334, 0.05916472524404526, 0.003188040340319276, -0.07727594673633575, 0.04221395403146744, 0.11369874328374863, -0.0923808291554451, -0.07137343287467957, 0.07477962225675583, 0.0972946360707283, -0.07331304252147675, 0.00012658814375754446, 0.00874367356300354, 0.023951783776283264, 0.037102166563272476, 0.06778035312891006, -0.03966575115919113, 0.08589404821395874, -0.19917890429496765, 0.0372927263379097, 0.106058269739151, 0.023754918947815895, 0.0638108178973198, 0.07643651217222214, -0.1058402881026268, -0.008500572293996811, -0.032518330961465836, -0.21341575682163239, 0.1668180525302887, 0.1355515867471695, 0.06788124144077301, -0.025637222453951836, -0.00461410591378808, -0.0649740919470787, 0.05773647129535675, 0.02723747305572033, -0.14758841693401337, 0.004883295856416225, 0.06064270809292793, 0.026899009943008423, 0.01614922471344471, 0.07971042394638062, 0.014697225764393806, -0.1801026314496994, -0.014406266622245312, 0.10730406641960144, 0.002390873385593295, 0.0053148469887673855, -0.03175045922398567, -0.1755964607000351, 0.0751047357916832, 0.004285442177206278, 0.07233936339616776, -0.1676585078239441, 0.14297930896282196, -0.10089799761772156, 0.07726949453353882, -0.004285062663257122, -0.021311495453119278, 0.02507244050502777, -0.0541163794696331, 0.15163759887218475, 0.01058570109307766, -0.021810131147503853, -0.1200498715043068, -0.1717042326927185, -0.019227758049964905, -0.11788936704397202, -0.11679866164922714, 0.050424277782440186, 0.062185097485780716, 0.04923136904835701, -0.061147067695856094, 0.1518532931804657, -0.047422297298908234, 0.060713399201631546, -0.06893875449895859, -0.06755045056343079, 0.03764858841896057, -0.12588608264923096, -0.08176055550575256, 0.05573027580976486, 0.19166934490203857, 0.15833087265491486, -0.02816431224346161, -0.03472423925995827, -0.047419581562280655, -0.006212298292666674, -0.007802055217325687, 0.0275666993111372, 0.023223137483000755, 0.07315318286418915, -0.07681374251842499, -0.11649256944656372, 0.033787861466407776, -0.06713802367448807, -0.055589709430933, -0.015439179725944996, 0.1513158082962036, 0.04671623185276985, 0.07720734924077988, -0.018946662545204163, 0.03887668624520302, -0.001724981120787561, -0.056474871933460236, 0.16197094321250916, 0.03885216265916824, -0.05193585529923439, 0.06837689876556396, 0.053174007683992386, 0.043745119124650955, 0.03011113777756691, -0.026783017441630363, 0.206032395362854, 0.1980147808790207, 0.014206883497536182, 0.2175983190536499, 0.03177616000175476, -0.03772832080721855, -0.1300560086965561, -0.065880686044693, -0.006372632458806038, 0.03559038043022156, 0.08070417493581772, -0.18207235634326935, -0.015011128038167953, -0.05689644813537598, -0.034518610686063766, -0.15059494972229004, -0.28553900122642517, -0.05957856774330139, 0.20075850188732147, 0.14706264436244965, 0.27519428730010986, -0.10432573407888412, 0.035197313874959946, 0.02663275972008705, -0.04912831634283066, -0.006501141935586929, 0.00018665487004909664, 0.10268618166446686, -0.15421873331069946, 0.1176437959074974, 0.08486983180046082, -0.019002694636583328, 0.01058861706405878, -0.1619086116552353, 0.00936629343777895, -0.12191236019134521, 0.05354422330856323, 0.1400289237499237, -0.048128653317689896, -0.054873593151569366, 0.14033560454845428, -0.024562934413552284, -0.22685599327087402, -0.04648222774267197, -0.043600670993328094, -0.010640020482242107, 0.026607351377606392, -0.1013401448726654, 0.04101909324526787, 0.1330099105834961, 0.009380043484270573, 0.1147187277674675, 0.11749245226383209, -0.052566803991794586, 0.10792597383260727, 0.2257719188928604, -0.018785694614052773, 0.04689010605216026, -0.12743118405342102, -0.0012336712097749114, -0.028270328417420387, 0.013657891191542149, -0.09504974633455276, -0.09938385337591171, 0.02366873063147068, 0.02872389927506447, 0.009118586778640747, 0.0921793207526207, -0.029922157526016235, 0.0759170651435852, 0.06817561388015747, -0.13014446198940277, -0.16288450360298157, 0.015828335657715797, -0.007344507612287998, 0.08354310691356659, 0.00027861111448146403, 0.08878035843372345, -0.11932205408811569, -0.018093237653374672, -0.03153328225016594, -0.03319635987281799, -0.130486860871315, -0.07138993591070175, 0.06156524643301964, 0.028095467016100883, -0.06602972000837326, 0.1398407518863678, 0.026440169662237167, 0.15942534804344177, 0.049197953194379807, 0.012499804608523846, 0.07227300107479095, -0.05345509201288223, 0.1283530443906784, 0.13818155229091644, -0.00868943240493536, -0.05460423603653908, -0.1013643890619278, -0.10236792266368866, 0.08925779908895493, -0.05773641914129257, 0.07476430386304855, -0.14885357022285461, -0.06675903499126434, 0.015772046521306038, 0.016141414642333984, -0.09562095999717712, 0.02571965754032135, -0.01625603251159191, -0.18119946122169495, 0.056570518761873245, -0.048285093158483505, 0.0440407395362854, -0.06347788125276566, -0.1110161691904068, -0.17226378619670868, 0.06091433763504028, 0.08593481779098511, -0.053876690566539764, -0.12229149043560028, 0.011023230850696564, -0.00012518465518951416, -0.06341652572154999, -0.05023367330431938, 0.09722746908664703, -0.11020902544260025, 0.031452205032110214, -0.012567701749503613, 0.08853451162576675, -0.03510405123233795, -0.011538895778357983, 0.044220831245183945, -0.08039166033267975, -0.009481523185968399, 0.03534642979502678, -0.026372017338871956, -0.04127239063382149, -0.2689029574394226, 0.0036654395516961813, 0.0341104120016098, 0.02497158572077751, 0.07856601476669312, 0.011906822212040424, 0.021174922585487366, 0.03993808850646019, -0.15396519005298615, -0.013395369984209538, 0.14574195444583893, -0.07689505815505981, -0.022186370566487312, 0.05703273415565491, -0.09054436534643173, 0.013882770203053951, -0.030287226662039757, 0.1345842480659485, 0.023923413828015327, 0.06404478847980499, -0.0851147472858429, 0.10106813907623291, -0.1451139897108078, -0.04998219385743141, -0.01244612317532301, 0.09761348366737366, 0.07019034773111343, -0.10272270441055298, 0.014697125181555748, 0.04210108891129494, 0.19416837394237518, 0.016384804621338844, -0.0356343574821949, -0.03396720811724663, 0.004015897400677204, 0.22076453268527985, 0.03044266067445278, 0.10457023978233337, 0.07281364500522614, -0.026583973318338394, 0.12624378502368927, 0.09929762035608292, 0.11280370503664017, -0.055645186454057693, 0.13904185593128204, 0.04667386785149574, 0.038641396909952164, 0.0614289753139019, 0.06836545467376709, 0.09098632633686066, -0.0008288522367365658, 0.1138714924454689, 0.013811973854899406, -0.02422109805047512, -0.021335409954190254, 0.17759373784065247, 0.10501719266176224, -0.14769648015499115, 0.029047364369034767, -0.01258957851678133, 0.039933037012815475, -0.014194529503583908, -0.15634691715240479, -0.07240267097949982, -0.3315149247646332, 0.1226184144616127, -0.07119352370500565, 0.019930170848965645, 0.007913772016763687, -0.037425633519887924, -0.03296699747443199, -0.04477746784687042, 0.13151589035987854, -0.013641550205647945, -0.006079165264964104, -0.04815853759646416, -0.015360191464424133, -0.11607866734266281, -0.11200575530529022, -0.013207737356424332, -0.13671602308750153, -0.010119039565324783, 0.05595948174595833, 0.003977729007601738, 0.01821410097181797, -0.03142618387937546, 0.0024383175186812878, 0.06541839241981506, -0.05751744285225868, 0.056182678788900375, 0.12097269296646118, 0.08766137808561325, -0.1058853268623352, 0.031048951670527458, 0.2011747509241104, 0.04359564557671547, -0.12483977526426315, 0.01449228823184967, 0.1819491684436798, 0.004885740112513304, 0.017068125307559967, -0.006097703706473112, -0.0540788508951664, -0.07554277032613754, 0.1251034289598465, 0.08296554535627365, -0.09985227137804031, 0.015833314508199692, -0.0726347416639328, -0.01594804972410202, -0.06374675035476685, 0.10130585730075836, 0.09538925439119339, 0.04440245032310486, -0.10621760785579681, -0.08487539738416672, -0.10891728103160858, 0.040588874369859695, -0.08629853278398514, -0.07311757653951645, 0.09629398584365845, -0.07057105004787445, -0.07029950618743896, 0.025521177798509598, -0.17978744208812714, -0.009467960335314274, 0.1711762249469757, -0.24654000997543335, -0.0916430801153183, -0.10857923328876495, 0.14477859437465668, 0.016497576609253883, 0.1013975441455841, -0.006207061931490898, -0.007889035157859325, -0.20577777922153473, 0.024890204891562462, -0.05293011665344238, -0.02073732763528824, 0.07814782857894897, -0.09476397186517715, 0.22629831731319427, -0.08276885002851486, 0.020940175279974937, 0.012659613974392414, 0.0870661810040474, -0.030675338581204414, 0.09283176809549332, -0.03660329803824425, -0.12576518952846527, -0.03620953485369682, 0.03001813031733036, 0.013904244638979435, 0.10071761906147003, 0.09772487729787827, -0.03414725139737129, 0.03389119729399681, 0.09747414290904999, 0.04172342270612717, -0.023843804374337196, 0.0360250361263752, -0.17077107727527618, 0.02182629331946373, -0.018498148769140244, -0.06935930997133255, 0.03687669709324837, -0.06603235751390457, 0.1639697551727295, 0.04022442549467087, 0.0670473501086235, -0.036152735352516174, 0.0073931049555540085, -0.014454689808189869, -0.013775371946394444, -0.026180334389209747, -0.17259705066680908, -0.10422050207853317, -0.1347656100988388, -0.012701659463346004, -0.034971047192811966, 0.04591470584273338, 0.023234914988279343, -0.0003200018545612693, -0.014577031135559082, -0.12090865522623062, 0.04360328987240791, 0.11146783083677292, -0.04631396010518074, -0.026193076744675636 ]
null
null
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
text-generation
Ruiz3/phi-2-kingshipAIv2
[ "transformers", "safetensors", "phi", "text-generation", "custom_code", "arxiv:1910.09700", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2024-02-13T22:15:46+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #phi #text-generation #custom_code #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #phi #text-generation #custom_code #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ 51, 6, 3, 82, 28, 3, 4, 9, 9, 10, 42, 20, 3, 4, 5, 9, 11, 13, 3, 12, 5, 4, 5, 3, 4, 9, 53, 9, 8, 6, 3, 14, 8, 7, 9, 4 ]
[ "passage: TAGS\n#transformers #safetensors #phi #text-generation #custom_code #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact" ]
[ -0.07441530376672745, 0.13596796989440918, -0.0039025098085403442, 0.027505790814757347, 0.12266930937767029, 0.005486504640430212, 0.0640522912144661, 0.10635033994913101, -0.02424517273902893, 0.12324316799640656, 0.022558843716979027, 0.10996841639280319, 0.10686804354190826, 0.18769802153110504, -0.005247652996331453, -0.20407868921756744, 0.052618369460105896, -0.1339786946773529, -0.005346748046576977, 0.12320936471223831, 0.12734514474868774, -0.11981219798326492, 0.07206547260284424, -0.04122542217373848, -0.006963358726352453, -0.03489234298467636, -0.05797455459833145, -0.048964742571115494, 0.06430605798959732, 0.06026479974389076, 0.0595458559691906, 0.01642736792564392, 0.09416768699884415, -0.2770709991455078, 0.02226952090859413, 0.08239509910345078, 0.005618265364319086, 0.06627916544675827, 0.07169265300035477, -0.07693344354629517, 0.08495151996612549, -0.0664825439453125, 0.14687298238277435, 0.07982858270406723, -0.09438081085681915, -0.1880333423614502, -0.09057903289794922, 0.09343822300434113, 0.18868158757686615, 0.059760916978120804, -0.030585795640945435, 0.13118816912174225, -0.06512762606143951, 0.01845625787973404, 0.06895748525857925, -0.07347869127988815, -0.05359777435660362, 0.06372594088315964, 0.0695996955037117, 0.09846188873052597, -0.12773703038692474, -0.009679428301751614, 0.0320039726793766, 0.013185882940888405, 0.10711356997489929, 0.015842726454138756, 0.12049184739589691, 0.03668152913451195, -0.14256839454174042, -0.048429738730192184, 0.08701564371585846, 0.036932192742824554, -0.0556931346654892, -0.24876362085342407, -0.020193742588162422, -0.038236599415540695, -0.035524725914001465, -0.04393884912133217, 0.045244693756103516, -0.02178187482059002, 0.08087658137083054, -0.0036660772748291492, -0.06963636726140976, -0.05113578215241432, 0.08353596180677414, 0.07143381237983704, 0.028143590316176414, -0.026850108057260513, 0.01174293551594019, 0.11898057907819748, 0.11549804359674454, -0.11481842398643494, -0.051060616970062256, -0.06272978335618973, -0.08522246032953262, -0.04741254821419716, 0.03236919641494751, 0.04775122553110123, 0.05512697622179985, 0.21314972639083862, -0.0013204539427533746, 0.04992509260773659, 0.03274988383054733, 0.01066634152084589, 0.06731437146663666, 0.08677016198635101, -0.06419411301612854, -0.13046035170555115, -0.02145533822476864, 0.11218346655368805, 0.01231331005692482, -0.0314481221139431, -0.038787633180618286, 0.06697112321853638, 0.030200589448213577, 0.12535712122917175, 0.07337819784879684, 0.02016271837055683, -0.07914318144321442, -0.06071794033050537, 0.17465178668498993, -0.16488726437091827, 0.031738489866256714, 0.025622278451919556, -0.050521157681941986, -0.018783841282129288, 0.01899137534201145, 0.016399497166275978, -0.02034466527402401, 0.0874326080083847, -0.057896632701158524, -0.03757679834961891, -0.11592794209718704, -0.05162535607814789, 0.026356054469943047, 0.005852686706930399, -0.030844759196043015, -0.04131776839494705, -0.11982329934835434, -0.07797703891992569, 0.07937561720609665, -0.06747156381607056, -0.04716470465064049, -0.03360702842473984, -0.07821470499038696, 0.012420105747878551, 0.0004091960145160556, 0.11594925820827484, -0.030400289222598076, 0.05072459205985069, -0.05188135802745819, 0.07153815776109695, 0.14497826993465424, 0.027334017679095268, -0.06379645317792892, 0.056507088243961334, -0.23278117179870605, 0.10387995839118958, -0.07912862300872803, 0.04115021973848343, -0.16289231181144714, -0.01469900831580162, 0.03982052579522133, 0.02627355046570301, -0.006934128236025572, 0.1390630006790161, -0.19293013215065002, -0.03536631911993027, 0.18081475794315338, -0.11522960662841797, -0.08808764070272446, 0.052698906511068344, -0.054467298090457916, 0.12280778586864471, 0.050916753709316254, -0.02316596917808056, 0.030826477333903313, -0.1417923867702484, -0.01687287911772728, -0.06468956172466278, -0.025315633043646812, 0.15359161794185638, 0.05711430311203003, -0.051229238510131836, 0.052433162927627563, 0.020726939663290977, -0.021120568737387657, -0.04788779839873314, -0.03371148556470871, -0.09498532861471176, 0.009590674191713333, -0.0753822848200798, 0.01856466382741928, -0.029109936207532883, -0.09291869401931763, -0.035687901079654694, -0.15461255609989166, 0.005384957883507013, 0.09633282572031021, -0.0055311573669314384, -0.025023676455020905, -0.10565053671598434, -0.004781804513186216, 0.016855718567967415, -0.00016373902326449752, -0.15187151730060577, -0.05655598267912865, 0.019132478162646294, -0.16505438089370728, 0.02767706662416458, -0.047599732875823975, 0.045580726116895676, 0.04182368889451027, -0.03936924785375595, -0.03521854057908058, 0.018329832702875137, 0.020268244668841362, -0.01465248316526413, -0.2745935916900635, -0.018261034041643143, -0.04086336866021156, 0.17035499215126038, -0.2475264072418213, 0.04439546912908554, 0.059322256594896317, 0.1307375431060791, 0.01173730194568634, -0.03705551475286484, 0.03161298856139183, -0.06255152821540833, -0.033266592770814896, -0.0667189359664917, -0.009362515062093735, -0.03631554916501045, -0.03938153013586998, 0.03835300728678703, -0.17000539600849152, -0.03406575322151184, 0.11603987962007523, 0.04540814831852913, -0.15477602183818817, -0.05056281015276909, -0.04004296287894249, -0.05738358572125435, -0.07204438000917435, -0.05216284841299057, 0.09743187576532364, 0.05571887642145157, 0.05545826256275177, -0.05951985344290733, -0.061445724219083786, 0.009554018266499043, -0.02068808674812317, -0.01867086999118328, 0.08103828877210617, 0.07077015191316605, -0.11522045731544495, 0.09927672892808914, 0.08799781650304794, 0.08139653503894806, 0.10005104541778564, 0.0010666352463886142, -0.09291157126426697, -0.02339044213294983, 0.027750657871365547, 0.014336124062538147, 0.14687193930149078, -0.04000900313258171, 0.0429966077208519, 0.04120675474405289, -0.01584675721824169, 0.008143718354403973, -0.09446796029806137, 0.02997143194079399, 0.02818182110786438, -0.010246568359434605, 0.037614114582538605, -0.056816551834344864, 0.019303709268569946, 0.10318583995103836, 0.03345242142677307, 0.04412994161248207, 0.009559271857142448, -0.04857930168509483, -0.11974377185106277, 0.1767151653766632, -0.11110951006412506, -0.23173309862613678, -0.12149907648563385, -0.01399032212793827, 0.02910485304892063, -0.011633564718067646, 0.02006695233285427, -0.06405475735664368, -0.1171390563249588, -0.09921693801879883, 0.045627593994140625, 0.07059439271688461, -0.08641253411769867, -0.06378761678934097, 0.06134819984436035, 0.04600827768445015, -0.13516096770763397, 0.02277415804564953, 0.03902184218168259, -0.08909334987401962, 0.007867258042097092, 0.07907920330762863, 0.07120607048273087, 0.17945720255374908, 0.012182043865323067, -0.024257373064756393, 0.019671371206641197, 0.20478034019470215, -0.13766378164291382, 0.10145840793848038, 0.14393620193004608, -0.06286554783582687, 0.08066798746585846, 0.20545852184295654, 0.036268092691898346, -0.1057758554816246, 0.044006094336509705, 0.03648979216814041, -0.02651887945830822, -0.24340394139289856, -0.08019697666168213, 0.004161624237895012, -0.06197261065244675, 0.08161719888448715, 0.08306818455457687, 0.09198566526174545, 0.02785661816596985, -0.1081320270895958, -0.06691340357065201, 0.05036139488220215, 0.11249116063117981, -0.008557078428566456, -0.007815919816493988, 0.09523359686136246, -0.02225065603852272, 0.029176659882068634, 0.09147068858146667, 0.01374965999275446, 0.18282483518123627, 0.045852772891521454, 0.14848409593105316, 0.09157159924507141, 0.059395745396614075, 0.01233623269945383, 0.01314469799399376, 0.019094863906502724, 0.026712998747825623, -0.015145753510296345, -0.08685000985860825, -0.012303872965276241, 0.1268911212682724, 0.010885220021009445, 0.04597875103354454, 0.0076150596141815186, -0.04230163246393204, 0.08450151234865189, 0.17545753717422485, 0.01328173466026783, -0.21406996250152588, -0.06688741594552994, 0.06981010735034943, -0.08051439374685287, -0.10911136865615845, -0.024429909884929657, 0.03406251221895218, -0.18049609661102295, 0.02387341856956482, -0.025180401280522346, 0.10069414228200912, -0.12370731681585312, -0.018827902153134346, 0.052628833800554276, 0.07052139192819595, -0.018700284883379936, 0.06386490911245346, -0.17778280377388, 0.13549263775348663, 0.013200430199503899, 0.07557245343923569, -0.09068016707897186, 0.08482389152050018, 0.0111467270180583, -0.002043683547526598, 0.1468254029750824, -0.0010637122904881835, -0.05408002436161041, -0.11050406098365784, -0.0906725600361824, -0.011339336633682251, 0.11465787142515182, -0.12593887746334076, 0.10165182501077652, -0.016582757234573364, -0.044178079813718796, -0.0030248011462390423, -0.12813955545425415, -0.14044401049613953, -0.17314541339874268, 0.04187968373298645, -0.13014033436775208, 0.0451013408601284, -0.10672678053379059, -0.05035872012376785, -0.05017208307981491, 0.19719818234443665, -0.21763156354427338, -0.07621806859970093, -0.15351133048534393, -0.06420157849788666, 0.11623851954936981, -0.04613782465457916, 0.08647869527339935, 0.012962628155946732, 0.18781377375125885, 0.014061033725738525, -0.015962716192007065, 0.10993410646915436, -0.10395599156618118, -0.21440888941287994, -0.10220180451869965, 0.13403694331645966, 0.13545545935630798, 0.03708728775382042, 0.00035940390080213547, 0.03232092037796974, -0.007850716821849346, -0.11358384788036346, 0.023570599034428596, 0.18197759985923767, 0.11685380339622498, 0.037179335951805115, -0.034665536135435104, -0.13531899452209473, -0.0839521661400795, -0.042324043810367584, 0.008525622077286243, 0.18976294994354248, -0.06857912987470627, 0.1652597337961197, 0.15934355556964874, -0.055173277854919434, -0.21036414802074432, 0.0313970185816288, 0.033629804849624634, 0.0021239151246845722, 0.05604655668139458, -0.20132838189601898, 0.0957157164812088, 0.00788893923163414, -0.057729288935661316, 0.12271249294281006, -0.18383583426475525, -0.14666838943958282, 0.0679788589477539, 0.07568002492189407, -0.18666845560073853, -0.12836617231369019, -0.09530680626630783, -0.04426150023937225, -0.1240062341094017, 0.0767902210354805, -0.019116053357720375, 0.009703016839921474, 0.03049294650554657, 0.017553992569446564, 0.010632803663611412, -0.04766656085848808, 0.18440434336662292, -0.005318623501807451, 0.050052255392074585, -0.07833196222782135, -0.05977580323815346, 0.04439995810389519, -0.06766178458929062, 0.07768969982862473, -0.011583259329199791, 0.012072126381099224, -0.10826653987169266, -0.05835650488734245, -0.03404201939702034, 0.024099772796034813, -0.08059826493263245, -0.09612218290567398, -0.037487708032131195, 0.09951330721378326, 0.09140417724847794, -0.03928857669234276, -0.06511175632476807, -0.08731205761432648, 0.032564677298069, 0.21537625789642334, 0.17581914365291595, 0.05872897058725357, -0.06627403944730759, -0.004332480486482382, -0.013938636519014835, 0.0518467053771019, -0.20769350230693817, 0.054770588874816895, 0.037577200680971146, 0.03502080589532852, 0.11540760844945908, -0.02692747861146927, -0.15991008281707764, -0.04947725683450699, 0.054892536252737045, -0.07749488949775696, -0.16381342709064484, 0.014566776342689991, 0.0698530450463295, -0.15249542891979218, -0.023638471961021423, 0.04465564340353012, -0.019799569621682167, -0.033337272703647614, 0.003100043162703514, 0.08220522850751877, 0.016223285347223282, 0.09557998180389404, 0.05498101934790611, 0.09563100337982178, -0.10772447288036346, 0.06952618062496185, 0.07929857820272446, -0.10225345939397812, 0.03691693767905235, 0.06498466432094574, -0.07187005877494812, -0.035860974341630936, 0.04298849776387215, 0.09064827114343643, 0.03834117203950882, -0.05791795626282692, 0.006768029183149338, -0.1019822508096695, 0.058791015297174454, 0.11939579248428345, 0.043310075998306274, 0.008954020217061043, 0.03660706803202629, 0.039979103952646255, -0.09563424438238144, 0.12395429611206055, 0.04702746868133545, 0.03306521847844124, -0.05115185305476189, -0.030670443549752235, 0.033765602856874466, -0.03013032302260399, -0.016100779175758362, -0.04014170542359352, -0.06690575927495956, -0.012527769431471825, -0.17551913857460022, 0.004580955021083355, -0.05460330843925476, 0.004814730025827885, 0.01808975264430046, -0.030708830803632736, 0.006335208658128977, 0.01802607998251915, -0.07054036855697632, -0.05601222440600395, -0.007826367393136024, 0.10339196026325226, -0.17446856200695038, 0.014777671545743942, 0.07612703740596771, -0.12507562339305878, 0.0856013149023056, 0.019583139568567276, 0.0037027799990028143, 0.030126722529530525, -0.12974882125854492, 0.04562569037079811, -0.008532330393791199, 0.0110886599868536, 0.04916682839393616, -0.2144635170698166, -0.00006298656080616638, -0.048669952899217606, -0.06047982722520828, -0.008075869642198086, -0.022350680083036423, -0.11814062297344208, 0.10634202510118484, 0.011746841482818127, -0.07358573377132416, -0.025499174371361732, 0.039712198078632355, 0.09615115076303482, -0.03705960139632225, 0.15897074341773987, -0.017435546964406967, 0.06202258542180061, -0.18289825320243835, -0.022887926548719406, -0.01893971487879753, 0.02133350446820259, -0.041365109384059906, -0.009923189878463745, 0.053397808223962784, -0.0207329411059618, 0.20566853880882263, -0.017383437603712082, 0.03804474696516991, 0.06314582377672195, -0.013027135282754898, -0.014618457295000553, 0.10731750726699829, 0.048464275896549225, 0.011181623674929142, 0.02981831505894661, 0.010703807696700096, -0.03417062386870384, -0.005388245452195406, -0.1523161679506302, 0.077239029109478, 0.16865576803684235, 0.08118049800395966, -0.008775852620601654, 0.054247766733169556, -0.11114349961280823, -0.11427582800388336, 0.0913289338350296, -0.056236591190099716, -0.013130106031894684, -0.05957726389169693, 0.14178913831710815, 0.15433260798454285, -0.1893889456987381, 0.06022936478257179, -0.06838760524988174, -0.04934924468398094, -0.10593041032552719, -0.172664612531662, -0.05883026123046875, -0.05516333505511284, -0.0200477484613657, -0.055225640535354614, 0.065329410135746, 0.09025082737207413, 0.016052771359682083, 0.013859162107110023, 0.08703679591417313, -0.017933277413249016, 0.0062139625661075115, 0.030186165124177933, 0.06423898041248322, 0.011570695787668228, -0.04630725830793381, 0.008485673926770687, -0.0025986286345869303, 0.032309141010046005, 0.051715679466724396, 0.03794151917099953, -0.025615144520998, 0.009215989150106907, -0.028480391949415207, -0.11046841740608215, 0.039721209555864334, -0.025451408699154854, -0.0637175589799881, 0.14837713539600372, 0.02840440161526203, -0.0067321667447686195, -0.023941997438669205, 0.2540489137172699, -0.07580790668725967, -0.08472823351621628, -0.1372121125459671, 0.14538848400115967, -0.03086796961724758, 0.06126725301146507, 0.039360348135232925, -0.11396372318267822, 0.03298129141330719, 0.1398884356021881, 0.14529621601104736, -0.051987502723932266, 0.017118316143751144, 0.014656228013336658, 0.0029644023161381483, -0.038789160549640656, 0.05346855893731117, 0.06594305485486984, 0.1253807246685028, -0.05399900674819946, 0.08479224145412445, -0.004516707267612219, -0.10068650543689728, -0.03351442143321037, 0.1215558722615242, -0.0051100910641252995, 0.021343640983104706, -0.07243939489126205, 0.12681175768375397, -0.04128829762339592, -0.26241305470466614, 0.06438590586185455, -0.06042512133717537, -0.14796359837055206, -0.0250939279794693, 0.042521025985479355, -0.005583211313933134, 0.02851193957030773, 0.06835387647151947, -0.06482896208763123, 0.1890970915555954, 0.037300046533346176, -0.05160627141594887, -0.06661834567785263, 0.07295969873666763, -0.10394251346588135, 0.2989484369754791, 0.006831952836364508, 0.056051596999168396, 0.10134656727313995, -0.031050942838191986, -0.14268989861011505, 0.031161298975348473, 0.08590974658727646, -0.06741449981927872, 0.055566806346178055, 0.2119295448064804, -0.00888585951179266, 0.10927559435367584, 0.07252391427755356, -0.08921248465776443, 0.04874938353896141, -0.10592100024223328, -0.09221301972866058, -0.08717742562294006, 0.09235067665576935, -0.0555795282125473, 0.14914315938949585, 0.12058565765619278, -0.04728509485721588, 0.021935351192951202, -0.021421335637569427, 0.050473760813474655, 0.004591043572872877, 0.12314941734075546, 0.022128529846668243, -0.19706779718399048, 0.026956038549542427, -0.0006028058123774827, 0.10154236108064651, -0.21685825288295746, -0.09420528262853622, 0.04897189885377884, 0.00334720266982913, -0.06139703094959259, 0.1250954121351242, 0.05242267996072769, 0.041330695152282715, -0.046910692006349564, -0.030460618436336517, -0.00651969201862812, 0.1657109558582306, -0.10882015526294708, -0.004513995256274939 ]
null
null
null
Compiled with `runwayml/stable-diffusion-inpainting` for Neuronx devices.
{"license": "creativeml-openrail-m"}
null
Jingya/stable-diffusion-inpainting-neuronx
[ "license:creativeml-openrail-m", "region:us" ]
2024-02-13T22:19:19+00:00
[]
[]
TAGS #license-creativeml-openrail-m #region-us
Compiled with 'runwayml/stable-diffusion-inpainting' for Neuronx devices.
[]
[ "TAGS\n#license-creativeml-openrail-m #region-us \n" ]
[ 18 ]
[ "passage: TAGS\n#license-creativeml-openrail-m #region-us \n" ]
[ -0.07587551325559616, 0.1441737711429596, -0.0062791393138468266, 0.012048184871673584, -0.001431003911420703, -0.022854028269648552, 0.2091037780046463, -0.018623588606715202, 0.08854977041482925, -0.11491455882787704, 0.14648450911045074, 0.18939465284347534, -0.10384178161621094, 0.0838744044303894, -0.061768148094415665, -0.13200531899929047, 0.029243366792798042, -0.07651498913764954, -0.0865340456366539, 0.028722204267978668, 0.056829702109098434, -0.01273291651159525, -0.003666024887934327, -0.0012952570104971528, -0.11045186221599579, 0.07173702865839005, -0.029841862618923187, -0.037320639938116074, 0.060927797108888626, -0.04866224527359009, 0.04899880662560463, 0.11812204867601395, -0.033462416380643845, -0.13358792662620544, 0.004443002864718437, -0.11795501410961151, -0.13281011581420898, 0.007506446447223425, 0.121794693171978, -0.0353701114654541, 0.12644833326339722, 0.17882929742336273, 0.0022871040273457766, 0.07042364031076431, -0.1692226231098175, -0.17680460214614868, -0.04340395703911781, -0.018681490793824196, -0.026622790843248367, 0.0532202385365963, 0.11296376585960388, 0.0959911122918129, -0.1474708467721939, 0.059626504778862, 0.08025065064430237, -0.29932230710983276, 0.03342466056346893, 0.23123668134212494, 0.11160528659820557, 0.03646189346909523, -0.04899992793798447, 0.06103713810443878, 0.037279851734638214, -0.055691562592983246, -0.011489230208098888, -0.07466674596071243, 0.033063821494579315, 0.1203068420290947, -0.048032116144895554, -0.025952165946364403, 0.3207513689994812, -0.011608880013227463, 0.004257023800164461, 0.03850623592734337, -0.046627260744571686, 0.03471478819847107, 0.053042974323034286, 0.07628075033426285, 0.05806995555758476, 0.1503586620092392, 0.06162842735648155, -0.11057397723197937, -0.12041215598583221, 0.018044639378786087, -0.14939343929290771, 0.16419777274131775, -0.05087574943900108, 0.0932750254869461, -0.11752020567655563, 0.018267955631017685, -0.0651155412197113, -0.03550999239087105, -0.010290741920471191, -0.14436741173267365, 0.09543514996767044, -0.00750720826908946, -0.044816359877586365, -0.06333030760288239, 0.06353012472391129, 0.134693443775177, 0.06326734274625778, -0.01916888915002346, 0.03110724687576294, 0.18312698602676392, 0.02453736774623394, -0.039170458912849426, 0.02620672434568405, 0.14288429915905, 0.03429737314581871, -0.1762668490409851, -0.0059744445607066154, -0.0644608810544014, -0.1936662793159485, -0.02320769429206848, -0.19997692108154297, 0.16352415084838867, -0.030033577233552933, -0.016221072524785995, -0.03707468882203102, 0.022218478843569756, 0.04353277385234833, 0.007484832778573036, 0.018807580694556236, -0.044244956225156784, -0.08294660598039627, -0.08514150232076645, -0.020517800003290176, 0.05681263282895088, 0.07853931933641434, 0.18057872354984283, -0.12033670395612717, 0.0023163571022450924, -0.04746192321181297, -0.002028648741543293, 0.10751507431268692, -0.1799560934305191, 0.05942503362894058, -0.10612065345048904, -0.21264076232910156, -0.0035186251625418663, 0.11188323050737381, 0.02211635187268257, 0.00010340322478441522, 0.023470120504498482, -0.042402785271406174, -0.03322858735918999, -0.06714189052581787, -0.09123854339122772, -0.07618846744298935, 0.0644230917096138, -0.15088342130184174, -0.06908489763736725, -0.27447474002838135, 0.021657612174749374, -0.11370886117219925, 0.030269425362348557, 0.09551744163036346, -0.08233252167701721, -0.11906278878450394, 0.24992190301418304, 0.07235409319400787, 0.07105377316474915, -0.037106942385435104, -0.02335505001246929, -0.040998950600624084, 0.07576625794172287, -0.051450882107019424, 0.006896975915879011, 0.06892602890729904, -0.05309505760669708, -0.13028347492218018, -0.018723927438259125, -0.04109232872724533, 0.13036558032035828, -0.005558064207434654, 0.30143606662750244, 0.04775548353791237, -0.18540549278259277, 0.20458267629146576, 0.13462620973587036, -0.17578788101673126, -0.3525811433792114, 0.10510481148958206, -0.08032525330781937, -0.12903624773025513, 0.02135874517261982, 0.05760384723544121, 0.08029629290103912, -0.016704760491847992, -0.03554001823067665, 0.003427563700824976, -0.061561521142721176, -0.016107140108942986, 0.031175263226032257, 0.09541988372802734, -0.08737137913703918, 0.08379733562469482, 0.03426050394773483, -0.0114505710080266, 0.14006270468235016, -0.02073829248547554, -0.0763879269361496, 0.02079492248594761, 0.04172089695930481, -0.020384199917316437, -0.056601639837026596, -0.019958069548010826, 0.024005193263292313, -0.017852509394288063, 0.10743143409490585, 0.29301881790161133, 0.0457768440246582, -0.015894168987870216, 0.050522804260253906, 0.02892244979739189, 0.031187754124403, 0.04622279107570648, 0.002081167884171009, -0.15730762481689453, 0.07284589111804962, -0.05682012811303139, -0.09314198791980743, -0.03167767822742462, -0.0017506676958873868, 0.0981268361210823, -0.05222945287823677, 0.06663653254508972, 0.04907272756099701, 0.008146014995872974, -0.0024776349309831858, 0.019724633544683456, 0.03505800664424896, 0.15693770349025726, 0.06973138451576233, -0.09330075234174728, 0.2326427847146988, -0.07795968651771545, 0.3451519012451172, 0.06519531458616257, -0.17186447978019714, 0.0015280802035704255, -0.16536928713321686, -0.08274903148412704, 0.009426575154066086, 0.06846177577972412, 0.04244798794388771, -0.06766051799058914, -0.0681324228644371, 0.1076645776629448, -0.05602144077420235, -0.05967314541339874, -0.09208252280950546, -0.06438151746988297, -0.09841792285442352, 0.11479154229164124, 0.17103825509548187, -0.17601613700389862, 0.14707137644290924, 0.31644511222839355, 0.0033473046496510506, 0.20550797879695892, -0.06598898768424988, 0.06533558666706085, -0.11870601028203964, 0.06948951631784439, -0.033792875707149506, 0.1264963299036026, -0.10152938961982727, 0.04339653253555298, 0.01719778962433338, 0.05835990980267525, 0.12580721080303192, -0.1375611275434494, -0.2047722488641739, 0.05393601953983307, 0.04846670478582382, -0.08490802347660065, 0.15654030442237854, -0.07621043175458908, 0.03958071768283844, -0.04002580791711807, -0.10932640731334686, 0.16022461652755737, -0.07396190613508224, -0.03576399013400078, 0.04601873457431793, -0.162797212600708, 0.04817049205303192, -0.13655415177345276, -0.20034807920455933, -0.03256381303071976, 0.011739566922187805, 0.09091648459434509, 0.0064963698387146, -0.045913100242614746, 0.008927296847105026, -0.1321311742067337, -0.24660253524780273, -0.10214889049530029, -0.04224977269768715, 0.1463703066110611, -0.09529456496238708, -0.08689732849597931, -0.008191614411771297, -0.027925807982683182, 0.0383632630109787, 0.0873899981379509, -0.04390016943216324, 0.15604910254478455, 0.13776685297489166, 0.03233470022678375, 0.07692384719848633, -0.0302706528455019, 0.16908830404281616, 0.07715359330177307, -0.09182680398225784, 0.09044599533081055, -0.006939579267054796, 0.07778391242027283, 0.26205286383628845, 0.13615888357162476, -0.10827198624610901, 0.0021787171717733145, -0.09298930317163467, -0.13136249780654907, -0.25473496317863464, -0.03117409534752369, -0.15477068722248077, 0.13437145948410034, -0.08579761534929276, 0.08686056733131409, 0.13696706295013428, 0.05041143670678139, 0.10572081059217453, 0.018525123596191406, -0.016791416332125664, 0.022843502461910248, 0.17746564745903015, -0.02853401191532612, -0.043541014194488525, -0.14404186606407166, -0.022182300686836243, 0.15260697901248932, 0.10192563384771347, 0.16757766902446747, 0.16616763174533844, 0.11930298805236816, 0.1956932544708252, 0.11704401671886444, 0.10304278880357742, 0.052189555019140244, -0.013531852513551712, -0.004093863070011139, -0.01228472962975502, -0.042497504502534866, 0.05230056867003441, 0.05571495369076729, 0.027585504576563835, -0.19872500002384186, 0.02184155583381653, -0.19329896569252014, -0.02313016541302204, -0.08243345469236374, 0.01644495315849781, 0.05239224433898926, 0.2096434086561203, 0.04210057109594345, 0.10118018835783005, 0.021744482219219208, 0.10573884844779968, 0.015865135937929153, -0.07006605714559555, -0.0065298317931592464, -0.024272896349430084, 0.09974277764558792, 0.10174193233251572, 0.021700428798794746, -0.016679642722010612, -0.09889253973960876, 0.04607788100838661, 0.17424549162387848, -0.17494839429855347, 0.3187439739704132, -0.0007240860140882432, -0.04524024948477745, -0.04190666601061821, -0.08219234645366669, 0.04142151027917862, 0.1647384762763977, 0.1017698273062706, 0.0333428718149662, -0.14635729789733887, -0.06874663382768631, -0.029922528192400932, -0.029125673696398735, 0.10087492316961288, -0.06689736992120743, -0.13817089796066284, -0.025579528883099556, 0.0344909206032753, 0.003919827751815319, 0.21354736387729645, -0.10228335112333298, -0.15175104141235352, 0.00922450888901949, 0.13133007287979126, -0.06745465099811554, -0.04906000941991806, 0.09594502300024033, -0.02669750526547432, 0.0972210094332695, -0.0541548989713192, 0.002656505908817053, -0.14727191627025604, -0.2363637089729309, 0.010592032223939896, -0.02335694245994091, 0.020698489621281624, -0.07203120738267899, -0.11125075072050095, -0.1240958720445633, -0.1789770871400833, 0.11374562233686447, -0.06521226465702057, 0.09276589751243591, -0.09726036339998245, 0.08684233576059341, -0.08414942771196365, 0.02816055528819561, -0.05099964141845703, -0.0012100528692826629, -0.09757094830274582, -0.14613427221775055, 0.024435222148895264, -0.13409870862960815, -0.001014217734336853, 0.034934982657432556, -0.11161556839942932, 0.14066044986248016, 0.13931402564048767, -0.08724056929349899, 0.17418785393238068, 0.42831170558929443, -0.05984934791922569, 0.25173598527908325, 0.2527628242969513, -0.13718484342098236, -0.2734082341194153, -0.059651490300893784, -0.23391994833946228, -0.08160211890935898, 0.1082993745803833, -0.1578003615140915, 0.015907390043139458, 0.05020333454012871, -0.11690597236156464, 0.1467704027891159, -0.32824045419692993, -0.07495500147342682, 0.09672868996858597, 0.007048844825476408, 0.4732857048511505, -0.1068139299750328, -0.12494277954101562, -0.07125994563102722, -0.10485164821147919, 0.10395017266273499, -0.07008004188537598, 0.08493339270353317, -0.030203424394130707, 0.025772906839847565, 0.011868835426867008, -0.04774972423911095, 0.14879614114761353, -0.0427577942609787, 0.19098854064941406, -0.11560776084661484, 0.0027590321842581034, 0.14695321023464203, -0.03108292631804943, 0.038532279431819916, -0.07178329676389694, 0.04545990377664566, -0.042950090020895004, -0.027814088389277458, -0.018928585574030876, 0.11621513217687607, -0.004339784849435091, -0.1380559802055359, -0.06945756077766418, 0.01972813345491886, -0.07362999767065048, -0.05320021137595177, 0.15675771236419678, 0.03502804413437843, 0.05609925836324692, 0.11970125883817673, 0.004991572815924883, -0.146412655711174, 0.00884049292653799, -0.07536338269710541, 0.01455683447420597, 0.04314182698726654, -0.08771193772554398, -0.050023581832647324, 0.11971840262413025, 0.021750157698988914, 0.0665673241019249, 0.06486256420612335, -0.042168524116277695, 0.02131110616028309, 0.11186312884092331, -0.12857086956501007, -0.06895474344491959, -0.017605429515242577, 0.2739332914352417, 0.20882153511047363, 0.06424131989479065, 0.011942589655518532, 0.03977527841925621, 0.08851079642772675, 0.025800030678510666, -0.024320857599377632, -0.027894796803593636, -0.07533380389213562, 0.08076632767915726, -0.026636533439159393, -0.08794095367193222, 0.1338292956352234, 0.04866079241037369, -0.0795087143778801, -0.08115667849779129, 0.10095386952161789, -0.03139214217662811, -0.0645640566945076, -0.04291141778230667, 0.16875873506069183, -0.142974391579628, -0.05379750579595566, 0.05253109708428383, -0.06923473626375198, 0.03050602227449417, 0.1983366161584854, 0.06317481398582458, 0.10652732849121094, 0.020412208512425423, -0.03693949803709984, 0.09139978885650635, -0.008889229968190193, -0.1458244025707245, 0.04242372885346413, -0.1516965925693512, -0.1209954097867012, -0.03220202773809433, 0.059742625802755356, -0.06468313187360764, -0.0443362258374691, -0.16110824048519135, 0.08512833714485168, -0.059125129133462906, -0.04787873104214668, -0.07900126278400421, -0.034204404801130295, -0.011031275615096092, -0.027199620380997658, -0.08409348875284195, 0.0068776607513427734, -0.22133535146713257, 0.051574207842350006, 0.04428314045071602, 0.017113016918301582, -0.03435007482767105, -0.08292978256940842, 0.07848229259252548, 0.04986674711108208, 0.10280575603246689, 0.03711284324526787, -0.059191394597291946, 0.0037306465674191713, -0.20414716005325317, -0.038815271109342575, 0.04232484847307205, -0.021390240639448166, 0.0267819594591856, 0.08142497390508652, -0.03312315046787262, 0.05886727198958397, -0.04134150594472885, 0.031092548742890358, -0.12302310764789581, -0.19250139594078064, -0.07369648665189743, 0.0737677738070488, -0.1768668293952942, -0.007294799666851759, -0.158339723944664, 0.12045895308256149, 0.0037357027176767588, 0.19128042459487915, 0.05877019464969635, 0.07969143241643906, 0.07085993885993958, -0.03897101804614067, 0.1005023792386055, -0.05584702640771866, -0.09622103720903397, -0.019361555576324463, -0.12480172514915466, -0.049345120787620544, 0.42032214999198914, 0.05109545961022377, -0.34862402081489563, 0.03209015727043152, 0.10416815429925919, 0.09029489010572433, 0.0010600913083180785, 0.1751212626695633, -0.02115757390856743, 0.00999172031879425, -0.09422436356544495, 0.09467131644487381, -0.0020058725494891405, -0.11290951073169708, 0.0739678293466568, 0.09658773243427277, 0.08477838337421417, -0.024424241855740547, 0.13553570210933685, -0.010457966476678848, 0.03920025750994682, -0.11343693733215332, 0.15077632665634155, 0.06773624569177628, -0.05210328474640846, 0.062154389917850494, 0.1635616272687912, 0.05306112766265869, 0.07038675248622894, 0.04032095894217491, 0.0014122785069048405, -0.1754148155450821, -0.1602102369070053, 0.02099275030195713, -0.05523645877838135, 0.07993361353874207, 0.02664482593536377, 0.06025690957903862, 0.05930217728018761, 0.08369890600442886, -0.02683570235967636, -0.012045243754982948, -0.21370548009872437, -0.059094905853271484, -0.014421275816857815, -0.06632379442453384, -0.06530799716711044, -0.13236206769943237, -0.007965253666043282, -0.11605394631624222, -0.1677420735359192, -0.11075370758771896, 0.06186629459261894, -0.03134578466415405, -0.07950954884290695, -0.1361609846353531, 0.005552724003791809, -0.051663242280483246, 0.0591781884431839, 0.020678075030446053, 0.14382748305797577, -0.055859338492155075, -0.007769476156681776, 0.03557850420475006, 0.17586101591587067, 0.03452156111598015, -0.019137056544423103, 0.05009777843952179, -0.11230028420686722, -0.013903132639825344, 0.09447801858186722, -0.05355257913470268, 0.03868480771780014, 0.05060523375868797, 0.14069905877113342, 0.3000718951225281, -0.15852685272693634, 0.022173447534441948, -0.0156106511130929, 0.027616411447525024, 0.03752091899514198, 0.10538272559642792, -0.047601912170648575, 0.30318450927734375, -0.03754459694027901, 0.015319152735173702, -0.05392564833164215, 0.03960913047194481, -0.0902356207370758, 0.13807453215122223, 0.07016881555318832, -0.1437612622976303, -0.11773919314146042, 0.13123241066932678, -0.2251790165901184, 0.21079330146312714, 0.05835592746734619, -0.018531115725636482, 0.0006959201418794692, -0.017787374556064606, 0.20127902925014496, -0.06664536148309708, 0.07648804783821106, -0.10087135434150696, -0.11177007853984833, -0.14956814050674438, 0.008278977125883102, -0.3149573504924774, -0.07720612734556198, 0.10045251995325089, 0.1509818434715271, 0.17898774147033691, -0.022407056763768196, 0.060840118676424026, 0.03429623693227768, 0.016734736040234566, -0.09003262221813202, 0.09443855285644531, 0.08975303173065186, -0.14206120371818542, -0.09327292442321777, -0.12793666124343872, -0.015153053216636181, -0.009946417063474655, -0.008153465576469898, 0.0022670275066047907, 0.04026666656136513, 0.12014163285493851, -0.04463301971554756, -0.05576737970113754, 0.06202622875571251, -0.09607529640197754, 0.03486022725701332, -0.03752650320529938, 0.012558498419821262, -0.07468373328447342, -0.03885192796587944, -0.04395401477813721, 0.06765811145305634, -0.2736577093601227, -0.04237256944179535, 0.10482975840568542, -0.0006625195383094251, 0.22920070588588715, 0.053381726145744324, -0.108866386115551, -0.028044672682881355, -0.11392955482006073, 0.06305203586816788, -0.12086670845746994, -0.0018355880165472627, 0.1538183093070984, 0.022182224318385124, 0.03804173693060875, -0.16429899632930756, 0.040075428783893585, -0.10011276602745056, -0.03175477311015129, -0.06921384483575821 ]
null
null
transformers
[<img src="https://raw.githubusercontent.com/OpenAccess-AI-Collective/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/OpenAccess-AI-Collective/axolotl) An instruct based fine tune of [migtissera/Tess-10.7B-v1.5b](https://huggingface.co/migtissera/Tess-10.7B-v1.5b). It works well with long system prompts. It isn't generic in a sense that it shouldn't be used for story telling, for example, but only for reasoning and text comprehension. This model is trained on a private dataset. The high GSM8K score is **NOT** because of the MetaMath dataset. # Prompt Format: ``` SYSTEM: <ANY SYSTEM CONTEXT> USER: ASSISTANT: ```
{"license": "apache-2.0", "metrics": ["accuracy"], "base_model": "migtissera/Tess-10.7B-v1.5b", "inference": false}
text-generation
Mihaiii/Bucharest-0.1
[ "transformers", "safetensors", "llama", "text-generation", "conversational", "base_model:migtissera/Tess-10.7B-v1.5b", "license:apache-2.0", "autotrain_compatible", "text-generation-inference", "region:us" ]
2024-02-13T22:20:37+00:00
[]
[]
TAGS #transformers #safetensors #llama #text-generation #conversational #base_model-migtissera/Tess-10.7B-v1.5b #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us
<img src="URL alt="Built with Axolotl" width="200" height="32"/> An instruct based fine tune of migtissera/Tess-10.7B-v1.5b. It works well with long system prompts. It isn't generic in a sense that it shouldn't be used for story telling, for example, but only for reasoning and text comprehension. This model is trained on a private dataset. The high GSM8K score is NOT because of the MetaMath dataset. # Prompt Format:
[ "# Prompt Format:" ]
[ "TAGS\n#transformers #safetensors #llama #text-generation #conversational #base_model-migtissera/Tess-10.7B-v1.5b #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us \n", "# Prompt Format:" ]
[ 72, 6 ]
[ "passage: TAGS\n#transformers #safetensors #llama #text-generation #conversational #base_model-migtissera/Tess-10.7B-v1.5b #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us \n# Prompt Format:" ]
[ -0.0667615607380867, 0.0492016039788723, -0.005820530466735363, 0.009457980282604694, 0.0832534059882164, -0.026303747668862343, 0.21226821839809418, 0.09660342335700989, -0.06850859522819519, -0.04509070888161659, 0.12288939207792282, 0.12926027178764343, 0.0017344342777505517, 0.1447494924068451, -0.08972685784101486, -0.1547650843858719, 0.0960673913359642, -0.03715558722615242, 0.023672012612223625, 0.087949737906456, 0.13172438740730286, -0.011691070161759853, 0.07227994501590729, -0.05237778276205063, -0.022380467504262924, 0.03776712343096733, 0.0503731369972229, -0.10288827866315842, 0.07476606965065002, 0.06246991828083992, 0.03417700529098511, 0.08527399599552155, 0.013927011750638485, -0.19002848863601685, 0.028932495042681694, 0.009844033047556877, -0.06935075670480728, 0.03486566245555878, 0.00843476876616478, -0.025301393121480942, 0.07541759312152863, -0.007611300330609083, -0.03111089952290058, 0.07470235973596573, -0.09449689090251923, -0.04125212877988815, -0.060989491641521454, 0.023644549772143364, 0.10487734526395798, 0.07879094779491425, 0.0041889771819114685, 0.11355999112129211, -0.019846992567181587, 0.08104651421308517, 0.0991344228386879, -0.3066892921924591, -0.01569933444261551, 0.10184647142887115, 0.04592043161392212, 0.1578555852174759, -0.01863047480583191, 0.0606587752699852, 0.07732006907463074, -0.03215279430150986, 0.024802522733807564, -0.06037675961852074, -0.10203588008880615, 0.022780777886509895, -0.0915650948882103, -0.014048561453819275, 0.34758779406547546, 0.00989597849547863, -0.017053445801138878, 0.005966647062450647, -0.07146203517913818, 0.042726997286081314, -0.036764826625585556, 0.04303533211350441, 0.03125224635004997, 0.09017525613307953, 0.06411805003881454, -0.08603981882333755, -0.12773708999156952, -0.060160230845212936, -0.14144140481948853, 0.05753654986619949, 0.004059222061187029, 0.05262307450175285, -0.12190323323011398, 0.01686909981071949, -0.0481673888862133, -0.11987601220607758, -0.008841149508953094, -0.07575637847185135, 0.08735644072294235, 0.023458603769540787, -0.03845939040184021, -0.09798724204301834, 0.1706831455230713, 0.17410439252853394, 0.041415195912122726, 0.03113267756998539, -0.09582138061523438, 0.06120609492063522, -0.04181018844246864, 0.04071084037423134, 0.01679825410246849, -0.03807198256254196, 0.13317830860614777, 0.03196816146373749, 0.12115273624658585, -0.012366382405161858, -0.16674871742725372, 0.024555139243602753, 0.003297380870208144, 0.1332901269197464, 0.06173960864543915, 0.0927991271018982, -0.01822093315422535, 0.028430156409740448, 0.12249957025051117, -0.09443935751914978, -0.05023004859685898, -0.0051699429750442505, 0.022342579439282417, 0.008058344013988972, 0.053695231676101685, 0.03944159299135208, -0.025021139532327652, -0.06735316663980484, -0.047416072338819504, -0.046156492084264755, -0.039734624326229095, -0.0464618019759655, 0.03855373337864876, -0.06109074130654335, 0.004771757870912552, -0.17044319212436676, -0.2648060917854309, 0.01283093448728323, 0.03761496767401695, -0.012894638814032078, -0.07302844524383545, -0.033551912754774094, -0.034687578678131104, 0.01898358203470707, -0.06541788578033447, -0.0503978505730629, -0.09223446249961853, 0.0654548853635788, -0.02341398596763611, 0.05550781264901161, -0.19468730688095093, 0.041393209248781204, -0.1067432090640068, 0.006827783305197954, -0.05774137005209923, 0.027297263965010643, -0.1021437868475914, 0.16556838154792786, -0.06161380559206009, 0.04077814146876335, -0.010071425698697567, 0.02167360857129097, -0.01723497174680233, 0.177653968334198, -0.18887780606746674, -0.010863961651921272, 0.17842544615268707, -0.15790782868862152, -0.19562146067619324, 0.08813022077083588, 0.036951757967472076, 0.0674525797367096, 0.10245611518621445, 0.17253142595291138, 0.03974981978535652, -0.06828606128692627, 0.08999764919281006, 0.0933963879942894, -0.025508154183626175, -0.12295679748058319, 0.057601988315582275, -0.04836403205990791, -0.14371879398822784, 0.052878543734550476, -0.01697651855647564, 0.07486513257026672, 0.016354769468307495, -0.08225110918283463, -0.06633041054010391, -0.08928040415048599, -0.018969304859638214, -0.05630774050951004, 0.00877184234559536, -0.08427271991968155, -0.005430325400084257, 0.013226861134171486, 0.06253113597631454, 0.020036088302731514, -0.00035507086431607604, -0.06467902660369873, 0.06266552954912186, -0.01752348057925701, 0.06451866775751114, -0.08423545956611633, -0.04767867550253868, -0.019720571115612984, 0.05521466210484505, 0.05877422168850899, 0.01129127386957407, 0.04891545698046684, -0.01307392306625843, -0.032440632581710815, -0.017540613189339638, 0.12015931308269501, 0.04161373898386955, -0.0325523316860199, -0.15326917171478271, 0.10069335252046585, -0.03785727173089981, 0.1120515912771225, -0.08171103894710541, 0.06233591586351395, -0.001515827840194106, 0.048269208520650864, 0.008486802689731121, 0.0718991607427597, 0.03075573220849037, -0.01933092623949051, -0.054664645344018936, -0.004920435603708029, 0.07616341859102249, 0.044075582176446915, -0.08304447680711746, 0.17303265631198883, -0.16809521615505219, 0.17718751728534698, 0.1830892711877823, -0.053329963237047195, 0.0672716572880745, -0.06440312415361404, 0.004117555916309357, 0.006375542376190424, 0.01725166104733944, -0.029144395142793655, -0.06810282170772552, -0.023921646177768707, 0.13922525942325592, -0.08995883911848068, 0.024277323856949806, -0.030226178467273712, -0.09610135853290558, -0.03148989379405975, 0.04611137881875038, 0.13965949416160583, -0.12359024584293365, 0.14076389372348785, 0.2744406759738922, -0.05968451499938965, 0.16629481315612793, -0.07826003432273865, -0.01959741674363613, 0.03112717904150486, 0.029360584914684296, 0.01139301247894764, 0.03514685854315758, -0.0725974440574646, 0.009150093421339989, 0.06436678767204285, 0.011256769299507141, 0.05925733223557472, -0.12017325311899185, -0.03813193365931511, -0.015018261037766933, -0.08150938898324966, 0.0009838978294283152, 0.02637113630771637, -0.03382857143878937, 0.1125638410449028, -0.0729447677731514, -0.08112064749002457, 0.06444236636161804, -0.0029733648989349604, -0.09096300601959229, 0.14989474415779114, -0.16414055228233337, -0.2094060182571411, -0.17010006308555603, -0.07492123544216156, -0.12305452674627304, 0.048035554587841034, 0.1272132247686386, -0.06392382085323334, -0.06527968496084213, -0.11073275655508041, -0.0012270922306925058, 0.060171011835336685, 0.0026205533649772406, -0.03321259841322899, 0.06188766658306122, 0.039951056241989136, -0.14063549041748047, -0.026298563927412033, 0.019183989614248276, -0.05959146097302437, 0.08417683839797974, -0.10241653770208359, 0.11120116710662842, 0.11055506765842438, 0.043524689972400665, -0.004277043044567108, -0.03438449651002884, 0.14587050676345825, -0.009748073294758797, 0.05508562549948692, 0.22224508225917816, 0.013213586062192917, 0.051645271480083466, 0.20982638001441956, 0.005477667786180973, -0.08577021211385727, 0.045732077211141586, -0.03411991894245148, -0.05770830437541008, -0.24539126455783844, -0.09791956841945648, -0.07684389501810074, 0.07248588651418686, 0.009598531760275364, 0.0696263313293457, 0.1477127969264984, 0.09115298837423325, -0.06468984484672546, 0.01826886273920536, 0.08616035431623459, 0.0813201442360878, 0.21068477630615234, -0.02505980245769024, 0.12948471307754517, -0.11114134639501572, -0.041467756032943726, 0.12198812514543533, 0.03180159628391266, 0.12506850063800812, 0.11781278252601624, 0.10692165791988373, 0.07706333696842194, 0.060735661536455154, 0.07706188410520554, 0.10518407821655273, 0.029941746965050697, -0.015622164122760296, -0.0536126047372818, -0.08164680004119873, -0.025674663484096527, 0.0657922625541687, -0.11142819374799728, -0.07183738052845001, 0.0023527902085334063, -0.011875869706273079, 0.08690844476222992, 0.13079726696014404, 0.025948891416192055, -0.18194083869457245, 0.0184720978140831, 0.1184418722987175, 0.04635327681899071, -0.058172523975372314, 0.09430422633886337, -0.009970699436962605, 0.006994126830250025, 0.1069587990641594, 0.013259246945381165, 0.12724830210208893, 0.03328077122569084, 0.04165519401431084, -0.11655936390161514, 0.03325515612959862, 0.005277303513139486, 0.11643866449594498, -0.30314141511917114, 0.1928418129682541, 0.027279319241642952, 0.0050629316829144955, -0.03560766205191612, 0.01813984289765358, 0.0611964650452137, 0.219783753156662, 0.07941992580890656, -0.0047753495164215565, -0.08889753371477127, 0.01757178083062172, -0.08586373925209045, 0.05729738250374794, 0.04406236857175827, 0.028443533927202225, 0.005903454497456551, -0.05087564140558243, -0.018348515033721924, 0.017288824543356895, -0.017828693613409996, -0.1543223112821579, -0.14135712385177612, 0.02261722832918167, 0.14534009993076324, 0.08568084239959717, -0.084896519780159, 0.011265935376286507, -0.09592677652835846, 0.15376612544059753, -0.027444632723927498, -0.07461334764957428, -0.09205807000398636, -0.14226296544075012, 0.03247056528925896, -0.021434089168906212, 0.05520476773381233, -0.06949447095394135, 0.041975244879722595, -0.05020643025636673, -0.1539601981639862, 0.09926464408636093, -0.1273404210805893, -0.027598252519965172, -0.04579196125268936, 0.11798780411481857, -0.1030544564127922, -0.01273755356669426, 0.0345345102250576, 0.009853051975369453, -0.05537281930446625, -0.11659607291221619, -0.025196272879838943, 0.06052599474787712, -0.04499102756381035, 0.010884511284530163, -0.14632302522659302, -0.20027562975883484, -0.01653350330889225, -0.06948523223400116, 0.1833057403564453, 0.27016839385032654, -0.0315040722489357, 0.12052420526742935, 0.2116612195968628, -0.08003512024879456, -0.28749004006385803, -0.12836797535419464, -0.11941742151975632, -0.06740547716617584, -0.02364007756114006, -0.12820754945278168, 0.06323830038309097, 0.06627941876649857, -0.07269101589918137, 0.08624351024627686, -0.19305437803268433, -0.10786213725805283, 0.19673633575439453, 0.042226627469062805, 0.24839472770690918, -0.21482205390930176, -0.11261734366416931, -0.1454024314880371, -0.07696432620286942, 0.12207481265068054, -0.20665915310382843, 0.06223846226930618, 0.020409753546118736, -0.030911283567547798, -0.009877045638859272, -0.01631244644522667, 0.15445595979690552, -0.049792125821113586, 0.05481055751442909, -0.11701096594333649, 0.079950712621212, 0.13287971913814545, -0.021422740072011948, 0.11282487958669662, -0.21738342940807343, 0.04274078458547592, -0.07302132248878479, -0.022964250296354294, -0.021218452602624893, 0.05340224876999855, -0.0007234873482957482, -0.056492917239665985, -0.0075147515162825584, -0.061102546751499176, 0.006166626699268818, -0.018161194398999214, 0.155462384223938, -0.06920675933361053, 0.10247216373682022, 0.21642564237117767, 0.08755423128604889, -0.13951261341571808, 0.09652771800756454, -0.04488880932331085, -0.08193007111549377, 0.06820577383041382, -0.17080377042293549, 0.040336236357688904, 0.026233894750475883, -0.06358291208744049, 0.1280638575553894, 0.01455137599259615, 0.010795916430652142, -0.02340371161699295, 0.14545486867427826, -0.15846948325634003, -0.04339389130473137, -0.044168245047330856, 0.16354355216026306, 0.0039930627681314945, 0.08865193277597427, 0.16048061847686768, -0.009130967780947685, 0.011998888105154037, 0.002728817518800497, 0.05790924280881882, -0.013513414189219475, 0.05963899940252304, 0.0895831435918808, 0.009855004958808422, -0.11495652049779892, 0.12304353713989258, 0.039257124066352844, -0.07396791875362396, -0.007531404495239258, 0.0941539779305458, -0.13737262785434723, -0.1321203112602234, 0.03299177065491676, 0.10634152591228485, -0.09039632976055145, -0.10006871074438095, -0.09107930213212967, -0.15139538049697876, 0.024321207776665688, 0.15865613520145416, 0.06392929702997208, -0.00010183639824390411, 0.03341728076338768, -0.05309927836060524, 0.04169714078307152, 0.0689893364906311, -0.03929172083735466, 0.03780718147754669, -0.0985812321305275, -0.056535396724939346, -0.023127563297748566, 0.007951898500323296, -0.05747132748365402, -0.006196987349539995, -0.0963079035282135, -0.007496025413274765, -0.2119990736246109, 0.041098035871982574, -0.053164608776569366, -0.0002984351012855768, 0.010556970722973347, -0.057321153581142426, -0.044076550751924515, -0.016458099707961082, -0.08620326966047287, -0.021515976637601852, -0.04038630798459053, 0.07653401792049408, -0.1292489767074585, -0.0319129079580307, 0.04604264721274376, -0.03188573941588402, 0.08212695270776749, 0.05306163802742958, -0.06792660802602768, 0.07363590598106384, -0.20191174745559692, -0.03474697098135948, 0.09469491988420486, 0.04383878409862518, -0.004978463053703308, 0.022007986903190613, -0.01860205829143524, 0.13079720735549927, 0.014634901657700539, 0.01930716633796692, 0.015630660578608513, -0.10655492544174194, 0.01708611659705639, -0.02658320590853691, -0.07790646702051163, -0.0057700141333043575, -0.08486640453338623, 0.08650428056716919, 0.0018685830291360617, 0.18639275431632996, -0.08676538616418839, -0.022674664855003357, -0.0720110833644867, 0.0381673127412796, -0.02560056746006012, -0.1313818097114563, -0.1762942522764206, -0.06337277591228485, -0.021084830164909363, -0.007815372198820114, 0.21378540992736816, 0.014353619888424873, -0.04676142334938049, 0.07153958082199097, 0.09873894602060318, 0.09228567034006119, 0.013960029929876328, 0.26284921169281006, 0.0887332633137703, 0.03282386437058449, -0.11040163040161133, -0.012386230751872063, 0.05591204762458801, -0.09134384244680405, 0.026184460148215294, 0.08786897361278534, -0.03497622162103653, 0.09507035464048386, 0.07097338885068893, 0.014710699208080769, 0.05696908384561539, -0.038030292838811874, -0.0021602946799248457, 0.06715422123670578, 0.0048540448769927025, 0.14419765770435333, 0.21381741762161255, -0.02674802392721176, -0.0069418433122336864, -0.017627215012907982, -0.026980970054864883, -0.13891726732254028, -0.12202943116426468, -0.11298098415136337, -0.16397960484027863, -0.022037072107195854, -0.0767233744263649, 0.02763640508055687, 0.05141616612672806, 0.04265535622835159, -0.010997787117958069, 0.06171019375324249, -0.032388679683208466, -0.0809764415025711, 0.025967011228203773, -0.05303575471043587, -0.015245763584971428, -0.02222135104238987, -0.06461866199970245, 0.022383319213986397, -0.023051073774695396, -0.048190951347351074, 0.07158275693655014, 0.04472176358103752, 0.06972743570804596, -0.10445868223905563, -0.06150073558092117, -0.02343711629509926, 0.03235926106572151, -0.01636676676571369, 0.14021536707878113, 0.03439825773239136, -0.0340581014752388, 0.11505091935396194, 0.2074914574623108, -0.04652822017669678, -0.24068953096866608, -0.06632117182016373, 0.12054569274187088, -0.02005736716091633, 0.08964470773935318, -0.054980579763650894, -0.04005076363682747, -0.003485280554741621, 0.2788628041744232, 0.2695721387863159, -0.06229916960000992, 0.019052477553486824, -0.09314917027950287, 0.02148972824215889, 0.02630120888352394, 0.08219660073518753, 0.10783253610134125, 0.11821684241294861, -0.015220877714455128, 0.0030716354958713055, -0.02996818907558918, 0.018354663625359535, -0.15759898722171783, 0.0637439712882042, -0.008683894760906696, -0.08680132031440735, 0.0034067558590322733, 0.10097310692071915, -0.0568966269493103, 0.07674241065979004, -0.1328793168067932, -0.048040926456451416, -0.04869907721877098, -0.04755163937807083, 0.16926243901252747, 0.03800574317574501, -0.018009144812822342, -0.04039900004863739, -0.028146198019385338, 0.06489887833595276, -0.03307562321424484, -0.18693524599075317, -0.08015161752700806, 0.04905511066317558, -0.061932243406772614, 0.08554902672767639, 0.008967136032879353, 0.03716901317238808, 0.06404893100261688, -0.01698748953640461, -0.07769337296485901, 0.20521694421768188, 0.013772769831120968, -0.05274898186326027, 0.06791714578866959, -0.07306820899248123, -0.029527632519602776, 0.026723287999629974, 0.047836851328611374, -0.030750853940844536, 0.031172269955277443, 0.08646542578935623, -0.130409374833107, -0.02953852340579033, 0.04979884251952171, -0.10372445732355118, 0.06555480509996414, 0.010140878148376942, -0.04328235983848572, -0.0021322076208889484, -0.050101302564144135, 0.03340009227395058, 0.022227177396416664, -0.18462032079696655, -0.004702125210314989, -0.07916490733623505, -0.05864090472459793, 0.084066241979599, 0.05860688537359238, -0.2224002182483673, -0.007419212721288204, -0.1517416387796402, 0.04410700500011444, -0.14301082491874695, 0.05957864597439766, 0.1906365156173706, 0.002866483526304364, -0.020421650260686874, -0.11243137717247009, 0.039018161594867706, 0.05933419615030289, -0.060840390622615814, -0.09305339306592941 ]