Search is not available for this dataset
pipeline_tag
stringclasses 48
values | library_name
stringclasses 205
values | text
stringlengths 0
18.3M
| metadata
stringlengths 2
1.07B
| id
stringlengths 5
122
| last_modified
null | tags
listlengths 1
1.84k
| sha
null | created_at
stringlengths 25
25
|
---|---|---|---|---|---|---|---|---|
fill-mask
|
transformers
|
# Transformer language model for Croatian and Serbian
Trained on 28GB datasets that contain Croatian and Serbian language for one epochs (3 mil. steps).
Leipzig Corpus, OSCAR, srWac, hrWac, cc100-hr and cc100-sr datasets
| Model | #params | Arch. | Training data |
|--------------------------------|--------------------------------|-------|-----------------------------------|
| `Andrija/SRoBERTa-XL` | 80M | Forth | Leipzig Corpus, OSCAR, srWac, hrWac, cc100-hr and cc100-sr (28 GB of text) |
|
{"language": ["hr", "sr", "multilingual"], "license": "apache-2.0", "tags": ["masked-lm"], "datasets": ["oscar", "srwac", "leipzig", "cc100", "hrwac"], "widget": [{"text": "Ovo je po\u010detak <mask>."}]}
|
Andrija/SRoBERTa-XL
| null |
[
"transformers",
"pytorch",
"roberta",
"fill-mask",
"masked-lm",
"hr",
"sr",
"multilingual",
"dataset:oscar",
"dataset:srwac",
"dataset:leipzig",
"dataset:cc100",
"dataset:hrwac",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
token-classification
|
transformers
|
Named Entity Recognition (Token Classification Head) for Serbian / Croatian languges.
Abbreviation|Description
-|-
O|Outside of a named entity
B-MIS |Beginning of a miscellaneous entity right after another miscellaneous entity
I-MIS | Miscellaneous entity
B-PER |Beginning of a person’s name right after another person’s name
B-DERIV-PER| Begginning derivative that describes relation to a person
I-PER |Person’s name
B-ORG |Beginning of an organization right after another organization
I-ORG |organization
B-LOC |Beginning of a location right after another location
I-LOC |Location
|
{"language": ["hr", "sr", "multilingual"], "license": "apache-2.0", "datasets": ["hr500k"], "widget": [{"text": "Moje ime je Aleksandar i zivim u Beogradu pored Vlade Republike Srbije"}]}
|
Andrija/SRoBERTa-base-NER
| null |
[
"transformers",
"pytorch",
"roberta",
"token-classification",
"hr",
"sr",
"multilingual",
"dataset:hr500k",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
fill-mask
|
transformers
|
# Transformer language model for Croatian and Serbian
Trained on 3GB datasets that contain Croatian and Serbian language for two epochs.
Leipzig and OSCAR datasets
# Information of dataset
| Model | #params | Arch. | Training data |
|--------------------------------|--------------------------------|-------|-----------------------------------|
| `Andrija/SRoBERTa-base` | 80M | Second | Leipzig Corpus and OSCAR (3 GB of text) |
|
{"language": ["hr", "sr", "multilingual"], "license": "apache-2.0", "tags": ["masked-lm"], "datasets": ["oscar", "leipzig"], "widget": [{"text": "Ovo je po\u010detak <mask>."}]}
|
Andrija/SRoBERTa-base
| null |
[
"transformers",
"pytorch",
"roberta",
"fill-mask",
"masked-lm",
"hr",
"sr",
"multilingual",
"dataset:oscar",
"dataset:leipzig",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
fill-mask
|
transformers
|
# Transformer language model for Croatian and Serbian
Trained on 0.7GB dataset Croatian and Serbian language for one epoch.
Dataset from Leipzig Corpora.
# Information of dataset
| Model | #params | Arch. | Training data |
|--------------------------------|--------------------------------|-------|-----------------------------------|
| `Andrija/SRoBERTa` | 120M | First | Leipzig Corpus (0.7 GB of text) |
# How to use in code
```python
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("Andrija/SRoBERTa")
model = AutoModelForMaskedLM.from_pretrained("Andrija/SRoBERTa")
```
|
{"language": ["hr", "sr", "multilingual"], "license": "apache-2.0", "tags": ["masked-lm"], "datasets": ["leipzig"], "widget": [{"text": "Gde je <mask>."}]}
|
Andrija/SRoBERTa
| null |
[
"transformers",
"pytorch",
"roberta",
"fill-mask",
"masked-lm",
"hr",
"sr",
"multilingual",
"dataset:leipzig",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
fill-mask
|
transformers
|
{}
|
Andrija/SRoBERTaFastBPE-2
| null |
[
"transformers",
"roberta",
"fill-mask",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
null | null |
{}
|
Andrija/SRoBERTaFastBPE
| null |
[
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
null | null |
{}
|
Andry/111
| null |
[
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
null | null |
C:\Users\andry\Desktop\Выжигание 24-12-2021.jpg
|
{}
|
Andry/1111
| null |
[
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
null | null |
Now we only upload two models for creating demos for image and video classification.
More models and code can be found in our github repo: [UniFormer](https://github.com/Sense-X/UniFormer).
|
{"license": "mit"}
|
Andy1621/uniformer
| null |
[
"license:mit",
"has_space",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
null |
transformers
|
{}
|
AndyJ/clinicalBERT
| null |
[
"transformers",
"pytorch",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
fill-mask
|
transformers
|
{}
|
AndyJ/prompt_finetune
| null |
[
"transformers",
"pytorch",
"bert",
"fill-mask",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
multiple-choice
|
transformers
|
{}
|
AndyyyCai/bert-base-uncased-finetuned-copa
| null |
[
"transformers",
"pytorch",
"bert",
"multiple-choice",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
null | null |
{}
|
Ani123/Ani
| null |
[
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
text-classification
|
transformers
|
{}
|
Anirbanbhk/Hate-speech-Pretrained-movies
| null |
[
"transformers",
"tf",
"bert",
"text-classification",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
text-classification
|
transformers
|
{}
|
tensor-trek/distilbert-base-uncased-finetuned-emotion
| null |
[
"transformers",
"pytorch",
"safetensors",
"distilbert",
"text-classification",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
null | null |
{}
|
Anji/roberta-base-squad2-finetuned-squad
| null |
[
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
null | null |
{}
|
Ankit-11/distilbert-base-uncased-finetuned-toxic
| null |
[
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
null | null |
{}
|
Ankitha/DialoGPT-small-harrypotter
| null |
[
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
null | null |
{}
|
Ankitha/DialoGPT-small-harrypottery
| null |
[
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
token-classification
|
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-ner
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the conll2003 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0609
- Precision: 0.9275
- Recall: 0.9365
- F1: 0.9320
- Accuracy: 0.9840
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
| 0.2527 | 1.0 | 878 | 0.0706 | 0.9120 | 0.9181 | 0.9150 | 0.9803 |
| 0.0517 | 2.0 | 1756 | 0.0603 | 0.9174 | 0.9349 | 0.9261 | 0.9830 |
| 0.031 | 3.0 | 2634 | 0.0609 | 0.9275 | 0.9365 | 0.9320 | 0.9840 |
### Framework versions
- Transformers 4.9.2
- Pytorch 1.9.0+cu102
- Datasets 1.11.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["conll2003"], "metrics": ["precision", "recall", "f1", "accuracy"], "model_index": [{"name": "distilbert-base-uncased-finetuned-ner", "results": [{"task": {"name": "Token Classification", "type": "token-classification"}, "dataset": {"name": "conll2003", "type": "conll2003", "args": "conll2003"}, "metric": {"name": "Accuracy", "type": "accuracy", "value": 0.984018301110458}}]}]}
|
Ann2020/distilbert-base-uncased-finetuned-ner
| null |
[
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"token-classification",
"generated_from_trainer",
"dataset:conll2003",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
null | null |
{}
|
Ann2020/model-finetuned-ner
| null |
[
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
null | null |
{}
|
Ann2020/rubert-base-cased-finetuned-ner
| null |
[
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
null | null |
{}
|
Ann2020/rubert-base-cased-sentence-finetuned-ner
| null |
[
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
null | null |
{}
|
Ann2020/rubert-base-cased-sentence-finetuned-ner_tags
| null |
[
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
null | null |
{}
|
AnnettJaeger/AnneJae
| null |
[
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
null | null |
{}
|
Anomic/DialoGPT-medium-loki
| null |
[
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
text-classification
|
transformers
|
{}
|
AnonARR/qqp-bert
| null |
[
"transformers",
"pytorch",
"bert",
"text-classification",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
Pre-trained to have better reasoning ability, try this if you are working with task like QA. For more details please see https://openreview.net/forum?id=cGB7CMFtrSx
This is based on bert-base-uncased model and pre-trained for text input
|
{}
|
Anonymous/ReasonBERT-BERT
| null |
[
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
feature-extraction
|
transformers
|
Pre-trained to have better reasoning ability, try this if you are working with task like QA. For more details please see https://openreview.net/forum?id=cGB7CMFtrSx
This is based on roberta-base model and pre-trained for text input
|
{}
|
Anonymous/ReasonBERT-RoBERTa
| null |
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
feature-extraction
|
transformers
|
Pre-trained to have better reasoning ability, try this if you are working with task like QA. For more details please see https://openreview.net/forum?id=cGB7CMFtrSx
This is based on tapas-base(no_reset) model and pre-trained for table input
|
{}
|
Anonymous/ReasonBERT-TAPAS
| null |
[
"transformers",
"pytorch",
"tapas",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
null | null |
{}
|
Anonymous0230/model_name
| null |
[
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
null |
transformers
|
{}
|
AnonymousNLP/pretrained-model-1
| null |
[
"transformers",
"pytorch",
"gpt2",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
null |
transformers
|
{}
|
AnonymousNLP/pretrained-model-2
| null |
[
"transformers",
"pytorch",
"gpt2",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/AR_EManuals-BERT
| null |
[
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/AR_EManuals-RoBERTa
| null |
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/AR_SDR_HF_model_base
| null |
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/AR_bert-base-uncased
| null |
[
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/AR_cline
| null |
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/AR_consert
| null |
[
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/AR_declutr
| null |
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/AR_rule_based_bert_quadruplet_epochs_1_shard_1
| null |
[
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/AR_rule_based_bert_triplet_epochs_1_shard_1
| null |
[
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/AR_rule_based_hier_quadruplet_epochs_1_shard_1
| null |
[
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/AR_rule_based_hier_triplet_epochs_1_shard_1
| null |
[
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/AR_rule_based_only_classfn_epochs_1_shard_1
| null |
[
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/AR_rule_based_only_classfn_twostage_epochs_1_shard_1
| null |
[
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/AR_rule_based_roberta_bert_quadruplet_epochs_1_shard_1
| null |
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/AR_rule_based_roberta_bert_quadruplet_epochs_1_shard_10
| null |
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/AR_rule_based_roberta_bert_triplet_epochs_1_shard_1
| null |
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/AR_rule_based_roberta_bert_triplet_epochs_1_shard_10
| null |
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/AR_rule_based_roberta_hier_quadruplet_epochs_1_shard_1
| null |
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/AR_rule_based_roberta_hier_quadruplet_epochs_1_shard_10
| null |
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/AR_rule_based_roberta_hier_triplet_epochs_1_shard_1
| null |
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/AR_rule_based_roberta_hier_triplet_epochs_1_shard_10
| null |
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/AR_rule_based_roberta_only_classfn_epochs_1_shard_1
| null |
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/AR_rule_based_roberta_only_classfn_epochs_1_shard_10
| null |
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/AR_rule_based_roberta_only_classfn_twostage_epochs_1_shard_1
| null |
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/AR_rule_based_roberta_only_classfn_twostage_epochs_1_shard_10
| null |
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/AR_rule_based_roberta_twostage_quadruplet_epochs_1_shard_1
| null |
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/AR_rule_based_roberta_twostage_quadruplet_epochs_1_shard_10
| null |
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/AR_rule_based_roberta_twostagequadruplet_hier_epochs_1_shard_1
| null |
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/AR_rule_based_roberta_twostagequadruplet_hier_epochs_1_shard_10
| null |
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/AR_rule_based_roberta_twostagetriplet_epochs_1_shard_1
| null |
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/AR_rule_based_roberta_twostagetriplet_epochs_1_shard_10
| null |
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/AR_rule_based_roberta_twostagetriplet_hier_epochs_1_shard_1
| null |
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/AR_rule_based_roberta_twostagetriplet_hier_epochs_1_shard_10
| null |
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/AR_rule_based_twostage_quadruplet_epochs_1_shard_1
| null |
[
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/AR_rule_based_twostagequadruplet_hier_epochs_1_shard_1
| null |
[
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/AR_rule_based_twostagetriplet_epochs_1_shard_1
| null |
[
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/AR_rule_based_twostagetriplet_hier_epochs_1_shard_1
| null |
[
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/AR_specter
| null |
[
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/EManuals_BERT_copy
| null |
[
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
text-classification
|
transformers
|
{}
|
AnonymousSub/EManuals_BERT_copy_wikiqa
| null |
[
"transformers",
"pytorch",
"bert",
"text-classification",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
question-answering
|
transformers
|
{}
|
AnonymousSub/EManuals_BERT_squad2.0
| null |
[
"transformers",
"pytorch",
"bert",
"question-answering",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
question-answering
|
transformers
|
{}
|
AnonymousSub/EManuals_RoBERTa_squad2.0
| null |
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
text-classification
|
transformers
|
{}
|
AnonymousSub/EManuals_RoBERTa_wikiqa
| null |
[
"transformers",
"pytorch",
"roberta",
"text-classification",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/SDR_HF_model_base
| null |
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/SR_EManuals-BERT
| null |
[
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/SR_EManuals-RoBERTa
| null |
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/SR_SDR_HF_model_base
| null |
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/SR_bert-base-uncased
| null |
[
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/SR_cline
| null |
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/SR_consert
| null |
[
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/SR_declutr
| null |
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/SR_rule_based_bert_quadruplet_epochs_1_shard_1
| null |
[
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/SR_rule_based_bert_triplet_epochs_1_shard_1
| null |
[
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/SR_rule_based_hier_quadruplet_epochs_1_shard_1
| null |
[
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/SR_rule_based_hier_triplet_epochs_1_shard_1
| null |
[
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/SR_rule_based_only_classfn_epochs_1_shard_1
| null |
[
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/SR_rule_based_only_classfn_twostage_epochs_1_shard_1
| null |
[
"transformers",
"pytorch",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/SR_rule_based_roberta_bert_quadruplet_epochs_1_shard_1
| null |
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/SR_rule_based_roberta_bert_quadruplet_epochs_1_shard_10
| null |
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/SR_rule_based_roberta_bert_triplet_epochs_1_shard_1
| null |
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/SR_rule_based_roberta_bert_triplet_epochs_1_shard_10
| null |
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/SR_rule_based_roberta_hier_quadruplet_epochs_1_shard_1
| null |
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/SR_rule_based_roberta_hier_quadruplet_epochs_1_shard_10
| null |
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/SR_rule_based_roberta_hier_triplet_epochs_1_shard_1
| null |
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/SR_rule_based_roberta_hier_triplet_epochs_1_shard_10
| null |
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/SR_rule_based_roberta_hier_triplet_epochs_1_shard_1_wikiqa_copy
| null |
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
|
feature-extraction
|
transformers
|
{}
|
AnonymousSub/SR_rule_based_roberta_only_classfn_epochs_1_shard_1
| null |
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | null |
2022-03-02T23:29:04+00:00
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.