nace-pl-v1 / README.md
annazdr's picture
Add new SentenceTransformer model.
0384969 verified
metadata
language: []
library_name: sentence-transformers
tags:
  - sentence-transformers
  - sentence-similarity
  - feature-extraction
  - generated_from_trainer
  - dataset_size:6413
  - loss:BatchAllTripletLoss
base_model: sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2
datasets: []
widget:
  - source_sentence: manufacture of charging stations for cars
    sentences:
      - ' education in public and private universities provided at the level of:  short-cycle studies, usually based on practical learning appropriate to the performance of a given profession and preparing students for entry into the labour market or other higher education programmes'
      - g
      - >-
        Działalność agentów zajmujących się sprzedażą hurtową płodów rolnych,
        żywych zwierząt, surowców dla przemysłu tekstylnego i półproduktów
  - source_sentence: >-
      manufacture of carbon and graphite fibres and products (except electrodes
      and electrical applications)
    sentences:
      - educational testing evaluation activities
      - Obróbka i wykończanie produktów z tworzyw sztucznych
      - ' manufacture of all electric motors and transformers: ac, dc and ac/dc'
  - source_sentence: ' retail sale of sports goods, including fishing gear, weapons and ammunitions, camping goods, etc'
    sentences:
      - anise, badian and fennel as spice or aromatic plant
      - ' transformers)'
      - retail sale of knitting yarn
  - source_sentence: community and neighbourhood activities
    sentences:
      - manufacture of glass eyes
      - uprawa trzciny cukrowej
      - ' revenue for the intermediation activities can include other sources of income, e'
  - source_sentence: dating and other speed networking activities
    sentences:
      - >-
        sprzedaż detaliczna prowadzona na straganach i targowiskach ryb, owoców
        morza, produktów z owoców morza i ich przetworów, sprzedaż detaliczna
        prowadzona na straganach i targowiskach alg i wodorostów
      - ' pressure, pushbutton, snap, tumbler switches)'
      - ' dializy, chemioterapia, insulinoterapia, radioterapia'
pipeline_tag: sentence-similarity

SentenceTransformer based on sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2

This is a sentence-transformers model finetuned from sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("annazdr/nace-pl-v1")
# Run inference
sentences = [
    'dating and other speed networking activities',
    ' pressure, pushbutton, snap, tumbler switches)',
    ' dializy, chemioterapia, insulinoterapia, radioterapia',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Training Details

Training Dataset

Unnamed Dataset

  • Size: 6,413 training samples
  • Columns: sentence_0 and label
  • Approximate statistics based on the first 1000 samples:
    sentence_0 label
    type string int
    details
    • min: 2 tokens
    • mean: 17.31 tokens
    • max: 128 tokens
    • 0: ~0.10%
    • 1: ~0.30%
    • 2: ~0.40%
    • 3: ~0.20%
    • 4: ~0.40%
    • 5: ~0.10%
    • 6: ~0.30%
    • 8: ~0.30%
    • 9: ~0.20%
    • 10: ~0.10%
    • 11: ~0.40%
    • 12: ~0.30%
    • 13: ~0.10%
    • 14: ~0.10%
    • 15: ~0.20%
    • 16: ~0.10%
    • 17: ~0.30%
    • 18: ~0.20%
    • 19: ~0.20%
    • 20: ~0.40%
    • 21: ~0.30%
    • 22: ~0.10%
    • 23: ~0.30%
    • 24: ~0.20%
    • 25: ~0.20%
    • 26: ~0.10%
    • 27: ~0.30%
    • 28: ~0.30%
    • 29: ~0.20%
    • 31: ~0.10%
    • 34: ~0.20%
    • 36: ~0.10%
    • 37: ~0.30%
    • 39: ~0.20%
    • 40: ~0.60%
    • 41: ~0.10%
    • 42: ~0.30%
    • 43: ~0.20%
    • 44: ~0.60%
    • 45: ~0.50%
    • 46: ~0.20%
    • 47: ~0.10%
    • 48: ~0.10%
    • 49: ~0.20%
    • 50: ~0.20%
    • 51: ~0.20%
    • 52: ~0.20%
    • 53: ~0.60%
    • 54: ~0.10%
    • 55: ~0.20%
    • 57: ~0.20%
    • 58: ~0.10%
    • 59: ~0.10%
    • 60: ~0.20%
    • 62: ~0.10%
    • 63: ~0.20%
    • 64: ~0.80%
    • 65: ~0.60%
    • 66: ~0.70%
    • 67: ~0.10%
    • 68: ~0.20%
    • 69: ~0.30%
    • 70: ~0.70%
    • 72: ~0.20%
    • 73: ~0.90%
    • 74: ~0.40%
    • 75: ~0.10%
    • 76: ~0.40%
    • 77: ~0.10%
    • 78: ~0.30%
    • 79: ~0.20%
    • 81: ~0.10%
    • 82: ~0.60%
    • 83: ~0.20%
    • 85: ~0.20%
    • 87: ~0.30%
    • 88: ~0.20%
    • 89: ~0.10%
    • 90: ~0.50%
    • 95: ~0.20%
    • 96: ~0.10%
    • 97: ~0.40%
    • 98: ~0.30%
    • 99: ~0.70%
    • 100: ~0.60%
    • 102: ~1.00%
    • 103: ~0.30%
    • 104: ~0.10%
    • 106: ~0.20%
    • 107: ~0.10%
    • 108: ~0.20%
    • 109: ~0.20%
    • 110: ~0.30%
    • 112: ~0.10%
    • 115: ~0.10%
    • 116: ~0.30%
    • 120: ~0.40%
    • 122: ~0.20%
    • 123: ~0.20%
    • 124: ~0.10%
    • 125: ~0.30%
    • 126: ~0.50%
    • 127: ~0.40%
    • 128: ~0.70%
    • 130: ~0.10%
    • 132: ~0.10%
    • 135: ~0.20%
    • 136: ~0.10%
    • 140: ~0.10%
    • 141: ~0.10%
    • 143: ~0.10%
    • 145: ~0.10%
    • 148: ~0.30%
    • 149: ~0.20%
    • 150: ~0.10%
    • 151: ~0.40%
    • 152: ~0.40%
    • 153: ~0.20%
    • 154: ~0.50%
    • 158: ~0.20%
    • 159: ~0.10%
    • 161: ~0.10%
    • 163: ~0.10%
    • 164: ~0.10%
    • 167: ~0.10%
    • 168: ~0.20%
    • 169: ~0.10%
    • 171: ~0.10%
    • 172: ~0.10%
    • 173: ~0.10%
    • 179: ~0.10%
    • 181: ~0.40%
    • 182: ~0.50%
    • 183: ~0.20%
    • 184: ~0.10%
    • 185: ~0.30%
    • 186: ~0.20%
    • 188: ~0.10%
    • 189: ~0.40%
    • 190: ~0.20%
    • 191: ~0.20%
    • 192: ~0.60%
    • 193: ~0.20%
    • 194: ~0.30%
    • 195: ~0.40%
    • 196: ~0.10%
    • 198: ~0.10%
    • 199: ~0.40%
    • 200: ~0.20%
    • 201: ~0.20%
    • 202: ~0.30%
    • 206: ~0.10%
    • 209: ~0.10%
    • 210: ~0.10%
    • 211: ~0.20%
    • 212: ~0.10%
    • 213: ~0.10%
    • 221: ~0.20%
    • 222: ~0.10%
    • 224: ~0.20%
    • 227: ~0.10%
    • 228: ~0.10%
    • 229: ~0.40%
    • 231: ~0.30%
    • 233: ~0.10%
    • 235: ~0.10%
    • 236: ~0.40%
    • 237: ~0.30%
    • 238: ~0.10%
    • 241: ~0.20%
    • 242: ~0.30%
    • 243: ~0.60%
    • 244: ~0.30%
    • 245: ~0.10%
    • 246: ~0.20%
    • 247: ~0.20%
    • 248: ~0.10%
    • 249: ~0.10%
    • 250: ~0.20%
    • 254: ~0.30%
    • 255: ~0.10%
    • 258: ~0.10%
    • 259: ~0.10%
    • 260: ~0.50%
    • 261: ~0.10%
    • 262: ~0.20%
    • 264: ~0.20%
    • 265: ~0.20%
    • 270: ~0.20%
    • 272: ~0.10%
    • 273: ~0.10%
    • 274: ~0.20%
    • 276: ~0.10%
    • 277: ~0.30%
    • 279: ~0.10%
    • 280: ~0.10%
    • 283: ~0.10%
    • 284: ~0.10%
    • 285: ~0.40%
    • 286: ~0.20%
    • 287: ~0.20%
    • 288: ~0.10%
    • 289: ~0.40%
    • 291: ~0.10%
    • 292: ~0.40%
    • 293: ~0.40%
    • 294: ~0.10%
    • 295: ~0.30%
    • 296: ~0.30%
    • 297: ~0.20%
    • 298: ~0.20%
    • 300: ~0.20%
    • 302: ~0.20%
    • 303: ~0.30%
    • 304: ~0.20%
    • 308: ~0.30%
    • 310: ~0.30%
    • 311: ~0.50%
    • 312: ~0.20%
    • 313: ~0.20%
    • 314: ~0.30%
    • 315: ~0.10%
    • 316: ~0.20%
    • 317: ~0.10%
    • 319: ~0.20%
    • 322: ~0.10%
    • 323: ~0.10%
    • 324: ~0.30%
    • 325: ~0.30%
    • 328: ~0.20%
    • 329: ~0.30%
    • 330: ~0.10%
    • 332: ~0.20%
    • 333: ~0.30%
    • 335: ~0.20%
    • 336: ~0.60%
    • 337: ~0.40%
    • 338: ~0.10%
    • 339: ~0.10%
    • 340: ~0.10%
    • 341: ~0.10%
    • 342: ~0.10%
    • 344: ~0.20%
    • 346: ~0.10%
    • 347: ~0.30%
    • 348: ~0.10%
    • 349: ~0.30%
    • 350: ~0.20%
    • 351: ~0.10%
    • 352: ~0.40%
    • 353: ~0.30%
    • 354: ~0.20%
    • 356: ~0.20%
    • 357: ~0.40%
    • 358: ~0.40%
    • 359: ~0.40%
    • 360: ~0.20%
    • 361: ~0.40%
    • 362: ~0.20%
    • 363: ~0.10%
    • 366: ~0.10%
    • 367: ~0.10%
    • 368: ~0.70%
    • 369: ~0.20%
    • 370: ~0.30%
    • 372: ~0.30%
    • 373: ~0.20%
    • 374: ~0.40%
    • 375: ~0.10%
    • 376: ~0.10%
    • 377: ~0.10%
    • 379: ~0.20%
    • 381: ~0.30%
    • 383: ~0.40%
    • 384: ~0.20%
    • 385: ~0.20%
    • 386: ~0.20%
    • 387: ~0.20%
    • 389: ~0.10%
    • 390: ~0.30%
    • 391: ~0.20%
    • 392: ~0.20%
    • 393: ~0.20%
    • 395: ~0.10%
    • 397: ~0.40%
    • 398: ~0.20%
    • 399: ~0.30%
    • 400: ~0.40%
    • 402: ~0.10%
    • 407: ~0.10%
    • 408: ~0.20%
    • 409: ~0.30%
    • 411: ~0.20%
    • 412: ~0.20%
    • 414: ~0.20%
    • 415: ~0.20%
    • 416: ~0.10%
    • 417: ~0.30%
    • 418: ~0.10%
    • 420: ~0.20%
    • 422: ~0.50%
    • 423: ~0.10%
    • 425: ~0.40%
    • 426: ~0.10%
    • 427: ~0.10%
    • 428: ~0.40%
    • 429: ~0.20%
    • 430: ~0.10%
    • 431: ~0.10%
    • 432: ~0.10%
    • 433: ~0.20%
    • 434: ~0.30%
    • 435: ~0.20%
    • 436: ~0.40%
    • 437: ~0.10%
    • 438: ~0.40%
    • 440: ~0.80%
    • 441: ~0.20%
    • 442: ~0.50%
    • 443: ~0.20%
    • 444: ~0.30%
    • 445: ~0.30%
    • 446: ~0.10%
    • 447: ~0.20%
    • 450: ~0.30%
    • 451: ~0.20%
    • 452: ~0.20%
    • 453: ~0.10%
    • 454: ~0.20%
    • 455: ~0.30%
    • 456: ~0.10%
    • 457: ~0.10%
    • 458: ~0.20%
    • 459: ~0.20%
    • 460: ~0.10%
    • 461: ~0.10%
    • 462: ~0.10%
    • 463: ~0.40%
    • 464: ~0.30%
    • 467: ~0.10%
    • 469: ~0.10%
    • 470: ~0.10%
    • 472: ~0.10%
    • 475: ~0.50%
    • 476: ~0.30%
    • 478: ~0.10%
    • 479: ~0.20%
    • 480: ~0.10%
    • 482: ~0.30%
    • 483: ~0.50%
    • 484: ~0.30%
    • 485: ~0.40%
    • 486: ~0.20%
    • 487: ~0.20%
    • 489: ~0.10%
    • 490: ~0.20%
    • 491: ~0.10%
    • 492: ~0.40%
    • 493: ~0.40%
    • 495: ~0.10%
    • 497: ~0.10%
    • 498: ~0.10%
    • 499: ~0.30%
    • 501: ~0.20%
    • 502: ~0.20%
    • 503: ~0.10%
    • 504: ~0.30%
    • 505: ~0.10%
    • 506: ~0.10%
    • 507: ~0.10%
    • 508: ~0.20%
    • 509: ~0.10%
    • 510: ~0.10%
    • 511: ~0.10%
    • 512: ~0.50%
    • 514: ~0.20%
    • 517: ~0.40%
    • 518: ~0.10%
    • 519: ~0.60%
    • 520: ~0.90%
    • 521: ~0.60%
    • 522: ~0.10%
    • 523: ~0.10%
    • 524: ~0.10%
    • 525: ~0.10%
    • 526: ~0.10%
    • 527: ~0.10%
    • 528: ~0.30%
    • 529: ~0.40%
    • 530: ~0.60%
    • 531: ~0.20%
    • 532: ~0.10%
    • 533: ~0.30%
    • 535: ~0.50%
    • 537: ~0.20%
    • 540: ~0.10%
    • 541: ~0.10%
    • 542: ~0.20%
    • 543: ~0.30%
    • 544: ~0.20%
    • 545: ~0.50%
    • 546: ~0.30%
    • 547: ~0.10%
    • 548: ~0.50%
    • 549: ~0.10%
    • 550: ~0.30%
    • 551: ~0.30%
    • 552: ~0.10%
    • 554: ~0.60%
    • 555: ~0.20%
    • 556: ~0.10%
    • 557: ~0.10%
    • 560: ~0.20%
    • 561: ~0.10%
    • 563: ~0.20%
    • 564: ~0.20%
    • 565: ~0.20%
    • 567: ~0.10%
    • 568: ~0.20%
    • 570: ~0.20%
    • 571: ~0.10%
    • 572: ~0.20%
    • 573: ~0.20%
    • 576: ~0.20%
    • 579: ~0.10%
  • Samples:
    sentence_0 label
    retail sale of wooden, cork and wickerwork goods 202
    e 298
    produkcję maszyn do obróbki miękkiej gumy lub tworzyw sztucznych oraz wytwarzania wyrobów z tych materiałów: wytłaczarek, maszyn do formowania, maszyn do produkcji lub bieżnikowania opon pneumatycznych oraz pozostałych maszyn do produkcji wyrobów z gumy lub tworzyw sztucznych 79
  • Loss: BatchAllTripletLoss

Training Hyperparameters

Non-Default Hyperparameters

  • per_device_train_batch_size: 256
  • per_device_eval_batch_size: 256
  • num_train_epochs: 4
  • multi_dataset_batch_sampler: round_robin

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: no
  • prediction_loss_only: True
  • per_device_train_batch_size: 256
  • per_device_eval_batch_size: 256
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • learning_rate: 5e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1
  • num_train_epochs: 4
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.0
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: round_robin

Framework Versions

  • Python: 3.10.12
  • Sentence Transformers: 3.0.1
  • Transformers: 4.41.2
  • PyTorch: 2.3.0+cu121
  • Accelerate: 0.31.0
  • Datasets: 2.20.0
  • Tokenizers: 0.19.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

BatchAllTripletLoss

@misc{hermans2017defense,
    title={In Defense of the Triplet Loss for Person Re-Identification}, 
    author={Alexander Hermans and Lucas Beyer and Bastian Leibe},
    year={2017},
    eprint={1703.07737},
    archivePrefix={arXiv},
    primaryClass={cs.CV}
}