--- tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:753920 - loss:MultipleNegativesRankingLoss base_model: egyllm/pretrained-arabert widget: - source_sentence: ': استجابة الحادث بعد حادث كشف عن أوجه القصور في الشركة' sentences: - ': لقد وقعت حادثة مؤسفة في الشركة' - ': الحادثة التي حدثت في الشركة لم تكن غلطتهم' - ': من غير الواضح بالنسبة لي ان كانوا قد اعط كل المعلومات قبل النطق بالحكم .' - source_sentence: ': ما معنى اختصار rq؟' sentences: - ': تم نشر غلوبال هوك عسكريًا لدعم العمليات الاستثنائية منذ نوفمبر 2001. في اسم RQ-4، يشير R إلى التعيين الذي تستخدمه وزارة الدفاع للتعقب، وQ يعني نظام طيران بدون طيار. يشير الرقم 4 إلى سلسلة من أنظمة الطيران المأهولة عن بعد.' - ': برنامج تسجيل الشاشة Camtasia. Camtasia هو برنامج يستخدم لتسجيل الأنشطة على الشاشة، والصوت، والفيديو من الكاميرا، وتقديم العروض التقديمية من PowerPoint. من خلال Camtasia، يمكنك تسجيل وتحرير وإنتاج ومشاركة محتوى الدروس. تشمل ميزات التحرير إشارات التوضيح، والتحولات، والتقريب والتحريك، وتعزيزات الصوت، وغيرها. أنتج ملف الفيديو النهائي الذي يشاهده الطلاب حسب ملاءمةهم، ويمكنك تضمين جدول المحتويات للمساعدة في التنقل.' - ': تعريف أعلى. RQ. اختصار لـ ''Rage Quit''، وهو ما يحدث عندما يغادر اللاعب/المستخدم اللعبة بسبب الغضب، عادة عندما يقتل. يستخدم في الألعاب متعددة اللاعبين عبر الإنترنت أو LAN، أو الدردشة. على سبيل المثال، WC3(DOTA).' - source_sentence: ': فتاة تلعب في ساحة لعب' sentences: - ': فتاة تلعب في الفناء الأمامي لمنزلهم.' - ': واذا لم يكمل المتعاقد عمله في الوقت المناسب , فانه قد يتعين عليه تعويض الحكومة .' - ': فتاة في ساحة لعب' - source_sentence: ': ما هو كاسكوس' sentences: - ': تستخدم تفاعل بوليميراز سلسلة متعدد الأطراف الكمي (qPCR) لتحديد عدد نسخ الحمض النووي المحدد في عينة، مقارنةً بمقياس. في PCR في الوقت الحقيقي، يمكن تحديد عدد نسخ الحمض النووي بعد كل دورة من عمليات التضاعيف.' - ': كاسكوس هو تحالف متوسط قائم على كرات التداول البيضاء والبنية.' - ': تُظهر الرسوم البيانية أعلاه نشاط حالة الخدمة لكاسكوس.كو.ايد خلال آخر 10 فحوصات أوتوماتيكية. يُظهر الشريط الأزرق وقت الاستجابة، وهو أفضل عندما يكون أصغر. إذا لم يُعرض أي شريط لوقت معين، فهذا يعني أن الخدمة كانت غير متاحة وكان الموقع غير متصل بالإنترنت.' - source_sentence: ': يبدو أن الفتاة ذات الوشاح الأخضر والكلب الأبيض يلعبان.' sentences: - ': يبدو أن الفتاة والكلب يلعبان.' - ': الفتاة والكلب لا يتفاعلان' - ': فيلكيتونوريا هي اضطراب وراثي يزيد من مستويات الفينيلalanine في الدم.' pipeline_tag: sentence-similarity library_name: sentence-transformers metrics: - cosine_accuracy@1 - cosine_accuracy@3 - cosine_accuracy@5 - cosine_accuracy@10 - cosine_precision@1 - cosine_precision@3 - cosine_precision@5 - cosine_precision@10 - cosine_recall@1 - cosine_recall@3 - cosine_recall@5 - cosine_recall@10 - cosine_ndcg@10 - cosine_mrr@10 - cosine_map@100 - pearson_cosine - spearman_cosine - pearson_manhattan - spearman_manhattan - pearson_euclidean - spearman_euclidean - pearson_dot - spearman_dot - pearson_max - spearman_max model-index: - name: SentenceTransformer based on egyllm/pretrained-arabert results: - task: type: information-retrieval name: Information Retrieval dataset: name: Unknown type: unknown metrics: - type: cosine_accuracy@1 value: 0.7175 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.841 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.878 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.9155 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.7175 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.28033333333333327 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.17560000000000003 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.09155 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.7175 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.841 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.878 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.9155 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.8172358824512647 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.7856547619047611 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.7890154491139222 name: Cosine Map@100 - task: type: semantic-similarity name: Semantic Similarity dataset: name: sts dev type: sts-dev metrics: - type: pearson_cosine value: 0.8015277726105404 name: Pearson Cosine - type: spearman_cosine value: 0.8038248041571585 name: Spearman Cosine - type: pearson_manhattan value: 0.7895258398435966 name: Pearson Manhattan - type: spearman_manhattan value: 0.8012166855619245 name: Spearman Manhattan - type: pearson_euclidean value: 0.7893816883662468 name: Pearson Euclidean - type: spearman_euclidean value: 0.8029392819509334 name: Spearman Euclidean - type: pearson_dot value: 0.7952010752539163 name: Pearson Dot - type: spearman_dot value: 0.7982104142453529 name: Spearman Dot - type: pearson_max value: 0.8015277726105404 name: Pearson Max - type: spearman_max value: 0.8038248041571585 name: Spearman Max --- # SentenceTransformer based on egyllm/pretrained-arabert This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [egyllm/pretrained-arabert](https://huggingface.co/egyllm/pretrained-arabert). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [egyllm/pretrained-arabert](https://huggingface.co/egyllm/pretrained-arabert) - **Maximum Sequence Length:** 256 tokens - **Output Dimensionality:** 768 tokens - **Similarity Function:** Cosine Similarity ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 256, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("sentence_transformers_model_id") # Run inference sentences = [ ': يبدو أن الفتاة ذات الوشاح الأخضر والكلب الأبيض يلعبان.', ': يبدو أن الفتاة والكلب يلعبان.', ': الفتاة والكلب لا يتفاعلان', ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 768] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` ## Evaluation ### Metrics #### Information Retrieval * Evaluated with [InformationRetrievalEvaluator](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) | Metric | Value | |:--------------------|:----------| | cosine_accuracy@1 | 0.7175 | | cosine_accuracy@3 | 0.841 | | cosine_accuracy@5 | 0.878 | | cosine_accuracy@10 | 0.9155 | | cosine_precision@1 | 0.7175 | | cosine_precision@3 | 0.2803 | | cosine_precision@5 | 0.1756 | | cosine_precision@10 | 0.0916 | | cosine_recall@1 | 0.7175 | | cosine_recall@3 | 0.841 | | cosine_recall@5 | 0.878 | | cosine_recall@10 | 0.9155 | | cosine_ndcg@10 | 0.8172 | | cosine_mrr@10 | 0.7857 | | **cosine_map@100** | **0.789** | #### Semantic Similarity * Dataset: `sts-dev` * Evaluated with [EmbeddingSimilarityEvaluator](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator) | Metric | Value | |:--------------------|:-----------| | pearson_cosine | 0.8015 | | **spearman_cosine** | **0.8038** | | pearson_manhattan | 0.7895 | | spearman_manhattan | 0.8012 | | pearson_euclidean | 0.7894 | | spearman_euclidean | 0.8029 | | pearson_dot | 0.7952 | | spearman_dot | 0.7982 | | pearson_max | 0.8015 | | spearman_max | 0.8038 | ## Training Details ### Training Hyperparameters #### Non-Default Hyperparameters - `eval_strategy`: steps - `per_device_train_batch_size`: 64 - `per_device_eval_batch_size`: 64 - `learning_rate`: 1e-05 - `num_train_epochs`: 1 - `lr_scheduler_type`: cosine - `warmup_ratio`: 0.1 - `fp16`: True - `batch_sampler`: no_duplicates #### All Hyperparameters
Click to expand - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: steps - `prediction_loss_only`: True - `per_device_train_batch_size`: 64 - `per_device_eval_batch_size`: 64 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 1 - `eval_accumulation_steps`: None - `torch_empty_cache_steps`: None - `learning_rate`: 1e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 1 - `max_steps`: -1 - `lr_scheduler_type`: cosine - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.1 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: False - `fp16`: True - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: True - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: False - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: False - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `eval_on_start`: False - `use_liger_kernel`: False - `eval_use_gather_object`: False - `batch_sampler`: no_duplicates - `multi_dataset_batch_sampler`: proportional
### Training Logs
Click to expand | Epoch | Step | Training Loss | Validation Loss | cosine_map@100 | sts-dev_spearman_cosine | |:------:|:----:|:-------------:|:---------------:|:--------------:|:-----------------------:| | 0 | 0 | - | - | 0.6380 | 0.6561 | | 0.0008 | 10 | 3.8114 | - | - | - | | 0.0017 | 20 | 3.8901 | - | - | - | | 0.0025 | 30 | 3.598 | - | - | - | | 0.0034 | 40 | 3.8369 | - | - | - | | 0.0042 | 50 | 3.4766 | - | - | - | | 0.0051 | 60 | 3.5983 | - | - | - | | 0.0059 | 70 | 3.3285 | - | - | - | | 0.0068 | 80 | 3.1135 | - | - | - | | 0.0076 | 90 | 2.9757 | - | - | - | | 0.0085 | 100 | 3.3373 | - | - | - | | 0.0093 | 110 | 3.1236 | - | - | - | | 0.0102 | 120 | 2.7132 | - | - | - | | 0.0110 | 130 | 2.8783 | - | - | - | | 0.0119 | 140 | 2.3779 | - | - | - | | 0.0127 | 150 | 2.6556 | - | - | - | | 0.0136 | 160 | 2.2028 | - | - | - | | 0.0144 | 170 | 2.2236 | - | - | - | | 0.0153 | 180 | 2.7309 | - | - | - | | 0.0161 | 190 | 2.4107 | - | - | - | | 0.0170 | 200 | 2.3434 | - | - | - | | 0.0178 | 210 | 1.9811 | - | - | - | | 0.0187 | 220 | 2.6514 | - | - | - | | 0.0195 | 230 | 2.6114 | - | - | - | | 0.0204 | 240 | 2.5214 | - | - | - | | 0.0212 | 250 | 2.01 | - | - | - | | 0.0221 | 260 | 1.7568 | - | - | - | | 0.0229 | 270 | 2.356 | - | - | - | | 0.0238 | 280 | 2.5519 | - | - | - | | 0.0246 | 290 | 2.0232 | - | - | - | | 0.0255 | 300 | 1.6215 | - | - | - | | 0.0263 | 310 | 2.6331 | - | - | - | | 0.0272 | 320 | 2.0053 | - | - | - | | 0.0280 | 330 | 2.3054 | - | - | - | | 0.0289 | 340 | 1.9774 | - | - | - | | 0.0297 | 350 | 1.8434 | - | - | - | | 0.0306 | 360 | 1.3065 | - | - | - | | 0.0314 | 370 | 2.5697 | - | - | - | | 0.0323 | 380 | 2.3131 | - | - | - | | 0.0331 | 390 | 2.0535 | - | - | - | | 0.0340 | 400 | 1.5674 | - | - | - | | 0.0348 | 410 | 2.45 | - | - | - | | 0.0357 | 420 | 1.9994 | - | - | - | | 0.0365 | 430 | 2.6629 | - | - | - | | 0.0374 | 440 | 2.0677 | - | - | - | | 0.0382 | 450 | 1.7282 | - | - | - | | 0.0391 | 460 | 2.1117 | - | - | - | | 0.0399 | 470 | 2.374 | - | - | - | | 0.0408 | 480 | 1.7799 | - | - | - | | 0.0416 | 490 | 1.6734 | - | - | - | | 0.0425 | 500 | 1.4893 | - | - | - | | 0.0433 | 510 | 2.031 | - | - | - | | 0.0442 | 520 | 2.4175 | - | - | - | | 0.0450 | 530 | 2.2505 | - | - | - | | 0.0459 | 540 | 2.3695 | - | - | - | | 0.0467 | 550 | 2.1952 | - | - | - | | 0.0476 | 560 | 2.582 | - | - | - | | 0.0484 | 570 | 1.7935 | - | - | - | | 0.0493 | 580 | 2.156 | - | - | - | | 0.0501 | 590 | 1.5579 | - | - | - | | 0.0510 | 600 | 2.572 | - | - | - | | 0.0518 | 610 | 1.8751 | - | - | - | | 0.0527 | 620 | 2.1146 | - | - | - | | 0.0535 | 630 | 1.739 | - | - | - | | 0.0544 | 640 | 1.7652 | - | - | - | | 0.0552 | 650 | 2.3194 | - | - | - | | 0.0561 | 660 | 1.8637 | - | - | - | | 0.0569 | 670 | 1.9794 | - | - | - | | 0.0578 | 680 | 1.6374 | - | - | - | | 0.0586 | 690 | 1.4355 | - | - | - | | 0.0595 | 700 | 1.3763 | - | - | - | | 0.0603 | 710 | 2.2797 | - | - | - | | 0.0612 | 720 | 1.6895 | - | - | - | | 0.0620 | 730 | 1.6998 | - | - | - | | 0.0629 | 740 | 2.0926 | - | - | - | | 0.0637 | 750 | 2.2495 | - | - | - | | 0.0646 | 760 | 1.8361 | - | - | - | | 0.0654 | 770 | 2.0814 | - | - | - | | 0.0663 | 780 | 1.9751 | - | - | - | | 0.0671 | 790 | 1.5877 | - | - | - | | 0.0680 | 800 | 2.9411 | - | - | - | | 0.0688 | 810 | 2.466 | - | - | - | | 0.0697 | 820 | 1.8303 | - | - | - | | 0.0705 | 830 | 1.3468 | - | - | - | | 0.0714 | 840 | 1.5485 | - | - | - | | 0.0722 | 850 | 2.0856 | - | - | - | | 0.0731 | 860 | 1.9067 | - | - | - | | 0.0739 | 870 | 1.5406 | - | - | - | | 0.0748 | 880 | 2.0842 | - | - | - | | 0.0756 | 890 | 1.3399 | - | - | - | | 0.0765 | 900 | 1.8138 | - | - | - | | 0.0773 | 910 | 1.8355 | - | - | - | | 0.0782 | 920 | 2.2083 | - | - | - | | 0.0790 | 930 | 1.849 | - | - | - | | 0.0799 | 940 | 1.9105 | - | - | - | | 0.0807 | 950 | 1.5099 | - | - | - | | 0.0816 | 960 | 1.2589 | - | - | - | | 0.0824 | 970 | 1.5917 | - | - | - | | 0.0833 | 980 | 1.5236 | - | - | - | | 0.0841 | 990 | 1.9194 | - | - | - | | 0.0850 | 1000 | 1.6147 | 1.7406 | 0.7580 | 0.8109 | | 0.0858 | 1010 | 1.8092 | - | - | - | | 0.0867 | 1020 | 2.2912 | - | - | - | | 0.0875 | 1030 | 1.8473 | - | - | - | | 0.0884 | 1040 | 1.3879 | - | - | - | | 0.0892 | 1050 | 2.5645 | - | - | - | | 0.0901 | 1060 | 1.9847 | - | - | - | | 0.0909 | 1070 | 1.7767 | - | - | - | | 0.0918 | 1080 | 1.8132 | - | - | - | | 0.0926 | 1090 | 2.356 | - | - | - | | 0.0935 | 1100 | 1.8806 | - | - | - | | 0.0943 | 1110 | 1.7226 | - | - | - | | 0.0952 | 1120 | 1.6482 | - | - | - | | 0.0960 | 1130 | 2.5 | - | - | - | | 0.0969 | 1140 | 1.5931 | - | - | - | | 0.0977 | 1150 | 1.3899 | - | - | - | | 0.0986 | 1160 | 1.5451 | - | - | - | | 0.0994 | 1170 | 1.59 | - | - | - | | 0.1003 | 1180 | 1.8115 | - | - | - | | 0.1011 | 1190 | 2.062 | - | - | - | | 0.1020 | 1200 | 1.9508 | - | - | - | | 0.1028 | 1210 | 2.4069 | - | - | - | | 0.1037 | 1220 | 2.0273 | - | - | - | | 0.1045 | 1230 | 1.6278 | - | - | - | | 0.1054 | 1240 | 2.5481 | - | - | - | | 0.1062 | 1250 | 1.9195 | - | - | - | | 0.1071 | 1260 | 1.3667 | - | - | - | | 0.1079 | 1270 | 2.4832 | - | - | - | | 0.1088 | 1280 | 2.0343 | - | - | - | | 0.1096 | 1290 | 2.0113 | - | - | - | | 0.1105 | 1300 | 1.5492 | - | - | - | | 0.1113 | 1310 | 1.6053 | - | - | - | | 0.1122 | 1320 | 1.7595 | - | - | - | | 0.1130 | 1330 | 1.356 | - | - | - | | 0.1139 | 1340 | 1.5716 | - | - | - | | 0.1147 | 1350 | 2.1764 | - | - | - | | 0.1156 | 1360 | 1.9217 | - | - | - | | 0.1164 | 1370 | 2.1936 | - | - | - | | 0.1173 | 1380 | 1.3914 | - | - | - | | 0.1181 | 1390 | 1.9944 | - | - | - | | 0.1190 | 1400 | 2.1162 | - | - | - | | 0.1198 | 1410 | 1.7333 | - | - | - | | 0.1207 | 1420 | 2.1856 | - | - | - | | 0.1215 | 1430 | 2.1026 | - | - | - | | 0.1224 | 1440 | 1.2478 | - | - | - | | 0.1232 | 1450 | 2.1637 | - | - | - | | 0.1241 | 1460 | 1.8734 | - | - | - | | 0.1249 | 1470 | 1.8867 | - | - | - | | 0.1258 | 1480 | 2.2377 | - | - | - | | 0.1266 | 1490 | 1.6174 | - | - | - | | 0.1275 | 1500 | 1.356 | - | - | - | | 0.1283 | 1510 | 2.0684 | - | - | - | | 0.1292 | 1520 | 1.4745 | - | - | - | | 0.1300 | 1530 | 2.0965 | - | - | - | | 0.1309 | 1540 | 1.8437 | - | - | - | | 0.1317 | 1550 | 1.4531 | - | - | - | | 0.1326 | 1560 | 2.4221 | - | - | - | | 0.1334 | 1570 | 1.5201 | - | - | - | | 0.1343 | 1580 | 1.5904 | - | - | - | | 0.1351 | 1590 | 1.5357 | - | - | - | | 0.1360 | 1600 | 2.2998 | - | - | - | | 0.1368 | 1610 | 1.2875 | - | - | - | | 0.1377 | 1620 | 1.089 | - | - | - | | 0.1385 | 1630 | 2.0749 | - | - | - | | 0.1394 | 1640 | 2.2554 | - | - | - | | 0.1402 | 1650 | 1.969 | - | - | - | | 0.1411 | 1660 | 2.6012 | - | - | - | | 0.1419 | 1670 | 2.4911 | - | - | - | | 0.1428 | 1680 | 2.5227 | - | - | - | | 0.1436 | 1690 | 1.4801 | - | - | - | | 0.1445 | 1700 | 1.8368 | - | - | - | | 0.1453 | 1710 | 1.3036 | - | - | - | | 0.1462 | 1720 | 1.0037 | - | - | - | | 0.1470 | 1730 | 1.9339 | - | - | - | | 0.1479 | 1740 | 1.3418 | - | - | - | | 0.1487 | 1750 | 1.6051 | - | - | - | | 0.1496 | 1760 | 1.519 | - | - | - | | 0.1504 | 1770 | 1.7575 | - | - | - | | 0.1513 | 1780 | 2.4666 | - | - | - | | 0.1521 | 1790 | 1.6071 | - | - | - | | 0.1530 | 1800 | 1.5381 | - | - | - | | 0.1538 | 1810 | 2.0542 | - | - | - | | 0.1547 | 1820 | 1.489 | - | - | - | | 0.1555 | 1830 | 1.6377 | - | - | - | | 0.1564 | 1840 | 1.8472 | - | - | - | | 0.1572 | 1850 | 1.1818 | - | - | - | | 0.1581 | 1860 | 1.3088 | - | - | - | | 0.1589 | 1870 | 1.7981 | - | - | - | | 0.1598 | 1880 | 1.6091 | - | - | - | | 0.1606 | 1890 | 1.9716 | - | - | - | | 0.1615 | 1900 | 1.9483 | - | - | - | | 0.1623 | 1910 | 2.0124 | - | - | - | | 0.1632 | 1920 | 1.6491 | - | - | - | | 0.1640 | 1930 | 1.7327 | - | - | - | | 0.1649 | 1940 | 2.1865 | - | - | - | | 0.1657 | 1950 | 2.169 | - | - | - | | 0.1666 | 1960 | 1.1178 | - | - | - | | 0.1674 | 1970 | 1.8374 | - | - | - | | 0.1683 | 1980 | 1.493 | - | - | - | | 0.1691 | 1990 | 1.4554 | - | - | - | | 0.1700 | 2000 | 1.5359 | 1.6272 | 0.7663 | 0.8068 | | 0.1708 | 2010 | 1.5926 | - | - | - | | 0.1717 | 2020 | 1.5631 | - | - | - | | 0.1725 | 2030 | 2.054 | - | - | - | | 0.1734 | 2040 | 1.7155 | - | - | - | | 0.1742 | 2050 | 2.2145 | - | - | - | | 0.1751 | 2060 | 1.9712 | - | - | - | | 0.1759 | 2070 | 1.2845 | - | - | - | | 0.1768 | 2080 | 1.5927 | - | - | - | | 0.1776 | 2090 | 2.0479 | - | - | - | | 0.1785 | 2100 | 1.6388 | - | - | - | | 0.1793 | 2110 | 1.4514 | - | - | - | | 0.1801 | 2120 | 1.5075 | - | - | - | | 0.1810 | 2130 | 1.3573 | - | - | - | | 0.1818 | 2140 | 1.6252 | - | - | - | | 0.1827 | 2150 | 1.73 | - | - | - | | 0.1835 | 2160 | 1.6867 | - | - | - | | 0.1844 | 2170 | 1.4409 | - | - | - | | 0.1852 | 2180 | 1.0126 | - | - | - | | 0.1861 | 2190 | 1.5874 | - | - | - | | 0.1869 | 2200 | 1.5113 | - | - | - | | 0.1878 | 2210 | 2.129 | - | - | - | | 0.1886 | 2220 | 1.2366 | - | - | - | | 0.1895 | 2230 | 2.0757 | - | - | - | | 0.1903 | 2240 | 1.8596 | - | - | - | | 0.1912 | 2250 | 2.1074 | - | - | - | | 0.1920 | 2260 | 1.5711 | - | - | - | | 0.1929 | 2270 | 1.3869 | - | - | - | | 0.1937 | 2280 | 1.7303 | - | - | - | | 0.1946 | 2290 | 1.8375 | - | - | - | | 0.1954 | 2300 | 1.6658 | - | - | - | | 0.1963 | 2310 | 2.4472 | - | - | - | | 0.1971 | 2320 | 1.1964 | - | - | - | | 0.1980 | 2330 | 2.1802 | - | - | - | | 0.1988 | 2340 | 2.2913 | - | - | - | | 0.1997 | 2350 | 1.7305 | - | - | - | | 0.2005 | 2360 | 1.2718 | - | - | - | | 0.2014 | 2370 | 2.1567 | - | - | - | | 0.2022 | 2380 | 1.4862 | - | - | - | | 0.2031 | 2390 | 1.8498 | - | - | - | | 0.2039 | 2400 | 2.0407 | - | - | - | | 0.2048 | 2410 | 1.9914 | - | - | - | | 0.2056 | 2420 | 1.7447 | - | - | - | | 0.2065 | 2430 | 1.944 | - | - | - | | 0.2073 | 2440 | 1.7682 | - | - | - | | 0.2082 | 2450 | 2.0332 | - | - | - | | 0.2090 | 2460 | 2.4602 | - | - | - | | 0.2099 | 2470 | 1.6737 | - | - | - | | 0.2107 | 2480 | 1.2002 | - | - | - | | 0.2116 | 2490 | 2.0536 | - | - | - | | 0.2124 | 2500 | 1.2564 | - | - | - | | 0.2133 | 2510 | 1.7968 | - | - | - | | 0.2141 | 2520 | 1.7934 | - | - | - | | 0.2150 | 2530 | 1.3855 | - | - | - | | 0.2158 | 2540 | 1.5086 | - | - | - | | 0.2167 | 2550 | 2.3278 | - | - | - | | 0.2175 | 2560 | 1.62 | - | - | - | | 0.2184 | 2570 | 2.0118 | - | - | - | | 0.2192 | 2580 | 1.7665 | - | - | - | | 0.2201 | 2590 | 1.4106 | - | - | - | | 0.2209 | 2600 | 2.0529 | - | - | - | | 0.2218 | 2610 | 1.5266 | - | - | - | | 0.2226 | 2620 | 2.2004 | - | - | - | | 0.2235 | 2630 | 1.2109 | - | - | - | | 0.2243 | 2640 | 1.4509 | - | - | - | | 0.2252 | 2650 | 1.494 | - | - | - | | 0.2260 | 2660 | 1.5459 | - | - | - | | 0.2269 | 2670 | 2.0089 | - | - | - | | 0.2277 | 2680 | 1.9762 | - | - | - | | 0.2286 | 2690 | 1.3596 | - | - | - | | 0.2294 | 2700 | 1.5094 | - | - | - | | 0.2303 | 2710 | 1.7427 | - | - | - | | 0.2311 | 2720 | 1.354 | - | - | - | | 0.2320 | 2730 | 1.9882 | - | - | - | | 0.2328 | 2740 | 1.3848 | - | - | - | | 0.2337 | 2750 | 1.6313 | - | - | - | | 0.2345 | 2760 | 1.7722 | - | - | - | | 0.2354 | 2770 | 1.2339 | - | - | - | | 0.2362 | 2780 | 1.3144 | - | - | - | | 0.2371 | 2790 | 1.7124 | - | - | - | | 0.2379 | 2800 | 1.8489 | - | - | - | | 0.2388 | 2810 | 1.4535 | - | - | - | | 0.2396 | 2820 | 1.6224 | - | - | - | | 0.2405 | 2830 | 1.6815 | - | - | - | | 0.2413 | 2840 | 1.2336 | - | - | - | | 0.2422 | 2850 | 1.4843 | - | - | - | | 0.2430 | 2860 | 1.295 | - | - | - | | 0.2439 | 2870 | 1.6095 | - | - | - | | 0.2447 | 2880 | 1.7894 | - | - | - | | 0.2456 | 2890 | 1.6503 | - | - | - | | 0.2464 | 2900 | 1.6089 | - | - | - | | 0.2473 | 2910 | 1.8407 | - | - | - | | 0.2481 | 2920 | 1.5631 | - | - | - | | 0.2490 | 2930 | 1.4495 | - | - | - | | 0.2498 | 2940 | 2.0262 | - | - | - | | 0.2507 | 2950 | 1.7444 | - | - | - | | 0.2515 | 2960 | 1.1065 | - | - | - | | 0.2524 | 2970 | 2.1085 | - | - | - | | 0.2532 | 2980 | 1.8828 | - | - | - | | 0.2541 | 2990 | 1.9617 | - | - | - | | 0.2549 | 3000 | 2.1222 | 1.5225 | 0.7716 | 0.7985 | | 0.2558 | 3010 | 1.8215 | - | - | - | | 0.2566 | 3020 | 2.3271 | - | - | - | | 0.2575 | 3030 | 1.3244 | - | - | - | | 0.2583 | 3040 | 1.5012 | - | - | - | | 0.2592 | 3050 | 1.7094 | - | - | - | | 0.2600 | 3060 | 1.7635 | - | - | - | | 0.2609 | 3070 | 1.4024 | - | - | - | | 0.2617 | 3080 | 1.8977 | - | - | - | | 0.2626 | 3090 | 1.4965 | - | - | - | | 0.2634 | 3100 | 1.986 | - | - | - | | 0.2643 | 3110 | 1.6921 | - | - | - | | 0.2651 | 3120 | 1.1191 | - | - | - | | 0.2660 | 3130 | 1.5588 | - | - | - | | 0.2668 | 3140 | 2.2996 | - | - | - | | 0.2677 | 3150 | 1.3422 | - | - | - | | 0.2685 | 3160 | 1.9579 | - | - | - | | 0.2694 | 3170 | 1.0521 | - | - | - | | 0.2702 | 3180 | 1.8859 | - | - | - | | 0.2711 | 3190 | 1.6077 | - | - | - | | 0.2719 | 3200 | 1.0576 | - | - | - | | 0.2728 | 3210 | 1.527 | - | - | - | | 0.2736 | 3220 | 1.2154 | - | - | - | | 0.2745 | 3230 | 1.6487 | - | - | - | | 0.2753 | 3240 | 1.918 | - | - | - | | 0.2762 | 3250 | 1.8735 | - | - | - | | 0.2770 | 3260 | 2.508 | - | - | - | | 0.2779 | 3270 | 1.5813 | - | - | - | | 0.2787 | 3280 | 1.3501 | - | - | - | | 0.2796 | 3290 | 1.364 | - | - | - | | 0.2804 | 3300 | 1.5669 | - | - | - | | 0.2813 | 3310 | 1.2687 | - | - | - | | 0.2821 | 3320 | 1.9495 | - | - | - | | 0.2830 | 3330 | 1.1315 | - | - | - | | 0.2838 | 3340 | 0.9636 | - | - | - | | 0.2847 | 3350 | 1.3071 | - | - | - | | 0.2855 | 3360 | 1.3237 | - | - | - | | 0.2864 | 3370 | 2.1571 | - | - | - | | 0.2872 | 3380 | 1.5394 | - | - | - | | 0.2881 | 3390 | 1.493 | - | - | - | | 0.2889 | 3400 | 1.8023 | - | - | - | | 0.2898 | 3410 | 1.9951 | - | - | - | | 0.2906 | 3420 | 1.4618 | - | - | - | | 0.2915 | 3430 | 1.5207 | - | - | - | | 0.2923 | 3440 | 1.8013 | - | - | - | | 0.2932 | 3450 | 1.4841 | - | - | - | | 0.2940 | 3460 | 2.1567 | - | - | - | | 0.2949 | 3470 | 1.7638 | - | - | - | | 0.2957 | 3480 | 1.4507 | - | - | - | | 0.2966 | 3490 | 2.1364 | - | - | - | | 0.2974 | 3500 | 1.3655 | - | - | - | | 0.2983 | 3510 | 1.147 | - | - | - | | 0.2991 | 3520 | 1.8986 | - | - | - | | 0.3000 | 3530 | 1.6014 | - | - | - | | 0.3008 | 3540 | 1.2619 | - | - | - | | 0.3017 | 3550 | 1.3716 | - | - | - | | 0.3025 | 3560 | 1.5904 | - | - | - | | 0.3034 | 3570 | 1.726 | - | - | - | | 0.3042 | 3580 | 1.6235 | - | - | - | | 0.3051 | 3590 | 1.7598 | - | - | - | | 0.3059 | 3600 | 1.8795 | - | - | - | | 0.3068 | 3610 | 1.6107 | - | - | - | | 0.3076 | 3620 | 1.3525 | - | - | - | | 0.3085 | 3630 | 1.8275 | - | - | - | | 0.3093 | 3640 | 1.333 | - | - | - | | 0.3102 | 3650 | 1.6917 | - | - | - | | 0.3110 | 3660 | 1.6108 | - | - | - | | 0.3119 | 3670 | 1.6899 | - | - | - | | 0.3127 | 3680 | 1.2133 | - | - | - | | 0.3136 | 3690 | 1.4407 | - | - | - | | 0.3144 | 3700 | 1.8746 | - | - | - | | 0.3153 | 3710 | 1.6211 | - | - | - | | 0.3161 | 3720 | 1.5504 | - | - | - | | 0.3170 | 3730 | 1.8787 | - | - | - | | 0.3178 | 3740 | 2.0654 | - | - | - | | 0.3187 | 3750 | 1.4762 | - | - | - | | 0.3195 | 3760 | 1.7039 | - | - | - | | 0.3204 | 3770 | 1.8382 | - | - | - | | 0.3212 | 3780 | 1.684 | - | - | - | | 0.3221 | 3790 | 1.5044 | - | - | - | | 0.3229 | 3800 | 1.9366 | - | - | - | | 0.3238 | 3810 | 1.3692 | - | - | - | | 0.3246 | 3820 | 1.9425 | - | - | - | | 0.3255 | 3830 | 1.9457 | - | - | - | | 0.3263 | 3840 | 2.0349 | - | - | - | | 0.3272 | 3850 | 2.2629 | - | - | - | | 0.3280 | 3860 | 1.782 | - | - | - | | 0.3289 | 3870 | 1.1131 | - | - | - | | 0.3297 | 3880 | 1.6522 | - | - | - | | 0.3306 | 3890 | 1.4468 | - | - | - | | 0.3314 | 3900 | 1.2263 | - | - | - | | 0.3323 | 3910 | 1.4744 | - | - | - | | 0.3331 | 3920 | 1.346 | - | - | - | | 0.3340 | 3930 | 1.6235 | - | - | - | | 0.3348 | 3940 | 1.5373 | - | - | - | | 0.3357 | 3950 | 1.9912 | - | - | - | | 0.3365 | 3960 | 1.5235 | - | - | - | | 0.3374 | 3970 | 1.2973 | - | - | - | | 0.3382 | 3980 | 1.8943 | - | - | - | | 0.3391 | 3990 | 1.796 | - | - | - | | 0.3399 | 4000 | 1.4485 | 1.4988 | 0.7767 | 0.8003 | | 0.3408 | 4010 | 1.4139 | - | - | - | | 0.3416 | 4020 | 1.5104 | - | - | - | | 0.3425 | 4030 | 1.4306 | - | - | - | | 0.3433 | 4040 | 2.0212 | - | - | - | | 0.3442 | 4050 | 1.4815 | - | - | - | | 0.3450 | 4060 | 1.0738 | - | - | - | | 0.3459 | 4070 | 0.9565 | - | - | - | | 0.3467 | 4080 | 1.0451 | - | - | - | | 0.3476 | 4090 | 1.5975 | - | - | - | | 0.3484 | 4100 | 1.8642 | - | - | - | | 0.3493 | 4110 | 1.8995 | - | - | - | | 0.3501 | 4120 | 1.8488 | - | - | - | | 0.3510 | 4130 | 1.1606 | - | - | - | | 0.3518 | 4140 | 1.8689 | - | - | - | | 0.3527 | 4150 | 1.2646 | - | - | - | | 0.3535 | 4160 | 0.8987 | - | - | - | | 0.3544 | 4170 | 1.4526 | - | - | - | | 0.3552 | 4180 | 1.8155 | - | - | - | | 0.3561 | 4190 | 1.4764 | - | - | - | | 0.3569 | 4200 | 1.2846 | - | - | - | | 0.3577 | 4210 | 1.7014 | - | - | - | | 0.3586 | 4220 | 1.2782 | - | - | - | | 0.3594 | 4230 | 1.4259 | - | - | - | | 0.3603 | 4240 | 1.6493 | - | - | - | | 0.3611 | 4250 | 2.1898 | - | - | - | | 0.3620 | 4260 | 2.011 | - | - | - | | 0.3628 | 4270 | 1.4618 | - | - | - | | 0.3637 | 4280 | 1.4918 | - | - | - | | 0.3645 | 4290 | 1.203 | - | - | - | | 0.3654 | 4300 | 2.0598 | - | - | - | | 0.3662 | 4310 | 1.2831 | - | - | - | | 0.3671 | 4320 | 1.6989 | - | - | - | | 0.3679 | 4330 | 1.5319 | - | - | - | | 0.3688 | 4340 | 1.7994 | - | - | - | | 0.3696 | 4350 | 1.9254 | - | - | - | | 0.3705 | 4360 | 1.373 | - | - | - | | 0.3713 | 4370 | 1.7809 | - | - | - | | 0.3722 | 4380 | 1.5119 | - | - | - | | 0.3730 | 4390 | 0.9275 | - | - | - | | 0.3739 | 4400 | 1.9906 | - | - | - | | 0.3747 | 4410 | 1.6756 | - | - | - | | 0.3756 | 4420 | 1.8964 | - | - | - | | 0.3764 | 4430 | 1.3878 | - | - | - | | 0.3773 | 4440 | 2.1686 | - | - | - | | 0.3781 | 4450 | 1.7287 | - | - | - | | 0.3790 | 4460 | 1.4491 | - | - | - | | 0.3798 | 4470 | 1.2374 | - | - | - | | 0.3807 | 4480 | 1.7013 | - | - | - | | 0.3815 | 4490 | 1.511 | - | - | - | | 0.3824 | 4500 | 1.7912 | - | - | - | | 0.3832 | 4510 | 1.3491 | - | - | - | | 0.3841 | 4520 | 1.1391 | - | - | - | | 0.3849 | 4530 | 2.2409 | - | - | - | | 0.3858 | 4540 | 1.1876 | - | - | - | | 0.3866 | 4550 | 1.6563 | - | - | - | | 0.3875 | 4560 | 1.4501 | - | - | - | | 0.3883 | 4570 | 1.4546 | - | - | - | | 0.3892 | 4580 | 1.8082 | - | - | - | | 0.3900 | 4590 | 1.6279 | - | - | - | | 0.3909 | 4600 | 1.6263 | - | - | - | | 0.3917 | 4610 | 1.3064 | - | - | - | | 0.3926 | 4620 | 1.3364 | - | - | - | | 0.3934 | 4630 | 1.3731 | - | - | - | | 0.3943 | 4640 | 1.6393 | - | - | - | | 0.3951 | 4650 | 1.5386 | - | - | - | | 0.3960 | 4660 | 1.3492 | - | - | - | | 0.3968 | 4670 | 1.3999 | - | - | - | | 0.3977 | 4680 | 1.6538 | - | - | - | | 0.3985 | 4690 | 1.1034 | - | - | - | | 0.3994 | 4700 | 1.2209 | - | - | - | | 0.4002 | 4710 | 1.2475 | - | - | - | | 0.4011 | 4720 | 1.4437 | - | - | - | | 0.4019 | 4730 | 1.3123 | - | - | - | | 0.4028 | 4740 | 1.3572 | - | - | - | | 0.4036 | 4750 | 1.7064 | - | - | - | | 0.4045 | 4760 | 1.1078 | - | - | - | | 0.4053 | 4770 | 1.5242 | - | - | - | | 0.4062 | 4780 | 1.9819 | - | - | - | | 0.4070 | 4790 | 1.2159 | - | - | - | | 0.4079 | 4800 | 0.9277 | - | - | - | | 0.4087 | 4810 | 1.7686 | - | - | - | | 0.4096 | 4820 | 1.2682 | - | - | - | | 0.4104 | 4830 | 1.4559 | - | - | - | | 0.4113 | 4840 | 1.6704 | - | - | - | | 0.4121 | 4850 | 1.8827 | - | - | - | | 0.4130 | 4860 | 1.8031 | - | - | - | | 0.4138 | 4870 | 1.5041 | - | - | - | | 0.4147 | 4880 | 1.7433 | - | - | - | | 0.4155 | 4890 | 1.1801 | - | - | - | | 0.4164 | 4900 | 1.7493 | - | - | - | | 0.4172 | 4910 | 1.3221 | - | - | - | | 0.4181 | 4920 | 1.5274 | - | - | - | | 0.4189 | 4930 | 1.2865 | - | - | - | | 0.4198 | 4940 | 1.1829 | - | - | - | | 0.4206 | 4950 | 1.6341 | - | - | - | | 0.4215 | 4960 | 1.7116 | - | - | - | | 0.4223 | 4970 | 2.116 | - | - | - | | 0.4232 | 4980 | 1.0212 | - | - | - | | 0.4240 | 4990 | 1.6326 | - | - | - | | 0.4249 | 5000 | 1.5782 | 1.4283 | 0.7817 | 0.8030 | | 0.4257 | 5010 | 1.1953 | - | - | - | | 0.4266 | 5020 | 1.2725 | - | - | - | | 0.4274 | 5030 | 1.1633 | - | - | - | | 0.4283 | 5040 | 1.4567 | - | - | - | | 0.4291 | 5050 | 1.5835 | - | - | - | | 0.4300 | 5060 | 1.7031 | - | - | - | | 0.4308 | 5070 | 1.8205 | - | - | - | | 0.4317 | 5080 | 1.7956 | - | - | - | | 0.4325 | 5090 | 1.4548 | - | - | - | | 0.4334 | 5100 | 1.3128 | - | - | - | | 0.4342 | 5110 | 1.4953 | - | - | - | | 0.4351 | 5120 | 1.2878 | - | - | - | | 0.4359 | 5130 | 1.2808 | - | - | - | | 0.4368 | 5140 | 1.6998 | - | - | - | | 0.4376 | 5150 | 1.5072 | - | - | - | | 0.4385 | 5160 | 2.1685 | - | - | - | | 0.4393 | 5170 | 1.5449 | - | - | - | | 0.4402 | 5180 | 1.5365 | - | - | - | | 0.4410 | 5190 | 2.8665 | - | - | - | | 0.4419 | 5200 | 1.3293 | - | - | - | | 0.4427 | 5210 | 1.9454 | - | - | - | | 0.4436 | 5220 | 2.1613 | - | - | - | | 0.4444 | 5230 | 1.8404 | - | - | - | | 0.4453 | 5240 | 1.7808 | - | - | - | | 0.4461 | 5250 | 1.2141 | - | - | - | | 0.4470 | 5260 | 1.3211 | - | - | - | | 0.4478 | 5270 | 2.0617 | - | - | - | | 0.4487 | 5280 | 2.0629 | - | - | - | | 0.4495 | 5290 | 1.2651 | - | - | - | | 0.4504 | 5300 | 1.9326 | - | - | - | | 0.4512 | 5310 | 1.455 | - | - | - | | 0.4521 | 5320 | 2.0163 | - | - | - | | 0.4529 | 5330 | 1.3844 | - | - | - | | 0.4538 | 5340 | 2.1358 | - | - | - | | 0.4546 | 5350 | 1.6149 | - | - | - | | 0.4555 | 5360 | 1.5739 | - | - | - | | 0.4563 | 5370 | 1.365 | - | - | - | | 0.4572 | 5380 | 1.4386 | - | - | - | | 0.4580 | 5390 | 1.8719 | - | - | - | | 0.4589 | 5400 | 1.357 | - | - | - | | 0.4597 | 5410 | 1.5401 | - | - | - | | 0.4606 | 5420 | 1.6023 | - | - | - | | 0.4614 | 5430 | 1.277 | - | - | - | | 0.4623 | 5440 | 1.5706 | - | - | - | | 0.4631 | 5450 | 1.7458 | - | - | - | | 0.4640 | 5460 | 1.2394 | - | - | - | | 0.4648 | 5470 | 1.1898 | - | - | - | | 0.4657 | 5480 | 1.6555 | - | - | - | | 0.4665 | 5490 | 2.1313 | - | - | - | | 0.4674 | 5500 | 1.5389 | - | - | - | | 0.4682 | 5510 | 1.8014 | - | - | - | | 0.4691 | 5520 | 0.8131 | - | - | - | | 0.4699 | 5530 | 1.9825 | - | - | - | | 0.4708 | 5540 | 1.1446 | - | - | - | | 0.4716 | 5550 | 1.6029 | - | - | - | | 0.4725 | 5560 | 0.8073 | - | - | - | | 0.4733 | 5570 | 1.4648 | - | - | - | | 0.4742 | 5580 | 1.4102 | - | - | - | | 0.4750 | 5590 | 1.3797 | - | - | - | | 0.4759 | 5600 | 1.5279 | - | - | - | | 0.4767 | 5610 | 1.5366 | - | - | - | | 0.4776 | 5620 | 1.7663 | - | - | - | | 0.4784 | 5630 | 1.4334 | - | - | - | | 0.4793 | 5640 | 1.7049 | - | - | - | | 0.4801 | 5650 | 1.9447 | - | - | - | | 0.4810 | 5660 | 1.3648 | - | - | - | | 0.4818 | 5670 | 1.7867 | - | - | - | | 0.4827 | 5680 | 1.6188 | - | - | - | | 0.4835 | 5690 | 1.7816 | - | - | - | | 0.4844 | 5700 | 1.4414 | - | - | - | | 0.4852 | 5710 | 1.1949 | - | - | - | | 0.4861 | 5720 | 1.9432 | - | - | - | | 0.4869 | 5730 | 1.6184 | - | - | - | | 0.4878 | 5740 | 1.5613 | - | - | - | | 0.4886 | 5750 | 1.7348 | - | - | - | | 0.4895 | 5760 | 1.3744 | - | - | - | | 0.4903 | 5770 | 1.9828 | - | - | - | | 0.4912 | 5780 | 1.7423 | - | - | - | | 0.4920 | 5790 | 1.3677 | - | - | - | | 0.4929 | 5800 | 1.1892 | - | - | - | | 0.4937 | 5810 | 1.588 | - | - | - | | 0.4946 | 5820 | 1.5046 | - | - | - | | 0.4954 | 5830 | 1.5982 | - | - | - | | 0.4963 | 5840 | 1.492 | - | - | - | | 0.4971 | 5850 | 1.7543 | - | - | - | | 0.4980 | 5860 | 1.9768 | - | - | - | | 0.4988 | 5870 | 1.5444 | - | - | - | | 0.4997 | 5880 | 1.3143 | - | - | - | | 0.5005 | 5890 | 1.0762 | - | - | - | | 0.5014 | 5900 | 1.9283 | - | - | - | | 0.5022 | 5910 | 1.9011 | - | - | - | | 0.5031 | 5920 | 1.6025 | - | - | - | | 0.5039 | 5930 | 1.5606 | - | - | - | | 0.5048 | 5940 | 1.2376 | - | - | - | | 0.5056 | 5950 | 1.322 | - | - | - | | 0.5065 | 5960 | 1.2843 | - | - | - | | 0.5073 | 5970 | 1.3481 | - | - | - | | 0.5082 | 5980 | 1.0269 | - | - | - | | 0.5090 | 5990 | 1.204 | - | - | - | | 0.5099 | 6000 | 1.6248 | 1.4044 | 0.7823 | 0.8081 | | 0.5107 | 6010 | 1.3755 | - | - | - | | 0.5116 | 6020 | 0.9876 | - | - | - | | 0.5124 | 6030 | 1.5123 | - | - | - | | 0.5133 | 6040 | 1.4224 | - | - | - | | 0.5141 | 6050 | 1.5319 | - | - | - | | 0.5150 | 6060 | 1.6707 | - | - | - | | 0.5158 | 6070 | 1.7906 | - | - | - | | 0.5167 | 6080 | 1.0413 | - | - | - | | 0.5175 | 6090 | 1.3346 | - | - | - | | 0.5184 | 6100 | 1.8298 | - | - | - | | 0.5192 | 6110 | 1.4339 | - | - | - | | 0.5201 | 6120 | 1.6045 | - | - | - | | 0.5209 | 6130 | 1.5257 | - | - | - | | 0.5218 | 6140 | 1.4627 | - | - | - | | 0.5226 | 6150 | 1.8083 | - | - | - | | 0.5235 | 6160 | 1.1072 | - | - | - | | 0.5243 | 6170 | 1.3782 | - | - | - | | 0.5252 | 6180 | 1.539 | - | - | - | | 0.5260 | 6190 | 1.3758 | - | - | - | | 0.5269 | 6200 | 2.0819 | - | - | - | | 0.5277 | 6210 | 1.2339 | - | - | - | | 0.5286 | 6220 | 1.346 | - | - | - | | 0.5294 | 6230 | 1.6628 | - | - | - | | 0.5303 | 6240 | 2.0857 | - | - | - | | 0.5311 | 6250 | 1.3907 | - | - | - | | 0.5320 | 6260 | 1.3082 | - | - | - | | 0.5328 | 6270 | 1.8005 | - | - | - | | 0.5337 | 6280 | 2.1571 | - | - | - | | 0.5345 | 6290 | 1.9294 | - | - | - | | 0.5354 | 6300 | 2.2004 | - | - | - | | 0.5362 | 6310 | 1.5136 | - | - | - | | 0.5370 | 6320 | 1.6803 | - | - | - | | 0.5379 | 6330 | 1.3923 | - | - | - | | 0.5387 | 6340 | 2.4211 | - | - | - | | 0.5396 | 6350 | 1.4678 | - | - | - | | 0.5404 | 6360 | 1.6661 | - | - | - | | 0.5413 | 6370 | 0.9979 | - | - | - | | 0.5421 | 6380 | 1.1718 | - | - | - | | 0.5430 | 6390 | 1.9122 | - | - | - | | 0.5438 | 6400 | 1.7934 | - | - | - | | 0.5447 | 6410 | 1.6539 | - | - | - | | 0.5455 | 6420 | 1.8081 | - | - | - | | 0.5464 | 6430 | 1.8629 | - | - | - | | 0.5472 | 6440 | 1.3883 | - | - | - | | 0.5481 | 6450 | 1.3248 | - | - | - | | 0.5489 | 6460 | 1.6304 | - | - | - | | 0.5498 | 6470 | 0.9951 | - | - | - | | 0.5506 | 6480 | 0.9729 | - | - | - | | 0.5515 | 6490 | 2.2003 | - | - | - | | 0.5523 | 6500 | 0.9242 | - | - | - | | 0.5532 | 6510 | 1.6794 | - | - | - | | 0.5540 | 6520 | 1.2956 | - | - | - | | 0.5549 | 6530 | 1.4456 | - | - | - | | 0.5557 | 6540 | 1.1975 | - | - | - | | 0.5566 | 6550 | 2.0751 | - | - | - | | 0.5574 | 6560 | 1.5858 | - | - | - | | 0.5583 | 6570 | 1.8451 | - | - | - | | 0.5591 | 6580 | 0.9895 | - | - | - | | 0.5600 | 6590 | 1.5388 | - | - | - | | 0.5608 | 6600 | 1.443 | - | - | - | | 0.5617 | 6610 | 1.4455 | - | - | - | | 0.5625 | 6620 | 1.5491 | - | - | - | | 0.5634 | 6630 | 1.2772 | - | - | - | | 0.5642 | 6640 | 1.566 | - | - | - | | 0.5651 | 6650 | 1.1092 | - | - | - | | 0.5659 | 6660 | 1.4266 | - | - | - | | 0.5668 | 6670 | 1.9267 | - | - | - | | 0.5676 | 6680 | 1.4297 | - | - | - | | 0.5685 | 6690 | 1.4397 | - | - | - | | 0.5693 | 6700 | 1.4476 | - | - | - | | 0.5702 | 6710 | 1.6113 | - | - | - | | 0.5710 | 6720 | 0.8579 | - | - | - | | 0.5719 | 6730 | 2.1762 | - | - | - | | 0.5727 | 6740 | 1.7159 | - | - | - | | 0.5736 | 6750 | 1.247 | - | - | - | | 0.5744 | 6760 | 1.4467 | - | - | - | | 0.5753 | 6770 | 1.8219 | - | - | - | | 0.5761 | 6780 | 1.729 | - | - | - | | 0.5770 | 6790 | 1.58 | - | - | - | | 0.5778 | 6800 | 1.5089 | - | - | - | | 0.5787 | 6810 | 1.2977 | - | - | - | | 0.5795 | 6820 | 1.6302 | - | - | - | | 0.5804 | 6830 | 1.7185 | - | - | - | | 0.5812 | 6840 | 1.1584 | - | - | - | | 0.5821 | 6850 | 1.6683 | - | - | - | | 0.5829 | 6860 | 1.1037 | - | - | - | | 0.5838 | 6870 | 1.7633 | - | - | - | | 0.5846 | 6880 | 1.4152 | - | - | - | | 0.5855 | 6890 | 1.8851 | - | - | - | | 0.5863 | 6900 | 1.6294 | - | - | - | | 0.5872 | 6910 | 1.2872 | - | - | - | | 0.5880 | 6920 | 1.3789 | - | - | - | | 0.5889 | 6930 | 1.6389 | - | - | - | | 0.5897 | 6940 | 2.172 | - | - | - | | 0.5906 | 6950 | 1.2677 | - | - | - | | 0.5914 | 6960 | 1.5623 | - | - | - | | 0.5923 | 6970 | 1.993 | - | - | - | | 0.5931 | 6980 | 0.9549 | - | - | - | | 0.5940 | 6990 | 1.3705 | - | - | - | | 0.5948 | 7000 | 1.0568 | 1.3680 | 0.7842 | 0.8020 | | 0.5957 | 7010 | 1.2301 | - | - | - | | 0.5965 | 7020 | 1.7126 | - | - | - | | 0.5974 | 7030 | 1.5412 | - | - | - | | 0.5982 | 7040 | 1.1385 | - | - | - | | 0.5991 | 7050 | 1.2436 | - | - | - | | 0.5999 | 7060 | 1.323 | - | - | - | | 0.6008 | 7070 | 1.4247 | - | - | - | | 0.6016 | 7080 | 1.6796 | - | - | - | | 0.6025 | 7090 | 1.4213 | - | - | - | | 0.6033 | 7100 | 0.9983 | - | - | - | | 0.6042 | 7110 | 1.5862 | - | - | - | | 0.6050 | 7120 | 1.118 | - | - | - | | 0.6059 | 7130 | 1.6444 | - | - | - | | 0.6067 | 7140 | 1.7763 | - | - | - | | 0.6076 | 7150 | 1.8345 | - | - | - | | 0.6084 | 7160 | 1.6835 | - | - | - | | 0.6093 | 7170 | 1.0519 | - | - | - | | 0.6101 | 7180 | 1.6993 | - | - | - | | 0.6110 | 7190 | 1.8109 | - | - | - | | 0.6118 | 7200 | 1.7157 | - | - | - | | 0.6127 | 7210 | 1.5706 | - | - | - | | 0.6135 | 7220 | 1.5365 | - | - | - | | 0.6144 | 7230 | 1.4711 | - | - | - | | 0.6152 | 7240 | 1.5818 | - | - | - | | 0.6161 | 7250 | 1.3997 | - | - | - | | 0.6169 | 7260 | 1.044 | - | - | - | | 0.6178 | 7270 | 1.6471 | - | - | - | | 0.6186 | 7280 | 1.2558 | - | - | - | | 0.6195 | 7290 | 1.0215 | - | - | - | | 0.6203 | 7300 | 1.6653 | - | - | - | | 0.6212 | 7310 | 1.2894 | - | - | - | | 0.6220 | 7320 | 1.6529 | - | - | - | | 0.6229 | 7330 | 1.7363 | - | - | - | | 0.6237 | 7340 | 0.8245 | - | - | - | | 0.6246 | 7350 | 2.1902 | - | - | - | | 0.6254 | 7360 | 1.1631 | - | - | - | | 0.6263 | 7370 | 1.735 | - | - | - | | 0.6271 | 7380 | 1.4256 | - | - | - | | 0.6280 | 7390 | 1.6377 | - | - | - | | 0.6288 | 7400 | 1.5828 | - | - | - | | 0.6297 | 7410 | 1.4463 | - | - | - | | 0.6305 | 7420 | 0.9314 | - | - | - | | 0.6314 | 7430 | 1.1351 | - | - | - | | 0.6322 | 7440 | 1.3325 | - | - | - | | 0.6331 | 7450 | 1.8632 | - | - | - | | 0.6339 | 7460 | 1.014 | - | - | - | | 0.6348 | 7470 | 1.4796 | - | - | - | | 0.6356 | 7480 | 1.8911 | - | - | - | | 0.6365 | 7490 | 1.6274 | - | - | - | | 0.6373 | 7500 | 1.2259 | - | - | - | | 0.6382 | 7510 | 1.1066 | - | - | - | | 0.6390 | 7520 | 1.3845 | - | - | - | | 0.6399 | 7530 | 1.4874 | - | - | - | | 0.6407 | 7540 | 1.5912 | - | - | - | | 0.6416 | 7550 | 1.4071 | - | - | - | | 0.6424 | 7560 | 1.2559 | - | - | - | | 0.6433 | 7570 | 1.2858 | - | - | - | | 0.6441 | 7580 | 1.5097 | - | - | - | | 0.6450 | 7590 | 1.1406 | - | - | - | | 0.6458 | 7600 | 1.6047 | - | - | - | | 0.6467 | 7610 | 1.2911 | - | - | - | | 0.6475 | 7620 | 1.4758 | - | - | - | | 0.6484 | 7630 | 1.4608 | - | - | - | | 0.6492 | 7640 | 1.4307 | - | - | - | | 0.6501 | 7650 | 1.1705 | - | - | - | | 0.6509 | 7660 | 1.1394 | - | - | - | | 0.6518 | 7670 | 1.133 | - | - | - | | 0.6526 | 7680 | 1.8461 | - | - | - | | 0.6535 | 7690 | 1.6305 | - | - | - | | 0.6543 | 7700 | 1.3304 | - | - | - | | 0.6552 | 7710 | 0.9695 | - | - | - | | 0.6560 | 7720 | 1.3937 | - | - | - | | 0.6569 | 7730 | 1.4486 | - | - | - | | 0.6577 | 7740 | 1.3141 | - | - | - | | 0.6586 | 7750 | 1.1174 | - | - | - | | 0.6594 | 7760 | 1.0358 | - | - | - | | 0.6603 | 7770 | 1.4542 | - | - | - | | 0.6611 | 7780 | 1.3459 | - | - | - | | 0.6620 | 7790 | 1.3809 | - | - | - | | 0.6628 | 7800 | 1.1335 | - | - | - | | 0.6637 | 7810 | 2.2354 | - | - | - | | 0.6645 | 7820 | 1.9021 | - | - | - | | 0.6654 | 7830 | 1.4453 | - | - | - | | 0.6662 | 7840 | 1.621 | - | - | - | | 0.6671 | 7850 | 1.3936 | - | - | - | | 0.6679 | 7860 | 1.5465 | - | - | - | | 0.6688 | 7870 | 1.4917 | - | - | - | | 0.6696 | 7880 | 1.9427 | - | - | - | | 0.6705 | 7890 | 1.2764 | - | - | - | | 0.6713 | 7900 | 1.8721 | - | - | - | | 0.6722 | 7910 | 1.6532 | - | - | - | | 0.6730 | 7920 | 0.9971 | - | - | - | | 0.6739 | 7930 | 1.4542 | - | - | - | | 0.6747 | 7940 | 1.5839 | - | - | - | | 0.6756 | 7950 | 1.6431 | - | - | - | | 0.6764 | 7960 | 1.8941 | - | - | - | | 0.6773 | 7970 | 1.0336 | - | - | - | | 0.6781 | 7980 | 1.7703 | - | - | - | | 0.6790 | 7990 | 1.1059 | - | - | - | | 0.6798 | 8000 | 1.7855 | 1.3473 | 0.7890 | 0.8038 |
### Framework Versions - Python: 3.10.12 - Sentence Transformers: 3.2.1 - Transformers: 4.45.2 - PyTorch: 2.1.0+cu118 - Accelerate: 1.0.1 - Datasets: 3.0.2 - Tokenizers: 0.20.3 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### MultipleNegativesRankingLoss ```bibtex @misc{henderson2017efficient, title={Efficient Natural Language Response Suggestion for Smart Reply}, author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil}, year={2017}, eprint={1705.00652}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```