Shashwat13333 commited on
Commit
70cb6c2
·
verified ·
1 Parent(s): ea27bd2

Model save

Browse files
1_Pooling/config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "word_embedding_dimension": 768,
3
+ "pooling_mode_cls_token": true,
4
+ "pooling_mode_mean_tokens": false,
5
+ "pooling_mode_max_tokens": false,
6
+ "pooling_mode_mean_sqrt_len_tokens": false,
7
+ "pooling_mode_weightedmean_tokens": false,
8
+ "pooling_mode_lasttoken": false,
9
+ "include_prompt": true
10
+ }
README.md ADDED
@@ -0,0 +1,896 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - en
4
+ license: apache-2.0
5
+ tags:
6
+ - sentence-transformers
7
+ - sentence-similarity
8
+ - feature-extraction
9
+ - generated_from_trainer
10
+ - dataset_size:150
11
+ - loss:MatryoshkaLoss
12
+ - loss:MultipleNegativesRankingLoss
13
+ base_model: BAAI/bge-base-en-v1.5
14
+ widget:
15
+ - source_sentence: What services does Techchefz Digital offer for AI adoption?
16
+ sentences:
17
+ - 'We are a New breed of innovative digital transformation agency, redefining storytelling
18
+ for an always-on world.
19
+
20
+ With roots dating back to 2017, we started as a pocket size team of enthusiasts
21
+ with a goal of helping traditional businesses transform and create dynamic, digital
22
+ cultures through disruptive strategies and agile deployment of innovative solutions.'
23
+ - "At Techchefz Digital, we specialize in guiding companies through the complexities\
24
+ \ of adopting and integrating Artificial Intelligence and Machine Learning technologies.\
25
+ \ Our consultancy services are designed to enhance your operational efficiency\
26
+ \ and decision-making capabilities across all sectors. With a global network of\
27
+ \ AI/ML experts and a commitment to excellence, we are your partners in transforming\
28
+ \ innovative possibilities into real-world achievements. \
29
+ \ \
30
+ \ \n DATA INTELLIGENCE PLATFORMS we\
31
+ \ specialize in\nTensorFlow\nDatabricks\nTableau\nPytorch\nOpenAI\nPinecone\""
32
+ - 'How can we get started with your DevOps solutions?
33
+
34
+ Getting started is easy. Contact us through our website. We''ll schedule a consultation
35
+ to discuss your needs, evaluate your current infrastructure, and propose a customized
36
+ DevOps solution designed to achieve your goals.'
37
+ - source_sentence: Hav you made any services for schools and students?
38
+ sentences:
39
+ - 'How do we do Custom Development ?
40
+
41
+ We follow below process to develop custom web or mobile Application on Agile Methodology,
42
+ breaking requirements in pieces and developing and shipping them with considering
43
+ utmost quality:
44
+
45
+ Requirements Analysis
46
+
47
+ We begin by understanding the client's needs and objectives for the website.
48
+ Identify key features, functionality, and any specific design preferences.
49
+
50
+
51
+ Project Planning
52
+
53
+ Then create a detailed project plan outlining the scope, timeline, and milestones.
54
+ Define the technology stack and development tools suitable for the project.
55
+
56
+
57
+ User Experience Design
58
+
59
+ Then comes the stage of Developing wireframes or prototypes to visualize the website's
60
+ structure and layout. We create a custom design that aligns with the brand identity
61
+ and user experience goals.
62
+
63
+
64
+ Development
65
+
66
+ After getting Sign-off on Design from Client, we break the requirements into Sprints
67
+ on Agile Methodology, and start developing them.'
68
+ - 'This is our Portfolio
69
+
70
+ Introducing the world of Housing Finance& Banking Firm.
71
+
72
+ Corporate Website with 10 regional languages in India with analytics and user
73
+ personalization and Dashboard for Regional Managers, Sales Agents, etc. to manage
74
+ the Builder Requests, approve/deny Properties, manage visits and appointments,
75
+ manage leads, etc.
76
+
77
+
78
+
79
+ Introducing the world of Global Automotive Brand.We have implemented a Multi Locale
80
+ Multilingual Omnichannel platform for Royal Enfield. The platform supports public
81
+ websites, customer portals, internal portals, business applications for over 35+
82
+ different locations all over the world.
83
+
84
+
85
+ Developed Digital Platform for Students, Guardians, Teachers, Tutors, with AI/ML
86
+ in collaboration with Successive Technologies Inc, USA. Cloud, Dev-Sec-Ops &
87
+ Data Governance
88
+
89
+ Managing cloud provisioning and modernization alongside automated infrastructure,
90
+ event-driven microservices, containerization, DevOps, cybersecurity, and 24x7
91
+ monitoring support ensures efficient, secure, and responsive IT operations.'
92
+ - "SERVICES WE PROVIDE\nFlexible engagement models tailored to your needs\nWe specialize\
93
+ \ in comprehensive website audits that provide valuable insights and recommendations\
94
+ \ to enhance your online presence.\nDigital Strategy & Consulting\nCreating digital\
95
+ \ roadmap that transform your digital enterprise and produce a return on investment,\
96
+ \ basis our discovery framework, brainstorming sessions & current state analysis.\n\
97
+ \nPlatform Selection\nHelping you select the optimal digital experience, commerce,\
98
+ \ cloud and marketing platform for your enterprise.\n\nPlatform Builds\nDeploying\
99
+ \ next-gen scalable and agile enterprise digital platforms, along with multi-platform\
100
+ \ integrations. \nProduct Builds\nHelp you ideate, strategize, and engineer\
101
+ \ your product with help of our enterprise frameworks\nInfrastructure\nSpecialize\
102
+ \ in multi-cloud infrastructure helping you put forward the right cloud infrastructure\
103
+ \ and optimization strategy.\n\nManaged Services\nOperate and monitor your business-critical\
104
+ \ applications, data, and IT workloads, along with Application maintenance and\
105
+ \ operations.\nTeam Augmentation\nHelp you scale up and augment your existing\
106
+ \ team to solve your hiring challenges with our easy to deploy staff augmentation\
107
+ \ offerings.\""
108
+ - source_sentence: How did TechChefz evolve from its early days?
109
+ sentences:
110
+ - 'Why do we need Microservices ?
111
+
112
+ Instead of building a monolithic application where all functionalities are tightly
113
+ integrated, microservices break down the system into modular and loosely coupled
114
+ services.
115
+
116
+
117
+ Scalability
118
+
119
+ Flexibility and Agility
120
+
121
+ Resilience and Fault Isolation
122
+
123
+ Technology Diversity
124
+
125
+ Continuous Delivery'
126
+ - 'After a transformative scuba dive in the Maldives, Mayank Maggon made a pivotal
127
+ decision to depart from the corporate ladder in December 2016. Fueled by a clear
128
+ vision to revolutionize the digital landscape, Mayank set out to leverage the
129
+ best technology ingredients, crafting custom applications and digital ecosystems
130
+ tailored to clients'' specific needs, limitations, and budgets.
131
+
132
+
133
+ However, this solo journey was not without its challenges. Mayank had to initiate
134
+ the revenue engine by offering corporate trainings and conducting online batches
135
+ for tech training across the USA. He also undertook small projects and subcontracted
136
+ modules of larger projects for clients in the US, UK, and India. It was only after
137
+ this initial groundwork that Mayank was able to hire a group of interns, whom
138
+ he meticulously trained and groomed to prepare them for handling Enterprise Level
139
+ Applications. This journey reflects Mayank''s resilience, determination, and entrepreneurial
140
+ spirit in building TechChefz Digital from the ground up.
141
+
142
+
143
+ With a passion for innovation and a relentless drive for excellence, Mayank has
144
+ steered TechChefz Digital through strategic partnerships, groundbreaking projects,
145
+ and exponential growth. His leadership has been instrumental in shaping TechChefz
146
+ Digital into a leading force in the digital transformation arena, inspiring a
147
+ culture of innovation and excellence that continues to propel the company forward.'
148
+ - 'In what ways can machine learning optimize our operations?
149
+
150
+ Machine learning algorithms can analyze operational data to identify inefficiencies,
151
+ predict maintenance needs, optimize supply chains, and automate repetitive tasks,
152
+ significantly improving operational efficiency and reducing costs.'
153
+ - source_sentence: What kind of data do you leverage for AI solutions?
154
+ sentences:
155
+ - 'In the Introducing the world of Global Insurance Firm, we crafted Effective Solutions
156
+ for Complex Problems and delieverd a comprehensive Website Development, Production
157
+ Support & Managed Services, we optimized customer journeys, integrate analytics,
158
+ CRM, ERP, and third-party applications, and implement cutting-edge technologies
159
+ for enhanced performance and efficiency
160
+
161
+ and achievied 200% Reduction in operational time & effort managing content & experience,
162
+ 70% Reduction in Deployment Errors and Downtime, 2.5X Customer Engagement, Conversion
163
+ & Retention'
164
+ - 'Our Solutions
165
+
166
+ Strategy & Digital Transformation
167
+
168
+ Innovate via digital transformation, modernize tech, craft product strategies,
169
+ enhance customer experiences, optimize data analytics, transition to cloud for
170
+ growth and efficiency
171
+
172
+
173
+ Product Engineering & Custom Development
174
+
175
+ Providing product development, enterprise web and mobile development, microservices
176
+ integrations, quality engineering, and application support services to drive innovation
177
+ and enhance operational efficiency.'
178
+ - Our AI/ML services pave the way for transformative change across industries, embodying
179
+ a client-focused approach that integrates seamlessly with human-centric innovation.
180
+ Our collaborative teams are dedicated to fostering growth, leveraging data, and
181
+ harnessing the predictive power of artificial intelligence to forge the next wave
182
+ of software excellence. We don't just deliver AI; we deliver the future.
183
+ - source_sentence: What managed services does TechChefz provide ?
184
+ sentences:
185
+ - " What we do\n\nDigital Strategy\nCreating digital frameworks that transform\
186
+ \ your digital enterprise and produce a return on investment.\n\nPlatform Selection\n\
187
+ Helping you select the optimal digital experience, commerce, cloud and marketing\
188
+ \ platform for your enterprise.\n\nPlatform Builds\nDeploying next-gen scalable\
189
+ \ and agile enterprise digital platforms, along with multi-platform integrations.\n\
190
+ \nProduct Builds\nHelp you ideate, strategize, and engineer your product with\
191
+ \ help of our enterprise frameworks \n\nTeam Augmentation\nHelp you scale up and\
192
+ \ augment your existing team to solve your hiring challenges with our easy to\
193
+ \ deploy staff augmentation offerings .\nManaged Services\nOperate and monitor\
194
+ \ your business-critical applications, data, and IT workloads, along with Application\
195
+ \ maintenance and operations\n"
196
+ - 'What makes your DevOps solutions stand out from the competition?
197
+
198
+ Our DevOps solutions stand out due to our personalized approach, extensive expertise,
199
+ and commitment to innovation. We focus on delivering measurable results, such
200
+ as reduced deployment times, improved system reliability, and enhanced security,
201
+ ensuring you get the maximum benefit from our services.'
202
+ - 'Introducing the world of General Insurance Firm
203
+
204
+ In this project, we implemented Digital Solution and Implementation with Headless
205
+ Drupal as the CMS, and lightweight React JS (Next JS SSR on Node JS) with the
206
+ following features:
207
+
208
+ PWA & AMP based Web Pages
209
+
210
+ Page Speed Optimization
211
+
212
+ Reusable and scalable React JS / Next JS Templates and Components
213
+
214
+ Headless Drupal CMS with Content & Experience management, approval workflows,
215
+ etc for seamless collaboration between the business and marketing teams
216
+
217
+ Minimalistic Buy and Renewal Journeys for various products, with API integrations
218
+ and adherence to data compliances
219
+
220
+
221
+ We achieved 250% Reduction in Operational Time and Effort in managing the Content
222
+ & Experience for Buy & renew Journeys,220% Reduction in Customer Drops during
223
+ buy and renewal journeys, 300% Reduction in bounce rate on policy landing and
224
+ campaign pages'
225
+ pipeline_tag: sentence-similarity
226
+ library_name: sentence-transformers
227
+ metrics:
228
+ - cosine_accuracy@1
229
+ - cosine_accuracy@3
230
+ - cosine_accuracy@5
231
+ - cosine_accuracy@10
232
+ - cosine_precision@1
233
+ - cosine_precision@3
234
+ - cosine_precision@5
235
+ - cosine_precision@10
236
+ - cosine_recall@1
237
+ - cosine_recall@3
238
+ - cosine_recall@5
239
+ - cosine_recall@10
240
+ - cosine_ndcg@10
241
+ - cosine_mrr@10
242
+ - cosine_map@100
243
+ model-index:
244
+ - name: BGE base Financial Matryoshka
245
+ results:
246
+ - task:
247
+ type: information-retrieval
248
+ name: Information Retrieval
249
+ dataset:
250
+ name: dim 768
251
+ type: dim_768
252
+ metrics:
253
+ - type: cosine_accuracy@1
254
+ value: 0.17333333333333334
255
+ name: Cosine Accuracy@1
256
+ - type: cosine_accuracy@3
257
+ value: 0.5466666666666666
258
+ name: Cosine Accuracy@3
259
+ - type: cosine_accuracy@5
260
+ value: 0.6
261
+ name: Cosine Accuracy@5
262
+ - type: cosine_accuracy@10
263
+ value: 0.6933333333333334
264
+ name: Cosine Accuracy@10
265
+ - type: cosine_precision@1
266
+ value: 0.17333333333333334
267
+ name: Cosine Precision@1
268
+ - type: cosine_precision@3
269
+ value: 0.1822222222222222
270
+ name: Cosine Precision@3
271
+ - type: cosine_precision@5
272
+ value: 0.12
273
+ name: Cosine Precision@5
274
+ - type: cosine_precision@10
275
+ value: 0.06933333333333333
276
+ name: Cosine Precision@10
277
+ - type: cosine_recall@1
278
+ value: 0.17333333333333334
279
+ name: Cosine Recall@1
280
+ - type: cosine_recall@3
281
+ value: 0.5466666666666666
282
+ name: Cosine Recall@3
283
+ - type: cosine_recall@5
284
+ value: 0.6
285
+ name: Cosine Recall@5
286
+ - type: cosine_recall@10
287
+ value: 0.6933333333333334
288
+ name: Cosine Recall@10
289
+ - type: cosine_ndcg@10
290
+ value: 0.43705488094312567
291
+ name: Cosine Ndcg@10
292
+ - type: cosine_mrr@10
293
+ value: 0.3539576719576719
294
+ name: Cosine Mrr@10
295
+ - type: cosine_map@100
296
+ value: 0.3663753684578632
297
+ name: Cosine Map@100
298
+ - task:
299
+ type: information-retrieval
300
+ name: Information Retrieval
301
+ dataset:
302
+ name: dim 512
303
+ type: dim_512
304
+ metrics:
305
+ - type: cosine_accuracy@1
306
+ value: 0.17333333333333334
307
+ name: Cosine Accuracy@1
308
+ - type: cosine_accuracy@3
309
+ value: 0.5333333333333333
310
+ name: Cosine Accuracy@3
311
+ - type: cosine_accuracy@5
312
+ value: 0.6266666666666667
313
+ name: Cosine Accuracy@5
314
+ - type: cosine_accuracy@10
315
+ value: 0.6933333333333334
316
+ name: Cosine Accuracy@10
317
+ - type: cosine_precision@1
318
+ value: 0.17333333333333334
319
+ name: Cosine Precision@1
320
+ - type: cosine_precision@3
321
+ value: 0.17777777777777776
322
+ name: Cosine Precision@3
323
+ - type: cosine_precision@5
324
+ value: 0.12533333333333332
325
+ name: Cosine Precision@5
326
+ - type: cosine_precision@10
327
+ value: 0.06933333333333333
328
+ name: Cosine Precision@10
329
+ - type: cosine_recall@1
330
+ value: 0.17333333333333334
331
+ name: Cosine Recall@1
332
+ - type: cosine_recall@3
333
+ value: 0.5333333333333333
334
+ name: Cosine Recall@3
335
+ - type: cosine_recall@5
336
+ value: 0.6266666666666667
337
+ name: Cosine Recall@5
338
+ - type: cosine_recall@10
339
+ value: 0.6933333333333334
340
+ name: Cosine Recall@10
341
+ - type: cosine_ndcg@10
342
+ value: 0.43324477959330543
343
+ name: Cosine Ndcg@10
344
+ - type: cosine_mrr@10
345
+ value: 0.3495185185185184
346
+ name: Cosine Mrr@10
347
+ - type: cosine_map@100
348
+ value: 0.359896266319179
349
+ name: Cosine Map@100
350
+ - task:
351
+ type: information-retrieval
352
+ name: Information Retrieval
353
+ dataset:
354
+ name: dim 256
355
+ type: dim_256
356
+ metrics:
357
+ - type: cosine_accuracy@1
358
+ value: 0.22666666666666666
359
+ name: Cosine Accuracy@1
360
+ - type: cosine_accuracy@3
361
+ value: 0.49333333333333335
362
+ name: Cosine Accuracy@3
363
+ - type: cosine_accuracy@5
364
+ value: 0.56
365
+ name: Cosine Accuracy@5
366
+ - type: cosine_accuracy@10
367
+ value: 0.68
368
+ name: Cosine Accuracy@10
369
+ - type: cosine_precision@1
370
+ value: 0.22666666666666666
371
+ name: Cosine Precision@1
372
+ - type: cosine_precision@3
373
+ value: 0.16444444444444445
374
+ name: Cosine Precision@3
375
+ - type: cosine_precision@5
376
+ value: 0.11199999999999997
377
+ name: Cosine Precision@5
378
+ - type: cosine_precision@10
379
+ value: 0.06799999999999998
380
+ name: Cosine Precision@10
381
+ - type: cosine_recall@1
382
+ value: 0.22666666666666666
383
+ name: Cosine Recall@1
384
+ - type: cosine_recall@3
385
+ value: 0.49333333333333335
386
+ name: Cosine Recall@3
387
+ - type: cosine_recall@5
388
+ value: 0.56
389
+ name: Cosine Recall@5
390
+ - type: cosine_recall@10
391
+ value: 0.68
392
+ name: Cosine Recall@10
393
+ - type: cosine_ndcg@10
394
+ value: 0.4383628839300849
395
+ name: Cosine Ndcg@10
396
+ - type: cosine_mrr@10
397
+ value: 0.36210582010582004
398
+ name: Cosine Mrr@10
399
+ - type: cosine_map@100
400
+ value: 0.3731640827722892
401
+ name: Cosine Map@100
402
+ - task:
403
+ type: information-retrieval
404
+ name: Information Retrieval
405
+ dataset:
406
+ name: dim 128
407
+ type: dim_128
408
+ metrics:
409
+ - type: cosine_accuracy@1
410
+ value: 0.24
411
+ name: Cosine Accuracy@1
412
+ - type: cosine_accuracy@3
413
+ value: 0.48
414
+ name: Cosine Accuracy@3
415
+ - type: cosine_accuracy@5
416
+ value: 0.56
417
+ name: Cosine Accuracy@5
418
+ - type: cosine_accuracy@10
419
+ value: 0.6933333333333334
420
+ name: Cosine Accuracy@10
421
+ - type: cosine_precision@1
422
+ value: 0.24
423
+ name: Cosine Precision@1
424
+ - type: cosine_precision@3
425
+ value: 0.16
426
+ name: Cosine Precision@3
427
+ - type: cosine_precision@5
428
+ value: 0.11199999999999997
429
+ name: Cosine Precision@5
430
+ - type: cosine_precision@10
431
+ value: 0.06933333333333332
432
+ name: Cosine Precision@10
433
+ - type: cosine_recall@1
434
+ value: 0.24
435
+ name: Cosine Recall@1
436
+ - type: cosine_recall@3
437
+ value: 0.48
438
+ name: Cosine Recall@3
439
+ - type: cosine_recall@5
440
+ value: 0.56
441
+ name: Cosine Recall@5
442
+ - type: cosine_recall@10
443
+ value: 0.6933333333333334
444
+ name: Cosine Recall@10
445
+ - type: cosine_ndcg@10
446
+ value: 0.4443870388298522
447
+ name: Cosine Ndcg@10
448
+ - type: cosine_mrr@10
449
+ value: 0.36651322751322746
450
+ name: Cosine Mrr@10
451
+ - type: cosine_map@100
452
+ value: 0.37546675549059694
453
+ name: Cosine Map@100
454
+ - task:
455
+ type: information-retrieval
456
+ name: Information Retrieval
457
+ dataset:
458
+ name: dim 64
459
+ type: dim_64
460
+ metrics:
461
+ - type: cosine_accuracy@1
462
+ value: 0.08
463
+ name: Cosine Accuracy@1
464
+ - type: cosine_accuracy@3
465
+ value: 0.3466666666666667
466
+ name: Cosine Accuracy@3
467
+ - type: cosine_accuracy@5
468
+ value: 0.49333333333333335
469
+ name: Cosine Accuracy@5
470
+ - type: cosine_accuracy@10
471
+ value: 0.56
472
+ name: Cosine Accuracy@10
473
+ - type: cosine_precision@1
474
+ value: 0.08
475
+ name: Cosine Precision@1
476
+ - type: cosine_precision@3
477
+ value: 0.11555555555555555
478
+ name: Cosine Precision@3
479
+ - type: cosine_precision@5
480
+ value: 0.09866666666666667
481
+ name: Cosine Precision@5
482
+ - type: cosine_precision@10
483
+ value: 0.05599999999999999
484
+ name: Cosine Precision@10
485
+ - type: cosine_recall@1
486
+ value: 0.08
487
+ name: Cosine Recall@1
488
+ - type: cosine_recall@3
489
+ value: 0.3466666666666667
490
+ name: Cosine Recall@3
491
+ - type: cosine_recall@5
492
+ value: 0.49333333333333335
493
+ name: Cosine Recall@5
494
+ - type: cosine_recall@10
495
+ value: 0.56
496
+ name: Cosine Recall@10
497
+ - type: cosine_ndcg@10
498
+ value: 0.3120295466486537
499
+ name: Cosine Ndcg@10
500
+ - type: cosine_mrr@10
501
+ value: 0.23260846560846554
502
+ name: Cosine Mrr@10
503
+ - type: cosine_map@100
504
+ value: 0.24731947636993173
505
+ name: Cosine Map@100
506
+ ---
507
+
508
+ # BGE base Financial Matryoshka
509
+
510
+ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
511
+
512
+ ## Model Details
513
+
514
+ ### Model Description
515
+ - **Model Type:** Sentence Transformer
516
+ - **Base model:** [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) <!-- at revision a5beb1e3e68b9ab74eb54cfd186867f64f240e1a -->
517
+ - **Maximum Sequence Length:** 512 tokens
518
+ - **Output Dimensionality:** 768 dimensions
519
+ - **Similarity Function:** Cosine Similarity
520
+ <!-- - **Training Dataset:** Unknown -->
521
+ - **Language:** en
522
+ - **License:** apache-2.0
523
+
524
+ ### Model Sources
525
+
526
+ - **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
527
+ - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
528
+ - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
529
+
530
+ ### Full Model Architecture
531
+
532
+ ```
533
+ SentenceTransformer(
534
+ (0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel
535
+ (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
536
+ (2): Normalize()
537
+ )
538
+ ```
539
+
540
+ ## Usage
541
+
542
+ ### Direct Usage (Sentence Transformers)
543
+
544
+ First install the Sentence Transformers library:
545
+
546
+ ```bash
547
+ pip install -U sentence-transformers
548
+ ```
549
+
550
+ Then you can load this model and run inference.
551
+ ```python
552
+ from sentence_transformers import SentenceTransformer
553
+
554
+ # Download from the 🤗 Hub
555
+ model = SentenceTransformer("Shashwat13333/bge-base-en-v1.5")
556
+ # Run inference
557
+ sentences = [
558
+ 'What managed services does TechChefz provide ?',
559
+ ' What we do\n\nDigital Strategy\nCreating digital frameworks that transform your digital enterprise and produce a return on investment.\n\nPlatform Selection\nHelping you select the optimal digital experience, commerce, cloud and marketing platform for your enterprise.\n\nPlatform Builds\nDeploying next-gen scalable and agile enterprise digital platforms, along with multi-platform integrations.\n\nProduct Builds\nHelp you ideate, strategize, and engineer your product with help of our enterprise frameworks \n\nTeam Augmentation\nHelp you scale up and augment your existing team to solve your hiring challenges with our easy to deploy staff augmentation offerings .\nManaged Services\nOperate and monitor your business-critical applications, data, and IT workloads, along with Application maintenance and operations\n',
560
+ 'Introducing the world of General Insurance Firm\nIn this project, we implemented Digital Solution and Implementation with Headless Drupal as the CMS, and lightweight React JS (Next JS SSR on Node JS) with the following features:\nPWA & AMP based Web Pages\nPage Speed Optimization\nReusable and scalable React JS / Next JS Templates and Components\nHeadless Drupal CMS with Content & Experience management, approval workflows, etc for seamless collaboration between the business and marketing teams\nMinimalistic Buy and Renewal Journeys for various products, with API integrations and adherence to data compliances\n\nWe achieved 250% Reduction in Operational Time and Effort in managing the Content & Experience for Buy & renew Journeys,220% Reduction in Customer Drops during buy and renewal journeys, 300% Reduction in bounce rate on policy landing and campaign pages',
561
+ ]
562
+ embeddings = model.encode(sentences)
563
+ print(embeddings.shape)
564
+ # [3, 768]
565
+
566
+ # Get the similarity scores for the embeddings
567
+ similarities = model.similarity(embeddings, embeddings)
568
+ print(similarities.shape)
569
+ # [3, 3]
570
+ ```
571
+
572
+ <!--
573
+ ### Direct Usage (Transformers)
574
+
575
+ <details><summary>Click to see the direct usage in Transformers</summary>
576
+
577
+ </details>
578
+ -->
579
+
580
+ <!--
581
+ ### Downstream Usage (Sentence Transformers)
582
+
583
+ You can finetune this model on your own dataset.
584
+
585
+ <details><summary>Click to expand</summary>
586
+
587
+ </details>
588
+ -->
589
+
590
+ <!--
591
+ ### Out-of-Scope Use
592
+
593
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
594
+ -->
595
+
596
+ ## Evaluation
597
+
598
+ ### Metrics
599
+
600
+ #### Information Retrieval
601
+
602
+ * Datasets: `dim_768`, `dim_512`, `dim_256`, `dim_128` and `dim_64`
603
+ * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
604
+
605
+ | Metric | dim_768 | dim_512 | dim_256 | dim_128 | dim_64 |
606
+ |:--------------------|:-----------|:-----------|:-----------|:-----------|:----------|
607
+ | cosine_accuracy@1 | 0.1733 | 0.1733 | 0.2267 | 0.24 | 0.08 |
608
+ | cosine_accuracy@3 | 0.5467 | 0.5333 | 0.4933 | 0.48 | 0.3467 |
609
+ | cosine_accuracy@5 | 0.6 | 0.6267 | 0.56 | 0.56 | 0.4933 |
610
+ | cosine_accuracy@10 | 0.6933 | 0.6933 | 0.68 | 0.6933 | 0.56 |
611
+ | cosine_precision@1 | 0.1733 | 0.1733 | 0.2267 | 0.24 | 0.08 |
612
+ | cosine_precision@3 | 0.1822 | 0.1778 | 0.1644 | 0.16 | 0.1156 |
613
+ | cosine_precision@5 | 0.12 | 0.1253 | 0.112 | 0.112 | 0.0987 |
614
+ | cosine_precision@10 | 0.0693 | 0.0693 | 0.068 | 0.0693 | 0.056 |
615
+ | cosine_recall@1 | 0.1733 | 0.1733 | 0.2267 | 0.24 | 0.08 |
616
+ | cosine_recall@3 | 0.5467 | 0.5333 | 0.4933 | 0.48 | 0.3467 |
617
+ | cosine_recall@5 | 0.6 | 0.6267 | 0.56 | 0.56 | 0.4933 |
618
+ | cosine_recall@10 | 0.6933 | 0.6933 | 0.68 | 0.6933 | 0.56 |
619
+ | **cosine_ndcg@10** | **0.4371** | **0.4332** | **0.4384** | **0.4444** | **0.312** |
620
+ | cosine_mrr@10 | 0.354 | 0.3495 | 0.3621 | 0.3665 | 0.2326 |
621
+ | cosine_map@100 | 0.3664 | 0.3599 | 0.3732 | 0.3755 | 0.2473 |
622
+
623
+ <!--
624
+ ## Bias, Risks and Limitations
625
+
626
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
627
+ -->
628
+
629
+ <!--
630
+ ### Recommendations
631
+
632
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
633
+ -->
634
+
635
+ ## Training Details
636
+
637
+ ### Training Dataset
638
+
639
+ #### Unnamed Dataset
640
+
641
+
642
+ * Size: 150 training samples
643
+ * Columns: <code>anchor</code> and <code>positive</code>
644
+ * Approximate statistics based on the first 150 samples:
645
+ | | anchor | positive |
646
+ |:--------|:---------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
647
+ | type | string | string |
648
+ | details | <ul><li>min: 7 tokens</li><li>mean: 12.4 tokens</li><li>max: 20 tokens</li></ul> | <ul><li>min: 20 tokens</li><li>mean: 126.17 tokens</li><li>max: 378 tokens</li></ul> |
649
+ * Samples:
650
+ | anchor | positive |
651
+ |:--------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
652
+ | <code>Is it hard to move old systems to the cloud?</code> | <code>We offer custom software development, digital marketing strategies, and tailored solutions to drive tangible results for your business. Our expert team combines technical prowess with industry insights to propel your business forward in the digital landscape.<br><br>"Engage, analyze & target your customers<br>Digital transformation enables you to interact with customers across multiple channels, providing personalized experiences. This could include social media engagement, interactive websites, and mobile apps." "Empower your employees & partners<br>The push for digital transformation has led many companies to embrace cloud solutions. However, the migration and integration of legacy systems into the cloud often present challenges." "Optimize & automate your operations<br>The push for digital transformation has led many companies to embrace cloud solutions. However, the migration and integration of legacy systems into the cloud often present challenges." "Transform your products<br>The push for digi...</code> |
653
+ | <code>What benefits does marketing automation offer for time management?</code> | <code>Our MarTech capabilities<br><br>Personalization<br>Involves tailoring marketing messages and experiences to individual customers. It enhances customer engagement, loyalty, and ultimately, conversion rates.<br><br>Marketing Automation<br>Marketing automation streamlines repetitive tasks such as email marketing, lead nurturing, and social media posting. It improves efficiency, saves time, and ensures timely communication with customers.<br><br>Customer Relationship Management<br>CRM systems help manage interactions with current and potential customers. They store customer data, track interactions, and facilitate communication, improving customer retention.</code> |
654
+ | <code>How can your recommendation engines improve our business?</code> | <code>How can your recommendation engines improve our business?<br>Our recommendation engines are designed to analyze customer behavior and preferences to deliver personalized suggestions, enhancing user experience, increasing sales, and boosting customer retention.</code> |
655
+ * Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
656
+ ```json
657
+ {
658
+ "loss": "MultipleNegativesRankingLoss",
659
+ "matryoshka_dims": [
660
+ 768,
661
+ 512,
662
+ 256,
663
+ 128,
664
+ 64
665
+ ],
666
+ "matryoshka_weights": [
667
+ 1,
668
+ 1,
669
+ 1,
670
+ 1,
671
+ 1
672
+ ],
673
+ "n_dims_per_step": -1
674
+ }
675
+ ```
676
+
677
+ ### Training Hyperparameters
678
+ #### Non-Default Hyperparameters
679
+
680
+ - `eval_strategy`: epoch
681
+ - `gradient_accumulation_steps`: 4
682
+ - `learning_rate`: 1e-05
683
+ - `weight_decay`: 0.01
684
+ - `num_train_epochs`: 4
685
+ - `lr_scheduler_type`: cosine
686
+ - `warmup_ratio`: 0.1
687
+ - `fp16`: True
688
+ - `load_best_model_at_end`: True
689
+ - `optim`: adamw_torch_fused
690
+ - `push_to_hub`: True
691
+ - `hub_model_id`: Shashwat13333/bge-base-en-v1.5
692
+ - `push_to_hub_model_id`: bge-base-en-v1.5
693
+ - `batch_sampler`: no_duplicates
694
+
695
+ #### All Hyperparameters
696
+ <details><summary>Click to expand</summary>
697
+
698
+ - `overwrite_output_dir`: False
699
+ - `do_predict`: False
700
+ - `eval_strategy`: epoch
701
+ - `prediction_loss_only`: True
702
+ - `per_device_train_batch_size`: 8
703
+ - `per_device_eval_batch_size`: 8
704
+ - `per_gpu_train_batch_size`: None
705
+ - `per_gpu_eval_batch_size`: None
706
+ - `gradient_accumulation_steps`: 4
707
+ - `eval_accumulation_steps`: None
708
+ - `torch_empty_cache_steps`: None
709
+ - `learning_rate`: 1e-05
710
+ - `weight_decay`: 0.01
711
+ - `adam_beta1`: 0.9
712
+ - `adam_beta2`: 0.999
713
+ - `adam_epsilon`: 1e-08
714
+ - `max_grad_norm`: 1.0
715
+ - `num_train_epochs`: 4
716
+ - `max_steps`: -1
717
+ - `lr_scheduler_type`: cosine
718
+ - `lr_scheduler_kwargs`: {}
719
+ - `warmup_ratio`: 0.1
720
+ - `warmup_steps`: 0
721
+ - `log_level`: passive
722
+ - `log_level_replica`: warning
723
+ - `log_on_each_node`: True
724
+ - `logging_nan_inf_filter`: True
725
+ - `save_safetensors`: True
726
+ - `save_on_each_node`: False
727
+ - `save_only_model`: False
728
+ - `restore_callback_states_from_checkpoint`: False
729
+ - `no_cuda`: False
730
+ - `use_cpu`: False
731
+ - `use_mps_device`: False
732
+ - `seed`: 42
733
+ - `data_seed`: None
734
+ - `jit_mode_eval`: False
735
+ - `use_ipex`: False
736
+ - `bf16`: False
737
+ - `fp16`: True
738
+ - `fp16_opt_level`: O1
739
+ - `half_precision_backend`: auto
740
+ - `bf16_full_eval`: False
741
+ - `fp16_full_eval`: False
742
+ - `tf32`: None
743
+ - `local_rank`: 0
744
+ - `ddp_backend`: None
745
+ - `tpu_num_cores`: None
746
+ - `tpu_metrics_debug`: False
747
+ - `debug`: []
748
+ - `dataloader_drop_last`: False
749
+ - `dataloader_num_workers`: 0
750
+ - `dataloader_prefetch_factor`: None
751
+ - `past_index`: -1
752
+ - `disable_tqdm`: False
753
+ - `remove_unused_columns`: True
754
+ - `label_names`: None
755
+ - `load_best_model_at_end`: True
756
+ - `ignore_data_skip`: False
757
+ - `fsdp`: []
758
+ - `fsdp_min_num_params`: 0
759
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
760
+ - `fsdp_transformer_layer_cls_to_wrap`: None
761
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
762
+ - `deepspeed`: None
763
+ - `label_smoothing_factor`: 0.0
764
+ - `optim`: adamw_torch_fused
765
+ - `optim_args`: None
766
+ - `adafactor`: False
767
+ - `group_by_length`: False
768
+ - `length_column_name`: length
769
+ - `ddp_find_unused_parameters`: None
770
+ - `ddp_bucket_cap_mb`: None
771
+ - `ddp_broadcast_buffers`: False
772
+ - `dataloader_pin_memory`: True
773
+ - `dataloader_persistent_workers`: False
774
+ - `skip_memory_metrics`: True
775
+ - `use_legacy_prediction_loop`: False
776
+ - `push_to_hub`: True
777
+ - `resume_from_checkpoint`: None
778
+ - `hub_model_id`: Shashwat13333/bge-base-en-v1.5
779
+ - `hub_strategy`: every_save
780
+ - `hub_private_repo`: None
781
+ - `hub_always_push`: False
782
+ - `gradient_checkpointing`: False
783
+ - `gradient_checkpointing_kwargs`: None
784
+ - `include_inputs_for_metrics`: False
785
+ - `include_for_metrics`: []
786
+ - `eval_do_concat_batches`: True
787
+ - `fp16_backend`: auto
788
+ - `push_to_hub_model_id`: bge-base-en-v1.5
789
+ - `push_to_hub_organization`: None
790
+ - `mp_parameters`:
791
+ - `auto_find_batch_size`: False
792
+ - `full_determinism`: False
793
+ - `torchdynamo`: None
794
+ - `ray_scope`: last
795
+ - `ddp_timeout`: 1800
796
+ - `torch_compile`: False
797
+ - `torch_compile_backend`: None
798
+ - `torch_compile_mode`: None
799
+ - `dispatch_batches`: None
800
+ - `split_batches`: None
801
+ - `include_tokens_per_second`: False
802
+ - `include_num_input_tokens_seen`: False
803
+ - `neftune_noise_alpha`: None
804
+ - `optim_target_modules`: None
805
+ - `batch_eval_metrics`: False
806
+ - `eval_on_start`: False
807
+ - `use_liger_kernel`: False
808
+ - `eval_use_gather_object`: False
809
+ - `average_tokens_across_devices`: False
810
+ - `prompts`: None
811
+ - `batch_sampler`: no_duplicates
812
+ - `multi_dataset_batch_sampler`: proportional
813
+
814
+ </details>
815
+
816
+ ### Training Logs
817
+ | Epoch | Step | Training Loss | dim_768_cosine_ndcg@10 | dim_512_cosine_ndcg@10 | dim_256_cosine_ndcg@10 | dim_128_cosine_ndcg@10 | dim_64_cosine_ndcg@10 |
818
+ |:----------:|:------:|:-------------:|:----------------------:|:----------------------:|:----------------------:|:----------------------:|:---------------------:|
819
+ | 0.2105 | 1 | 4.4608 | - | - | - | - | - |
820
+ | 0.8421 | 4 | - | 0.3891 | 0.3727 | 0.4175 | 0.3876 | 0.2956 |
821
+ | 1.2105 | 5 | 4.2215 | - | - | - | - | - |
822
+ | 1.8421 | 8 | - | 0.4088 | 0.4351 | 0.4034 | 0.4052 | 0.3167 |
823
+ | 2.4211 | 10 | 3.397 | - | - | - | - | - |
824
+ | 2.8421 | 12 | - | 0.4440 | 0.4252 | 0.4133 | 0.4284 | 0.3024 |
825
+ | 3.6316 | 15 | 2.87 | - | - | - | - | - |
826
+ | **3.8421** | **16** | **-** | **0.4371** | **0.4332** | **0.4384** | **0.4444** | **0.312** |
827
+
828
+ * The bold row denotes the saved checkpoint.
829
+
830
+ ### Framework Versions
831
+ - Python: 3.11.11
832
+ - Sentence Transformers: 3.3.1
833
+ - Transformers: 4.47.1
834
+ - PyTorch: 2.5.1+cu124
835
+ - Accelerate: 1.2.1
836
+ - Datasets: 3.2.0
837
+ - Tokenizers: 0.21.0
838
+
839
+ ## Citation
840
+
841
+ ### BibTeX
842
+
843
+ #### Sentence Transformers
844
+ ```bibtex
845
+ @inproceedings{reimers-2019-sentence-bert,
846
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
847
+ author = "Reimers, Nils and Gurevych, Iryna",
848
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
849
+ month = "11",
850
+ year = "2019",
851
+ publisher = "Association for Computational Linguistics",
852
+ url = "https://arxiv.org/abs/1908.10084",
853
+ }
854
+ ```
855
+
856
+ #### MatryoshkaLoss
857
+ ```bibtex
858
+ @misc{kusupati2024matryoshka,
859
+ title={Matryoshka Representation Learning},
860
+ author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
861
+ year={2024},
862
+ eprint={2205.13147},
863
+ archivePrefix={arXiv},
864
+ primaryClass={cs.LG}
865
+ }
866
+ ```
867
+
868
+ #### MultipleNegativesRankingLoss
869
+ ```bibtex
870
+ @misc{henderson2017efficient,
871
+ title={Efficient Natural Language Response Suggestion for Smart Reply},
872
+ author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
873
+ year={2017},
874
+ eprint={1705.00652},
875
+ archivePrefix={arXiv},
876
+ primaryClass={cs.CL}
877
+ }
878
+ ```
879
+
880
+ <!--
881
+ ## Glossary
882
+
883
+ *Clearly define terms in order to be accessible across audiences.*
884
+ -->
885
+
886
+ <!--
887
+ ## Model Card Authors
888
+
889
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
890
+ -->
891
+
892
+ <!--
893
+ ## Model Card Contact
894
+
895
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
896
+ -->
config_sentence_transformers.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "__version__": {
3
+ "sentence_transformers": "3.3.1",
4
+ "transformers": "4.47.1",
5
+ "pytorch": "2.5.1+cu124"
6
+ },
7
+ "prompts": {},
8
+ "default_prompt_name": null,
9
+ "similarity_fn_name": "cosine"
10
+ }
modules.json ADDED
@@ -0,0 +1,20 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ },
14
+ {
15
+ "idx": 2,
16
+ "name": "2",
17
+ "path": "2_Normalize",
18
+ "type": "sentence_transformers.models.Normalize"
19
+ }
20
+ ]
sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 512,
3
+ "do_lower_case": true
4
+ }