pj-mathematician commited on
Commit
8fd808f
·
verified ·
1 Parent(s): d7b417e

Add files using upload-large-folder tool

Browse files
Files changed (41) hide show
  1. .gitattributes +2 -0
  2. README.md +1400 -3
  3. checkpoint-200/1_Pooling/config.json +10 -0
  4. checkpoint-200/README.md +1398 -0
  5. checkpoint-200/config.json +49 -0
  6. checkpoint-200/config_sentence_transformers.json +10 -0
  7. checkpoint-200/model.safetensors +3 -0
  8. checkpoint-200/modules.json +20 -0
  9. checkpoint-200/optimizer.pt +3 -0
  10. checkpoint-200/rng_state_0.pth +3 -0
  11. checkpoint-200/scaler.pt +3 -0
  12. checkpoint-200/scheduler.pt +3 -0
  13. checkpoint-200/sentence_bert_config.json +4 -0
  14. checkpoint-200/special_tokens_map.json +51 -0
  15. checkpoint-200/tokenizer.json +3 -0
  16. checkpoint-200/tokenizer_config.json +55 -0
  17. checkpoint-200/trainer_state.json +322 -0
  18. checkpoint-200/training_args.bin +3 -0
  19. checkpoint-400/1_Pooling/config.json +10 -0
  20. checkpoint-400/README.md +1400 -0
  21. checkpoint-400/config.json +49 -0
  22. checkpoint-400/config_sentence_transformers.json +10 -0
  23. checkpoint-400/model.safetensors +3 -0
  24. checkpoint-400/modules.json +20 -0
  25. checkpoint-400/optimizer.pt +3 -0
  26. checkpoint-400/rng_state_0.pth +3 -0
  27. checkpoint-400/scaler.pt +3 -0
  28. checkpoint-400/scheduler.pt +3 -0
  29. checkpoint-400/sentence_bert_config.json +4 -0
  30. checkpoint-400/special_tokens_map.json +51 -0
  31. checkpoint-400/tokenizer.json +3 -0
  32. checkpoint-400/tokenizer_config.json +55 -0
  33. checkpoint-400/trainer_state.json +603 -0
  34. checkpoint-400/training_args.bin +3 -0
  35. eval/Information-Retrieval_evaluation_full_de_results.csv +4 -0
  36. eval/Information-Retrieval_evaluation_full_en_results.csv +4 -0
  37. eval/Information-Retrieval_evaluation_full_es_results.csv +4 -0
  38. eval/Information-Retrieval_evaluation_full_zh_results.csv +4 -0
  39. eval/Information-Retrieval_evaluation_mix_de_results.csv +4 -0
  40. eval/Information-Retrieval_evaluation_mix_es_results.csv +4 -0
  41. eval/Information-Retrieval_evaluation_mix_zh_results.csv +4 -0
.gitattributes CHANGED
@@ -33,3 +33,5 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ checkpoint-400/tokenizer.json filter=lfs diff=lfs merge=lfs -text
37
+ checkpoint-200/tokenizer.json filter=lfs diff=lfs merge=lfs -text
README.md CHANGED
@@ -1,3 +1,1400 @@
1
- ---
2
- license: apache-2.0
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - sentence-transformers
4
+ - sentence-similarity
5
+ - feature-extraction
6
+ - generated_from_trainer
7
+ - dataset_size:124788
8
+ - loss:GISTEmbedLoss
9
+ base_model: Alibaba-NLP/gte-multilingual-base
10
+ widget:
11
+ - source_sentence: 其他机械、设备和有形货物租赁服务代表
12
+ sentences:
13
+ - 其他机械和设备租赁服务工作人员
14
+ - 电子和电信设备及零部件物流经理
15
+ - 工业主厨
16
+ - source_sentence: 公交车司机
17
+ sentences:
18
+ - 表演灯光设计师
19
+ - 乙烯基地板安装工
20
+ - 国际巴士司机
21
+ - source_sentence: online communication manager
22
+ sentences:
23
+ - trades union official
24
+ - social media manager
25
+ - budget manager
26
+ - source_sentence: Projektmanagerin
27
+ sentences:
28
+ - Projektmanager/Projektmanagerin
29
+ - Category-Manager
30
+ - Infanterist
31
+ - source_sentence: Volksvertreter
32
+ sentences:
33
+ - Parlamentarier
34
+ - Oberbürgermeister
35
+ - Konsul
36
+ pipeline_tag: sentence-similarity
37
+ library_name: sentence-transformers
38
+ metrics:
39
+ - cosine_accuracy@1
40
+ - cosine_accuracy@20
41
+ - cosine_accuracy@50
42
+ - cosine_accuracy@100
43
+ - cosine_accuracy@150
44
+ - cosine_accuracy@200
45
+ - cosine_precision@1
46
+ - cosine_precision@20
47
+ - cosine_precision@50
48
+ - cosine_precision@100
49
+ - cosine_precision@150
50
+ - cosine_precision@200
51
+ - cosine_recall@1
52
+ - cosine_recall@20
53
+ - cosine_recall@50
54
+ - cosine_recall@100
55
+ - cosine_recall@150
56
+ - cosine_recall@200
57
+ - cosine_ndcg@1
58
+ - cosine_ndcg@20
59
+ - cosine_ndcg@50
60
+ - cosine_ndcg@100
61
+ - cosine_ndcg@150
62
+ - cosine_ndcg@200
63
+ - cosine_mrr@1
64
+ - cosine_mrr@20
65
+ - cosine_mrr@50
66
+ - cosine_mrr@100
67
+ - cosine_mrr@150
68
+ - cosine_mrr@200
69
+ - cosine_map@1
70
+ - cosine_map@20
71
+ - cosine_map@50
72
+ - cosine_map@100
73
+ - cosine_map@150
74
+ - cosine_map@200
75
+ - cosine_map@500
76
+ model-index:
77
+ - name: SentenceTransformer based on Alibaba-NLP/gte-multilingual-base
78
+ results:
79
+ - task:
80
+ type: information-retrieval
81
+ name: Information Retrieval
82
+ dataset:
83
+ name: full en
84
+ type: full_en
85
+ metrics:
86
+ - type: cosine_accuracy@1
87
+ value: 0.6666666666666666
88
+ name: Cosine Accuracy@1
89
+ - type: cosine_accuracy@20
90
+ value: 0.9904761904761905
91
+ name: Cosine Accuracy@20
92
+ - type: cosine_accuracy@50
93
+ value: 0.9904761904761905
94
+ name: Cosine Accuracy@50
95
+ - type: cosine_accuracy@100
96
+ value: 0.9904761904761905
97
+ name: Cosine Accuracy@100
98
+ - type: cosine_accuracy@150
99
+ value: 0.9904761904761905
100
+ name: Cosine Accuracy@150
101
+ - type: cosine_accuracy@200
102
+ value: 0.9904761904761905
103
+ name: Cosine Accuracy@200
104
+ - type: cosine_precision@1
105
+ value: 0.6666666666666666
106
+ name: Cosine Precision@1
107
+ - type: cosine_precision@20
108
+ value: 0.5147619047619048
109
+ name: Cosine Precision@20
110
+ - type: cosine_precision@50
111
+ value: 0.31999999999999995
112
+ name: Cosine Precision@50
113
+ - type: cosine_precision@100
114
+ value: 0.19047619047619047
115
+ name: Cosine Precision@100
116
+ - type: cosine_precision@150
117
+ value: 0.1361904761904762
118
+ name: Cosine Precision@150
119
+ - type: cosine_precision@200
120
+ value: 0.10542857142857143
121
+ name: Cosine Precision@200
122
+ - type: cosine_recall@1
123
+ value: 0.06854687410617222
124
+ name: Cosine Recall@1
125
+ - type: cosine_recall@20
126
+ value: 0.5491240579458434
127
+ name: Cosine Recall@20
128
+ - type: cosine_recall@50
129
+ value: 0.7553654907661455
130
+ name: Cosine Recall@50
131
+ - type: cosine_recall@100
132
+ value: 0.8503209224897438
133
+ name: Cosine Recall@100
134
+ - type: cosine_recall@150
135
+ value: 0.8994749092946579
136
+ name: Cosine Recall@150
137
+ - type: cosine_recall@200
138
+ value: 0.9207884118691805
139
+ name: Cosine Recall@200
140
+ - type: cosine_ndcg@1
141
+ value: 0.6666666666666666
142
+ name: Cosine Ndcg@1
143
+ - type: cosine_ndcg@20
144
+ value: 0.6952098522285352
145
+ name: Cosine Ndcg@20
146
+ - type: cosine_ndcg@50
147
+ value: 0.7229572913271685
148
+ name: Cosine Ndcg@50
149
+ - type: cosine_ndcg@100
150
+ value: 0.7732532874348539
151
+ name: Cosine Ndcg@100
152
+ - type: cosine_ndcg@150
153
+ value: 0.7947334799125039
154
+ name: Cosine Ndcg@150
155
+ - type: cosine_ndcg@200
156
+ value: 0.8038564389556094
157
+ name: Cosine Ndcg@200
158
+ - type: cosine_mrr@1
159
+ value: 0.6666666666666666
160
+ name: Cosine Mrr@1
161
+ - type: cosine_mrr@20
162
+ value: 0.8182539682539683
163
+ name: Cosine Mrr@20
164
+ - type: cosine_mrr@50
165
+ value: 0.8182539682539683
166
+ name: Cosine Mrr@50
167
+ - type: cosine_mrr@100
168
+ value: 0.8182539682539683
169
+ name: Cosine Mrr@100
170
+ - type: cosine_mrr@150
171
+ value: 0.8182539682539683
172
+ name: Cosine Mrr@150
173
+ - type: cosine_mrr@200
174
+ value: 0.8182539682539683
175
+ name: Cosine Mrr@200
176
+ - type: cosine_map@1
177
+ value: 0.6666666666666666
178
+ name: Cosine Map@1
179
+ - type: cosine_map@20
180
+ value: 0.5566401101002375
181
+ name: Cosine Map@20
182
+ - type: cosine_map@50
183
+ value: 0.55344017265156
184
+ name: Cosine Map@50
185
+ - type: cosine_map@100
186
+ value: 0.5852249415484134
187
+ name: Cosine Map@100
188
+ - type: cosine_map@150
189
+ value: 0.5943042662925763
190
+ name: Cosine Map@150
191
+ - type: cosine_map@200
192
+ value: 0.5975837437975446
193
+ name: Cosine Map@200
194
+ - type: cosine_map@500
195
+ value: 0.6015742986218369
196
+ name: Cosine Map@500
197
+ - task:
198
+ type: information-retrieval
199
+ name: Information Retrieval
200
+ dataset:
201
+ name: full es
202
+ type: full_es
203
+ metrics:
204
+ - type: cosine_accuracy@1
205
+ value: 0.12432432432432433
206
+ name: Cosine Accuracy@1
207
+ - type: cosine_accuracy@20
208
+ value: 1.0
209
+ name: Cosine Accuracy@20
210
+ - type: cosine_accuracy@50
211
+ value: 1.0
212
+ name: Cosine Accuracy@50
213
+ - type: cosine_accuracy@100
214
+ value: 1.0
215
+ name: Cosine Accuracy@100
216
+ - type: cosine_accuracy@150
217
+ value: 1.0
218
+ name: Cosine Accuracy@150
219
+ - type: cosine_accuracy@200
220
+ value: 1.0
221
+ name: Cosine Accuracy@200
222
+ - type: cosine_precision@1
223
+ value: 0.12432432432432433
224
+ name: Cosine Precision@1
225
+ - type: cosine_precision@20
226
+ value: 0.575945945945946
227
+ name: Cosine Precision@20
228
+ - type: cosine_precision@50
229
+ value: 0.3923243243243244
230
+ name: Cosine Precision@50
231
+ - type: cosine_precision@100
232
+ value: 0.2565945945945946
233
+ name: Cosine Precision@100
234
+ - type: cosine_precision@150
235
+ value: 0.19282882882882882
236
+ name: Cosine Precision@150
237
+ - type: cosine_precision@200
238
+ value: 0.1527837837837838
239
+ name: Cosine Precision@200
240
+ - type: cosine_recall@1
241
+ value: 0.0036138931714884822
242
+ name: Cosine Recall@1
243
+ - type: cosine_recall@20
244
+ value: 0.3852888120551914
245
+ name: Cosine Recall@20
246
+ - type: cosine_recall@50
247
+ value: 0.5659574514538841
248
+ name: Cosine Recall@50
249
+ - type: cosine_recall@100
250
+ value: 0.6898678629281393
251
+ name: Cosine Recall@100
252
+ - type: cosine_recall@150
253
+ value: 0.7540209165372845
254
+ name: Cosine Recall@150
255
+ - type: cosine_recall@200
256
+ value: 0.7858170054407897
257
+ name: Cosine Recall@200
258
+ - type: cosine_ndcg@1
259
+ value: 0.12432432432432433
260
+ name: Cosine Ndcg@1
261
+ - type: cosine_ndcg@20
262
+ value: 0.6168674053047035
263
+ name: Cosine Ndcg@20
264
+ - type: cosine_ndcg@50
265
+ value: 0.5913690595071309
266
+ name: Cosine Ndcg@50
267
+ - type: cosine_ndcg@100
268
+ value: 0.62350509928888
269
+ name: Cosine Ndcg@100
270
+ - type: cosine_ndcg@150
271
+ value: 0.6556716735369459
272
+ name: Cosine Ndcg@150
273
+ - type: cosine_ndcg@200
274
+ value: 0.6716557949894583
275
+ name: Cosine Ndcg@200
276
+ - type: cosine_mrr@1
277
+ value: 0.12432432432432433
278
+ name: Cosine Mrr@1
279
+ - type: cosine_mrr@20
280
+ value: 0.5581081081081081
281
+ name: Cosine Mrr@20
282
+ - type: cosine_mrr@50
283
+ value: 0.5581081081081081
284
+ name: Cosine Mrr@50
285
+ - type: cosine_mrr@100
286
+ value: 0.5581081081081081
287
+ name: Cosine Mrr@100
288
+ - type: cosine_mrr@150
289
+ value: 0.5581081081081081
290
+ name: Cosine Mrr@150
291
+ - type: cosine_mrr@200
292
+ value: 0.5581081081081081
293
+ name: Cosine Mrr@200
294
+ - type: cosine_map@1
295
+ value: 0.12432432432432433
296
+ name: Cosine Map@1
297
+ - type: cosine_map@20
298
+ value: 0.48407152706202555
299
+ name: Cosine Map@20
300
+ - type: cosine_map@50
301
+ value: 0.43043374125481026
302
+ name: Cosine Map@50
303
+ - type: cosine_map@100
304
+ value: 0.43735327570764515
305
+ name: Cosine Map@100
306
+ - type: cosine_map@150
307
+ value: 0.45269435912524697
308
+ name: Cosine Map@150
309
+ - type: cosine_map@200
310
+ value: 0.45930097680668164
311
+ name: Cosine Map@200
312
+ - type: cosine_map@500
313
+ value: 0.47204219228541466
314
+ name: Cosine Map@500
315
+ - task:
316
+ type: information-retrieval
317
+ name: Information Retrieval
318
+ dataset:
319
+ name: full de
320
+ type: full_de
321
+ metrics:
322
+ - type: cosine_accuracy@1
323
+ value: 0.2955665024630542
324
+ name: Cosine Accuracy@1
325
+ - type: cosine_accuracy@20
326
+ value: 0.9753694581280788
327
+ name: Cosine Accuracy@20
328
+ - type: cosine_accuracy@50
329
+ value: 0.9852216748768473
330
+ name: Cosine Accuracy@50
331
+ - type: cosine_accuracy@100
332
+ value: 0.9901477832512315
333
+ name: Cosine Accuracy@100
334
+ - type: cosine_accuracy@150
335
+ value: 0.9901477832512315
336
+ name: Cosine Accuracy@150
337
+ - type: cosine_accuracy@200
338
+ value: 0.9901477832512315
339
+ name: Cosine Accuracy@200
340
+ - type: cosine_precision@1
341
+ value: 0.2955665024630542
342
+ name: Cosine Precision@1
343
+ - type: cosine_precision@20
344
+ value: 0.5103448275862069
345
+ name: Cosine Precision@20
346
+ - type: cosine_precision@50
347
+ value: 0.36935960591133016
348
+ name: Cosine Precision@50
349
+ - type: cosine_precision@100
350
+ value: 0.23965517241379314
351
+ name: Cosine Precision@100
352
+ - type: cosine_precision@150
353
+ value: 0.1807881773399015
354
+ name: Cosine Precision@150
355
+ - type: cosine_precision@200
356
+ value: 0.1461576354679803
357
+ name: Cosine Precision@200
358
+ - type: cosine_recall@1
359
+ value: 0.01108543831680986
360
+ name: Cosine Recall@1
361
+ - type: cosine_recall@20
362
+ value: 0.3207974783481294
363
+ name: Cosine Recall@20
364
+ - type: cosine_recall@50
365
+ value: 0.5042046446720455
366
+ name: Cosine Recall@50
367
+ - type: cosine_recall@100
368
+ value: 0.6172666777909689
369
+ name: Cosine Recall@100
370
+ - type: cosine_recall@150
371
+ value: 0.6848138831682932
372
+ name: Cosine Recall@150
373
+ - type: cosine_recall@200
374
+ value: 0.7253195006357535
375
+ name: Cosine Recall@200
376
+ - type: cosine_ndcg@1
377
+ value: 0.2955665024630542
378
+ name: Cosine Ndcg@1
379
+ - type: cosine_ndcg@20
380
+ value: 0.537849085734973
381
+ name: Cosine Ndcg@20
382
+ - type: cosine_ndcg@50
383
+ value: 0.5288037060639387
384
+ name: Cosine Ndcg@50
385
+ - type: cosine_ndcg@100
386
+ value: 0.5551941695921919
387
+ name: Cosine Ndcg@100
388
+ - type: cosine_ndcg@150
389
+ value: 0.5887611959940118
390
+ name: Cosine Ndcg@150
391
+ - type: cosine_ndcg@200
392
+ value: 0.6092219717029682
393
+ name: Cosine Ndcg@200
394
+ - type: cosine_mrr@1
395
+ value: 0.2955665024630542
396
+ name: Cosine Mrr@1
397
+ - type: cosine_mrr@20
398
+ value: 0.5164773875147672
399
+ name: Cosine Mrr@20
400
+ - type: cosine_mrr@50
401
+ value: 0.5167647438366063
402
+ name: Cosine Mrr@50
403
+ - type: cosine_mrr@100
404
+ value: 0.5168213657719442
405
+ name: Cosine Mrr@100
406
+ - type: cosine_mrr@150
407
+ value: 0.5168213657719442
408
+ name: Cosine Mrr@150
409
+ - type: cosine_mrr@200
410
+ value: 0.5168213657719442
411
+ name: Cosine Mrr@200
412
+ - type: cosine_map@1
413
+ value: 0.2955665024630542
414
+ name: Cosine Map@1
415
+ - type: cosine_map@20
416
+ value: 0.398398563122481
417
+ name: Cosine Map@20
418
+ - type: cosine_map@50
419
+ value: 0.36032758502543594
420
+ name: Cosine Map@50
421
+ - type: cosine_map@100
422
+ value: 0.3632259128424842
423
+ name: Cosine Map@100
424
+ - type: cosine_map@150
425
+ value: 0.37822275477623696
426
+ name: Cosine Map@150
427
+ - type: cosine_map@200
428
+ value: 0.3863148456840816
429
+ name: Cosine Map@200
430
+ - type: cosine_map@500
431
+ value: 0.399227009561676
432
+ name: Cosine Map@500
433
+ - task:
434
+ type: information-retrieval
435
+ name: Information Retrieval
436
+ dataset:
437
+ name: full zh
438
+ type: full_zh
439
+ metrics:
440
+ - type: cosine_accuracy@1
441
+ value: 0.6796116504854369
442
+ name: Cosine Accuracy@1
443
+ - type: cosine_accuracy@20
444
+ value: 0.9805825242718447
445
+ name: Cosine Accuracy@20
446
+ - type: cosine_accuracy@50
447
+ value: 0.9902912621359223
448
+ name: Cosine Accuracy@50
449
+ - type: cosine_accuracy@100
450
+ value: 0.9902912621359223
451
+ name: Cosine Accuracy@100
452
+ - type: cosine_accuracy@150
453
+ value: 0.9902912621359223
454
+ name: Cosine Accuracy@150
455
+ - type: cosine_accuracy@200
456
+ value: 0.9902912621359223
457
+ name: Cosine Accuracy@200
458
+ - type: cosine_precision@1
459
+ value: 0.6796116504854369
460
+ name: Cosine Precision@1
461
+ - type: cosine_precision@20
462
+ value: 0.488349514563107
463
+ name: Cosine Precision@20
464
+ - type: cosine_precision@50
465
+ value: 0.29631067961165053
466
+ name: Cosine Precision@50
467
+ - type: cosine_precision@100
468
+ value: 0.17883495145631062
469
+ name: Cosine Precision@100
470
+ - type: cosine_precision@150
471
+ value: 0.12776699029126212
472
+ name: Cosine Precision@150
473
+ - type: cosine_precision@200
474
+ value: 0.09990291262135924
475
+ name: Cosine Precision@200
476
+ - type: cosine_recall@1
477
+ value: 0.06931865009287731
478
+ name: Cosine Recall@1
479
+ - type: cosine_recall@20
480
+ value: 0.5250914458143515
481
+ name: Cosine Recall@20
482
+ - type: cosine_recall@50
483
+ value: 0.7082715439925011
484
+ name: Cosine Recall@50
485
+ - type: cosine_recall@100
486
+ value: 0.8169166539243944
487
+ name: Cosine Recall@100
488
+ - type: cosine_recall@150
489
+ value: 0.8613232254521018
490
+ name: Cosine Recall@150
491
+ - type: cosine_recall@200
492
+ value: 0.8898175710074696
493
+ name: Cosine Recall@200
494
+ - type: cosine_ndcg@1
495
+ value: 0.6796116504854369
496
+ name: Cosine Ndcg@1
497
+ - type: cosine_ndcg@20
498
+ value: 0.6680745295820606
499
+ name: Cosine Ndcg@20
500
+ - type: cosine_ndcg@50
501
+ value: 0.6856578240865067
502
+ name: Cosine Ndcg@50
503
+ - type: cosine_ndcg@100
504
+ value: 0.7378907298421352
505
+ name: Cosine Ndcg@100
506
+ - type: cosine_ndcg@150
507
+ value: 0.7576651805692517
508
+ name: Cosine Ndcg@150
509
+ - type: cosine_ndcg@200
510
+ value: 0.7696718049970358
511
+ name: Cosine Ndcg@200
512
+ - type: cosine_mrr@1
513
+ value: 0.6796116504854369
514
+ name: Cosine Mrr@1
515
+ - type: cosine_mrr@20
516
+ value: 0.8158576051779936
517
+ name: Cosine Mrr@20
518
+ - type: cosine_mrr@50
519
+ value: 0.816279724215562
520
+ name: Cosine Mrr@50
521
+ - type: cosine_mrr@100
522
+ value: 0.816279724215562
523
+ name: Cosine Mrr@100
524
+ - type: cosine_mrr@150
525
+ value: 0.816279724215562
526
+ name: Cosine Mrr@150
527
+ - type: cosine_mrr@200
528
+ value: 0.816279724215562
529
+ name: Cosine Mrr@200
530
+ - type: cosine_map@1
531
+ value: 0.6796116504854369
532
+ name: Cosine Map@1
533
+ - type: cosine_map@20
534
+ value: 0.522177160195635
535
+ name: Cosine Map@20
536
+ - type: cosine_map@50
537
+ value: 0.5082601209392789
538
+ name: Cosine Map@50
539
+ - type: cosine_map@100
540
+ value: 0.5371705298206915
541
+ name: Cosine Map@100
542
+ - type: cosine_map@150
543
+ value: 0.5454012672534121
544
+ name: Cosine Map@150
545
+ - type: cosine_map@200
546
+ value: 0.5494570875591636
547
+ name: Cosine Map@200
548
+ - type: cosine_map@500
549
+ value: 0.5542116087189223
550
+ name: Cosine Map@500
551
+ - task:
552
+ type: information-retrieval
553
+ name: Information Retrieval
554
+ dataset:
555
+ name: mix es
556
+ type: mix_es
557
+ metrics:
558
+ - type: cosine_accuracy@1
559
+ value: 0.7087883515340614
560
+ name: Cosine Accuracy@1
561
+ - type: cosine_accuracy@20
562
+ value: 0.9552782111284451
563
+ name: Cosine Accuracy@20
564
+ - type: cosine_accuracy@50
565
+ value: 0.9802392095683827
566
+ name: Cosine Accuracy@50
567
+ - type: cosine_accuracy@100
568
+ value: 0.9901196047841914
569
+ name: Cosine Accuracy@100
570
+ - type: cosine_accuracy@150
571
+ value: 0.9937597503900156
572
+ name: Cosine Accuracy@150
573
+ - type: cosine_accuracy@200
574
+ value: 0.9958398335933437
575
+ name: Cosine Accuracy@200
576
+ - type: cosine_precision@1
577
+ value: 0.7087883515340614
578
+ name: Cosine Precision@1
579
+ - type: cosine_precision@20
580
+ value: 0.12158086323452937
581
+ name: Cosine Precision@20
582
+ - type: cosine_precision@50
583
+ value: 0.05122204888195529
584
+ name: Cosine Precision@50
585
+ - type: cosine_precision@100
586
+ value: 0.026125845033801356
587
+ name: Cosine Precision@100
588
+ - type: cosine_precision@150
589
+ value: 0.017548968625411682
590
+ name: Cosine Precision@150
591
+ - type: cosine_precision@200
592
+ value: 0.013239729589183572
593
+ name: Cosine Precision@200
594
+ - type: cosine_recall@1
595
+ value: 0.2737959042171211
596
+ name: Cosine Recall@1
597
+ - type: cosine_recall@20
598
+ value: 0.8990032934650719
599
+ name: Cosine Recall@20
600
+ - type: cosine_recall@50
601
+ value: 0.9459438377535101
602
+ name: Cosine Recall@50
603
+ - type: cosine_recall@100
604
+ value: 0.9650979372508233
605
+ name: Cosine Recall@100
606
+ - type: cosine_recall@150
607
+ value: 0.9731582596637198
608
+ name: Cosine Recall@150
609
+ - type: cosine_recall@200
610
+ value: 0.979086496793205
611
+ name: Cosine Recall@200
612
+ - type: cosine_ndcg@1
613
+ value: 0.7087883515340614
614
+ name: Cosine Ndcg@1
615
+ - type: cosine_ndcg@20
616
+ value: 0.7814741332820433
617
+ name: Cosine Ndcg@20
618
+ - type: cosine_ndcg@50
619
+ value: 0.7944033394497885
620
+ name: Cosine Ndcg@50
621
+ - type: cosine_ndcg@100
622
+ value: 0.7986024294603647
623
+ name: Cosine Ndcg@100
624
+ - type: cosine_ndcg@150
625
+ value: 0.8001222520801115
626
+ name: Cosine Ndcg@150
627
+ - type: cosine_ndcg@200
628
+ value: 0.801183843730514
629
+ name: Cosine Ndcg@200
630
+ - type: cosine_mrr@1
631
+ value: 0.7087883515340614
632
+ name: Cosine Mrr@1
633
+ - type: cosine_mrr@20
634
+ value: 0.7804158804359833
635
+ name: Cosine Mrr@20
636
+ - type: cosine_mrr@50
637
+ value: 0.7812547046826683
638
+ name: Cosine Mrr@50
639
+ - type: cosine_mrr@100
640
+ value: 0.7813961782842836
641
+ name: Cosine Mrr@100
642
+ - type: cosine_mrr@150
643
+ value: 0.7814280971923943
644
+ name: Cosine Mrr@150
645
+ - type: cosine_mrr@200
646
+ value: 0.7814392363829243
647
+ name: Cosine Mrr@200
648
+ - type: cosine_map@1
649
+ value: 0.7087883515340614
650
+ name: Cosine Map@1
651
+ - type: cosine_map@20
652
+ value: 0.7070596364024803
653
+ name: Cosine Map@20
654
+ - type: cosine_map@50
655
+ value: 0.7106867578203881
656
+ name: Cosine Map@50
657
+ - type: cosine_map@100
658
+ value: 0.7112928928384499
659
+ name: Cosine Map@100
660
+ - type: cosine_map@150
661
+ value: 0.7114314004578745
662
+ name: Cosine Map@150
663
+ - type: cosine_map@200
664
+ value: 0.711504950521157
665
+ name: Cosine Map@200
666
+ - type: cosine_map@500
667
+ value: 0.7116431478000537
668
+ name: Cosine Map@500
669
+ - task:
670
+ type: information-retrieval
671
+ name: Information Retrieval
672
+ dataset:
673
+ name: mix de
674
+ type: mix_de
675
+ metrics:
676
+ - type: cosine_accuracy@1
677
+ value: 0.6484659386375455
678
+ name: Cosine Accuracy@1
679
+ - type: cosine_accuracy@20
680
+ value: 0.9323972958918356
681
+ name: Cosine Accuracy@20
682
+ - type: cosine_accuracy@50
683
+ value: 0.968278731149246
684
+ name: Cosine Accuracy@50
685
+ - type: cosine_accuracy@100
686
+ value: 0.984919396775871
687
+ name: Cosine Accuracy@100
688
+ - type: cosine_accuracy@150
689
+ value: 0.9885595423816953
690
+ name: Cosine Accuracy@150
691
+ - type: cosine_accuracy@200
692
+ value: 0.9937597503900156
693
+ name: Cosine Accuracy@200
694
+ - type: cosine_precision@1
695
+ value: 0.6484659386375455
696
+ name: Cosine Precision@1
697
+ - type: cosine_precision@20
698
+ value: 0.12093083723348932
699
+ name: Cosine Precision@20
700
+ - type: cosine_precision@50
701
+ value: 0.05140925637025482
702
+ name: Cosine Precision@50
703
+ - type: cosine_precision@100
704
+ value: 0.02647425897035882
705
+ name: Cosine Precision@100
706
+ - type: cosine_precision@150
707
+ value: 0.017892182353960822
708
+ name: Cosine Precision@150
709
+ - type: cosine_precision@200
710
+ value: 0.013530941237649509
711
+ name: Cosine Precision@200
712
+ - type: cosine_recall@1
713
+ value: 0.2435517420696828
714
+ name: Cosine Recall@1
715
+ - type: cosine_recall@20
716
+ value: 0.87873114924597
717
+ name: Cosine Recall@20
718
+ - type: cosine_recall@50
719
+ value: 0.9319899462645173
720
+ name: Cosine Recall@50
721
+ - type: cosine_recall@100
722
+ value: 0.9596117178020455
723
+ name: Cosine Recall@100
724
+ - type: cosine_recall@150
725
+ value: 0.9718322066215982
726
+ name: Cosine Recall@150
727
+ - type: cosine_recall@200
728
+ value: 0.9799791991679667
729
+ name: Cosine Recall@200
730
+ - type: cosine_ndcg@1
731
+ value: 0.6484659386375455
732
+ name: Cosine Ndcg@1
733
+ - type: cosine_ndcg@20
734
+ value: 0.7448150588358
735
+ name: Cosine Ndcg@20
736
+ - type: cosine_ndcg@50
737
+ value: 0.7595232400510039
738
+ name: Cosine Ndcg@50
739
+ - type: cosine_ndcg@100
740
+ value: 0.7656851368194345
741
+ name: Cosine Ndcg@100
742
+ - type: cosine_ndcg@150
743
+ value: 0.7681576326024331
744
+ name: Cosine Ndcg@150
745
+ - type: cosine_ndcg@200
746
+ value: 0.7696474672652458
747
+ name: Cosine Ndcg@200
748
+ - type: cosine_mrr@1
749
+ value: 0.6484659386375455
750
+ name: Cosine Mrr@1
751
+ - type: cosine_mrr@20
752
+ value: 0.7323691045739125
753
+ name: Cosine Mrr@20
754
+ - type: cosine_mrr@50
755
+ value: 0.733538875120878
756
+ name: Cosine Mrr@50
757
+ - type: cosine_mrr@100
758
+ value: 0.733776247038599
759
+ name: Cosine Mrr@100
760
+ - type: cosine_mrr@150
761
+ value: 0.7338087409764548
762
+ name: Cosine Mrr@150
763
+ - type: cosine_mrr@200
764
+ value: 0.7338398642058079
765
+ name: Cosine Mrr@200
766
+ - type: cosine_map@1
767
+ value: 0.6484659386375455
768
+ name: Cosine Map@1
769
+ - type: cosine_map@20
770
+ value: 0.6646138211839377
771
+ name: Cosine Map@20
772
+ - type: cosine_map@50
773
+ value: 0.6683657128313888
774
+ name: Cosine Map@50
775
+ - type: cosine_map@100
776
+ value: 0.6692634410264182
777
+ name: Cosine Map@100
778
+ - type: cosine_map@150
779
+ value: 0.669518875077899
780
+ name: Cosine Map@150
781
+ - type: cosine_map@200
782
+ value: 0.6696171599377958
783
+ name: Cosine Map@200
784
+ - type: cosine_map@500
785
+ value: 0.6697127210085475
786
+ name: Cosine Map@500
787
+ - task:
788
+ type: information-retrieval
789
+ name: Information Retrieval
790
+ dataset:
791
+ name: mix zh
792
+ type: mix_zh
793
+ metrics:
794
+ - type: cosine_accuracy@1
795
+ value: 0.7667014613778705
796
+ name: Cosine Accuracy@1
797
+ - type: cosine_accuracy@20
798
+ value: 0.9843423799582464
799
+ name: Cosine Accuracy@20
800
+ - type: cosine_accuracy@50
801
+ value: 0.9932150313152401
802
+ name: Cosine Accuracy@50
803
+ - type: cosine_accuracy@100
804
+ value: 0.9958246346555324
805
+ name: Cosine Accuracy@100
806
+ - type: cosine_accuracy@150
807
+ value: 0.9973903966597077
808
+ name: Cosine Accuracy@150
809
+ - type: cosine_accuracy@200
810
+ value: 0.9979123173277662
811
+ name: Cosine Accuracy@200
812
+ - type: cosine_precision@1
813
+ value: 0.7667014613778705
814
+ name: Cosine Precision@1
815
+ - type: cosine_precision@20
816
+ value: 0.13870041753653445
817
+ name: Cosine Precision@20
818
+ - type: cosine_precision@50
819
+ value: 0.05810020876826725
820
+ name: Cosine Precision@50
821
+ - type: cosine_precision@100
822
+ value: 0.029598121085595
823
+ name: Cosine Precision@100
824
+ - type: cosine_precision@150
825
+ value: 0.01986778009742519
826
+ name: Cosine Precision@150
827
+ - type: cosine_precision@200
828
+ value: 0.014945198329853866
829
+ name: Cosine Precision@200
830
+ - type: cosine_recall@1
831
+ value: 0.25692041952480366
832
+ name: Cosine Recall@1
833
+ - type: cosine_recall@20
834
+ value: 0.9156576200417536
835
+ name: Cosine Recall@20
836
+ - type: cosine_recall@50
837
+ value: 0.9582637439109255
838
+ name: Cosine Recall@50
839
+ - type: cosine_recall@100
840
+ value: 0.9765483646485734
841
+ name: Cosine Recall@100
842
+ - type: cosine_recall@150
843
+ value: 0.9833768267223383
844
+ name: Cosine Recall@150
845
+ - type: cosine_recall@200
846
+ value: 0.986464857341684
847
+ name: Cosine Recall@200
848
+ - type: cosine_ndcg@1
849
+ value: 0.7667014613778705
850
+ name: Cosine Ndcg@1
851
+ - type: cosine_ndcg@20
852
+ value: 0.8002168358295473
853
+ name: Cosine Ndcg@20
854
+ - type: cosine_ndcg@50
855
+ value: 0.8125113081884888
856
+ name: Cosine Ndcg@50
857
+ - type: cosine_ndcg@100
858
+ value: 0.8167350090334409
859
+ name: Cosine Ndcg@100
860
+ - type: cosine_ndcg@150
861
+ value: 0.8181122471507385
862
+ name: Cosine Ndcg@150
863
+ - type: cosine_ndcg@200
864
+ value: 0.8186874070081017
865
+ name: Cosine Ndcg@200
866
+ - type: cosine_mrr@1
867
+ value: 0.7667014613778705
868
+ name: Cosine Mrr@1
869
+ - type: cosine_mrr@20
870
+ value: 0.8421752732824312
871
+ name: Cosine Mrr@20
872
+ - type: cosine_mrr@50
873
+ value: 0.8424954415974232
874
+ name: Cosine Mrr@50
875
+ - type: cosine_mrr@100
876
+ value: 0.8425358910333786
877
+ name: Cosine Mrr@100
878
+ - type: cosine_mrr@150
879
+ value: 0.8425483391786986
880
+ name: Cosine Mrr@150
881
+ - type: cosine_mrr@200
882
+ value: 0.8425515411459873
883
+ name: Cosine Mrr@200
884
+ - type: cosine_map@1
885
+ value: 0.7667014613778705
886
+ name: Cosine Map@1
887
+ - type: cosine_map@20
888
+ value: 0.7007206423896271
889
+ name: Cosine Map@20
890
+ - type: cosine_map@50
891
+ value: 0.7046277360194696
892
+ name: Cosine Map@50
893
+ - type: cosine_map@100
894
+ value: 0.7053668771050886
895
+ name: Cosine Map@100
896
+ - type: cosine_map@150
897
+ value: 0.7055166914145262
898
+ name: Cosine Map@150
899
+ - type: cosine_map@200
900
+ value: 0.7055658329670217
901
+ name: Cosine Map@200
902
+ - type: cosine_map@500
903
+ value: 0.7056512281794008
904
+ name: Cosine Map@500
905
+ ---
906
+
907
+ # Job - Job matching Alibaba-NLP/gte-multilingual-base (v2)
908
+
909
+ Top performing model on [TalentCLEF 2025](https://talentclef.github.io/talentclef/) Task A. Use it for multilingual job title matching
910
+
911
+ ## Model Details
912
+
913
+ ### Model Description
914
+ - **Model Type:** Sentence Transformer
915
+ - **Base model:** [Alibaba-NLP/gte-multilingual-base](https://huggingface.co/Alibaba-NLP/gte-multilingual-base) <!-- at revision 9fdd4ee8bba0e2808a34e0e739576f6740d2b225 -->
916
+ - **Maximum Sequence Length:** 512 tokens
917
+ - **Output Dimensionality:** 768 dimensions
918
+ - **Similarity Function:** Cosine Similarity
919
+ - **Training Datasets:**
920
+ - full_en
921
+ - full_de
922
+ - full_es
923
+ - full_zh
924
+ - mix
925
+ <!-- - **Language:** Unknown -->
926
+ <!-- - **License:** Unknown -->
927
+
928
+ ### Model Sources
929
+
930
+ - **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
931
+ - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
932
+ - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
933
+
934
+ ### Full Model Architecture
935
+
936
+ ```
937
+ SentenceTransformer(
938
+ (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: NewModel
939
+ (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
940
+ (2): Normalize()
941
+ )
942
+ ```
943
+
944
+ ## Usage
945
+
946
+ ### Direct Usage (Sentence Transformers)
947
+
948
+ First install the Sentence Transformers library:
949
+
950
+ ```bash
951
+ pip install -U sentence-transformers
952
+ ```
953
+
954
+ Then you can load this model and run inference.
955
+ ```python
956
+ from sentence_transformers import SentenceTransformer
957
+
958
+ # Download from the 🤗 Hub
959
+ model = SentenceTransformer("pj-mathematician/JobGTE-multilingual-base-v2")
960
+ # Run inference
961
+ sentences = [
962
+ 'Volksvertreter',
963
+ 'Parlamentarier',
964
+ 'Oberbürgermeister',
965
+ ]
966
+ embeddings = model.encode(sentences)
967
+ print(embeddings.shape)
968
+ # [3, 768]
969
+
970
+ # Get the similarity scores for the embeddings
971
+ similarities = model.similarity(embeddings, embeddings)
972
+ print(similarities.shape)
973
+ # [3, 3]
974
+ ```
975
+
976
+ <!--
977
+ ### Direct Usage (Transformers)
978
+
979
+ <details><summary>Click to see the direct usage in Transformers</summary>
980
+
981
+ </details>
982
+ -->
983
+
984
+ <!--
985
+ ### Downstream Usage (Sentence Transformers)
986
+
987
+ You can finetune this model on your own dataset.
988
+
989
+ <details><summary>Click to expand</summary>
990
+
991
+ </details>
992
+ -->
993
+
994
+ <!--
995
+ ### Out-of-Scope Use
996
+
997
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
998
+ -->
999
+
1000
+ ## Evaluation
1001
+
1002
+ ### Metrics
1003
+
1004
+ #### Information Retrieval
1005
+
1006
+ * Datasets: `full_en`, `full_es`, `full_de`, `full_zh`, `mix_es`, `mix_de` and `mix_zh`
1007
+ * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
1008
+
1009
+ | Metric | full_en | full_es | full_de | full_zh | mix_es | mix_de | mix_zh |
1010
+ |:---------------------|:-----------|:-----------|:-----------|:-----------|:-----------|:-----------|:-----------|
1011
+ | cosine_accuracy@1 | 0.6667 | 0.1243 | 0.2956 | 0.6796 | 0.7088 | 0.6485 | 0.7667 |
1012
+ | cosine_accuracy@20 | 0.9905 | 1.0 | 0.9754 | 0.9806 | 0.9553 | 0.9324 | 0.9843 |
1013
+ | cosine_accuracy@50 | 0.9905 | 1.0 | 0.9852 | 0.9903 | 0.9802 | 0.9683 | 0.9932 |
1014
+ | cosine_accuracy@100 | 0.9905 | 1.0 | 0.9901 | 0.9903 | 0.9901 | 0.9849 | 0.9958 |
1015
+ | cosine_accuracy@150 | 0.9905 | 1.0 | 0.9901 | 0.9903 | 0.9938 | 0.9886 | 0.9974 |
1016
+ | cosine_accuracy@200 | 0.9905 | 1.0 | 0.9901 | 0.9903 | 0.9958 | 0.9938 | 0.9979 |
1017
+ | cosine_precision@1 | 0.6667 | 0.1243 | 0.2956 | 0.6796 | 0.7088 | 0.6485 | 0.7667 |
1018
+ | cosine_precision@20 | 0.5148 | 0.5759 | 0.5103 | 0.4883 | 0.1216 | 0.1209 | 0.1387 |
1019
+ | cosine_precision@50 | 0.32 | 0.3923 | 0.3694 | 0.2963 | 0.0512 | 0.0514 | 0.0581 |
1020
+ | cosine_precision@100 | 0.1905 | 0.2566 | 0.2397 | 0.1788 | 0.0261 | 0.0265 | 0.0296 |
1021
+ | cosine_precision@150 | 0.1362 | 0.1928 | 0.1808 | 0.1278 | 0.0175 | 0.0179 | 0.0199 |
1022
+ | cosine_precision@200 | 0.1054 | 0.1528 | 0.1462 | 0.0999 | 0.0132 | 0.0135 | 0.0149 |
1023
+ | cosine_recall@1 | 0.0685 | 0.0036 | 0.0111 | 0.0693 | 0.2738 | 0.2436 | 0.2569 |
1024
+ | cosine_recall@20 | 0.5491 | 0.3853 | 0.3208 | 0.5251 | 0.899 | 0.8787 | 0.9157 |
1025
+ | cosine_recall@50 | 0.7554 | 0.566 | 0.5042 | 0.7083 | 0.9459 | 0.932 | 0.9583 |
1026
+ | cosine_recall@100 | 0.8503 | 0.6899 | 0.6173 | 0.8169 | 0.9651 | 0.9596 | 0.9765 |
1027
+ | cosine_recall@150 | 0.8995 | 0.754 | 0.6848 | 0.8613 | 0.9732 | 0.9718 | 0.9834 |
1028
+ | cosine_recall@200 | 0.9208 | 0.7858 | 0.7253 | 0.8898 | 0.9791 | 0.98 | 0.9865 |
1029
+ | cosine_ndcg@1 | 0.6667 | 0.1243 | 0.2956 | 0.6796 | 0.7088 | 0.6485 | 0.7667 |
1030
+ | cosine_ndcg@20 | 0.6952 | 0.6169 | 0.5378 | 0.6681 | 0.7815 | 0.7448 | 0.8002 |
1031
+ | cosine_ndcg@50 | 0.723 | 0.5914 | 0.5288 | 0.6857 | 0.7944 | 0.7595 | 0.8125 |
1032
+ | cosine_ndcg@100 | 0.7733 | 0.6235 | 0.5552 | 0.7379 | 0.7986 | 0.7657 | 0.8167 |
1033
+ | cosine_ndcg@150 | 0.7947 | 0.6557 | 0.5888 | 0.7577 | 0.8001 | 0.7682 | 0.8181 |
1034
+ | **cosine_ndcg@200** | **0.8039** | **0.6717** | **0.6092** | **0.7697** | **0.8012** | **0.7696** | **0.8187** |
1035
+ | cosine_mrr@1 | 0.6667 | 0.1243 | 0.2956 | 0.6796 | 0.7088 | 0.6485 | 0.7667 |
1036
+ | cosine_mrr@20 | 0.8183 | 0.5581 | 0.5165 | 0.8159 | 0.7804 | 0.7324 | 0.8422 |
1037
+ | cosine_mrr@50 | 0.8183 | 0.5581 | 0.5168 | 0.8163 | 0.7813 | 0.7335 | 0.8425 |
1038
+ | cosine_mrr@100 | 0.8183 | 0.5581 | 0.5168 | 0.8163 | 0.7814 | 0.7338 | 0.8425 |
1039
+ | cosine_mrr@150 | 0.8183 | 0.5581 | 0.5168 | 0.8163 | 0.7814 | 0.7338 | 0.8425 |
1040
+ | cosine_mrr@200 | 0.8183 | 0.5581 | 0.5168 | 0.8163 | 0.7814 | 0.7338 | 0.8426 |
1041
+ | cosine_map@1 | 0.6667 | 0.1243 | 0.2956 | 0.6796 | 0.7088 | 0.6485 | 0.7667 |
1042
+ | cosine_map@20 | 0.5566 | 0.4841 | 0.3984 | 0.5222 | 0.7071 | 0.6646 | 0.7007 |
1043
+ | cosine_map@50 | 0.5534 | 0.4304 | 0.3603 | 0.5083 | 0.7107 | 0.6684 | 0.7046 |
1044
+ | cosine_map@100 | 0.5852 | 0.4374 | 0.3632 | 0.5372 | 0.7113 | 0.6693 | 0.7054 |
1045
+ | cosine_map@150 | 0.5943 | 0.4527 | 0.3782 | 0.5454 | 0.7114 | 0.6695 | 0.7055 |
1046
+ | cosine_map@200 | 0.5976 | 0.4593 | 0.3863 | 0.5495 | 0.7115 | 0.6696 | 0.7056 |
1047
+ | cosine_map@500 | 0.6016 | 0.472 | 0.3992 | 0.5542 | 0.7116 | 0.6697 | 0.7057 |
1048
+
1049
+ <!--
1050
+ ## Bias, Risks and Limitations
1051
+
1052
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
1053
+ -->
1054
+
1055
+ <!--
1056
+ ### Recommendations
1057
+
1058
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
1059
+ -->
1060
+
1061
+ ## Training Details
1062
+
1063
+ ### Training Datasets
1064
+ <details><summary>full_en</summary>
1065
+
1066
+ #### full_en
1067
+
1068
+ * Dataset: full_en
1069
+ * Size: 28,880 training samples
1070
+ * Columns: <code>anchor</code> and <code>positive</code>
1071
+ * Approximate statistics based on the first 1000 samples:
1072
+ | | anchor | positive |
1073
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
1074
+ | type | string | string |
1075
+ | details | <ul><li>min: 3 tokens</li><li>mean: 5.68 tokens</li><li>max: 11 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 5.76 tokens</li><li>max: 12 tokens</li></ul> |
1076
+ * Samples:
1077
+ | anchor | positive |
1078
+ |:-----------------------------------------|:-----------------------------------------|
1079
+ | <code>air commodore</code> | <code>flight lieutenant</code> |
1080
+ | <code>command and control officer</code> | <code>flight officer</code> |
1081
+ | <code>air commodore</code> | <code>command and control officer</code> |
1082
+ * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
1083
+ ```json
1084
+ {'guide': SentenceTransformer(
1085
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
1086
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
1087
+ (2): Normalize()
1088
+ ), 'temperature': 0.01, 'margin_strategy': 'absolute', 'margin': 0.0}
1089
+ ```
1090
+ </details>
1091
+ <details><summary>full_de</summary>
1092
+
1093
+ #### full_de
1094
+
1095
+ * Dataset: full_de
1096
+ * Size: 23,023 training samples
1097
+ * Columns: <code>anchor</code> and <code>positive</code>
1098
+ * Approximate statistics based on the first 1000 samples:
1099
+ | | anchor | positive |
1100
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
1101
+ | type | string | string |
1102
+ | details | <ul><li>min: 3 tokens</li><li>mean: 7.99 tokens</li><li>max: 30 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 8.19 tokens</li><li>max: 30 tokens</li></ul> |
1103
+ * Samples:
1104
+ | anchor | positive |
1105
+ |:----------------------------------|:-----------------------------------------------------|
1106
+ | <code>Staffelkommandantin</code> | <code>Kommodore</code> |
1107
+ | <code>Luftwaffenoffizierin</code> | <code>Luftwaffenoffizier/Luftwaffenoffizierin</code> |
1108
+ | <code>Staffelkommandantin</code> | <code>Luftwaffenoffizierin</code> |
1109
+ * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
1110
+ ```json
1111
+ {'guide': SentenceTransformer(
1112
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
1113
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
1114
+ (2): Normalize()
1115
+ ), 'temperature': 0.01, 'margin_strategy': 'absolute', 'margin': 0.0}
1116
+ ```
1117
+ </details>
1118
+ <details><summary>full_es</summary>
1119
+
1120
+ #### full_es
1121
+
1122
+ * Dataset: full_es
1123
+ * Size: 20,724 training samples
1124
+ * Columns: <code>anchor</code> and <code>positive</code>
1125
+ * Approximate statistics based on the first 1000 samples:
1126
+ | | anchor | positive |
1127
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
1128
+ | type | string | string |
1129
+ | details | <ul><li>min: 3 tokens</li><li>mean: 9.13 tokens</li><li>max: 32 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 8.84 tokens</li><li>max: 32 tokens</li></ul> |
1130
+ * Samples:
1131
+ | anchor | positive |
1132
+ |:------------------------------------|:-------------------------------------------|
1133
+ | <code>jefe de escuadrón</code> | <code>instructor</code> |
1134
+ | <code>comandante de aeronave</code> | <code>instructor de simulador</code> |
1135
+ | <code>instructor</code> | <code>oficial del Ejército del Aire</code> |
1136
+ * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
1137
+ ```json
1138
+ {'guide': SentenceTransformer(
1139
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
1140
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
1141
+ (2): Normalize()
1142
+ ), 'temperature': 0.01, 'margin_strategy': 'absolute', 'margin': 0.0}
1143
+ ```
1144
+ </details>
1145
+ <details><summary>full_zh</summary>
1146
+
1147
+ #### full_zh
1148
+
1149
+ * Dataset: full_zh
1150
+ * Size: 30,401 training samples
1151
+ * Columns: <code>anchor</code> and <code>positive</code>
1152
+ * Approximate statistics based on the first 1000 samples:
1153
+ | | anchor | positive |
1154
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
1155
+ | type | string | string |
1156
+ | details | <ul><li>min: 5 tokens</li><li>mean: 7.15 tokens</li><li>max: 14 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 7.46 tokens</li><li>max: 21 tokens</li></ul> |
1157
+ * Samples:
1158
+ | anchor | positive |
1159
+ |:------------------|:---------------------|
1160
+ | <code>技术总监</code> | <code>技术和运营总监</code> |
1161
+ | <code>技术总监</code> | <code>技术主管</code> |
1162
+ | <code>技术总监</code> | <code>技术艺术总监</code> |
1163
+ * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
1164
+ ```json
1165
+ {'guide': SentenceTransformer(
1166
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
1167
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
1168
+ (2): Normalize()
1169
+ ), 'temperature': 0.01, 'margin_strategy': 'absolute', 'margin': 0.0}
1170
+ ```
1171
+ </details>
1172
+ <details><summary>mix</summary>
1173
+
1174
+ #### mix
1175
+
1176
+ * Dataset: mix
1177
+ * Size: 21,760 training samples
1178
+ * Columns: <code>anchor</code> and <code>positive</code>
1179
+ * Approximate statistics based on the first 1000 samples:
1180
+ | | anchor | positive |
1181
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
1182
+ | type | string | string |
1183
+ | details | <ul><li>min: 2 tokens</li><li>mean: 6.71 tokens</li><li>max: 19 tokens</li></ul> | <ul><li>min: 2 tokens</li><li>mean: 7.69 tokens</li><li>max: 19 tokens</li></ul> |
1184
+ * Samples:
1185
+ | anchor | positive |
1186
+ |:------------------------------------------|:----------------------------------------------------------------|
1187
+ | <code>technical manager</code> | <code>Technischer Direktor für Bühne, Film und Fernsehen</code> |
1188
+ | <code>head of technical</code> | <code>directora técnica</code> |
1189
+ | <code>head of technical department</code> | <code>技术艺术总监</code> |
1190
+ * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
1191
+ ```json
1192
+ {'guide': SentenceTransformer(
1193
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
1194
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
1195
+ (2): Normalize()
1196
+ ), 'temperature': 0.01, 'margin_strategy': 'absolute', 'margin': 0.0}
1197
+ ```
1198
+ </details>
1199
+
1200
+ ### Training Hyperparameters
1201
+ #### Non-Default Hyperparameters
1202
+
1203
+ - `eval_strategy`: steps
1204
+ - `per_device_train_batch_size`: 128
1205
+ - `per_device_eval_batch_size`: 128
1206
+ - `gradient_accumulation_steps`: 2
1207
+ - `num_train_epochs`: 5
1208
+ - `warmup_ratio`: 0.05
1209
+ - `log_on_each_node`: False
1210
+ - `fp16`: True
1211
+ - `dataloader_num_workers`: 4
1212
+ - `ddp_find_unused_parameters`: True
1213
+ - `batch_sampler`: no_duplicates
1214
+
1215
+ #### All Hyperparameters
1216
+ <details><summary>Click to expand</summary>
1217
+
1218
+ - `overwrite_output_dir`: False
1219
+ - `do_predict`: False
1220
+ - `eval_strategy`: steps
1221
+ - `prediction_loss_only`: True
1222
+ - `per_device_train_batch_size`: 128
1223
+ - `per_device_eval_batch_size`: 128
1224
+ - `per_gpu_train_batch_size`: None
1225
+ - `per_gpu_eval_batch_size`: None
1226
+ - `gradient_accumulation_steps`: 2
1227
+ - `eval_accumulation_steps`: None
1228
+ - `torch_empty_cache_steps`: None
1229
+ - `learning_rate`: 5e-05
1230
+ - `weight_decay`: 0.0
1231
+ - `adam_beta1`: 0.9
1232
+ - `adam_beta2`: 0.999
1233
+ - `adam_epsilon`: 1e-08
1234
+ - `max_grad_norm`: 1.0
1235
+ - `num_train_epochs`: 5
1236
+ - `max_steps`: -1
1237
+ - `lr_scheduler_type`: linear
1238
+ - `lr_scheduler_kwargs`: {}
1239
+ - `warmup_ratio`: 0.05
1240
+ - `warmup_steps`: 0
1241
+ - `log_level`: passive
1242
+ - `log_level_replica`: warning
1243
+ - `log_on_each_node`: False
1244
+ - `logging_nan_inf_filter`: True
1245
+ - `save_safetensors`: True
1246
+ - `save_on_each_node`: False
1247
+ - `save_only_model`: False
1248
+ - `restore_callback_states_from_checkpoint`: False
1249
+ - `no_cuda`: False
1250
+ - `use_cpu`: False
1251
+ - `use_mps_device`: False
1252
+ - `seed`: 42
1253
+ - `data_seed`: None
1254
+ - `jit_mode_eval`: False
1255
+ - `use_ipex`: False
1256
+ - `bf16`: False
1257
+ - `fp16`: True
1258
+ - `fp16_opt_level`: O1
1259
+ - `half_precision_backend`: auto
1260
+ - `bf16_full_eval`: False
1261
+ - `fp16_full_eval`: False
1262
+ - `tf32`: None
1263
+ - `local_rank`: 0
1264
+ - `ddp_backend`: None
1265
+ - `tpu_num_cores`: None
1266
+ - `tpu_metrics_debug`: False
1267
+ - `debug`: []
1268
+ - `dataloader_drop_last`: True
1269
+ - `dataloader_num_workers`: 4
1270
+ - `dataloader_prefetch_factor`: None
1271
+ - `past_index`: -1
1272
+ - `disable_tqdm`: False
1273
+ - `remove_unused_columns`: True
1274
+ - `label_names`: None
1275
+ - `load_best_model_at_end`: False
1276
+ - `ignore_data_skip`: False
1277
+ - `fsdp`: []
1278
+ - `fsdp_min_num_params`: 0
1279
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
1280
+ - `tp_size`: 0
1281
+ - `fsdp_transformer_layer_cls_to_wrap`: None
1282
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
1283
+ - `deepspeed`: None
1284
+ - `label_smoothing_factor`: 0.0
1285
+ - `optim`: adamw_torch
1286
+ - `optim_args`: None
1287
+ - `adafactor`: False
1288
+ - `group_by_length`: False
1289
+ - `length_column_name`: length
1290
+ - `ddp_find_unused_parameters`: True
1291
+ - `ddp_bucket_cap_mb`: None
1292
+ - `ddp_broadcast_buffers`: False
1293
+ - `dataloader_pin_memory`: True
1294
+ - `dataloader_persistent_workers`: False
1295
+ - `skip_memory_metrics`: True
1296
+ - `use_legacy_prediction_loop`: False
1297
+ - `push_to_hub`: False
1298
+ - `resume_from_checkpoint`: None
1299
+ - `hub_model_id`: None
1300
+ - `hub_strategy`: every_save
1301
+ - `hub_private_repo`: None
1302
+ - `hub_always_push`: False
1303
+ - `gradient_checkpointing`: False
1304
+ - `gradient_checkpointing_kwargs`: None
1305
+ - `include_inputs_for_metrics`: False
1306
+ - `include_for_metrics`: []
1307
+ - `eval_do_concat_batches`: True
1308
+ - `fp16_backend`: auto
1309
+ - `push_to_hub_model_id`: None
1310
+ - `push_to_hub_organization`: None
1311
+ - `mp_parameters`:
1312
+ - `auto_find_batch_size`: False
1313
+ - `full_determinism`: False
1314
+ - `torchdynamo`: None
1315
+ - `ray_scope`: last
1316
+ - `ddp_timeout`: 1800
1317
+ - `torch_compile`: False
1318
+ - `torch_compile_backend`: None
1319
+ - `torch_compile_mode`: None
1320
+ - `include_tokens_per_second`: False
1321
+ - `include_num_input_tokens_seen`: False
1322
+ - `neftune_noise_alpha`: None
1323
+ - `optim_target_modules`: None
1324
+ - `batch_eval_metrics`: False
1325
+ - `eval_on_start`: False
1326
+ - `use_liger_kernel`: False
1327
+ - `eval_use_gather_object`: False
1328
+ - `average_tokens_across_devices`: False
1329
+ - `prompts`: None
1330
+ - `batch_sampler`: no_duplicates
1331
+ - `multi_dataset_batch_sampler`: proportional
1332
+
1333
+ </details>
1334
+
1335
+ ### Training Logs
1336
+ | Epoch | Step | Training Loss | full_en_cosine_ndcg@200 | full_es_cosine_ndcg@200 | full_de_cosine_ndcg@200 | full_zh_cosine_ndcg@200 | mix_es_cosine_ndcg@200 | mix_de_cosine_ndcg@200 | mix_zh_cosine_ndcg@200 |
1337
+ |:------:|:----:|:-------------:|:-----------------------:|:-----------------------:|:-----------------------:|:-----------------------:|:----------------------:|:----------------------:|:----------------------:|
1338
+ | -1 | -1 | - | 0.7447 | 0.6125 | 0.5378 | 0.7240 | 0.7029 | 0.6345 | 0.7437 |
1339
+ | 0.0082 | 1 | 4.3088 | - | - | - | - | - | - | - |
1340
+ | 0.8230 | 100 | 1.9026 | - | - | - | - | - | - | - |
1341
+ | 1.6502 | 200 | 0.9336 | 0.8024 | 0.6703 | 0.6109 | 0.7695 | 0.7914 | 0.7594 | 0.8136 |
1342
+ | 2.4774 | 300 | 0.161 | - | - | - | - | - | - | - |
1343
+ | 3.3045 | 400 | 0.1398 | 0.8039 | 0.6717 | 0.6092 | 0.7697 | 0.8012 | 0.7696 | 0.8187 |
1344
+
1345
+
1346
+ ### Framework Versions
1347
+ - Python: 3.11.11
1348
+ - Sentence Transformers: 4.1.0
1349
+ - Transformers: 4.51.2
1350
+ - PyTorch: 2.6.0+cu124
1351
+ - Accelerate: 1.6.0
1352
+ - Datasets: 3.5.0
1353
+ - Tokenizers: 0.21.1
1354
+
1355
+ ## Citation
1356
+
1357
+ ### BibTeX
1358
+
1359
+ #### Sentence Transformers
1360
+ ```bibtex
1361
+ @inproceedings{reimers-2019-sentence-bert,
1362
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
1363
+ author = "Reimers, Nils and Gurevych, Iryna",
1364
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
1365
+ month = "11",
1366
+ year = "2019",
1367
+ publisher = "Association for Computational Linguistics",
1368
+ url = "https://arxiv.org/abs/1908.10084",
1369
+ }
1370
+ ```
1371
+
1372
+ #### GISTEmbedLoss
1373
+ ```bibtex
1374
+ @misc{solatorio2024gistembed,
1375
+ title={GISTEmbed: Guided In-sample Selection of Training Negatives for Text Embedding Fine-tuning},
1376
+ author={Aivin V. Solatorio},
1377
+ year={2024},
1378
+ eprint={2402.16829},
1379
+ archivePrefix={arXiv},
1380
+ primaryClass={cs.LG}
1381
+ }
1382
+ ```
1383
+
1384
+ <!--
1385
+ ## Glossary
1386
+
1387
+ *Clearly define terms in order to be accessible across audiences.*
1388
+ -->
1389
+
1390
+ <!--
1391
+ ## Model Card Authors
1392
+
1393
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
1394
+ -->
1395
+
1396
+ <!--
1397
+ ## Model Card Contact
1398
+
1399
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
1400
+ -->
checkpoint-200/1_Pooling/config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "word_embedding_dimension": 768,
3
+ "pooling_mode_cls_token": true,
4
+ "pooling_mode_mean_tokens": false,
5
+ "pooling_mode_max_tokens": false,
6
+ "pooling_mode_mean_sqrt_len_tokens": false,
7
+ "pooling_mode_weightedmean_tokens": false,
8
+ "pooling_mode_lasttoken": false,
9
+ "include_prompt": true
10
+ }
checkpoint-200/README.md ADDED
@@ -0,0 +1,1398 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - sentence-transformers
4
+ - sentence-similarity
5
+ - feature-extraction
6
+ - generated_from_trainer
7
+ - dataset_size:124788
8
+ - loss:GISTEmbedLoss
9
+ base_model: Alibaba-NLP/gte-multilingual-base
10
+ widget:
11
+ - source_sentence: 其他机械、设备和有形货物租赁服务代表
12
+ sentences:
13
+ - 其他机械和设备租赁服务工作人员
14
+ - 电子和电信设备及零部件物流经理
15
+ - 工业主厨
16
+ - source_sentence: 公交车司机
17
+ sentences:
18
+ - 表演灯光设计师
19
+ - 乙烯基地板安装工
20
+ - 国际巴士司机
21
+ - source_sentence: online communication manager
22
+ sentences:
23
+ - trades union official
24
+ - social media manager
25
+ - budget manager
26
+ - source_sentence: Projektmanagerin
27
+ sentences:
28
+ - Projektmanager/Projektmanagerin
29
+ - Category-Manager
30
+ - Infanterist
31
+ - source_sentence: Volksvertreter
32
+ sentences:
33
+ - Parlamentarier
34
+ - Oberbürgermeister
35
+ - Konsul
36
+ pipeline_tag: sentence-similarity
37
+ library_name: sentence-transformers
38
+ metrics:
39
+ - cosine_accuracy@1
40
+ - cosine_accuracy@20
41
+ - cosine_accuracy@50
42
+ - cosine_accuracy@100
43
+ - cosine_accuracy@150
44
+ - cosine_accuracy@200
45
+ - cosine_precision@1
46
+ - cosine_precision@20
47
+ - cosine_precision@50
48
+ - cosine_precision@100
49
+ - cosine_precision@150
50
+ - cosine_precision@200
51
+ - cosine_recall@1
52
+ - cosine_recall@20
53
+ - cosine_recall@50
54
+ - cosine_recall@100
55
+ - cosine_recall@150
56
+ - cosine_recall@200
57
+ - cosine_ndcg@1
58
+ - cosine_ndcg@20
59
+ - cosine_ndcg@50
60
+ - cosine_ndcg@100
61
+ - cosine_ndcg@150
62
+ - cosine_ndcg@200
63
+ - cosine_mrr@1
64
+ - cosine_mrr@20
65
+ - cosine_mrr@50
66
+ - cosine_mrr@100
67
+ - cosine_mrr@150
68
+ - cosine_mrr@200
69
+ - cosine_map@1
70
+ - cosine_map@20
71
+ - cosine_map@50
72
+ - cosine_map@100
73
+ - cosine_map@150
74
+ - cosine_map@200
75
+ - cosine_map@500
76
+ model-index:
77
+ - name: SentenceTransformer based on Alibaba-NLP/gte-multilingual-base
78
+ results:
79
+ - task:
80
+ type: information-retrieval
81
+ name: Information Retrieval
82
+ dataset:
83
+ name: full en
84
+ type: full_en
85
+ metrics:
86
+ - type: cosine_accuracy@1
87
+ value: 0.6571428571428571
88
+ name: Cosine Accuracy@1
89
+ - type: cosine_accuracy@20
90
+ value: 0.9904761904761905
91
+ name: Cosine Accuracy@20
92
+ - type: cosine_accuracy@50
93
+ value: 0.9904761904761905
94
+ name: Cosine Accuracy@50
95
+ - type: cosine_accuracy@100
96
+ value: 0.9904761904761905
97
+ name: Cosine Accuracy@100
98
+ - type: cosine_accuracy@150
99
+ value: 0.9904761904761905
100
+ name: Cosine Accuracy@150
101
+ - type: cosine_accuracy@200
102
+ value: 0.9904761904761905
103
+ name: Cosine Accuracy@200
104
+ - type: cosine_precision@1
105
+ value: 0.6571428571428571
106
+ name: Cosine Precision@1
107
+ - type: cosine_precision@20
108
+ value: 0.5133333333333332
109
+ name: Cosine Precision@20
110
+ - type: cosine_precision@50
111
+ value: 0.3169523809523809
112
+ name: Cosine Precision@50
113
+ - type: cosine_precision@100
114
+ value: 0.18971428571428572
115
+ name: Cosine Precision@100
116
+ - type: cosine_precision@150
117
+ value: 0.13619047619047617
118
+ name: Cosine Precision@150
119
+ - type: cosine_precision@200
120
+ value: 0.10561904761904761
121
+ name: Cosine Precision@200
122
+ - type: cosine_recall@1
123
+ value: 0.06695957251887064
124
+ name: Cosine Recall@1
125
+ - type: cosine_recall@20
126
+ value: 0.5478306503729546
127
+ name: Cosine Recall@20
128
+ - type: cosine_recall@50
129
+ value: 0.7470276357469449
130
+ name: Cosine Recall@50
131
+ - type: cosine_recall@100
132
+ value: 0.8467073011936345
133
+ name: Cosine Recall@100
134
+ - type: cosine_recall@150
135
+ value: 0.9010846211520122
136
+ name: Cosine Recall@150
137
+ - type: cosine_recall@200
138
+ value: 0.9256595392715059
139
+ name: Cosine Recall@200
140
+ - type: cosine_ndcg@1
141
+ value: 0.6571428571428571
142
+ name: Cosine Ndcg@1
143
+ - type: cosine_ndcg@20
144
+ value: 0.6923506957704934
145
+ name: Cosine Ndcg@20
146
+ - type: cosine_ndcg@50
147
+ value: 0.7170311913169547
148
+ name: Cosine Ndcg@50
149
+ - type: cosine_ndcg@100
150
+ value: 0.7690946845916871
151
+ name: Cosine Ndcg@100
152
+ - type: cosine_ndcg@150
153
+ value: 0.7923061459636489
154
+ name: Cosine Ndcg@150
155
+ - type: cosine_ndcg@200
156
+ value: 0.8023952171736648
157
+ name: Cosine Ndcg@200
158
+ - type: cosine_mrr@1
159
+ value: 0.6571428571428571
160
+ name: Cosine Mrr@1
161
+ - type: cosine_mrr@20
162
+ value: 0.8111111111111111
163
+ name: Cosine Mrr@20
164
+ - type: cosine_mrr@50
165
+ value: 0.8111111111111111
166
+ name: Cosine Mrr@50
167
+ - type: cosine_mrr@100
168
+ value: 0.8111111111111111
169
+ name: Cosine Mrr@100
170
+ - type: cosine_mrr@150
171
+ value: 0.8111111111111111
172
+ name: Cosine Mrr@150
173
+ - type: cosine_mrr@200
174
+ value: 0.8111111111111111
175
+ name: Cosine Mrr@200
176
+ - type: cosine_map@1
177
+ value: 0.6571428571428571
178
+ name: Cosine Map@1
179
+ - type: cosine_map@20
180
+ value: 0.5516314386214587
181
+ name: Cosine Map@20
182
+ - type: cosine_map@50
183
+ value: 0.5474217433291914
184
+ name: Cosine Map@50
185
+ - type: cosine_map@100
186
+ value: 0.5799091076338031
187
+ name: Cosine Map@100
188
+ - type: cosine_map@150
189
+ value: 0.5895042547793764
190
+ name: Cosine Map@150
191
+ - type: cosine_map@200
192
+ value: 0.5930550248640567
193
+ name: Cosine Map@200
194
+ - type: cosine_map@500
195
+ value: 0.5967311945998978
196
+ name: Cosine Map@500
197
+ - task:
198
+ type: information-retrieval
199
+ name: Information Retrieval
200
+ dataset:
201
+ name: full es
202
+ type: full_es
203
+ metrics:
204
+ - type: cosine_accuracy@1
205
+ value: 0.11891891891891893
206
+ name: Cosine Accuracy@1
207
+ - type: cosine_accuracy@20
208
+ value: 1.0
209
+ name: Cosine Accuracy@20
210
+ - type: cosine_accuracy@50
211
+ value: 1.0
212
+ name: Cosine Accuracy@50
213
+ - type: cosine_accuracy@100
214
+ value: 1.0
215
+ name: Cosine Accuracy@100
216
+ - type: cosine_accuracy@150
217
+ value: 1.0
218
+ name: Cosine Accuracy@150
219
+ - type: cosine_accuracy@200
220
+ value: 1.0
221
+ name: Cosine Accuracy@200
222
+ - type: cosine_precision@1
223
+ value: 0.11891891891891893
224
+ name: Cosine Precision@1
225
+ - type: cosine_precision@20
226
+ value: 0.5767567567567567
227
+ name: Cosine Precision@20
228
+ - type: cosine_precision@50
229
+ value: 0.3907027027027027
230
+ name: Cosine Precision@50
231
+ - type: cosine_precision@100
232
+ value: 0.2541621621621622
233
+ name: Cosine Precision@100
234
+ - type: cosine_precision@150
235
+ value: 0.19225225225225226
236
+ name: Cosine Precision@150
237
+ - type: cosine_precision@200
238
+ value: 0.15264864864864866
239
+ name: Cosine Precision@200
240
+ - type: cosine_recall@1
241
+ value: 0.0035436931012884127
242
+ name: Cosine Recall@1
243
+ - type: cosine_recall@20
244
+ value: 0.3862419782331355
245
+ name: Cosine Recall@20
246
+ - type: cosine_recall@50
247
+ value: 0.5625768407738393
248
+ name: Cosine Recall@50
249
+ - type: cosine_recall@100
250
+ value: 0.6836436316189977
251
+ name: Cosine Recall@100
252
+ - type: cosine_recall@150
253
+ value: 0.7496865406970199
254
+ name: Cosine Recall@150
255
+ - type: cosine_recall@200
256
+ value: 0.7852629043380305
257
+ name: Cosine Recall@200
258
+ - type: cosine_ndcg@1
259
+ value: 0.11891891891891893
260
+ name: Cosine Ndcg@1
261
+ - type: cosine_ndcg@20
262
+ value: 0.6158554243812342
263
+ name: Cosine Ndcg@20
264
+ - type: cosine_ndcg@50
265
+ value: 0.5886857089260162
266
+ name: Cosine Ndcg@50
267
+ - type: cosine_ndcg@100
268
+ value: 0.6196114606257926
269
+ name: Cosine Ndcg@100
270
+ - type: cosine_ndcg@150
271
+ value: 0.6530674955405338
272
+ name: Cosine Ndcg@150
273
+ - type: cosine_ndcg@200
274
+ value: 0.670287400819268
275
+ name: Cosine Ndcg@200
276
+ - type: cosine_mrr@1
277
+ value: 0.11891891891891893
278
+ name: Cosine Mrr@1
279
+ - type: cosine_mrr@20
280
+ value: 0.5554054054054054
281
+ name: Cosine Mrr@20
282
+ - type: cosine_mrr@50
283
+ value: 0.5554054054054054
284
+ name: Cosine Mrr@50
285
+ - type: cosine_mrr@100
286
+ value: 0.5554054054054054
287
+ name: Cosine Mrr@100
288
+ - type: cosine_mrr@150
289
+ value: 0.5554054054054054
290
+ name: Cosine Mrr@150
291
+ - type: cosine_mrr@200
292
+ value: 0.5554054054054054
293
+ name: Cosine Mrr@200
294
+ - type: cosine_map@1
295
+ value: 0.11891891891891893
296
+ name: Cosine Map@1
297
+ - type: cosine_map@20
298
+ value: 0.4839539531842883
299
+ name: Cosine Map@20
300
+ - type: cosine_map@50
301
+ value: 0.4288206349412292
302
+ name: Cosine Map@50
303
+ - type: cosine_map@100
304
+ value: 0.43522297182400527
305
+ name: Cosine Map@100
306
+ - type: cosine_map@150
307
+ value: 0.4511056582755023
308
+ name: Cosine Map@150
309
+ - type: cosine_map@200
310
+ value: 0.45802493743471273
311
+ name: Cosine Map@200
312
+ - type: cosine_map@500
313
+ value: 0.47075604946048677
314
+ name: Cosine Map@500
315
+ - task:
316
+ type: information-retrieval
317
+ name: Information Retrieval
318
+ dataset:
319
+ name: full de
320
+ type: full_de
321
+ metrics:
322
+ - type: cosine_accuracy@1
323
+ value: 0.2955665024630542
324
+ name: Cosine Accuracy@1
325
+ - type: cosine_accuracy@20
326
+ value: 0.9852216748768473
327
+ name: Cosine Accuracy@20
328
+ - type: cosine_accuracy@50
329
+ value: 0.9852216748768473
330
+ name: Cosine Accuracy@50
331
+ - type: cosine_accuracy@100
332
+ value: 0.9901477832512315
333
+ name: Cosine Accuracy@100
334
+ - type: cosine_accuracy@150
335
+ value: 0.9901477832512315
336
+ name: Cosine Accuracy@150
337
+ - type: cosine_accuracy@200
338
+ value: 0.9901477832512315
339
+ name: Cosine Accuracy@200
340
+ - type: cosine_precision@1
341
+ value: 0.2955665024630542
342
+ name: Cosine Precision@1
343
+ - type: cosine_precision@20
344
+ value: 0.5073891625615764
345
+ name: Cosine Precision@20
346
+ - type: cosine_precision@50
347
+ value: 0.3681773399014779
348
+ name: Cosine Precision@50
349
+ - type: cosine_precision@100
350
+ value: 0.24177339901477832
351
+ name: Cosine Precision@100
352
+ - type: cosine_precision@150
353
+ value: 0.18187192118226603
354
+ name: Cosine Precision@150
355
+ - type: cosine_precision@200
356
+ value: 0.1470689655172414
357
+ name: Cosine Precision@200
358
+ - type: cosine_recall@1
359
+ value: 0.01108543831680986
360
+ name: Cosine Recall@1
361
+ - type: cosine_recall@20
362
+ value: 0.31887986522832873
363
+ name: Cosine Recall@20
364
+ - type: cosine_recall@50
365
+ value: 0.5004342335550164
366
+ name: Cosine Recall@50
367
+ - type: cosine_recall@100
368
+ value: 0.6244233924508789
369
+ name: Cosine Recall@100
370
+ - type: cosine_recall@150
371
+ value: 0.687486468465792
372
+ name: Cosine Recall@150
373
+ - type: cosine_recall@200
374
+ value: 0.7334854348170513
375
+ name: Cosine Recall@200
376
+ - type: cosine_ndcg@1
377
+ value: 0.2955665024630542
378
+ name: Cosine Ndcg@1
379
+ - type: cosine_ndcg@20
380
+ value: 0.5342874353496432
381
+ name: Cosine Ndcg@20
382
+ - type: cosine_ndcg@50
383
+ value: 0.5251712513704461
384
+ name: Cosine Ndcg@50
385
+ - type: cosine_ndcg@100
386
+ value: 0.5568512483442096
387
+ name: Cosine Ndcg@100
388
+ - type: cosine_ndcg@150
389
+ value: 0.5891923427833955
390
+ name: Cosine Ndcg@150
391
+ - type: cosine_ndcg@200
392
+ value: 0.6108910915140433
393
+ name: Cosine Ndcg@200
394
+ - type: cosine_mrr@1
395
+ value: 0.2955665024630542
396
+ name: Cosine Mrr@1
397
+ - type: cosine_mrr@20
398
+ value: 0.5140785596450613
399
+ name: Cosine Mrr@20
400
+ - type: cosine_mrr@50
401
+ value: 0.5140785596450613
402
+ name: Cosine Mrr@50
403
+ - type: cosine_mrr@100
404
+ value: 0.5141580130059386
405
+ name: Cosine Mrr@100
406
+ - type: cosine_mrr@150
407
+ value: 0.5141580130059386
408
+ name: Cosine Mrr@150
409
+ - type: cosine_mrr@200
410
+ value: 0.5141580130059386
411
+ name: Cosine Mrr@200
412
+ - type: cosine_map@1
413
+ value: 0.2955665024630542
414
+ name: Cosine Map@1
415
+ - type: cosine_map@20
416
+ value: 0.3952556219642319
417
+ name: Cosine Map@20
418
+ - type: cosine_map@50
419
+ value: 0.3565895386599598
420
+ name: Cosine Map@50
421
+ - type: cosine_map@100
422
+ value: 0.3618162919366333
423
+ name: Cosine Map@100
424
+ - type: cosine_map@150
425
+ value: 0.37673093284239206
426
+ name: Cosine Map@150
427
+ - type: cosine_map@200
428
+ value: 0.3850375691141728
429
+ name: Cosine Map@200
430
+ - type: cosine_map@500
431
+ value: 0.3976475131909832
432
+ name: Cosine Map@500
433
+ - task:
434
+ type: information-retrieval
435
+ name: Information Retrieval
436
+ dataset:
437
+ name: full zh
438
+ type: full_zh
439
+ metrics:
440
+ - type: cosine_accuracy@1
441
+ value: 0.6601941747572816
442
+ name: Cosine Accuracy@1
443
+ - type: cosine_accuracy@20
444
+ value: 0.9902912621359223
445
+ name: Cosine Accuracy@20
446
+ - type: cosine_accuracy@50
447
+ value: 0.9902912621359223
448
+ name: Cosine Accuracy@50
449
+ - type: cosine_accuracy@100
450
+ value: 0.9902912621359223
451
+ name: Cosine Accuracy@100
452
+ - type: cosine_accuracy@150
453
+ value: 0.9902912621359223
454
+ name: Cosine Accuracy@150
455
+ - type: cosine_accuracy@200
456
+ value: 0.9902912621359223
457
+ name: Cosine Accuracy@200
458
+ - type: cosine_precision@1
459
+ value: 0.6601941747572816
460
+ name: Cosine Precision@1
461
+ - type: cosine_precision@20
462
+ value: 0.4868932038834952
463
+ name: Cosine Precision@20
464
+ - type: cosine_precision@50
465
+ value: 0.2959223300970874
466
+ name: Cosine Precision@50
467
+ - type: cosine_precision@100
468
+ value: 0.17902912621359218
469
+ name: Cosine Precision@100
470
+ - type: cosine_precision@150
471
+ value: 0.12912621359223303
472
+ name: Cosine Precision@150
473
+ - type: cosine_precision@200
474
+ value: 0.10063106796116503
475
+ name: Cosine Precision@200
476
+ - type: cosine_recall@1
477
+ value: 0.06669332811942774
478
+ name: Cosine Recall@1
479
+ - type: cosine_recall@20
480
+ value: 0.52040897323663
481
+ name: Cosine Recall@20
482
+ - type: cosine_recall@50
483
+ value: 0.7067236634036261
484
+ name: Cosine Recall@50
485
+ - type: cosine_recall@100
486
+ value: 0.813864097315397
487
+ name: Cosine Recall@100
488
+ - type: cosine_recall@150
489
+ value: 0.8683619147921042
490
+ name: Cosine Recall@150
491
+ - type: cosine_recall@200
492
+ value: 0.8964210248615742
493
+ name: Cosine Recall@200
494
+ - type: cosine_ndcg@1
495
+ value: 0.6601941747572816
496
+ name: Cosine Ndcg@1
497
+ - type: cosine_ndcg@20
498
+ value: 0.6629898844211244
499
+ name: Cosine Ndcg@20
500
+ - type: cosine_ndcg@50
501
+ value: 0.682216395408567
502
+ name: Cosine Ndcg@50
503
+ - type: cosine_ndcg@100
504
+ value: 0.7344118850318737
505
+ name: Cosine Ndcg@100
506
+ - type: cosine_ndcg@150
507
+ value: 0.7580048379992059
508
+ name: Cosine Ndcg@150
509
+ - type: cosine_ndcg@200
510
+ value: 0.769464510105362
511
+ name: Cosine Ndcg@200
512
+ - type: cosine_mrr@1
513
+ value: 0.6601941747572816
514
+ name: Cosine Mrr@1
515
+ - type: cosine_mrr@20
516
+ value: 0.8068423485899215
517
+ name: Cosine Mrr@20
518
+ - type: cosine_mrr@50
519
+ value: 0.8068423485899215
520
+ name: Cosine Mrr@50
521
+ - type: cosine_mrr@100
522
+ value: 0.8068423485899215
523
+ name: Cosine Mrr@100
524
+ - type: cosine_mrr@150
525
+ value: 0.8068423485899215
526
+ name: Cosine Mrr@150
527
+ - type: cosine_mrr@200
528
+ value: 0.8068423485899215
529
+ name: Cosine Mrr@200
530
+ - type: cosine_map@1
531
+ value: 0.6601941747572816
532
+ name: Cosine Map@1
533
+ - type: cosine_map@20
534
+ value: 0.5176817014415404
535
+ name: Cosine Map@20
536
+ - type: cosine_map@50
537
+ value: 0.5050961591489588
538
+ name: Cosine Map@50
539
+ - type: cosine_map@100
540
+ value: 0.5346277197767966
541
+ name: Cosine Map@100
542
+ - type: cosine_map@150
543
+ value: 0.5441006347287816
544
+ name: Cosine Map@150
545
+ - type: cosine_map@200
546
+ value: 0.547804939644668
547
+ name: Cosine Map@200
548
+ - type: cosine_map@500
549
+ value: 0.5524877228701637
550
+ name: Cosine Map@500
551
+ - task:
552
+ type: information-retrieval
553
+ name: Information Retrieval
554
+ dataset:
555
+ name: mix es
556
+ type: mix_es
557
+ metrics:
558
+ - type: cosine_accuracy@1
559
+ value: 0.7009880395215808
560
+ name: Cosine Accuracy@1
561
+ - type: cosine_accuracy@20
562
+ value: 0.9474778991159646
563
+ name: Cosine Accuracy@20
564
+ - type: cosine_accuracy@50
565
+ value: 0.9776391055642226
566
+ name: Cosine Accuracy@50
567
+ - type: cosine_accuracy@100
568
+ value: 0.9885595423816953
569
+ name: Cosine Accuracy@100
570
+ - type: cosine_accuracy@150
571
+ value: 0.9921996879875195
572
+ name: Cosine Accuracy@150
573
+ - type: cosine_accuracy@200
574
+ value: 0.9932397295891836
575
+ name: Cosine Accuracy@200
576
+ - type: cosine_precision@1
577
+ value: 0.7009880395215808
578
+ name: Cosine Precision@1
579
+ - type: cosine_precision@20
580
+ value: 0.11968278731149247
581
+ name: Cosine Precision@20
582
+ - type: cosine_precision@50
583
+ value: 0.05085803432137287
584
+ name: Cosine Precision@50
585
+ - type: cosine_precision@100
586
+ value: 0.02598543941757671
587
+ name: Cosine Precision@100
588
+ - type: cosine_precision@150
589
+ value: 0.017493499739989597
590
+ name: Cosine Precision@150
591
+ - type: cosine_precision@200
592
+ value: 0.013198127925117008
593
+ name: Cosine Precision@200
594
+ - type: cosine_recall@1
595
+ value: 0.27067577941212884
596
+ name: Cosine Recall@1
597
+ - type: cosine_recall@20
598
+ value: 0.8850840700294678
599
+ name: Cosine Recall@20
600
+ - type: cosine_recall@50
601
+ value: 0.9390968972092216
602
+ name: Cosine Recall@50
603
+ - type: cosine_recall@100
604
+ value: 0.9599497313225862
605
+ name: Cosine Recall@100
606
+ - type: cosine_recall@150
607
+ value: 0.9695527821112844
608
+ name: Cosine Recall@150
609
+ - type: cosine_recall@200
610
+ value: 0.9758970358814353
611
+ name: Cosine Recall@200
612
+ - type: cosine_ndcg@1
613
+ value: 0.7009880395215808
614
+ name: Cosine Ndcg@1
615
+ - type: cosine_ndcg@20
616
+ value: 0.7690336236998598
617
+ name: Cosine Ndcg@20
618
+ - type: cosine_ndcg@50
619
+ value: 0.7838732562697655
620
+ name: Cosine Ndcg@50
621
+ - type: cosine_ndcg@100
622
+ value: 0.7884317468596705
623
+ name: Cosine Ndcg@100
624
+ - type: cosine_ndcg@150
625
+ value: 0.7902844804245556
626
+ name: Cosine Ndcg@150
627
+ - type: cosine_ndcg@200
628
+ value: 0.7913994944724545
629
+ name: Cosine Ndcg@200
630
+ - type: cosine_mrr@1
631
+ value: 0.7009880395215808
632
+ name: Cosine Mrr@1
633
+ - type: cosine_mrr@20
634
+ value: 0.7712491671812917
635
+ name: Cosine Mrr@20
636
+ - type: cosine_mrr@50
637
+ value: 0.7722842539435679
638
+ name: Cosine Mrr@50
639
+ - type: cosine_mrr@100
640
+ value: 0.7724347923967887
641
+ name: Cosine Mrr@100
642
+ - type: cosine_mrr@150
643
+ value: 0.7724644404043258
644
+ name: Cosine Mrr@150
645
+ - type: cosine_mrr@200
646
+ value: 0.7724705526191206
647
+ name: Cosine Mrr@200
648
+ - type: cosine_map@1
649
+ value: 0.7009880395215808
650
+ name: Cosine Map@1
651
+ - type: cosine_map@20
652
+ value: 0.6938173897965141
653
+ name: Cosine Map@20
654
+ - type: cosine_map@50
655
+ value: 0.6978248868009254
656
+ name: Cosine Map@50
657
+ - type: cosine_map@100
658
+ value: 0.6984889579958145
659
+ name: Cosine Map@100
660
+ - type: cosine_map@150
661
+ value: 0.6986621032108891
662
+ name: Cosine Map@150
663
+ - type: cosine_map@200
664
+ value: 0.6987465392575996
665
+ name: Cosine Map@200
666
+ - type: cosine_map@500
667
+ value: 0.6988876342368443
668
+ name: Cosine Map@500
669
+ - task:
670
+ type: information-retrieval
671
+ name: Information Retrieval
672
+ dataset:
673
+ name: mix de
674
+ type: mix_de
675
+ metrics:
676
+ - type: cosine_accuracy@1
677
+ value: 0.642225689027561
678
+ name: Cosine Accuracy@1
679
+ - type: cosine_accuracy@20
680
+ value: 0.9240769630785232
681
+ name: Cosine Accuracy@20
682
+ - type: cosine_accuracy@50
683
+ value: 0.9635985439417577
684
+ name: Cosine Accuracy@50
685
+ - type: cosine_accuracy@100
686
+ value: 0.9771190847633905
687
+ name: Cosine Accuracy@100
688
+ - type: cosine_accuracy@150
689
+ value: 0.984919396775871
690
+ name: Cosine Accuracy@150
691
+ - type: cosine_accuracy@200
692
+ value: 0.9901196047841914
693
+ name: Cosine Accuracy@200
694
+ - type: cosine_precision@1
695
+ value: 0.642225689027561
696
+ name: Cosine Precision@1
697
+ - type: cosine_precision@20
698
+ value: 0.11911076443057722
699
+ name: Cosine Precision@20
700
+ - type: cosine_precision@50
701
+ value: 0.05086843473738951
702
+ name: Cosine Precision@50
703
+ - type: cosine_precision@100
704
+ value: 0.0261622464898596
705
+ name: Cosine Precision@100
706
+ - type: cosine_precision@150
707
+ value: 0.01770844167100017
708
+ name: Cosine Precision@150
709
+ - type: cosine_precision@200
710
+ value: 0.013424336973478942
711
+ name: Cosine Precision@200
712
+ - type: cosine_recall@1
713
+ value: 0.2405616224648986
714
+ name: Cosine Recall@1
715
+ - type: cosine_recall@20
716
+ value: 0.8650459351707401
717
+ name: Cosine Recall@20
718
+ - type: cosine_recall@50
719
+ value: 0.9226295718495406
720
+ name: Cosine Recall@50
721
+ - type: cosine_recall@100
722
+ value: 0.9480412549835328
723
+ name: Cosine Recall@100
724
+ - type: cosine_recall@150
725
+ value: 0.9618651412723176
726
+ name: Cosine Recall@150
727
+ - type: cosine_recall@200
728
+ value: 0.9720922170220142
729
+ name: Cosine Recall@200
730
+ - type: cosine_ndcg@1
731
+ value: 0.642225689027561
732
+ name: Cosine Ndcg@1
733
+ - type: cosine_ndcg@20
734
+ value: 0.7332013199323174
735
+ name: Cosine Ndcg@20
736
+ - type: cosine_ndcg@50
737
+ value: 0.7490333180034867
738
+ name: Cosine Ndcg@50
739
+ - type: cosine_ndcg@100
740
+ value: 0.7547612967303503
741
+ name: Cosine Ndcg@100
742
+ - type: cosine_ndcg@150
743
+ value: 0.7575184392863841
744
+ name: Cosine Ndcg@150
745
+ - type: cosine_ndcg@200
746
+ value: 0.7593986816807992
747
+ name: Cosine Ndcg@200
748
+ - type: cosine_mrr@1
749
+ value: 0.642225689027561
750
+ name: Cosine Mrr@1
751
+ - type: cosine_mrr@20
752
+ value: 0.7246816496840639
753
+ name: Cosine Mrr@20
754
+ - type: cosine_mrr@50
755
+ value: 0.7260235454700952
756
+ name: Cosine Mrr@50
757
+ - type: cosine_mrr@100
758
+ value: 0.7262168772880452
759
+ name: Cosine Mrr@100
760
+ - type: cosine_mrr@150
761
+ value: 0.7262822017289415
762
+ name: Cosine Mrr@150
763
+ - type: cosine_mrr@200
764
+ value: 0.7263128860080087
765
+ name: Cosine Mrr@200
766
+ - type: cosine_map@1
767
+ value: 0.642225689027561
768
+ name: Cosine Map@1
769
+ - type: cosine_map@20
770
+ value: 0.6521189972338849
771
+ name: Cosine Map@20
772
+ - type: cosine_map@50
773
+ value: 0.6561813596290409
774
+ name: Cosine Map@50
775
+ - type: cosine_map@100
776
+ value: 0.6570111325791598
777
+ name: Cosine Map@100
778
+ - type: cosine_map@150
779
+ value: 0.6572712744212402
780
+ name: Cosine Map@150
781
+ - type: cosine_map@200
782
+ value: 0.6574012324541948
783
+ name: Cosine Map@200
784
+ - type: cosine_map@500
785
+ value: 0.6575399010277455
786
+ name: Cosine Map@500
787
+ - task:
788
+ type: information-retrieval
789
+ name: Information Retrieval
790
+ dataset:
791
+ name: mix zh
792
+ type: mix_zh
793
+ metrics:
794
+ - type: cosine_accuracy@1
795
+ value: 0.7713987473903967
796
+ name: Cosine Accuracy@1
797
+ - type: cosine_accuracy@20
798
+ value: 0.9806889352818372
799
+ name: Cosine Accuracy@20
800
+ - type: cosine_accuracy@50
801
+ value: 0.9916492693110647
802
+ name: Cosine Accuracy@50
803
+ - type: cosine_accuracy@100
804
+ value: 0.9947807933194155
805
+ name: Cosine Accuracy@100
806
+ - type: cosine_accuracy@150
807
+ value: 0.9963465553235908
808
+ name: Cosine Accuracy@150
809
+ - type: cosine_accuracy@200
810
+ value: 0.9973903966597077
811
+ name: Cosine Accuracy@200
812
+ - type: cosine_precision@1
813
+ value: 0.7713987473903967
814
+ name: Cosine Precision@1
815
+ - type: cosine_precision@20
816
+ value: 0.13656054279749477
817
+ name: Cosine Precision@20
818
+ - type: cosine_precision@50
819
+ value: 0.05762004175365346
820
+ name: Cosine Precision@50
821
+ - type: cosine_precision@100
822
+ value: 0.02944676409185805
823
+ name: Cosine Precision@100
824
+ - type: cosine_precision@150
825
+ value: 0.0197633959638135
826
+ name: Cosine Precision@150
827
+ - type: cosine_precision@200
828
+ value: 0.014877348643006268
829
+ name: Cosine Precision@200
830
+ - type: cosine_recall@1
831
+ value: 0.2585731683069888
832
+ name: Cosine Recall@1
833
+ - type: cosine_recall@20
834
+ value: 0.9014352818371607
835
+ name: Cosine Recall@20
836
+ - type: cosine_recall@50
837
+ value: 0.950347947112039
838
+ name: Cosine Recall@50
839
+ - type: cosine_recall@100
840
+ value: 0.9715031315240084
841
+ name: Cosine Recall@100
842
+ - type: cosine_recall@150
843
+ value: 0.9781576200417537
844
+ name: Cosine Recall@150
845
+ - type: cosine_recall@200
846
+ value: 0.9818110647181629
847
+ name: Cosine Recall@200
848
+ - type: cosine_ndcg@1
849
+ value: 0.7713987473903967
850
+ name: Cosine Ndcg@1
851
+ - type: cosine_ndcg@20
852
+ value: 0.7926986810043013
853
+ name: Cosine Ndcg@20
854
+ - type: cosine_ndcg@50
855
+ value: 0.8066848794942646
856
+ name: Cosine Ndcg@50
857
+ - type: cosine_ndcg@100
858
+ value: 0.8115576206060865
859
+ name: Cosine Ndcg@100
860
+ - type: cosine_ndcg@150
861
+ value: 0.8129087269558002
862
+ name: Cosine Ndcg@150
863
+ - type: cosine_ndcg@200
864
+ value: 0.8135973837485255
865
+ name: Cosine Ndcg@200
866
+ - type: cosine_mrr@1
867
+ value: 0.7713987473903967
868
+ name: Cosine Mrr@1
869
+ - type: cosine_mrr@20
870
+ value: 0.8432785828350186
871
+ name: Cosine Mrr@20
872
+ - type: cosine_mrr@50
873
+ value: 0.8436385628906108
874
+ name: Cosine Mrr@50
875
+ - type: cosine_mrr@100
876
+ value: 0.8436803457907981
877
+ name: Cosine Mrr@100
878
+ - type: cosine_mrr@150
879
+ value: 0.8436922193976949
880
+ name: Cosine Mrr@150
881
+ - type: cosine_mrr@200
882
+ value: 0.8436986631082636
883
+ name: Cosine Mrr@200
884
+ - type: cosine_map@1
885
+ value: 0.7713987473903967
886
+ name: Cosine Map@1
887
+ - type: cosine_map@20
888
+ value: 0.6929147114308428
889
+ name: Cosine Map@20
890
+ - type: cosine_map@50
891
+ value: 0.6972607407491801
892
+ name: Cosine Map@50
893
+ - type: cosine_map@100
894
+ value: 0.6981100717727863
895
+ name: Cosine Map@100
896
+ - type: cosine_map@150
897
+ value: 0.6982601227257159
898
+ name: Cosine Map@150
899
+ - type: cosine_map@200
900
+ value: 0.6983171494463136
901
+ name: Cosine Map@200
902
+ - type: cosine_map@500
903
+ value: 0.6984116893552017
904
+ name: Cosine Map@500
905
+ ---
906
+
907
+ # SentenceTransformer based on Alibaba-NLP/gte-multilingual-base
908
+
909
+ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [Alibaba-NLP/gte-multilingual-base](https://huggingface.co/Alibaba-NLP/gte-multilingual-base) on the full_en, full_de, full_es, full_zh and mix datasets. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
910
+
911
+ ## Model Details
912
+
913
+ ### Model Description
914
+ - **Model Type:** Sentence Transformer
915
+ - **Base model:** [Alibaba-NLP/gte-multilingual-base](https://huggingface.co/Alibaba-NLP/gte-multilingual-base) <!-- at revision 9fdd4ee8bba0e2808a34e0e739576f6740d2b225 -->
916
+ - **Maximum Sequence Length:** 512 tokens
917
+ - **Output Dimensionality:** 768 dimensions
918
+ - **Similarity Function:** Cosine Similarity
919
+ - **Training Datasets:**
920
+ - full_en
921
+ - full_de
922
+ - full_es
923
+ - full_zh
924
+ - mix
925
+ <!-- - **Language:** Unknown -->
926
+ <!-- - **License:** Unknown -->
927
+
928
+ ### Model Sources
929
+
930
+ - **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
931
+ - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
932
+ - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
933
+
934
+ ### Full Model Architecture
935
+
936
+ ```
937
+ SentenceTransformer(
938
+ (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: NewModel
939
+ (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
940
+ (2): Normalize()
941
+ )
942
+ ```
943
+
944
+ ## Usage
945
+
946
+ ### Direct Usage (Sentence Transformers)
947
+
948
+ First install the Sentence Transformers library:
949
+
950
+ ```bash
951
+ pip install -U sentence-transformers
952
+ ```
953
+
954
+ Then you can load this model and run inference.
955
+ ```python
956
+ from sentence_transformers import SentenceTransformer
957
+
958
+ # Download from the 🤗 Hub
959
+ model = SentenceTransformer("sentence_transformers_model_id")
960
+ # Run inference
961
+ sentences = [
962
+ 'Volksvertreter',
963
+ 'Parlamentarier',
964
+ 'Oberbürgermeister',
965
+ ]
966
+ embeddings = model.encode(sentences)
967
+ print(embeddings.shape)
968
+ # [3, 768]
969
+
970
+ # Get the similarity scores for the embeddings
971
+ similarities = model.similarity(embeddings, embeddings)
972
+ print(similarities.shape)
973
+ # [3, 3]
974
+ ```
975
+
976
+ <!--
977
+ ### Direct Usage (Transformers)
978
+
979
+ <details><summary>Click to see the direct usage in Transformers</summary>
980
+
981
+ </details>
982
+ -->
983
+
984
+ <!--
985
+ ### Downstream Usage (Sentence Transformers)
986
+
987
+ You can finetune this model on your own dataset.
988
+
989
+ <details><summary>Click to expand</summary>
990
+
991
+ </details>
992
+ -->
993
+
994
+ <!--
995
+ ### Out-of-Scope Use
996
+
997
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
998
+ -->
999
+
1000
+ ## Evaluation
1001
+
1002
+ ### Metrics
1003
+
1004
+ #### Information Retrieval
1005
+
1006
+ * Datasets: `full_en`, `full_es`, `full_de`, `full_zh`, `mix_es`, `mix_de` and `mix_zh`
1007
+ * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
1008
+
1009
+ | Metric | full_en | full_es | full_de | full_zh | mix_es | mix_de | mix_zh |
1010
+ |:---------------------|:-----------|:-----------|:-----------|:-----------|:-----------|:-----------|:-----------|
1011
+ | cosine_accuracy@1 | 0.6571 | 0.1189 | 0.2956 | 0.6602 | 0.701 | 0.6422 | 0.7714 |
1012
+ | cosine_accuracy@20 | 0.9905 | 1.0 | 0.9852 | 0.9903 | 0.9475 | 0.9241 | 0.9807 |
1013
+ | cosine_accuracy@50 | 0.9905 | 1.0 | 0.9852 | 0.9903 | 0.9776 | 0.9636 | 0.9916 |
1014
+ | cosine_accuracy@100 | 0.9905 | 1.0 | 0.9901 | 0.9903 | 0.9886 | 0.9771 | 0.9948 |
1015
+ | cosine_accuracy@150 | 0.9905 | 1.0 | 0.9901 | 0.9903 | 0.9922 | 0.9849 | 0.9963 |
1016
+ | cosine_accuracy@200 | 0.9905 | 1.0 | 0.9901 | 0.9903 | 0.9932 | 0.9901 | 0.9974 |
1017
+ | cosine_precision@1 | 0.6571 | 0.1189 | 0.2956 | 0.6602 | 0.701 | 0.6422 | 0.7714 |
1018
+ | cosine_precision@20 | 0.5133 | 0.5768 | 0.5074 | 0.4869 | 0.1197 | 0.1191 | 0.1366 |
1019
+ | cosine_precision@50 | 0.317 | 0.3907 | 0.3682 | 0.2959 | 0.0509 | 0.0509 | 0.0576 |
1020
+ | cosine_precision@100 | 0.1897 | 0.2542 | 0.2418 | 0.179 | 0.026 | 0.0262 | 0.0294 |
1021
+ | cosine_precision@150 | 0.1362 | 0.1923 | 0.1819 | 0.1291 | 0.0175 | 0.0177 | 0.0198 |
1022
+ | cosine_precision@200 | 0.1056 | 0.1526 | 0.1471 | 0.1006 | 0.0132 | 0.0134 | 0.0149 |
1023
+ | cosine_recall@1 | 0.067 | 0.0035 | 0.0111 | 0.0667 | 0.2707 | 0.2406 | 0.2586 |
1024
+ | cosine_recall@20 | 0.5478 | 0.3862 | 0.3189 | 0.5204 | 0.8851 | 0.865 | 0.9014 |
1025
+ | cosine_recall@50 | 0.747 | 0.5626 | 0.5004 | 0.7067 | 0.9391 | 0.9226 | 0.9503 |
1026
+ | cosine_recall@100 | 0.8467 | 0.6836 | 0.6244 | 0.8139 | 0.9599 | 0.948 | 0.9715 |
1027
+ | cosine_recall@150 | 0.9011 | 0.7497 | 0.6875 | 0.8684 | 0.9696 | 0.9619 | 0.9782 |
1028
+ | cosine_recall@200 | 0.9257 | 0.7853 | 0.7335 | 0.8964 | 0.9759 | 0.9721 | 0.9818 |
1029
+ | cosine_ndcg@1 | 0.6571 | 0.1189 | 0.2956 | 0.6602 | 0.701 | 0.6422 | 0.7714 |
1030
+ | cosine_ndcg@20 | 0.6924 | 0.6159 | 0.5343 | 0.663 | 0.769 | 0.7332 | 0.7927 |
1031
+ | cosine_ndcg@50 | 0.717 | 0.5887 | 0.5252 | 0.6822 | 0.7839 | 0.749 | 0.8067 |
1032
+ | cosine_ndcg@100 | 0.7691 | 0.6196 | 0.5569 | 0.7344 | 0.7884 | 0.7548 | 0.8116 |
1033
+ | cosine_ndcg@150 | 0.7923 | 0.6531 | 0.5892 | 0.758 | 0.7903 | 0.7575 | 0.8129 |
1034
+ | **cosine_ndcg@200** | **0.8024** | **0.6703** | **0.6109** | **0.7695** | **0.7914** | **0.7594** | **0.8136** |
1035
+ | cosine_mrr@1 | 0.6571 | 0.1189 | 0.2956 | 0.6602 | 0.701 | 0.6422 | 0.7714 |
1036
+ | cosine_mrr@20 | 0.8111 | 0.5554 | 0.5141 | 0.8068 | 0.7712 | 0.7247 | 0.8433 |
1037
+ | cosine_mrr@50 | 0.8111 | 0.5554 | 0.5141 | 0.8068 | 0.7723 | 0.726 | 0.8436 |
1038
+ | cosine_mrr@100 | 0.8111 | 0.5554 | 0.5142 | 0.8068 | 0.7724 | 0.7262 | 0.8437 |
1039
+ | cosine_mrr@150 | 0.8111 | 0.5554 | 0.5142 | 0.8068 | 0.7725 | 0.7263 | 0.8437 |
1040
+ | cosine_mrr@200 | 0.8111 | 0.5554 | 0.5142 | 0.8068 | 0.7725 | 0.7263 | 0.8437 |
1041
+ | cosine_map@1 | 0.6571 | 0.1189 | 0.2956 | 0.6602 | 0.701 | 0.6422 | 0.7714 |
1042
+ | cosine_map@20 | 0.5516 | 0.484 | 0.3953 | 0.5177 | 0.6938 | 0.6521 | 0.6929 |
1043
+ | cosine_map@50 | 0.5474 | 0.4288 | 0.3566 | 0.5051 | 0.6978 | 0.6562 | 0.6973 |
1044
+ | cosine_map@100 | 0.5799 | 0.4352 | 0.3618 | 0.5346 | 0.6985 | 0.657 | 0.6981 |
1045
+ | cosine_map@150 | 0.5895 | 0.4511 | 0.3767 | 0.5441 | 0.6987 | 0.6573 | 0.6983 |
1046
+ | cosine_map@200 | 0.5931 | 0.458 | 0.385 | 0.5478 | 0.6987 | 0.6574 | 0.6983 |
1047
+ | cosine_map@500 | 0.5967 | 0.4708 | 0.3976 | 0.5525 | 0.6989 | 0.6575 | 0.6984 |
1048
+
1049
+ <!--
1050
+ ## Bias, Risks and Limitations
1051
+
1052
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
1053
+ -->
1054
+
1055
+ <!--
1056
+ ### Recommendations
1057
+
1058
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
1059
+ -->
1060
+
1061
+ ## Training Details
1062
+
1063
+ ### Training Datasets
1064
+ <details><summary>full_en</summary>
1065
+
1066
+ #### full_en
1067
+
1068
+ * Dataset: full_en
1069
+ * Size: 28,880 training samples
1070
+ * Columns: <code>anchor</code> and <code>positive</code>
1071
+ * Approximate statistics based on the first 1000 samples:
1072
+ | | anchor | positive |
1073
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
1074
+ | type | string | string |
1075
+ | details | <ul><li>min: 3 tokens</li><li>mean: 5.68 tokens</li><li>max: 11 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 5.76 tokens</li><li>max: 12 tokens</li></ul> |
1076
+ * Samples:
1077
+ | anchor | positive |
1078
+ |:-----------------------------------------|:-----------------------------------------|
1079
+ | <code>air commodore</code> | <code>flight lieutenant</code> |
1080
+ | <code>command and control officer</code> | <code>flight officer</code> |
1081
+ | <code>air commodore</code> | <code>command and control officer</code> |
1082
+ * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
1083
+ ```json
1084
+ {'guide': SentenceTransformer(
1085
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
1086
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
1087
+ (2): Normalize()
1088
+ ), 'temperature': 0.01, 'margin_strategy': 'absolute', 'margin': 0.0}
1089
+ ```
1090
+ </details>
1091
+ <details><summary>full_de</summary>
1092
+
1093
+ #### full_de
1094
+
1095
+ * Dataset: full_de
1096
+ * Size: 23,023 training samples
1097
+ * Columns: <code>anchor</code> and <code>positive</code>
1098
+ * Approximate statistics based on the first 1000 samples:
1099
+ | | anchor | positive |
1100
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
1101
+ | type | string | string |
1102
+ | details | <ul><li>min: 3 tokens</li><li>mean: 7.99 tokens</li><li>max: 30 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 8.19 tokens</li><li>max: 30 tokens</li></ul> |
1103
+ * Samples:
1104
+ | anchor | positive |
1105
+ |:----------------------------------|:-----------------------------------------------------|
1106
+ | <code>Staffelkommandantin</code> | <code>Kommodore</code> |
1107
+ | <code>Luftwaffenoffizierin</code> | <code>Luftwaffenoffizier/Luftwaffenoffizierin</code> |
1108
+ | <code>Staffelkommandantin</code> | <code>Luftwaffenoffizierin</code> |
1109
+ * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
1110
+ ```json
1111
+ {'guide': SentenceTransformer(
1112
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
1113
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
1114
+ (2): Normalize()
1115
+ ), 'temperature': 0.01, 'margin_strategy': 'absolute', 'margin': 0.0}
1116
+ ```
1117
+ </details>
1118
+ <details><summary>full_es</summary>
1119
+
1120
+ #### full_es
1121
+
1122
+ * Dataset: full_es
1123
+ * Size: 20,724 training samples
1124
+ * Columns: <code>anchor</code> and <code>positive</code>
1125
+ * Approximate statistics based on the first 1000 samples:
1126
+ | | anchor | positive |
1127
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
1128
+ | type | string | string |
1129
+ | details | <ul><li>min: 3 tokens</li><li>mean: 9.13 tokens</li><li>max: 32 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 8.84 tokens</li><li>max: 32 tokens</li></ul> |
1130
+ * Samples:
1131
+ | anchor | positive |
1132
+ |:------------------------------------|:-------------------------------------------|
1133
+ | <code>jefe de escuadrón</code> | <code>instructor</code> |
1134
+ | <code>comandante de aeronave</code> | <code>instructor de simulador</code> |
1135
+ | <code>instructor</code> | <code>oficial del Ejército del Aire</code> |
1136
+ * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
1137
+ ```json
1138
+ {'guide': SentenceTransformer(
1139
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
1140
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
1141
+ (2): Normalize()
1142
+ ), 'temperature': 0.01, 'margin_strategy': 'absolute', 'margin': 0.0}
1143
+ ```
1144
+ </details>
1145
+ <details><summary>full_zh</summary>
1146
+
1147
+ #### full_zh
1148
+
1149
+ * Dataset: full_zh
1150
+ * Size: 30,401 training samples
1151
+ * Columns: <code>anchor</code> and <code>positive</code>
1152
+ * Approximate statistics based on the first 1000 samples:
1153
+ | | anchor | positive |
1154
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
1155
+ | type | string | string |
1156
+ | details | <ul><li>min: 5 tokens</li><li>mean: 7.15 tokens</li><li>max: 14 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 7.46 tokens</li><li>max: 21 tokens</li></ul> |
1157
+ * Samples:
1158
+ | anchor | positive |
1159
+ |:------------------|:---------------------|
1160
+ | <code>技术总监</code> | <code>技术和运营总监</code> |
1161
+ | <code>技术总监</code> | <code>技术主管</code> |
1162
+ | <code>技术总监</code> | <code>技术艺术总监</code> |
1163
+ * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
1164
+ ```json
1165
+ {'guide': SentenceTransformer(
1166
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
1167
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
1168
+ (2): Normalize()
1169
+ ), 'temperature': 0.01, 'margin_strategy': 'absolute', 'margin': 0.0}
1170
+ ```
1171
+ </details>
1172
+ <details><summary>mix</summary>
1173
+
1174
+ #### mix
1175
+
1176
+ * Dataset: mix
1177
+ * Size: 21,760 training samples
1178
+ * Columns: <code>anchor</code> and <code>positive</code>
1179
+ * Approximate statistics based on the first 1000 samples:
1180
+ | | anchor | positive |
1181
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
1182
+ | type | string | string |
1183
+ | details | <ul><li>min: 2 tokens</li><li>mean: 6.71 tokens</li><li>max: 19 tokens</li></ul> | <ul><li>min: 2 tokens</li><li>mean: 7.69 tokens</li><li>max: 19 tokens</li></ul> |
1184
+ * Samples:
1185
+ | anchor | positive |
1186
+ |:------------------------------------------|:----------------------------------------------------------------|
1187
+ | <code>technical manager</code> | <code>Technischer Direktor für Bühne, Film und Fernsehen</code> |
1188
+ | <code>head of technical</code> | <code>directora técnica</code> |
1189
+ | <code>head of technical department</code> | <code>技术艺术总监</code> |
1190
+ * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
1191
+ ```json
1192
+ {'guide': SentenceTransformer(
1193
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
1194
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
1195
+ (2): Normalize()
1196
+ ), 'temperature': 0.01, 'margin_strategy': 'absolute', 'margin': 0.0}
1197
+ ```
1198
+ </details>
1199
+
1200
+ ### Training Hyperparameters
1201
+ #### Non-Default Hyperparameters
1202
+
1203
+ - `eval_strategy`: steps
1204
+ - `per_device_train_batch_size`: 128
1205
+ - `per_device_eval_batch_size`: 128
1206
+ - `gradient_accumulation_steps`: 2
1207
+ - `num_train_epochs`: 5
1208
+ - `warmup_ratio`: 0.05
1209
+ - `log_on_each_node`: False
1210
+ - `fp16`: True
1211
+ - `dataloader_num_workers`: 4
1212
+ - `ddp_find_unused_parameters`: True
1213
+ - `batch_sampler`: no_duplicates
1214
+
1215
+ #### All Hyperparameters
1216
+ <details><summary>Click to expand</summary>
1217
+
1218
+ - `overwrite_output_dir`: False
1219
+ - `do_predict`: False
1220
+ - `eval_strategy`: steps
1221
+ - `prediction_loss_only`: True
1222
+ - `per_device_train_batch_size`: 128
1223
+ - `per_device_eval_batch_size`: 128
1224
+ - `per_gpu_train_batch_size`: None
1225
+ - `per_gpu_eval_batch_size`: None
1226
+ - `gradient_accumulation_steps`: 2
1227
+ - `eval_accumulation_steps`: None
1228
+ - `torch_empty_cache_steps`: None
1229
+ - `learning_rate`: 5e-05
1230
+ - `weight_decay`: 0.0
1231
+ - `adam_beta1`: 0.9
1232
+ - `adam_beta2`: 0.999
1233
+ - `adam_epsilon`: 1e-08
1234
+ - `max_grad_norm`: 1.0
1235
+ - `num_train_epochs`: 5
1236
+ - `max_steps`: -1
1237
+ - `lr_scheduler_type`: linear
1238
+ - `lr_scheduler_kwargs`: {}
1239
+ - `warmup_ratio`: 0.05
1240
+ - `warmup_steps`: 0
1241
+ - `log_level`: passive
1242
+ - `log_level_replica`: warning
1243
+ - `log_on_each_node`: False
1244
+ - `logging_nan_inf_filter`: True
1245
+ - `save_safetensors`: True
1246
+ - `save_on_each_node`: False
1247
+ - `save_only_model`: False
1248
+ - `restore_callback_states_from_checkpoint`: False
1249
+ - `no_cuda`: False
1250
+ - `use_cpu`: False
1251
+ - `use_mps_device`: False
1252
+ - `seed`: 42
1253
+ - `data_seed`: None
1254
+ - `jit_mode_eval`: False
1255
+ - `use_ipex`: False
1256
+ - `bf16`: False
1257
+ - `fp16`: True
1258
+ - `fp16_opt_level`: O1
1259
+ - `half_precision_backend`: auto
1260
+ - `bf16_full_eval`: False
1261
+ - `fp16_full_eval`: False
1262
+ - `tf32`: None
1263
+ - `local_rank`: 0
1264
+ - `ddp_backend`: None
1265
+ - `tpu_num_cores`: None
1266
+ - `tpu_metrics_debug`: False
1267
+ - `debug`: []
1268
+ - `dataloader_drop_last`: True
1269
+ - `dataloader_num_workers`: 4
1270
+ - `dataloader_prefetch_factor`: None
1271
+ - `past_index`: -1
1272
+ - `disable_tqdm`: False
1273
+ - `remove_unused_columns`: True
1274
+ - `label_names`: None
1275
+ - `load_best_model_at_end`: False
1276
+ - `ignore_data_skip`: False
1277
+ - `fsdp`: []
1278
+ - `fsdp_min_num_params`: 0
1279
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
1280
+ - `tp_size`: 0
1281
+ - `fsdp_transformer_layer_cls_to_wrap`: None
1282
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
1283
+ - `deepspeed`: None
1284
+ - `label_smoothing_factor`: 0.0
1285
+ - `optim`: adamw_torch
1286
+ - `optim_args`: None
1287
+ - `adafactor`: False
1288
+ - `group_by_length`: False
1289
+ - `length_column_name`: length
1290
+ - `ddp_find_unused_parameters`: True
1291
+ - `ddp_bucket_cap_mb`: None
1292
+ - `ddp_broadcast_buffers`: False
1293
+ - `dataloader_pin_memory`: True
1294
+ - `dataloader_persistent_workers`: False
1295
+ - `skip_memory_metrics`: True
1296
+ - `use_legacy_prediction_loop`: False
1297
+ - `push_to_hub`: False
1298
+ - `resume_from_checkpoint`: None
1299
+ - `hub_model_id`: None
1300
+ - `hub_strategy`: every_save
1301
+ - `hub_private_repo`: None
1302
+ - `hub_always_push`: False
1303
+ - `gradient_checkpointing`: False
1304
+ - `gradient_checkpointing_kwargs`: None
1305
+ - `include_inputs_for_metrics`: False
1306
+ - `include_for_metrics`: []
1307
+ - `eval_do_concat_batches`: True
1308
+ - `fp16_backend`: auto
1309
+ - `push_to_hub_model_id`: None
1310
+ - `push_to_hub_organization`: None
1311
+ - `mp_parameters`:
1312
+ - `auto_find_batch_size`: False
1313
+ - `full_determinism`: False
1314
+ - `torchdynamo`: None
1315
+ - `ray_scope`: last
1316
+ - `ddp_timeout`: 1800
1317
+ - `torch_compile`: False
1318
+ - `torch_compile_backend`: None
1319
+ - `torch_compile_mode`: None
1320
+ - `include_tokens_per_second`: False
1321
+ - `include_num_input_tokens_seen`: False
1322
+ - `neftune_noise_alpha`: None
1323
+ - `optim_target_modules`: None
1324
+ - `batch_eval_metrics`: False
1325
+ - `eval_on_start`: False
1326
+ - `use_liger_kernel`: False
1327
+ - `eval_use_gather_object`: False
1328
+ - `average_tokens_across_devices`: False
1329
+ - `prompts`: None
1330
+ - `batch_sampler`: no_duplicates
1331
+ - `multi_dataset_batch_sampler`: proportional
1332
+
1333
+ </details>
1334
+
1335
+ ### Training Logs
1336
+ | Epoch | Step | Training Loss | full_en_cosine_ndcg@200 | full_es_cosine_ndcg@200 | full_de_cosine_ndcg@200 | full_zh_cosine_ndcg@200 | mix_es_cosine_ndcg@200 | mix_de_cosine_ndcg@200 | mix_zh_cosine_ndcg@200 |
1337
+ |:------:|:----:|:-------------:|:-----------------------:|:-----------------------:|:-----------------------:|:-----------------------:|:----------------------:|:----------------------:|:----------------------:|
1338
+ | -1 | -1 | - | 0.7447 | 0.6125 | 0.5378 | 0.7240 | 0.7029 | 0.6345 | 0.7437 |
1339
+ | 0.0082 | 1 | 4.3088 | - | - | - | - | - | - | - |
1340
+ | 0.8230 | 100 | 1.9026 | - | - | - | - | - | - | - |
1341
+ | 1.6502 | 200 | 0.9336 | 0.8024 | 0.6703 | 0.6109 | 0.7695 | 0.7914 | 0.7594 | 0.8136 |
1342
+
1343
+
1344
+ ### Framework Versions
1345
+ - Python: 3.11.11
1346
+ - Sentence Transformers: 4.1.0
1347
+ - Transformers: 4.51.2
1348
+ - PyTorch: 2.6.0+cu124
1349
+ - Accelerate: 1.6.0
1350
+ - Datasets: 3.5.0
1351
+ - Tokenizers: 0.21.1
1352
+
1353
+ ## Citation
1354
+
1355
+ ### BibTeX
1356
+
1357
+ #### Sentence Transformers
1358
+ ```bibtex
1359
+ @inproceedings{reimers-2019-sentence-bert,
1360
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
1361
+ author = "Reimers, Nils and Gurevych, Iryna",
1362
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
1363
+ month = "11",
1364
+ year = "2019",
1365
+ publisher = "Association for Computational Linguistics",
1366
+ url = "https://arxiv.org/abs/1908.10084",
1367
+ }
1368
+ ```
1369
+
1370
+ #### GISTEmbedLoss
1371
+ ```bibtex
1372
+ @misc{solatorio2024gistembed,
1373
+ title={GISTEmbed: Guided In-sample Selection of Training Negatives for Text Embedding Fine-tuning},
1374
+ author={Aivin V. Solatorio},
1375
+ year={2024},
1376
+ eprint={2402.16829},
1377
+ archivePrefix={arXiv},
1378
+ primaryClass={cs.LG}
1379
+ }
1380
+ ```
1381
+
1382
+ <!--
1383
+ ## Glossary
1384
+
1385
+ *Clearly define terms in order to be accessible across audiences.*
1386
+ -->
1387
+
1388
+ <!--
1389
+ ## Model Card Authors
1390
+
1391
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
1392
+ -->
1393
+
1394
+ <!--
1395
+ ## Model Card Contact
1396
+
1397
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
1398
+ -->
checkpoint-200/config.json ADDED
@@ -0,0 +1,49 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "NewModel"
4
+ ],
5
+ "attention_probs_dropout_prob": 0.0,
6
+ "auto_map": {
7
+ "AutoConfig": "Alibaba-NLP/new-impl--configuration.NewConfig",
8
+ "AutoModel": "Alibaba-NLP/new-impl--modeling.NewModel",
9
+ "AutoModelForMaskedLM": "Alibaba-NLP/new-impl--modeling.NewForMaskedLM",
10
+ "AutoModelForMultipleChoice": "Alibaba-NLP/new-impl--modeling.NewForMultipleChoice",
11
+ "AutoModelForQuestionAnswering": "Alibaba-NLP/new-impl--modeling.NewForQuestionAnswering",
12
+ "AutoModelForSequenceClassification": "Alibaba-NLP/new-impl--modeling.NewForSequenceClassification",
13
+ "AutoModelForTokenClassification": "Alibaba-NLP/new-impl--modeling.NewForTokenClassification"
14
+ },
15
+ "classifier_dropout": 0.0,
16
+ "hidden_act": "gelu",
17
+ "hidden_dropout_prob": 0.1,
18
+ "hidden_size": 768,
19
+ "id2label": {
20
+ "0": "LABEL_0"
21
+ },
22
+ "initializer_range": 0.02,
23
+ "intermediate_size": 3072,
24
+ "label2id": {
25
+ "LABEL_0": 0
26
+ },
27
+ "layer_norm_eps": 1e-12,
28
+ "layer_norm_type": "layer_norm",
29
+ "logn_attention_clip1": false,
30
+ "logn_attention_scale": false,
31
+ "max_position_embeddings": 8192,
32
+ "model_type": "new",
33
+ "num_attention_heads": 12,
34
+ "num_hidden_layers": 12,
35
+ "pack_qkv": true,
36
+ "pad_token_id": 1,
37
+ "position_embedding_type": "rope",
38
+ "rope_scaling": {
39
+ "factor": 8.0,
40
+ "type": "ntk"
41
+ },
42
+ "rope_theta": 20000,
43
+ "torch_dtype": "float32",
44
+ "transformers_version": "4.51.2",
45
+ "type_vocab_size": 1,
46
+ "unpad_inputs": false,
47
+ "use_memory_efficient_attention": false,
48
+ "vocab_size": 250048
49
+ }
checkpoint-200/config_sentence_transformers.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "__version__": {
3
+ "sentence_transformers": "4.1.0",
4
+ "transformers": "4.51.2",
5
+ "pytorch": "2.6.0+cu124"
6
+ },
7
+ "prompts": {},
8
+ "default_prompt_name": null,
9
+ "similarity_fn_name": "cosine"
10
+ }
checkpoint-200/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b5739aacbad9a359e1ff72bd23d43eda3a3610e34ac3061d3c0d19c24042df52
3
+ size 1221487872
checkpoint-200/modules.json ADDED
@@ -0,0 +1,20 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ },
14
+ {
15
+ "idx": 2,
16
+ "name": "2",
17
+ "path": "2_Normalize",
18
+ "type": "sentence_transformers.models.Normalize"
19
+ }
20
+ ]
checkpoint-200/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4490d323ccfa091660768e94e1be37784f3171607a6403582274d38e52ff32f7
3
+ size 2443060986
checkpoint-200/rng_state_0.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e8d62bb1f8508718b92a5fecab4fb7d55821121383ff7cb094985aaff6fccbb8
3
+ size 15984
checkpoint-200/scaler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:986f83da3f32a0b13555fd4d7fcd98b07983eae746b7dafddcf4aff0739363a0
3
+ size 988
checkpoint-200/scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5fea9571d262cb96d9a56307ab5a0db02609a153cb4165e4f3766dad960e60bd
3
+ size 1064
checkpoint-200/sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 512,
3
+ "do_lower_case": false
4
+ }
checkpoint-200/special_tokens_map.json ADDED
@@ -0,0 +1,51 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<s>",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "cls_token": {
10
+ "content": "<s>",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "eos_token": {
17
+ "content": "</s>",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "mask_token": {
24
+ "content": "<mask>",
25
+ "lstrip": true,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ },
30
+ "pad_token": {
31
+ "content": "<pad>",
32
+ "lstrip": false,
33
+ "normalized": false,
34
+ "rstrip": false,
35
+ "single_word": false
36
+ },
37
+ "sep_token": {
38
+ "content": "</s>",
39
+ "lstrip": false,
40
+ "normalized": false,
41
+ "rstrip": false,
42
+ "single_word": false
43
+ },
44
+ "unk_token": {
45
+ "content": "<unk>",
46
+ "lstrip": false,
47
+ "normalized": false,
48
+ "rstrip": false,
49
+ "single_word": false
50
+ }
51
+ }
checkpoint-200/tokenizer.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:883b037111086fd4dfebbbc9b7cee11e1517b5e0c0514879478661440f137085
3
+ size 17082987
checkpoint-200/tokenizer_config.json ADDED
@@ -0,0 +1,55 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "<s>",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "1": {
12
+ "content": "<pad>",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "2": {
20
+ "content": "</s>",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "3": {
28
+ "content": "<unk>",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "250001": {
36
+ "content": "<mask>",
37
+ "lstrip": true,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ }
43
+ },
44
+ "bos_token": "<s>",
45
+ "clean_up_tokenization_spaces": true,
46
+ "cls_token": "<s>",
47
+ "eos_token": "</s>",
48
+ "extra_special_tokens": {},
49
+ "mask_token": "<mask>",
50
+ "model_max_length": 8192,
51
+ "pad_token": "<pad>",
52
+ "sep_token": "</s>",
53
+ "tokenizer_class": "XLMRobertaTokenizerFast",
54
+ "unk_token": "<unk>"
55
+ }
checkpoint-200/trainer_state.json ADDED
@@ -0,0 +1,322 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "best_global_step": null,
3
+ "best_metric": null,
4
+ "best_model_checkpoint": null,
5
+ "epoch": 1.6502057613168724,
6
+ "eval_steps": 200,
7
+ "global_step": 200,
8
+ "is_hyper_param_search": false,
9
+ "is_local_process_zero": true,
10
+ "is_world_process_zero": true,
11
+ "log_history": [
12
+ {
13
+ "epoch": 0.00823045267489712,
14
+ "grad_norm": Infinity,
15
+ "learning_rate": 0.0,
16
+ "loss": 4.3088,
17
+ "step": 1
18
+ },
19
+ {
20
+ "epoch": 0.823045267489712,
21
+ "grad_norm": 3.1721928119659424,
22
+ "learning_rate": 4.4163763066202094e-05,
23
+ "loss": 1.9026,
24
+ "step": 100
25
+ },
26
+ {
27
+ "epoch": 1.6502057613168724,
28
+ "grad_norm": 2.9604015350341797,
29
+ "learning_rate": 3.5452961672473864e-05,
30
+ "loss": 0.9336,
31
+ "step": 200
32
+ },
33
+ {
34
+ "epoch": 1.6502057613168724,
35
+ "eval_full_de_cosine_accuracy@1": 0.2955665024630542,
36
+ "eval_full_de_cosine_accuracy@100": 0.9901477832512315,
37
+ "eval_full_de_cosine_accuracy@150": 0.9901477832512315,
38
+ "eval_full_de_cosine_accuracy@20": 0.9852216748768473,
39
+ "eval_full_de_cosine_accuracy@200": 0.9901477832512315,
40
+ "eval_full_de_cosine_accuracy@50": 0.9852216748768473,
41
+ "eval_full_de_cosine_map@1": 0.2955665024630542,
42
+ "eval_full_de_cosine_map@100": 0.3618162919366333,
43
+ "eval_full_de_cosine_map@150": 0.37673093284239206,
44
+ "eval_full_de_cosine_map@20": 0.3952556219642319,
45
+ "eval_full_de_cosine_map@200": 0.3850375691141728,
46
+ "eval_full_de_cosine_map@50": 0.3565895386599598,
47
+ "eval_full_de_cosine_map@500": 0.3976475131909832,
48
+ "eval_full_de_cosine_mrr@1": 0.2955665024630542,
49
+ "eval_full_de_cosine_mrr@100": 0.5141580130059386,
50
+ "eval_full_de_cosine_mrr@150": 0.5141580130059386,
51
+ "eval_full_de_cosine_mrr@20": 0.5140785596450613,
52
+ "eval_full_de_cosine_mrr@200": 0.5141580130059386,
53
+ "eval_full_de_cosine_mrr@50": 0.5140785596450613,
54
+ "eval_full_de_cosine_ndcg@1": 0.2955665024630542,
55
+ "eval_full_de_cosine_ndcg@100": 0.5568512483442096,
56
+ "eval_full_de_cosine_ndcg@150": 0.5891923427833955,
57
+ "eval_full_de_cosine_ndcg@20": 0.5342874353496432,
58
+ "eval_full_de_cosine_ndcg@200": 0.6108910915140433,
59
+ "eval_full_de_cosine_ndcg@50": 0.5251712513704461,
60
+ "eval_full_de_cosine_precision@1": 0.2955665024630542,
61
+ "eval_full_de_cosine_precision@100": 0.24177339901477832,
62
+ "eval_full_de_cosine_precision@150": 0.18187192118226603,
63
+ "eval_full_de_cosine_precision@20": 0.5073891625615764,
64
+ "eval_full_de_cosine_precision@200": 0.1470689655172414,
65
+ "eval_full_de_cosine_precision@50": 0.3681773399014779,
66
+ "eval_full_de_cosine_recall@1": 0.01108543831680986,
67
+ "eval_full_de_cosine_recall@100": 0.6244233924508789,
68
+ "eval_full_de_cosine_recall@150": 0.687486468465792,
69
+ "eval_full_de_cosine_recall@20": 0.31887986522832873,
70
+ "eval_full_de_cosine_recall@200": 0.7334854348170513,
71
+ "eval_full_de_cosine_recall@50": 0.5004342335550164,
72
+ "eval_full_en_cosine_accuracy@1": 0.6571428571428571,
73
+ "eval_full_en_cosine_accuracy@100": 0.9904761904761905,
74
+ "eval_full_en_cosine_accuracy@150": 0.9904761904761905,
75
+ "eval_full_en_cosine_accuracy@20": 0.9904761904761905,
76
+ "eval_full_en_cosine_accuracy@200": 0.9904761904761905,
77
+ "eval_full_en_cosine_accuracy@50": 0.9904761904761905,
78
+ "eval_full_en_cosine_map@1": 0.6571428571428571,
79
+ "eval_full_en_cosine_map@100": 0.5799091076338031,
80
+ "eval_full_en_cosine_map@150": 0.5895042547793764,
81
+ "eval_full_en_cosine_map@20": 0.5516314386214587,
82
+ "eval_full_en_cosine_map@200": 0.5930550248640567,
83
+ "eval_full_en_cosine_map@50": 0.5474217433291914,
84
+ "eval_full_en_cosine_map@500": 0.5967311945998978,
85
+ "eval_full_en_cosine_mrr@1": 0.6571428571428571,
86
+ "eval_full_en_cosine_mrr@100": 0.8111111111111111,
87
+ "eval_full_en_cosine_mrr@150": 0.8111111111111111,
88
+ "eval_full_en_cosine_mrr@20": 0.8111111111111111,
89
+ "eval_full_en_cosine_mrr@200": 0.8111111111111111,
90
+ "eval_full_en_cosine_mrr@50": 0.8111111111111111,
91
+ "eval_full_en_cosine_ndcg@1": 0.6571428571428571,
92
+ "eval_full_en_cosine_ndcg@100": 0.7690946845916871,
93
+ "eval_full_en_cosine_ndcg@150": 0.7923061459636489,
94
+ "eval_full_en_cosine_ndcg@20": 0.6923506957704934,
95
+ "eval_full_en_cosine_ndcg@200": 0.8023952171736648,
96
+ "eval_full_en_cosine_ndcg@50": 0.7170311913169547,
97
+ "eval_full_en_cosine_precision@1": 0.6571428571428571,
98
+ "eval_full_en_cosine_precision@100": 0.18971428571428572,
99
+ "eval_full_en_cosine_precision@150": 0.13619047619047617,
100
+ "eval_full_en_cosine_precision@20": 0.5133333333333332,
101
+ "eval_full_en_cosine_precision@200": 0.10561904761904761,
102
+ "eval_full_en_cosine_precision@50": 0.3169523809523809,
103
+ "eval_full_en_cosine_recall@1": 0.06695957251887064,
104
+ "eval_full_en_cosine_recall@100": 0.8467073011936345,
105
+ "eval_full_en_cosine_recall@150": 0.9010846211520122,
106
+ "eval_full_en_cosine_recall@20": 0.5478306503729546,
107
+ "eval_full_en_cosine_recall@200": 0.9256595392715059,
108
+ "eval_full_en_cosine_recall@50": 0.7470276357469449,
109
+ "eval_full_es_cosine_accuracy@1": 0.11891891891891893,
110
+ "eval_full_es_cosine_accuracy@100": 1.0,
111
+ "eval_full_es_cosine_accuracy@150": 1.0,
112
+ "eval_full_es_cosine_accuracy@20": 1.0,
113
+ "eval_full_es_cosine_accuracy@200": 1.0,
114
+ "eval_full_es_cosine_accuracy@50": 1.0,
115
+ "eval_full_es_cosine_map@1": 0.11891891891891893,
116
+ "eval_full_es_cosine_map@100": 0.43522297182400527,
117
+ "eval_full_es_cosine_map@150": 0.4511056582755023,
118
+ "eval_full_es_cosine_map@20": 0.4839539531842883,
119
+ "eval_full_es_cosine_map@200": 0.45802493743471273,
120
+ "eval_full_es_cosine_map@50": 0.4288206349412292,
121
+ "eval_full_es_cosine_map@500": 0.47075604946048677,
122
+ "eval_full_es_cosine_mrr@1": 0.11891891891891893,
123
+ "eval_full_es_cosine_mrr@100": 0.5554054054054054,
124
+ "eval_full_es_cosine_mrr@150": 0.5554054054054054,
125
+ "eval_full_es_cosine_mrr@20": 0.5554054054054054,
126
+ "eval_full_es_cosine_mrr@200": 0.5554054054054054,
127
+ "eval_full_es_cosine_mrr@50": 0.5554054054054054,
128
+ "eval_full_es_cosine_ndcg@1": 0.11891891891891893,
129
+ "eval_full_es_cosine_ndcg@100": 0.6196114606257926,
130
+ "eval_full_es_cosine_ndcg@150": 0.6530674955405338,
131
+ "eval_full_es_cosine_ndcg@20": 0.6158554243812342,
132
+ "eval_full_es_cosine_ndcg@200": 0.670287400819268,
133
+ "eval_full_es_cosine_ndcg@50": 0.5886857089260162,
134
+ "eval_full_es_cosine_precision@1": 0.11891891891891893,
135
+ "eval_full_es_cosine_precision@100": 0.2541621621621622,
136
+ "eval_full_es_cosine_precision@150": 0.19225225225225226,
137
+ "eval_full_es_cosine_precision@20": 0.5767567567567567,
138
+ "eval_full_es_cosine_precision@200": 0.15264864864864866,
139
+ "eval_full_es_cosine_precision@50": 0.3907027027027027,
140
+ "eval_full_es_cosine_recall@1": 0.0035436931012884127,
141
+ "eval_full_es_cosine_recall@100": 0.6836436316189977,
142
+ "eval_full_es_cosine_recall@150": 0.7496865406970199,
143
+ "eval_full_es_cosine_recall@20": 0.3862419782331355,
144
+ "eval_full_es_cosine_recall@200": 0.7852629043380305,
145
+ "eval_full_es_cosine_recall@50": 0.5625768407738393,
146
+ "eval_full_zh_cosine_accuracy@1": 0.6601941747572816,
147
+ "eval_full_zh_cosine_accuracy@100": 0.9902912621359223,
148
+ "eval_full_zh_cosine_accuracy@150": 0.9902912621359223,
149
+ "eval_full_zh_cosine_accuracy@20": 0.9902912621359223,
150
+ "eval_full_zh_cosine_accuracy@200": 0.9902912621359223,
151
+ "eval_full_zh_cosine_accuracy@50": 0.9902912621359223,
152
+ "eval_full_zh_cosine_map@1": 0.6601941747572816,
153
+ "eval_full_zh_cosine_map@100": 0.5346277197767966,
154
+ "eval_full_zh_cosine_map@150": 0.5441006347287816,
155
+ "eval_full_zh_cosine_map@20": 0.5176817014415404,
156
+ "eval_full_zh_cosine_map@200": 0.547804939644668,
157
+ "eval_full_zh_cosine_map@50": 0.5050961591489588,
158
+ "eval_full_zh_cosine_map@500": 0.5524877228701637,
159
+ "eval_full_zh_cosine_mrr@1": 0.6601941747572816,
160
+ "eval_full_zh_cosine_mrr@100": 0.8068423485899215,
161
+ "eval_full_zh_cosine_mrr@150": 0.8068423485899215,
162
+ "eval_full_zh_cosine_mrr@20": 0.8068423485899215,
163
+ "eval_full_zh_cosine_mrr@200": 0.8068423485899215,
164
+ "eval_full_zh_cosine_mrr@50": 0.8068423485899215,
165
+ "eval_full_zh_cosine_ndcg@1": 0.6601941747572816,
166
+ "eval_full_zh_cosine_ndcg@100": 0.7344118850318737,
167
+ "eval_full_zh_cosine_ndcg@150": 0.7580048379992059,
168
+ "eval_full_zh_cosine_ndcg@20": 0.6629898844211244,
169
+ "eval_full_zh_cosine_ndcg@200": 0.769464510105362,
170
+ "eval_full_zh_cosine_ndcg@50": 0.682216395408567,
171
+ "eval_full_zh_cosine_precision@1": 0.6601941747572816,
172
+ "eval_full_zh_cosine_precision@100": 0.17902912621359218,
173
+ "eval_full_zh_cosine_precision@150": 0.12912621359223303,
174
+ "eval_full_zh_cosine_precision@20": 0.4868932038834952,
175
+ "eval_full_zh_cosine_precision@200": 0.10063106796116503,
176
+ "eval_full_zh_cosine_precision@50": 0.2959223300970874,
177
+ "eval_full_zh_cosine_recall@1": 0.06669332811942774,
178
+ "eval_full_zh_cosine_recall@100": 0.813864097315397,
179
+ "eval_full_zh_cosine_recall@150": 0.8683619147921042,
180
+ "eval_full_zh_cosine_recall@20": 0.52040897323663,
181
+ "eval_full_zh_cosine_recall@200": 0.8964210248615742,
182
+ "eval_full_zh_cosine_recall@50": 0.7067236634036261,
183
+ "eval_mix_de_cosine_accuracy@1": 0.642225689027561,
184
+ "eval_mix_de_cosine_accuracy@100": 0.9771190847633905,
185
+ "eval_mix_de_cosine_accuracy@150": 0.984919396775871,
186
+ "eval_mix_de_cosine_accuracy@20": 0.9240769630785232,
187
+ "eval_mix_de_cosine_accuracy@200": 0.9901196047841914,
188
+ "eval_mix_de_cosine_accuracy@50": 0.9635985439417577,
189
+ "eval_mix_de_cosine_map@1": 0.642225689027561,
190
+ "eval_mix_de_cosine_map@100": 0.6570111325791598,
191
+ "eval_mix_de_cosine_map@150": 0.6572712744212402,
192
+ "eval_mix_de_cosine_map@20": 0.6521189972338849,
193
+ "eval_mix_de_cosine_map@200": 0.6574012324541948,
194
+ "eval_mix_de_cosine_map@50": 0.6561813596290409,
195
+ "eval_mix_de_cosine_map@500": 0.6575399010277455,
196
+ "eval_mix_de_cosine_mrr@1": 0.642225689027561,
197
+ "eval_mix_de_cosine_mrr@100": 0.7262168772880452,
198
+ "eval_mix_de_cosine_mrr@150": 0.7262822017289415,
199
+ "eval_mix_de_cosine_mrr@20": 0.7246816496840639,
200
+ "eval_mix_de_cosine_mrr@200": 0.7263128860080087,
201
+ "eval_mix_de_cosine_mrr@50": 0.7260235454700952,
202
+ "eval_mix_de_cosine_ndcg@1": 0.642225689027561,
203
+ "eval_mix_de_cosine_ndcg@100": 0.7547612967303503,
204
+ "eval_mix_de_cosine_ndcg@150": 0.7575184392863841,
205
+ "eval_mix_de_cosine_ndcg@20": 0.7332013199323174,
206
+ "eval_mix_de_cosine_ndcg@200": 0.7593986816807992,
207
+ "eval_mix_de_cosine_ndcg@50": 0.7490333180034867,
208
+ "eval_mix_de_cosine_precision@1": 0.642225689027561,
209
+ "eval_mix_de_cosine_precision@100": 0.0261622464898596,
210
+ "eval_mix_de_cosine_precision@150": 0.01770844167100017,
211
+ "eval_mix_de_cosine_precision@20": 0.11911076443057722,
212
+ "eval_mix_de_cosine_precision@200": 0.013424336973478942,
213
+ "eval_mix_de_cosine_precision@50": 0.05086843473738951,
214
+ "eval_mix_de_cosine_recall@1": 0.2405616224648986,
215
+ "eval_mix_de_cosine_recall@100": 0.9480412549835328,
216
+ "eval_mix_de_cosine_recall@150": 0.9618651412723176,
217
+ "eval_mix_de_cosine_recall@20": 0.8650459351707401,
218
+ "eval_mix_de_cosine_recall@200": 0.9720922170220142,
219
+ "eval_mix_de_cosine_recall@50": 0.9226295718495406,
220
+ "eval_mix_es_cosine_accuracy@1": 0.7009880395215808,
221
+ "eval_mix_es_cosine_accuracy@100": 0.9885595423816953,
222
+ "eval_mix_es_cosine_accuracy@150": 0.9921996879875195,
223
+ "eval_mix_es_cosine_accuracy@20": 0.9474778991159646,
224
+ "eval_mix_es_cosine_accuracy@200": 0.9932397295891836,
225
+ "eval_mix_es_cosine_accuracy@50": 0.9776391055642226,
226
+ "eval_mix_es_cosine_map@1": 0.7009880395215808,
227
+ "eval_mix_es_cosine_map@100": 0.6984889579958145,
228
+ "eval_mix_es_cosine_map@150": 0.6986621032108891,
229
+ "eval_mix_es_cosine_map@20": 0.6938173897965141,
230
+ "eval_mix_es_cosine_map@200": 0.6987465392575996,
231
+ "eval_mix_es_cosine_map@50": 0.6978248868009254,
232
+ "eval_mix_es_cosine_map@500": 0.6988876342368443,
233
+ "eval_mix_es_cosine_mrr@1": 0.7009880395215808,
234
+ "eval_mix_es_cosine_mrr@100": 0.7724347923967887,
235
+ "eval_mix_es_cosine_mrr@150": 0.7724644404043258,
236
+ "eval_mix_es_cosine_mrr@20": 0.7712491671812917,
237
+ "eval_mix_es_cosine_mrr@200": 0.7724705526191206,
238
+ "eval_mix_es_cosine_mrr@50": 0.7722842539435679,
239
+ "eval_mix_es_cosine_ndcg@1": 0.7009880395215808,
240
+ "eval_mix_es_cosine_ndcg@100": 0.7884317468596705,
241
+ "eval_mix_es_cosine_ndcg@150": 0.7902844804245556,
242
+ "eval_mix_es_cosine_ndcg@20": 0.7690336236998598,
243
+ "eval_mix_es_cosine_ndcg@200": 0.7913994944724545,
244
+ "eval_mix_es_cosine_ndcg@50": 0.7838732562697655,
245
+ "eval_mix_es_cosine_precision@1": 0.7009880395215808,
246
+ "eval_mix_es_cosine_precision@100": 0.02598543941757671,
247
+ "eval_mix_es_cosine_precision@150": 0.017493499739989597,
248
+ "eval_mix_es_cosine_precision@20": 0.11968278731149247,
249
+ "eval_mix_es_cosine_precision@200": 0.013198127925117008,
250
+ "eval_mix_es_cosine_precision@50": 0.05085803432137287,
251
+ "eval_mix_es_cosine_recall@1": 0.27067577941212884,
252
+ "eval_mix_es_cosine_recall@100": 0.9599497313225862,
253
+ "eval_mix_es_cosine_recall@150": 0.9695527821112844,
254
+ "eval_mix_es_cosine_recall@20": 0.8850840700294678,
255
+ "eval_mix_es_cosine_recall@200": 0.9758970358814353,
256
+ "eval_mix_es_cosine_recall@50": 0.9390968972092216,
257
+ "eval_mix_zh_cosine_accuracy@1": 0.7713987473903967,
258
+ "eval_mix_zh_cosine_accuracy@100": 0.9947807933194155,
259
+ "eval_mix_zh_cosine_accuracy@150": 0.9963465553235908,
260
+ "eval_mix_zh_cosine_accuracy@20": 0.9806889352818372,
261
+ "eval_mix_zh_cosine_accuracy@200": 0.9973903966597077,
262
+ "eval_mix_zh_cosine_accuracy@50": 0.9916492693110647,
263
+ "eval_mix_zh_cosine_map@1": 0.7713987473903967,
264
+ "eval_mix_zh_cosine_map@100": 0.6981100717727863,
265
+ "eval_mix_zh_cosine_map@150": 0.6982601227257159,
266
+ "eval_mix_zh_cosine_map@20": 0.6929147114308428,
267
+ "eval_mix_zh_cosine_map@200": 0.6983171494463136,
268
+ "eval_mix_zh_cosine_map@50": 0.6972607407491801,
269
+ "eval_mix_zh_cosine_map@500": 0.6984116893552017,
270
+ "eval_mix_zh_cosine_mrr@1": 0.7713987473903967,
271
+ "eval_mix_zh_cosine_mrr@100": 0.8436803457907981,
272
+ "eval_mix_zh_cosine_mrr@150": 0.8436922193976949,
273
+ "eval_mix_zh_cosine_mrr@20": 0.8432785828350186,
274
+ "eval_mix_zh_cosine_mrr@200": 0.8436986631082636,
275
+ "eval_mix_zh_cosine_mrr@50": 0.8436385628906108,
276
+ "eval_mix_zh_cosine_ndcg@1": 0.7713987473903967,
277
+ "eval_mix_zh_cosine_ndcg@100": 0.8115576206060865,
278
+ "eval_mix_zh_cosine_ndcg@150": 0.8129087269558002,
279
+ "eval_mix_zh_cosine_ndcg@20": 0.7926986810043013,
280
+ "eval_mix_zh_cosine_ndcg@200": 0.8135973837485255,
281
+ "eval_mix_zh_cosine_ndcg@50": 0.8066848794942646,
282
+ "eval_mix_zh_cosine_precision@1": 0.7713987473903967,
283
+ "eval_mix_zh_cosine_precision@100": 0.02944676409185805,
284
+ "eval_mix_zh_cosine_precision@150": 0.0197633959638135,
285
+ "eval_mix_zh_cosine_precision@20": 0.13656054279749477,
286
+ "eval_mix_zh_cosine_precision@200": 0.014877348643006268,
287
+ "eval_mix_zh_cosine_precision@50": 0.05762004175365346,
288
+ "eval_mix_zh_cosine_recall@1": 0.2585731683069888,
289
+ "eval_mix_zh_cosine_recall@100": 0.9715031315240084,
290
+ "eval_mix_zh_cosine_recall@150": 0.9781576200417537,
291
+ "eval_mix_zh_cosine_recall@20": 0.9014352818371607,
292
+ "eval_mix_zh_cosine_recall@200": 0.9818110647181629,
293
+ "eval_mix_zh_cosine_recall@50": 0.950347947112039,
294
+ "eval_runtime": 11.3241,
295
+ "eval_samples_per_second": 0.0,
296
+ "eval_sequential_score": 0.8135973837485255,
297
+ "eval_steps_per_second": 0.0,
298
+ "step": 200
299
+ }
300
+ ],
301
+ "logging_steps": 100,
302
+ "max_steps": 605,
303
+ "num_input_tokens_seen": 0,
304
+ "num_train_epochs": 5,
305
+ "save_steps": 200,
306
+ "stateful_callbacks": {
307
+ "TrainerControl": {
308
+ "args": {
309
+ "should_epoch_stop": false,
310
+ "should_evaluate": false,
311
+ "should_log": false,
312
+ "should_save": true,
313
+ "should_training_stop": false
314
+ },
315
+ "attributes": {}
316
+ }
317
+ },
318
+ "total_flos": 4.5625697398559293e+18,
319
+ "train_batch_size": 128,
320
+ "trial_name": null,
321
+ "trial_params": null
322
+ }
checkpoint-200/training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:bf4d5229cda7218c285b125b9db305e5b03ef7807e7bcdbf89afc0ac543dd982
3
+ size 5624
checkpoint-400/1_Pooling/config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "word_embedding_dimension": 768,
3
+ "pooling_mode_cls_token": true,
4
+ "pooling_mode_mean_tokens": false,
5
+ "pooling_mode_max_tokens": false,
6
+ "pooling_mode_mean_sqrt_len_tokens": false,
7
+ "pooling_mode_weightedmean_tokens": false,
8
+ "pooling_mode_lasttoken": false,
9
+ "include_prompt": true
10
+ }
checkpoint-400/README.md ADDED
@@ -0,0 +1,1400 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - sentence-transformers
4
+ - sentence-similarity
5
+ - feature-extraction
6
+ - generated_from_trainer
7
+ - dataset_size:124788
8
+ - loss:GISTEmbedLoss
9
+ base_model: Alibaba-NLP/gte-multilingual-base
10
+ widget:
11
+ - source_sentence: 其他机械、设备和有形货物租赁服务代表
12
+ sentences:
13
+ - 其他机械和设备租赁服务工作人员
14
+ - 电子和电信设备及零部件物流经理
15
+ - 工业主厨
16
+ - source_sentence: 公交车司机
17
+ sentences:
18
+ - 表演灯光设计师
19
+ - 乙烯基地板安装工
20
+ - 国际巴士司机
21
+ - source_sentence: online communication manager
22
+ sentences:
23
+ - trades union official
24
+ - social media manager
25
+ - budget manager
26
+ - source_sentence: Projektmanagerin
27
+ sentences:
28
+ - Projektmanager/Projektmanagerin
29
+ - Category-Manager
30
+ - Infanterist
31
+ - source_sentence: Volksvertreter
32
+ sentences:
33
+ - Parlamentarier
34
+ - Oberbürgermeister
35
+ - Konsul
36
+ pipeline_tag: sentence-similarity
37
+ library_name: sentence-transformers
38
+ metrics:
39
+ - cosine_accuracy@1
40
+ - cosine_accuracy@20
41
+ - cosine_accuracy@50
42
+ - cosine_accuracy@100
43
+ - cosine_accuracy@150
44
+ - cosine_accuracy@200
45
+ - cosine_precision@1
46
+ - cosine_precision@20
47
+ - cosine_precision@50
48
+ - cosine_precision@100
49
+ - cosine_precision@150
50
+ - cosine_precision@200
51
+ - cosine_recall@1
52
+ - cosine_recall@20
53
+ - cosine_recall@50
54
+ - cosine_recall@100
55
+ - cosine_recall@150
56
+ - cosine_recall@200
57
+ - cosine_ndcg@1
58
+ - cosine_ndcg@20
59
+ - cosine_ndcg@50
60
+ - cosine_ndcg@100
61
+ - cosine_ndcg@150
62
+ - cosine_ndcg@200
63
+ - cosine_mrr@1
64
+ - cosine_mrr@20
65
+ - cosine_mrr@50
66
+ - cosine_mrr@100
67
+ - cosine_mrr@150
68
+ - cosine_mrr@200
69
+ - cosine_map@1
70
+ - cosine_map@20
71
+ - cosine_map@50
72
+ - cosine_map@100
73
+ - cosine_map@150
74
+ - cosine_map@200
75
+ - cosine_map@500
76
+ model-index:
77
+ - name: SentenceTransformer based on Alibaba-NLP/gte-multilingual-base
78
+ results:
79
+ - task:
80
+ type: information-retrieval
81
+ name: Information Retrieval
82
+ dataset:
83
+ name: full en
84
+ type: full_en
85
+ metrics:
86
+ - type: cosine_accuracy@1
87
+ value: 0.6666666666666666
88
+ name: Cosine Accuracy@1
89
+ - type: cosine_accuracy@20
90
+ value: 0.9904761904761905
91
+ name: Cosine Accuracy@20
92
+ - type: cosine_accuracy@50
93
+ value: 0.9904761904761905
94
+ name: Cosine Accuracy@50
95
+ - type: cosine_accuracy@100
96
+ value: 0.9904761904761905
97
+ name: Cosine Accuracy@100
98
+ - type: cosine_accuracy@150
99
+ value: 0.9904761904761905
100
+ name: Cosine Accuracy@150
101
+ - type: cosine_accuracy@200
102
+ value: 0.9904761904761905
103
+ name: Cosine Accuracy@200
104
+ - type: cosine_precision@1
105
+ value: 0.6666666666666666
106
+ name: Cosine Precision@1
107
+ - type: cosine_precision@20
108
+ value: 0.5147619047619048
109
+ name: Cosine Precision@20
110
+ - type: cosine_precision@50
111
+ value: 0.31999999999999995
112
+ name: Cosine Precision@50
113
+ - type: cosine_precision@100
114
+ value: 0.19047619047619047
115
+ name: Cosine Precision@100
116
+ - type: cosine_precision@150
117
+ value: 0.1361904761904762
118
+ name: Cosine Precision@150
119
+ - type: cosine_precision@200
120
+ value: 0.10542857142857143
121
+ name: Cosine Precision@200
122
+ - type: cosine_recall@1
123
+ value: 0.06854687410617222
124
+ name: Cosine Recall@1
125
+ - type: cosine_recall@20
126
+ value: 0.5491240579458434
127
+ name: Cosine Recall@20
128
+ - type: cosine_recall@50
129
+ value: 0.7553654907661455
130
+ name: Cosine Recall@50
131
+ - type: cosine_recall@100
132
+ value: 0.8503209224897438
133
+ name: Cosine Recall@100
134
+ - type: cosine_recall@150
135
+ value: 0.8994749092946579
136
+ name: Cosine Recall@150
137
+ - type: cosine_recall@200
138
+ value: 0.9207884118691805
139
+ name: Cosine Recall@200
140
+ - type: cosine_ndcg@1
141
+ value: 0.6666666666666666
142
+ name: Cosine Ndcg@1
143
+ - type: cosine_ndcg@20
144
+ value: 0.6952098522285352
145
+ name: Cosine Ndcg@20
146
+ - type: cosine_ndcg@50
147
+ value: 0.7229572913271685
148
+ name: Cosine Ndcg@50
149
+ - type: cosine_ndcg@100
150
+ value: 0.7732532874348539
151
+ name: Cosine Ndcg@100
152
+ - type: cosine_ndcg@150
153
+ value: 0.7947334799125039
154
+ name: Cosine Ndcg@150
155
+ - type: cosine_ndcg@200
156
+ value: 0.8038564389556094
157
+ name: Cosine Ndcg@200
158
+ - type: cosine_mrr@1
159
+ value: 0.6666666666666666
160
+ name: Cosine Mrr@1
161
+ - type: cosine_mrr@20
162
+ value: 0.8182539682539683
163
+ name: Cosine Mrr@20
164
+ - type: cosine_mrr@50
165
+ value: 0.8182539682539683
166
+ name: Cosine Mrr@50
167
+ - type: cosine_mrr@100
168
+ value: 0.8182539682539683
169
+ name: Cosine Mrr@100
170
+ - type: cosine_mrr@150
171
+ value: 0.8182539682539683
172
+ name: Cosine Mrr@150
173
+ - type: cosine_mrr@200
174
+ value: 0.8182539682539683
175
+ name: Cosine Mrr@200
176
+ - type: cosine_map@1
177
+ value: 0.6666666666666666
178
+ name: Cosine Map@1
179
+ - type: cosine_map@20
180
+ value: 0.5566401101002375
181
+ name: Cosine Map@20
182
+ - type: cosine_map@50
183
+ value: 0.55344017265156
184
+ name: Cosine Map@50
185
+ - type: cosine_map@100
186
+ value: 0.5852249415484134
187
+ name: Cosine Map@100
188
+ - type: cosine_map@150
189
+ value: 0.5943042662925763
190
+ name: Cosine Map@150
191
+ - type: cosine_map@200
192
+ value: 0.5975837437975446
193
+ name: Cosine Map@200
194
+ - type: cosine_map@500
195
+ value: 0.6015742986218369
196
+ name: Cosine Map@500
197
+ - task:
198
+ type: information-retrieval
199
+ name: Information Retrieval
200
+ dataset:
201
+ name: full es
202
+ type: full_es
203
+ metrics:
204
+ - type: cosine_accuracy@1
205
+ value: 0.12432432432432433
206
+ name: Cosine Accuracy@1
207
+ - type: cosine_accuracy@20
208
+ value: 1.0
209
+ name: Cosine Accuracy@20
210
+ - type: cosine_accuracy@50
211
+ value: 1.0
212
+ name: Cosine Accuracy@50
213
+ - type: cosine_accuracy@100
214
+ value: 1.0
215
+ name: Cosine Accuracy@100
216
+ - type: cosine_accuracy@150
217
+ value: 1.0
218
+ name: Cosine Accuracy@150
219
+ - type: cosine_accuracy@200
220
+ value: 1.0
221
+ name: Cosine Accuracy@200
222
+ - type: cosine_precision@1
223
+ value: 0.12432432432432433
224
+ name: Cosine Precision@1
225
+ - type: cosine_precision@20
226
+ value: 0.575945945945946
227
+ name: Cosine Precision@20
228
+ - type: cosine_precision@50
229
+ value: 0.3923243243243244
230
+ name: Cosine Precision@50
231
+ - type: cosine_precision@100
232
+ value: 0.2565945945945946
233
+ name: Cosine Precision@100
234
+ - type: cosine_precision@150
235
+ value: 0.19282882882882882
236
+ name: Cosine Precision@150
237
+ - type: cosine_precision@200
238
+ value: 0.1527837837837838
239
+ name: Cosine Precision@200
240
+ - type: cosine_recall@1
241
+ value: 0.0036138931714884822
242
+ name: Cosine Recall@1
243
+ - type: cosine_recall@20
244
+ value: 0.3852888120551914
245
+ name: Cosine Recall@20
246
+ - type: cosine_recall@50
247
+ value: 0.5659574514538841
248
+ name: Cosine Recall@50
249
+ - type: cosine_recall@100
250
+ value: 0.6898678629281393
251
+ name: Cosine Recall@100
252
+ - type: cosine_recall@150
253
+ value: 0.7540209165372845
254
+ name: Cosine Recall@150
255
+ - type: cosine_recall@200
256
+ value: 0.7858170054407897
257
+ name: Cosine Recall@200
258
+ - type: cosine_ndcg@1
259
+ value: 0.12432432432432433
260
+ name: Cosine Ndcg@1
261
+ - type: cosine_ndcg@20
262
+ value: 0.6168674053047035
263
+ name: Cosine Ndcg@20
264
+ - type: cosine_ndcg@50
265
+ value: 0.5913690595071309
266
+ name: Cosine Ndcg@50
267
+ - type: cosine_ndcg@100
268
+ value: 0.62350509928888
269
+ name: Cosine Ndcg@100
270
+ - type: cosine_ndcg@150
271
+ value: 0.6556716735369459
272
+ name: Cosine Ndcg@150
273
+ - type: cosine_ndcg@200
274
+ value: 0.6716557949894583
275
+ name: Cosine Ndcg@200
276
+ - type: cosine_mrr@1
277
+ value: 0.12432432432432433
278
+ name: Cosine Mrr@1
279
+ - type: cosine_mrr@20
280
+ value: 0.5581081081081081
281
+ name: Cosine Mrr@20
282
+ - type: cosine_mrr@50
283
+ value: 0.5581081081081081
284
+ name: Cosine Mrr@50
285
+ - type: cosine_mrr@100
286
+ value: 0.5581081081081081
287
+ name: Cosine Mrr@100
288
+ - type: cosine_mrr@150
289
+ value: 0.5581081081081081
290
+ name: Cosine Mrr@150
291
+ - type: cosine_mrr@200
292
+ value: 0.5581081081081081
293
+ name: Cosine Mrr@200
294
+ - type: cosine_map@1
295
+ value: 0.12432432432432433
296
+ name: Cosine Map@1
297
+ - type: cosine_map@20
298
+ value: 0.48407152706202555
299
+ name: Cosine Map@20
300
+ - type: cosine_map@50
301
+ value: 0.43043374125481026
302
+ name: Cosine Map@50
303
+ - type: cosine_map@100
304
+ value: 0.43735327570764515
305
+ name: Cosine Map@100
306
+ - type: cosine_map@150
307
+ value: 0.45269435912524697
308
+ name: Cosine Map@150
309
+ - type: cosine_map@200
310
+ value: 0.45930097680668164
311
+ name: Cosine Map@200
312
+ - type: cosine_map@500
313
+ value: 0.47204219228541466
314
+ name: Cosine Map@500
315
+ - task:
316
+ type: information-retrieval
317
+ name: Information Retrieval
318
+ dataset:
319
+ name: full de
320
+ type: full_de
321
+ metrics:
322
+ - type: cosine_accuracy@1
323
+ value: 0.2955665024630542
324
+ name: Cosine Accuracy@1
325
+ - type: cosine_accuracy@20
326
+ value: 0.9753694581280788
327
+ name: Cosine Accuracy@20
328
+ - type: cosine_accuracy@50
329
+ value: 0.9852216748768473
330
+ name: Cosine Accuracy@50
331
+ - type: cosine_accuracy@100
332
+ value: 0.9901477832512315
333
+ name: Cosine Accuracy@100
334
+ - type: cosine_accuracy@150
335
+ value: 0.9901477832512315
336
+ name: Cosine Accuracy@150
337
+ - type: cosine_accuracy@200
338
+ value: 0.9901477832512315
339
+ name: Cosine Accuracy@200
340
+ - type: cosine_precision@1
341
+ value: 0.2955665024630542
342
+ name: Cosine Precision@1
343
+ - type: cosine_precision@20
344
+ value: 0.5103448275862069
345
+ name: Cosine Precision@20
346
+ - type: cosine_precision@50
347
+ value: 0.36935960591133016
348
+ name: Cosine Precision@50
349
+ - type: cosine_precision@100
350
+ value: 0.23965517241379314
351
+ name: Cosine Precision@100
352
+ - type: cosine_precision@150
353
+ value: 0.1807881773399015
354
+ name: Cosine Precision@150
355
+ - type: cosine_precision@200
356
+ value: 0.1461576354679803
357
+ name: Cosine Precision@200
358
+ - type: cosine_recall@1
359
+ value: 0.01108543831680986
360
+ name: Cosine Recall@1
361
+ - type: cosine_recall@20
362
+ value: 0.3207974783481294
363
+ name: Cosine Recall@20
364
+ - type: cosine_recall@50
365
+ value: 0.5042046446720455
366
+ name: Cosine Recall@50
367
+ - type: cosine_recall@100
368
+ value: 0.6172666777909689
369
+ name: Cosine Recall@100
370
+ - type: cosine_recall@150
371
+ value: 0.6848138831682932
372
+ name: Cosine Recall@150
373
+ - type: cosine_recall@200
374
+ value: 0.7253195006357535
375
+ name: Cosine Recall@200
376
+ - type: cosine_ndcg@1
377
+ value: 0.2955665024630542
378
+ name: Cosine Ndcg@1
379
+ - type: cosine_ndcg@20
380
+ value: 0.537849085734973
381
+ name: Cosine Ndcg@20
382
+ - type: cosine_ndcg@50
383
+ value: 0.5288037060639387
384
+ name: Cosine Ndcg@50
385
+ - type: cosine_ndcg@100
386
+ value: 0.5551941695921919
387
+ name: Cosine Ndcg@100
388
+ - type: cosine_ndcg@150
389
+ value: 0.5887611959940118
390
+ name: Cosine Ndcg@150
391
+ - type: cosine_ndcg@200
392
+ value: 0.6092219717029682
393
+ name: Cosine Ndcg@200
394
+ - type: cosine_mrr@1
395
+ value: 0.2955665024630542
396
+ name: Cosine Mrr@1
397
+ - type: cosine_mrr@20
398
+ value: 0.5164773875147672
399
+ name: Cosine Mrr@20
400
+ - type: cosine_mrr@50
401
+ value: 0.5167647438366063
402
+ name: Cosine Mrr@50
403
+ - type: cosine_mrr@100
404
+ value: 0.5168213657719442
405
+ name: Cosine Mrr@100
406
+ - type: cosine_mrr@150
407
+ value: 0.5168213657719442
408
+ name: Cosine Mrr@150
409
+ - type: cosine_mrr@200
410
+ value: 0.5168213657719442
411
+ name: Cosine Mrr@200
412
+ - type: cosine_map@1
413
+ value: 0.2955665024630542
414
+ name: Cosine Map@1
415
+ - type: cosine_map@20
416
+ value: 0.398398563122481
417
+ name: Cosine Map@20
418
+ - type: cosine_map@50
419
+ value: 0.36032758502543594
420
+ name: Cosine Map@50
421
+ - type: cosine_map@100
422
+ value: 0.3632259128424842
423
+ name: Cosine Map@100
424
+ - type: cosine_map@150
425
+ value: 0.37822275477623696
426
+ name: Cosine Map@150
427
+ - type: cosine_map@200
428
+ value: 0.3863148456840816
429
+ name: Cosine Map@200
430
+ - type: cosine_map@500
431
+ value: 0.399227009561676
432
+ name: Cosine Map@500
433
+ - task:
434
+ type: information-retrieval
435
+ name: Information Retrieval
436
+ dataset:
437
+ name: full zh
438
+ type: full_zh
439
+ metrics:
440
+ - type: cosine_accuracy@1
441
+ value: 0.6796116504854369
442
+ name: Cosine Accuracy@1
443
+ - type: cosine_accuracy@20
444
+ value: 0.9805825242718447
445
+ name: Cosine Accuracy@20
446
+ - type: cosine_accuracy@50
447
+ value: 0.9902912621359223
448
+ name: Cosine Accuracy@50
449
+ - type: cosine_accuracy@100
450
+ value: 0.9902912621359223
451
+ name: Cosine Accuracy@100
452
+ - type: cosine_accuracy@150
453
+ value: 0.9902912621359223
454
+ name: Cosine Accuracy@150
455
+ - type: cosine_accuracy@200
456
+ value: 0.9902912621359223
457
+ name: Cosine Accuracy@200
458
+ - type: cosine_precision@1
459
+ value: 0.6796116504854369
460
+ name: Cosine Precision@1
461
+ - type: cosine_precision@20
462
+ value: 0.488349514563107
463
+ name: Cosine Precision@20
464
+ - type: cosine_precision@50
465
+ value: 0.29631067961165053
466
+ name: Cosine Precision@50
467
+ - type: cosine_precision@100
468
+ value: 0.17883495145631062
469
+ name: Cosine Precision@100
470
+ - type: cosine_precision@150
471
+ value: 0.12776699029126212
472
+ name: Cosine Precision@150
473
+ - type: cosine_precision@200
474
+ value: 0.09990291262135924
475
+ name: Cosine Precision@200
476
+ - type: cosine_recall@1
477
+ value: 0.06931865009287731
478
+ name: Cosine Recall@1
479
+ - type: cosine_recall@20
480
+ value: 0.5250914458143515
481
+ name: Cosine Recall@20
482
+ - type: cosine_recall@50
483
+ value: 0.7082715439925011
484
+ name: Cosine Recall@50
485
+ - type: cosine_recall@100
486
+ value: 0.8169166539243944
487
+ name: Cosine Recall@100
488
+ - type: cosine_recall@150
489
+ value: 0.8613232254521018
490
+ name: Cosine Recall@150
491
+ - type: cosine_recall@200
492
+ value: 0.8898175710074696
493
+ name: Cosine Recall@200
494
+ - type: cosine_ndcg@1
495
+ value: 0.6796116504854369
496
+ name: Cosine Ndcg@1
497
+ - type: cosine_ndcg@20
498
+ value: 0.6680745295820606
499
+ name: Cosine Ndcg@20
500
+ - type: cosine_ndcg@50
501
+ value: 0.6856578240865067
502
+ name: Cosine Ndcg@50
503
+ - type: cosine_ndcg@100
504
+ value: 0.7378907298421352
505
+ name: Cosine Ndcg@100
506
+ - type: cosine_ndcg@150
507
+ value: 0.7576651805692517
508
+ name: Cosine Ndcg@150
509
+ - type: cosine_ndcg@200
510
+ value: 0.7696718049970358
511
+ name: Cosine Ndcg@200
512
+ - type: cosine_mrr@1
513
+ value: 0.6796116504854369
514
+ name: Cosine Mrr@1
515
+ - type: cosine_mrr@20
516
+ value: 0.8158576051779936
517
+ name: Cosine Mrr@20
518
+ - type: cosine_mrr@50
519
+ value: 0.816279724215562
520
+ name: Cosine Mrr@50
521
+ - type: cosine_mrr@100
522
+ value: 0.816279724215562
523
+ name: Cosine Mrr@100
524
+ - type: cosine_mrr@150
525
+ value: 0.816279724215562
526
+ name: Cosine Mrr@150
527
+ - type: cosine_mrr@200
528
+ value: 0.816279724215562
529
+ name: Cosine Mrr@200
530
+ - type: cosine_map@1
531
+ value: 0.6796116504854369
532
+ name: Cosine Map@1
533
+ - type: cosine_map@20
534
+ value: 0.522177160195635
535
+ name: Cosine Map@20
536
+ - type: cosine_map@50
537
+ value: 0.5082601209392789
538
+ name: Cosine Map@50
539
+ - type: cosine_map@100
540
+ value: 0.5371705298206915
541
+ name: Cosine Map@100
542
+ - type: cosine_map@150
543
+ value: 0.5454012672534121
544
+ name: Cosine Map@150
545
+ - type: cosine_map@200
546
+ value: 0.5494570875591636
547
+ name: Cosine Map@200
548
+ - type: cosine_map@500
549
+ value: 0.5542116087189223
550
+ name: Cosine Map@500
551
+ - task:
552
+ type: information-retrieval
553
+ name: Information Retrieval
554
+ dataset:
555
+ name: mix es
556
+ type: mix_es
557
+ metrics:
558
+ - type: cosine_accuracy@1
559
+ value: 0.7087883515340614
560
+ name: Cosine Accuracy@1
561
+ - type: cosine_accuracy@20
562
+ value: 0.9552782111284451
563
+ name: Cosine Accuracy@20
564
+ - type: cosine_accuracy@50
565
+ value: 0.9802392095683827
566
+ name: Cosine Accuracy@50
567
+ - type: cosine_accuracy@100
568
+ value: 0.9901196047841914
569
+ name: Cosine Accuracy@100
570
+ - type: cosine_accuracy@150
571
+ value: 0.9937597503900156
572
+ name: Cosine Accuracy@150
573
+ - type: cosine_accuracy@200
574
+ value: 0.9958398335933437
575
+ name: Cosine Accuracy@200
576
+ - type: cosine_precision@1
577
+ value: 0.7087883515340614
578
+ name: Cosine Precision@1
579
+ - type: cosine_precision@20
580
+ value: 0.12158086323452937
581
+ name: Cosine Precision@20
582
+ - type: cosine_precision@50
583
+ value: 0.05122204888195529
584
+ name: Cosine Precision@50
585
+ - type: cosine_precision@100
586
+ value: 0.026125845033801356
587
+ name: Cosine Precision@100
588
+ - type: cosine_precision@150
589
+ value: 0.017548968625411682
590
+ name: Cosine Precision@150
591
+ - type: cosine_precision@200
592
+ value: 0.013239729589183572
593
+ name: Cosine Precision@200
594
+ - type: cosine_recall@1
595
+ value: 0.2737959042171211
596
+ name: Cosine Recall@1
597
+ - type: cosine_recall@20
598
+ value: 0.8990032934650719
599
+ name: Cosine Recall@20
600
+ - type: cosine_recall@50
601
+ value: 0.9459438377535101
602
+ name: Cosine Recall@50
603
+ - type: cosine_recall@100
604
+ value: 0.9650979372508233
605
+ name: Cosine Recall@100
606
+ - type: cosine_recall@150
607
+ value: 0.9731582596637198
608
+ name: Cosine Recall@150
609
+ - type: cosine_recall@200
610
+ value: 0.979086496793205
611
+ name: Cosine Recall@200
612
+ - type: cosine_ndcg@1
613
+ value: 0.7087883515340614
614
+ name: Cosine Ndcg@1
615
+ - type: cosine_ndcg@20
616
+ value: 0.7814741332820433
617
+ name: Cosine Ndcg@20
618
+ - type: cosine_ndcg@50
619
+ value: 0.7944033394497885
620
+ name: Cosine Ndcg@50
621
+ - type: cosine_ndcg@100
622
+ value: 0.7986024294603647
623
+ name: Cosine Ndcg@100
624
+ - type: cosine_ndcg@150
625
+ value: 0.8001222520801115
626
+ name: Cosine Ndcg@150
627
+ - type: cosine_ndcg@200
628
+ value: 0.801183843730514
629
+ name: Cosine Ndcg@200
630
+ - type: cosine_mrr@1
631
+ value: 0.7087883515340614
632
+ name: Cosine Mrr@1
633
+ - type: cosine_mrr@20
634
+ value: 0.7804158804359833
635
+ name: Cosine Mrr@20
636
+ - type: cosine_mrr@50
637
+ value: 0.7812547046826683
638
+ name: Cosine Mrr@50
639
+ - type: cosine_mrr@100
640
+ value: 0.7813961782842836
641
+ name: Cosine Mrr@100
642
+ - type: cosine_mrr@150
643
+ value: 0.7814280971923943
644
+ name: Cosine Mrr@150
645
+ - type: cosine_mrr@200
646
+ value: 0.7814392363829243
647
+ name: Cosine Mrr@200
648
+ - type: cosine_map@1
649
+ value: 0.7087883515340614
650
+ name: Cosine Map@1
651
+ - type: cosine_map@20
652
+ value: 0.7070596364024803
653
+ name: Cosine Map@20
654
+ - type: cosine_map@50
655
+ value: 0.7106867578203881
656
+ name: Cosine Map@50
657
+ - type: cosine_map@100
658
+ value: 0.7112928928384499
659
+ name: Cosine Map@100
660
+ - type: cosine_map@150
661
+ value: 0.7114314004578745
662
+ name: Cosine Map@150
663
+ - type: cosine_map@200
664
+ value: 0.711504950521157
665
+ name: Cosine Map@200
666
+ - type: cosine_map@500
667
+ value: 0.7116431478000537
668
+ name: Cosine Map@500
669
+ - task:
670
+ type: information-retrieval
671
+ name: Information Retrieval
672
+ dataset:
673
+ name: mix de
674
+ type: mix_de
675
+ metrics:
676
+ - type: cosine_accuracy@1
677
+ value: 0.6484659386375455
678
+ name: Cosine Accuracy@1
679
+ - type: cosine_accuracy@20
680
+ value: 0.9323972958918356
681
+ name: Cosine Accuracy@20
682
+ - type: cosine_accuracy@50
683
+ value: 0.968278731149246
684
+ name: Cosine Accuracy@50
685
+ - type: cosine_accuracy@100
686
+ value: 0.984919396775871
687
+ name: Cosine Accuracy@100
688
+ - type: cosine_accuracy@150
689
+ value: 0.9885595423816953
690
+ name: Cosine Accuracy@150
691
+ - type: cosine_accuracy@200
692
+ value: 0.9937597503900156
693
+ name: Cosine Accuracy@200
694
+ - type: cosine_precision@1
695
+ value: 0.6484659386375455
696
+ name: Cosine Precision@1
697
+ - type: cosine_precision@20
698
+ value: 0.12093083723348932
699
+ name: Cosine Precision@20
700
+ - type: cosine_precision@50
701
+ value: 0.05140925637025482
702
+ name: Cosine Precision@50
703
+ - type: cosine_precision@100
704
+ value: 0.02647425897035882
705
+ name: Cosine Precision@100
706
+ - type: cosine_precision@150
707
+ value: 0.017892182353960822
708
+ name: Cosine Precision@150
709
+ - type: cosine_precision@200
710
+ value: 0.013530941237649509
711
+ name: Cosine Precision@200
712
+ - type: cosine_recall@1
713
+ value: 0.2435517420696828
714
+ name: Cosine Recall@1
715
+ - type: cosine_recall@20
716
+ value: 0.87873114924597
717
+ name: Cosine Recall@20
718
+ - type: cosine_recall@50
719
+ value: 0.9319899462645173
720
+ name: Cosine Recall@50
721
+ - type: cosine_recall@100
722
+ value: 0.9596117178020455
723
+ name: Cosine Recall@100
724
+ - type: cosine_recall@150
725
+ value: 0.9718322066215982
726
+ name: Cosine Recall@150
727
+ - type: cosine_recall@200
728
+ value: 0.9799791991679667
729
+ name: Cosine Recall@200
730
+ - type: cosine_ndcg@1
731
+ value: 0.6484659386375455
732
+ name: Cosine Ndcg@1
733
+ - type: cosine_ndcg@20
734
+ value: 0.7448150588358
735
+ name: Cosine Ndcg@20
736
+ - type: cosine_ndcg@50
737
+ value: 0.7595232400510039
738
+ name: Cosine Ndcg@50
739
+ - type: cosine_ndcg@100
740
+ value: 0.7656851368194345
741
+ name: Cosine Ndcg@100
742
+ - type: cosine_ndcg@150
743
+ value: 0.7681576326024331
744
+ name: Cosine Ndcg@150
745
+ - type: cosine_ndcg@200
746
+ value: 0.7696474672652458
747
+ name: Cosine Ndcg@200
748
+ - type: cosine_mrr@1
749
+ value: 0.6484659386375455
750
+ name: Cosine Mrr@1
751
+ - type: cosine_mrr@20
752
+ value: 0.7323691045739125
753
+ name: Cosine Mrr@20
754
+ - type: cosine_mrr@50
755
+ value: 0.733538875120878
756
+ name: Cosine Mrr@50
757
+ - type: cosine_mrr@100
758
+ value: 0.733776247038599
759
+ name: Cosine Mrr@100
760
+ - type: cosine_mrr@150
761
+ value: 0.7338087409764548
762
+ name: Cosine Mrr@150
763
+ - type: cosine_mrr@200
764
+ value: 0.7338398642058079
765
+ name: Cosine Mrr@200
766
+ - type: cosine_map@1
767
+ value: 0.6484659386375455
768
+ name: Cosine Map@1
769
+ - type: cosine_map@20
770
+ value: 0.6646138211839377
771
+ name: Cosine Map@20
772
+ - type: cosine_map@50
773
+ value: 0.6683657128313888
774
+ name: Cosine Map@50
775
+ - type: cosine_map@100
776
+ value: 0.6692634410264182
777
+ name: Cosine Map@100
778
+ - type: cosine_map@150
779
+ value: 0.669518875077899
780
+ name: Cosine Map@150
781
+ - type: cosine_map@200
782
+ value: 0.6696171599377958
783
+ name: Cosine Map@200
784
+ - type: cosine_map@500
785
+ value: 0.6697127210085475
786
+ name: Cosine Map@500
787
+ - task:
788
+ type: information-retrieval
789
+ name: Information Retrieval
790
+ dataset:
791
+ name: mix zh
792
+ type: mix_zh
793
+ metrics:
794
+ - type: cosine_accuracy@1
795
+ value: 0.7667014613778705
796
+ name: Cosine Accuracy@1
797
+ - type: cosine_accuracy@20
798
+ value: 0.9843423799582464
799
+ name: Cosine Accuracy@20
800
+ - type: cosine_accuracy@50
801
+ value: 0.9932150313152401
802
+ name: Cosine Accuracy@50
803
+ - type: cosine_accuracy@100
804
+ value: 0.9958246346555324
805
+ name: Cosine Accuracy@100
806
+ - type: cosine_accuracy@150
807
+ value: 0.9973903966597077
808
+ name: Cosine Accuracy@150
809
+ - type: cosine_accuracy@200
810
+ value: 0.9979123173277662
811
+ name: Cosine Accuracy@200
812
+ - type: cosine_precision@1
813
+ value: 0.7667014613778705
814
+ name: Cosine Precision@1
815
+ - type: cosine_precision@20
816
+ value: 0.13870041753653445
817
+ name: Cosine Precision@20
818
+ - type: cosine_precision@50
819
+ value: 0.05810020876826725
820
+ name: Cosine Precision@50
821
+ - type: cosine_precision@100
822
+ value: 0.029598121085595
823
+ name: Cosine Precision@100
824
+ - type: cosine_precision@150
825
+ value: 0.01986778009742519
826
+ name: Cosine Precision@150
827
+ - type: cosine_precision@200
828
+ value: 0.014945198329853866
829
+ name: Cosine Precision@200
830
+ - type: cosine_recall@1
831
+ value: 0.25692041952480366
832
+ name: Cosine Recall@1
833
+ - type: cosine_recall@20
834
+ value: 0.9156576200417536
835
+ name: Cosine Recall@20
836
+ - type: cosine_recall@50
837
+ value: 0.9582637439109255
838
+ name: Cosine Recall@50
839
+ - type: cosine_recall@100
840
+ value: 0.9765483646485734
841
+ name: Cosine Recall@100
842
+ - type: cosine_recall@150
843
+ value: 0.9833768267223383
844
+ name: Cosine Recall@150
845
+ - type: cosine_recall@200
846
+ value: 0.986464857341684
847
+ name: Cosine Recall@200
848
+ - type: cosine_ndcg@1
849
+ value: 0.7667014613778705
850
+ name: Cosine Ndcg@1
851
+ - type: cosine_ndcg@20
852
+ value: 0.8002168358295473
853
+ name: Cosine Ndcg@20
854
+ - type: cosine_ndcg@50
855
+ value: 0.8125113081884888
856
+ name: Cosine Ndcg@50
857
+ - type: cosine_ndcg@100
858
+ value: 0.8167350090334409
859
+ name: Cosine Ndcg@100
860
+ - type: cosine_ndcg@150
861
+ value: 0.8181122471507385
862
+ name: Cosine Ndcg@150
863
+ - type: cosine_ndcg@200
864
+ value: 0.8186874070081017
865
+ name: Cosine Ndcg@200
866
+ - type: cosine_mrr@1
867
+ value: 0.7667014613778705
868
+ name: Cosine Mrr@1
869
+ - type: cosine_mrr@20
870
+ value: 0.8421752732824312
871
+ name: Cosine Mrr@20
872
+ - type: cosine_mrr@50
873
+ value: 0.8424954415974232
874
+ name: Cosine Mrr@50
875
+ - type: cosine_mrr@100
876
+ value: 0.8425358910333786
877
+ name: Cosine Mrr@100
878
+ - type: cosine_mrr@150
879
+ value: 0.8425483391786986
880
+ name: Cosine Mrr@150
881
+ - type: cosine_mrr@200
882
+ value: 0.8425515411459873
883
+ name: Cosine Mrr@200
884
+ - type: cosine_map@1
885
+ value: 0.7667014613778705
886
+ name: Cosine Map@1
887
+ - type: cosine_map@20
888
+ value: 0.7007206423896271
889
+ name: Cosine Map@20
890
+ - type: cosine_map@50
891
+ value: 0.7046277360194696
892
+ name: Cosine Map@50
893
+ - type: cosine_map@100
894
+ value: 0.7053668771050886
895
+ name: Cosine Map@100
896
+ - type: cosine_map@150
897
+ value: 0.7055166914145262
898
+ name: Cosine Map@150
899
+ - type: cosine_map@200
900
+ value: 0.7055658329670217
901
+ name: Cosine Map@200
902
+ - type: cosine_map@500
903
+ value: 0.7056512281794008
904
+ name: Cosine Map@500
905
+ ---
906
+
907
+ # Job - Job matching Alibaba-NLP/gte-multilingual-base (v2)
908
+
909
+ Top performing model on [TalentCLEF 2025](https://talentclef.github.io/talentclef/) Task A. Use it for multilingual job title matching
910
+
911
+ ## Model Details
912
+
913
+ ### Model Description
914
+ - **Model Type:** Sentence Transformer
915
+ - **Base model:** [Alibaba-NLP/gte-multilingual-base](https://huggingface.co/Alibaba-NLP/gte-multilingual-base) <!-- at revision 9fdd4ee8bba0e2808a34e0e739576f6740d2b225 -->
916
+ - **Maximum Sequence Length:** 512 tokens
917
+ - **Output Dimensionality:** 768 dimensions
918
+ - **Similarity Function:** Cosine Similarity
919
+ - **Training Datasets:**
920
+ - full_en
921
+ - full_de
922
+ - full_es
923
+ - full_zh
924
+ - mix
925
+ <!-- - **Language:** Unknown -->
926
+ <!-- - **License:** Unknown -->
927
+
928
+ ### Model Sources
929
+
930
+ - **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
931
+ - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
932
+ - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
933
+
934
+ ### Full Model Architecture
935
+
936
+ ```
937
+ SentenceTransformer(
938
+ (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: NewModel
939
+ (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
940
+ (2): Normalize()
941
+ )
942
+ ```
943
+
944
+ ## Usage
945
+
946
+ ### Direct Usage (Sentence Transformers)
947
+
948
+ First install the Sentence Transformers library:
949
+
950
+ ```bash
951
+ pip install -U sentence-transformers
952
+ ```
953
+
954
+ Then you can load this model and run inference.
955
+ ```python
956
+ from sentence_transformers import SentenceTransformer
957
+
958
+ # Download from the 🤗 Hub
959
+ model = SentenceTransformer("pj-mathematician/JobGTE-multilingual-base-v2")
960
+ # Run inference
961
+ sentences = [
962
+ 'Volksvertreter',
963
+ 'Parlamentarier',
964
+ 'Oberbürgermeister',
965
+ ]
966
+ embeddings = model.encode(sentences)
967
+ print(embeddings.shape)
968
+ # [3, 768]
969
+
970
+ # Get the similarity scores for the embeddings
971
+ similarities = model.similarity(embeddings, embeddings)
972
+ print(similarities.shape)
973
+ # [3, 3]
974
+ ```
975
+
976
+ <!--
977
+ ### Direct Usage (Transformers)
978
+
979
+ <details><summary>Click to see the direct usage in Transformers</summary>
980
+
981
+ </details>
982
+ -->
983
+
984
+ <!--
985
+ ### Downstream Usage (Sentence Transformers)
986
+
987
+ You can finetune this model on your own dataset.
988
+
989
+ <details><summary>Click to expand</summary>
990
+
991
+ </details>
992
+ -->
993
+
994
+ <!--
995
+ ### Out-of-Scope Use
996
+
997
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
998
+ -->
999
+
1000
+ ## Evaluation
1001
+
1002
+ ### Metrics
1003
+
1004
+ #### Information Retrieval
1005
+
1006
+ * Datasets: `full_en`, `full_es`, `full_de`, `full_zh`, `mix_es`, `mix_de` and `mix_zh`
1007
+ * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
1008
+
1009
+ | Metric | full_en | full_es | full_de | full_zh | mix_es | mix_de | mix_zh |
1010
+ |:---------------------|:-----------|:-----------|:-----------|:-----------|:-----------|:-----------|:-----------|
1011
+ | cosine_accuracy@1 | 0.6667 | 0.1243 | 0.2956 | 0.6796 | 0.7088 | 0.6485 | 0.7667 |
1012
+ | cosine_accuracy@20 | 0.9905 | 1.0 | 0.9754 | 0.9806 | 0.9553 | 0.9324 | 0.9843 |
1013
+ | cosine_accuracy@50 | 0.9905 | 1.0 | 0.9852 | 0.9903 | 0.9802 | 0.9683 | 0.9932 |
1014
+ | cosine_accuracy@100 | 0.9905 | 1.0 | 0.9901 | 0.9903 | 0.9901 | 0.9849 | 0.9958 |
1015
+ | cosine_accuracy@150 | 0.9905 | 1.0 | 0.9901 | 0.9903 | 0.9938 | 0.9886 | 0.9974 |
1016
+ | cosine_accuracy@200 | 0.9905 | 1.0 | 0.9901 | 0.9903 | 0.9958 | 0.9938 | 0.9979 |
1017
+ | cosine_precision@1 | 0.6667 | 0.1243 | 0.2956 | 0.6796 | 0.7088 | 0.6485 | 0.7667 |
1018
+ | cosine_precision@20 | 0.5148 | 0.5759 | 0.5103 | 0.4883 | 0.1216 | 0.1209 | 0.1387 |
1019
+ | cosine_precision@50 | 0.32 | 0.3923 | 0.3694 | 0.2963 | 0.0512 | 0.0514 | 0.0581 |
1020
+ | cosine_precision@100 | 0.1905 | 0.2566 | 0.2397 | 0.1788 | 0.0261 | 0.0265 | 0.0296 |
1021
+ | cosine_precision@150 | 0.1362 | 0.1928 | 0.1808 | 0.1278 | 0.0175 | 0.0179 | 0.0199 |
1022
+ | cosine_precision@200 | 0.1054 | 0.1528 | 0.1462 | 0.0999 | 0.0132 | 0.0135 | 0.0149 |
1023
+ | cosine_recall@1 | 0.0685 | 0.0036 | 0.0111 | 0.0693 | 0.2738 | 0.2436 | 0.2569 |
1024
+ | cosine_recall@20 | 0.5491 | 0.3853 | 0.3208 | 0.5251 | 0.899 | 0.8787 | 0.9157 |
1025
+ | cosine_recall@50 | 0.7554 | 0.566 | 0.5042 | 0.7083 | 0.9459 | 0.932 | 0.9583 |
1026
+ | cosine_recall@100 | 0.8503 | 0.6899 | 0.6173 | 0.8169 | 0.9651 | 0.9596 | 0.9765 |
1027
+ | cosine_recall@150 | 0.8995 | 0.754 | 0.6848 | 0.8613 | 0.9732 | 0.9718 | 0.9834 |
1028
+ | cosine_recall@200 | 0.9208 | 0.7858 | 0.7253 | 0.8898 | 0.9791 | 0.98 | 0.9865 |
1029
+ | cosine_ndcg@1 | 0.6667 | 0.1243 | 0.2956 | 0.6796 | 0.7088 | 0.6485 | 0.7667 |
1030
+ | cosine_ndcg@20 | 0.6952 | 0.6169 | 0.5378 | 0.6681 | 0.7815 | 0.7448 | 0.8002 |
1031
+ | cosine_ndcg@50 | 0.723 | 0.5914 | 0.5288 | 0.6857 | 0.7944 | 0.7595 | 0.8125 |
1032
+ | cosine_ndcg@100 | 0.7733 | 0.6235 | 0.5552 | 0.7379 | 0.7986 | 0.7657 | 0.8167 |
1033
+ | cosine_ndcg@150 | 0.7947 | 0.6557 | 0.5888 | 0.7577 | 0.8001 | 0.7682 | 0.8181 |
1034
+ | **cosine_ndcg@200** | **0.8039** | **0.6717** | **0.6092** | **0.7697** | **0.8012** | **0.7696** | **0.8187** |
1035
+ | cosine_mrr@1 | 0.6667 | 0.1243 | 0.2956 | 0.6796 | 0.7088 | 0.6485 | 0.7667 |
1036
+ | cosine_mrr@20 | 0.8183 | 0.5581 | 0.5165 | 0.8159 | 0.7804 | 0.7324 | 0.8422 |
1037
+ | cosine_mrr@50 | 0.8183 | 0.5581 | 0.5168 | 0.8163 | 0.7813 | 0.7335 | 0.8425 |
1038
+ | cosine_mrr@100 | 0.8183 | 0.5581 | 0.5168 | 0.8163 | 0.7814 | 0.7338 | 0.8425 |
1039
+ | cosine_mrr@150 | 0.8183 | 0.5581 | 0.5168 | 0.8163 | 0.7814 | 0.7338 | 0.8425 |
1040
+ | cosine_mrr@200 | 0.8183 | 0.5581 | 0.5168 | 0.8163 | 0.7814 | 0.7338 | 0.8426 |
1041
+ | cosine_map@1 | 0.6667 | 0.1243 | 0.2956 | 0.6796 | 0.7088 | 0.6485 | 0.7667 |
1042
+ | cosine_map@20 | 0.5566 | 0.4841 | 0.3984 | 0.5222 | 0.7071 | 0.6646 | 0.7007 |
1043
+ | cosine_map@50 | 0.5534 | 0.4304 | 0.3603 | 0.5083 | 0.7107 | 0.6684 | 0.7046 |
1044
+ | cosine_map@100 | 0.5852 | 0.4374 | 0.3632 | 0.5372 | 0.7113 | 0.6693 | 0.7054 |
1045
+ | cosine_map@150 | 0.5943 | 0.4527 | 0.3782 | 0.5454 | 0.7114 | 0.6695 | 0.7055 |
1046
+ | cosine_map@200 | 0.5976 | 0.4593 | 0.3863 | 0.5495 | 0.7115 | 0.6696 | 0.7056 |
1047
+ | cosine_map@500 | 0.6016 | 0.472 | 0.3992 | 0.5542 | 0.7116 | 0.6697 | 0.7057 |
1048
+
1049
+ <!--
1050
+ ## Bias, Risks and Limitations
1051
+
1052
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
1053
+ -->
1054
+
1055
+ <!--
1056
+ ### Recommendations
1057
+
1058
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
1059
+ -->
1060
+
1061
+ ## Training Details
1062
+
1063
+ ### Training Datasets
1064
+ <details><summary>full_en</summary>
1065
+
1066
+ #### full_en
1067
+
1068
+ * Dataset: full_en
1069
+ * Size: 28,880 training samples
1070
+ * Columns: <code>anchor</code> and <code>positive</code>
1071
+ * Approximate statistics based on the first 1000 samples:
1072
+ | | anchor | positive |
1073
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
1074
+ | type | string | string |
1075
+ | details | <ul><li>min: 3 tokens</li><li>mean: 5.68 tokens</li><li>max: 11 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 5.76 tokens</li><li>max: 12 tokens</li></ul> |
1076
+ * Samples:
1077
+ | anchor | positive |
1078
+ |:-----------------------------------------|:-----------------------------------------|
1079
+ | <code>air commodore</code> | <code>flight lieutenant</code> |
1080
+ | <code>command and control officer</code> | <code>flight officer</code> |
1081
+ | <code>air commodore</code> | <code>command and control officer</code> |
1082
+ * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
1083
+ ```json
1084
+ {'guide': SentenceTransformer(
1085
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
1086
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
1087
+ (2): Normalize()
1088
+ ), 'temperature': 0.01, 'margin_strategy': 'absolute', 'margin': 0.0}
1089
+ ```
1090
+ </details>
1091
+ <details><summary>full_de</summary>
1092
+
1093
+ #### full_de
1094
+
1095
+ * Dataset: full_de
1096
+ * Size: 23,023 training samples
1097
+ * Columns: <code>anchor</code> and <code>positive</code>
1098
+ * Approximate statistics based on the first 1000 samples:
1099
+ | | anchor | positive |
1100
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
1101
+ | type | string | string |
1102
+ | details | <ul><li>min: 3 tokens</li><li>mean: 7.99 tokens</li><li>max: 30 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 8.19 tokens</li><li>max: 30 tokens</li></ul> |
1103
+ * Samples:
1104
+ | anchor | positive |
1105
+ |:----------------------------------|:-----------------------------------------------------|
1106
+ | <code>Staffelkommandantin</code> | <code>Kommodore</code> |
1107
+ | <code>Luftwaffenoffizierin</code> | <code>Luftwaffenoffizier/Luftwaffenoffizierin</code> |
1108
+ | <code>Staffelkommandantin</code> | <code>Luftwaffenoffizierin</code> |
1109
+ * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
1110
+ ```json
1111
+ {'guide': SentenceTransformer(
1112
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
1113
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
1114
+ (2): Normalize()
1115
+ ), 'temperature': 0.01, 'margin_strategy': 'absolute', 'margin': 0.0}
1116
+ ```
1117
+ </details>
1118
+ <details><summary>full_es</summary>
1119
+
1120
+ #### full_es
1121
+
1122
+ * Dataset: full_es
1123
+ * Size: 20,724 training samples
1124
+ * Columns: <code>anchor</code> and <code>positive</code>
1125
+ * Approximate statistics based on the first 1000 samples:
1126
+ | | anchor | positive |
1127
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
1128
+ | type | string | string |
1129
+ | details | <ul><li>min: 3 tokens</li><li>mean: 9.13 tokens</li><li>max: 32 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 8.84 tokens</li><li>max: 32 tokens</li></ul> |
1130
+ * Samples:
1131
+ | anchor | positive |
1132
+ |:------------------------------------|:-------------------------------------------|
1133
+ | <code>jefe de escuadrón</code> | <code>instructor</code> |
1134
+ | <code>comandante de aeronave</code> | <code>instructor de simulador</code> |
1135
+ | <code>instructor</code> | <code>oficial del Ejército del Aire</code> |
1136
+ * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
1137
+ ```json
1138
+ {'guide': SentenceTransformer(
1139
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
1140
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
1141
+ (2): Normalize()
1142
+ ), 'temperature': 0.01, 'margin_strategy': 'absolute', 'margin': 0.0}
1143
+ ```
1144
+ </details>
1145
+ <details><summary>full_zh</summary>
1146
+
1147
+ #### full_zh
1148
+
1149
+ * Dataset: full_zh
1150
+ * Size: 30,401 training samples
1151
+ * Columns: <code>anchor</code> and <code>positive</code>
1152
+ * Approximate statistics based on the first 1000 samples:
1153
+ | | anchor | positive |
1154
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
1155
+ | type | string | string |
1156
+ | details | <ul><li>min: 5 tokens</li><li>mean: 7.15 tokens</li><li>max: 14 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 7.46 tokens</li><li>max: 21 tokens</li></ul> |
1157
+ * Samples:
1158
+ | anchor | positive |
1159
+ |:------------------|:---------------------|
1160
+ | <code>技术总监</code> | <code>技术和运营总监</code> |
1161
+ | <code>技术总监</code> | <code>技术主管</code> |
1162
+ | <code>技术总监</code> | <code>技术艺术总监</code> |
1163
+ * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
1164
+ ```json
1165
+ {'guide': SentenceTransformer(
1166
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
1167
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
1168
+ (2): Normalize()
1169
+ ), 'temperature': 0.01, 'margin_strategy': 'absolute', 'margin': 0.0}
1170
+ ```
1171
+ </details>
1172
+ <details><summary>mix</summary>
1173
+
1174
+ #### mix
1175
+
1176
+ * Dataset: mix
1177
+ * Size: 21,760 training samples
1178
+ * Columns: <code>anchor</code> and <code>positive</code>
1179
+ * Approximate statistics based on the first 1000 samples:
1180
+ | | anchor | positive |
1181
+ |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
1182
+ | type | string | string |
1183
+ | details | <ul><li>min: 2 tokens</li><li>mean: 6.71 tokens</li><li>max: 19 tokens</li></ul> | <ul><li>min: 2 tokens</li><li>mean: 7.69 tokens</li><li>max: 19 tokens</li></ul> |
1184
+ * Samples:
1185
+ | anchor | positive |
1186
+ |:------------------------------------------|:----------------------------------------------------------------|
1187
+ | <code>technical manager</code> | <code>Technischer Direktor für Bühne, Film und Fernsehen</code> |
1188
+ | <code>head of technical</code> | <code>directora técnica</code> |
1189
+ | <code>head of technical department</code> | <code>技术艺术总监</code> |
1190
+ * Loss: [<code>GISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#gistembedloss) with these parameters:
1191
+ ```json
1192
+ {'guide': SentenceTransformer(
1193
+ (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
1194
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
1195
+ (2): Normalize()
1196
+ ), 'temperature': 0.01, 'margin_strategy': 'absolute', 'margin': 0.0}
1197
+ ```
1198
+ </details>
1199
+
1200
+ ### Training Hyperparameters
1201
+ #### Non-Default Hyperparameters
1202
+
1203
+ - `eval_strategy`: steps
1204
+ - `per_device_train_batch_size`: 128
1205
+ - `per_device_eval_batch_size`: 128
1206
+ - `gradient_accumulation_steps`: 2
1207
+ - `num_train_epochs`: 5
1208
+ - `warmup_ratio`: 0.05
1209
+ - `log_on_each_node`: False
1210
+ - `fp16`: True
1211
+ - `dataloader_num_workers`: 4
1212
+ - `ddp_find_unused_parameters`: True
1213
+ - `batch_sampler`: no_duplicates
1214
+
1215
+ #### All Hyperparameters
1216
+ <details><summary>Click to expand</summary>
1217
+
1218
+ - `overwrite_output_dir`: False
1219
+ - `do_predict`: False
1220
+ - `eval_strategy`: steps
1221
+ - `prediction_loss_only`: True
1222
+ - `per_device_train_batch_size`: 128
1223
+ - `per_device_eval_batch_size`: 128
1224
+ - `per_gpu_train_batch_size`: None
1225
+ - `per_gpu_eval_batch_size`: None
1226
+ - `gradient_accumulation_steps`: 2
1227
+ - `eval_accumulation_steps`: None
1228
+ - `torch_empty_cache_steps`: None
1229
+ - `learning_rate`: 5e-05
1230
+ - `weight_decay`: 0.0
1231
+ - `adam_beta1`: 0.9
1232
+ - `adam_beta2`: 0.999
1233
+ - `adam_epsilon`: 1e-08
1234
+ - `max_grad_norm`: 1.0
1235
+ - `num_train_epochs`: 5
1236
+ - `max_steps`: -1
1237
+ - `lr_scheduler_type`: linear
1238
+ - `lr_scheduler_kwargs`: {}
1239
+ - `warmup_ratio`: 0.05
1240
+ - `warmup_steps`: 0
1241
+ - `log_level`: passive
1242
+ - `log_level_replica`: warning
1243
+ - `log_on_each_node`: False
1244
+ - `logging_nan_inf_filter`: True
1245
+ - `save_safetensors`: True
1246
+ - `save_on_each_node`: False
1247
+ - `save_only_model`: False
1248
+ - `restore_callback_states_from_checkpoint`: False
1249
+ - `no_cuda`: False
1250
+ - `use_cpu`: False
1251
+ - `use_mps_device`: False
1252
+ - `seed`: 42
1253
+ - `data_seed`: None
1254
+ - `jit_mode_eval`: False
1255
+ - `use_ipex`: False
1256
+ - `bf16`: False
1257
+ - `fp16`: True
1258
+ - `fp16_opt_level`: O1
1259
+ - `half_precision_backend`: auto
1260
+ - `bf16_full_eval`: False
1261
+ - `fp16_full_eval`: False
1262
+ - `tf32`: None
1263
+ - `local_rank`: 0
1264
+ - `ddp_backend`: None
1265
+ - `tpu_num_cores`: None
1266
+ - `tpu_metrics_debug`: False
1267
+ - `debug`: []
1268
+ - `dataloader_drop_last`: True
1269
+ - `dataloader_num_workers`: 4
1270
+ - `dataloader_prefetch_factor`: None
1271
+ - `past_index`: -1
1272
+ - `disable_tqdm`: False
1273
+ - `remove_unused_columns`: True
1274
+ - `label_names`: None
1275
+ - `load_best_model_at_end`: False
1276
+ - `ignore_data_skip`: False
1277
+ - `fsdp`: []
1278
+ - `fsdp_min_num_params`: 0
1279
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
1280
+ - `tp_size`: 0
1281
+ - `fsdp_transformer_layer_cls_to_wrap`: None
1282
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
1283
+ - `deepspeed`: None
1284
+ - `label_smoothing_factor`: 0.0
1285
+ - `optim`: adamw_torch
1286
+ - `optim_args`: None
1287
+ - `adafactor`: False
1288
+ - `group_by_length`: False
1289
+ - `length_column_name`: length
1290
+ - `ddp_find_unused_parameters`: True
1291
+ - `ddp_bucket_cap_mb`: None
1292
+ - `ddp_broadcast_buffers`: False
1293
+ - `dataloader_pin_memory`: True
1294
+ - `dataloader_persistent_workers`: False
1295
+ - `skip_memory_metrics`: True
1296
+ - `use_legacy_prediction_loop`: False
1297
+ - `push_to_hub`: False
1298
+ - `resume_from_checkpoint`: None
1299
+ - `hub_model_id`: None
1300
+ - `hub_strategy`: every_save
1301
+ - `hub_private_repo`: None
1302
+ - `hub_always_push`: False
1303
+ - `gradient_checkpointing`: False
1304
+ - `gradient_checkpointing_kwargs`: None
1305
+ - `include_inputs_for_metrics`: False
1306
+ - `include_for_metrics`: []
1307
+ - `eval_do_concat_batches`: True
1308
+ - `fp16_backend`: auto
1309
+ - `push_to_hub_model_id`: None
1310
+ - `push_to_hub_organization`: None
1311
+ - `mp_parameters`:
1312
+ - `auto_find_batch_size`: False
1313
+ - `full_determinism`: False
1314
+ - `torchdynamo`: None
1315
+ - `ray_scope`: last
1316
+ - `ddp_timeout`: 1800
1317
+ - `torch_compile`: False
1318
+ - `torch_compile_backend`: None
1319
+ - `torch_compile_mode`: None
1320
+ - `include_tokens_per_second`: False
1321
+ - `include_num_input_tokens_seen`: False
1322
+ - `neftune_noise_alpha`: None
1323
+ - `optim_target_modules`: None
1324
+ - `batch_eval_metrics`: False
1325
+ - `eval_on_start`: False
1326
+ - `use_liger_kernel`: False
1327
+ - `eval_use_gather_object`: False
1328
+ - `average_tokens_across_devices`: False
1329
+ - `prompts`: None
1330
+ - `batch_sampler`: no_duplicates
1331
+ - `multi_dataset_batch_sampler`: proportional
1332
+
1333
+ </details>
1334
+
1335
+ ### Training Logs
1336
+ | Epoch | Step | Training Loss | full_en_cosine_ndcg@200 | full_es_cosine_ndcg@200 | full_de_cosine_ndcg@200 | full_zh_cosine_ndcg@200 | mix_es_cosine_ndcg@200 | mix_de_cosine_ndcg@200 | mix_zh_cosine_ndcg@200 |
1337
+ |:------:|:----:|:-------------:|:-----------------------:|:-----------------------:|:-----------------------:|:-----------------------:|:----------------------:|:----------------------:|:----------------------:|
1338
+ | -1 | -1 | - | 0.7447 | 0.6125 | 0.5378 | 0.7240 | 0.7029 | 0.6345 | 0.7437 |
1339
+ | 0.0082 | 1 | 4.3088 | - | - | - | - | - | - | - |
1340
+ | 0.8230 | 100 | 1.9026 | - | - | - | - | - | - | - |
1341
+ | 1.6502 | 200 | 0.9336 | 0.8024 | 0.6703 | 0.6109 | 0.7695 | 0.7914 | 0.7594 | 0.8136 |
1342
+ | 2.4774 | 300 | 0.161 | - | - | - | - | - | - | - |
1343
+ | 3.3045 | 400 | 0.1398 | 0.8039 | 0.6717 | 0.6092 | 0.7697 | 0.8012 | 0.7696 | 0.8187 |
1344
+
1345
+
1346
+ ### Framework Versions
1347
+ - Python: 3.11.11
1348
+ - Sentence Transformers: 4.1.0
1349
+ - Transformers: 4.51.2
1350
+ - PyTorch: 2.6.0+cu124
1351
+ - Accelerate: 1.6.0
1352
+ - Datasets: 3.5.0
1353
+ - Tokenizers: 0.21.1
1354
+
1355
+ ## Citation
1356
+
1357
+ ### BibTeX
1358
+
1359
+ #### Sentence Transformers
1360
+ ```bibtex
1361
+ @inproceedings{reimers-2019-sentence-bert,
1362
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
1363
+ author = "Reimers, Nils and Gurevych, Iryna",
1364
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
1365
+ month = "11",
1366
+ year = "2019",
1367
+ publisher = "Association for Computational Linguistics",
1368
+ url = "https://arxiv.org/abs/1908.10084",
1369
+ }
1370
+ ```
1371
+
1372
+ #### GISTEmbedLoss
1373
+ ```bibtex
1374
+ @misc{solatorio2024gistembed,
1375
+ title={GISTEmbed: Guided In-sample Selection of Training Negatives for Text Embedding Fine-tuning},
1376
+ author={Aivin V. Solatorio},
1377
+ year={2024},
1378
+ eprint={2402.16829},
1379
+ archivePrefix={arXiv},
1380
+ primaryClass={cs.LG}
1381
+ }
1382
+ ```
1383
+
1384
+ <!--
1385
+ ## Glossary
1386
+
1387
+ *Clearly define terms in order to be accessible across audiences.*
1388
+ -->
1389
+
1390
+ <!--
1391
+ ## Model Card Authors
1392
+
1393
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
1394
+ -->
1395
+
1396
+ <!--
1397
+ ## Model Card Contact
1398
+
1399
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
1400
+ -->
checkpoint-400/config.json ADDED
@@ -0,0 +1,49 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "NewModel"
4
+ ],
5
+ "attention_probs_dropout_prob": 0.0,
6
+ "auto_map": {
7
+ "AutoConfig": "Alibaba-NLP/new-impl--configuration.NewConfig",
8
+ "AutoModel": "Alibaba-NLP/new-impl--modeling.NewModel",
9
+ "AutoModelForMaskedLM": "Alibaba-NLP/new-impl--modeling.NewForMaskedLM",
10
+ "AutoModelForMultipleChoice": "Alibaba-NLP/new-impl--modeling.NewForMultipleChoice",
11
+ "AutoModelForQuestionAnswering": "Alibaba-NLP/new-impl--modeling.NewForQuestionAnswering",
12
+ "AutoModelForSequenceClassification": "Alibaba-NLP/new-impl--modeling.NewForSequenceClassification",
13
+ "AutoModelForTokenClassification": "Alibaba-NLP/new-impl--modeling.NewForTokenClassification"
14
+ },
15
+ "classifier_dropout": 0.0,
16
+ "hidden_act": "gelu",
17
+ "hidden_dropout_prob": 0.1,
18
+ "hidden_size": 768,
19
+ "id2label": {
20
+ "0": "LABEL_0"
21
+ },
22
+ "initializer_range": 0.02,
23
+ "intermediate_size": 3072,
24
+ "label2id": {
25
+ "LABEL_0": 0
26
+ },
27
+ "layer_norm_eps": 1e-12,
28
+ "layer_norm_type": "layer_norm",
29
+ "logn_attention_clip1": false,
30
+ "logn_attention_scale": false,
31
+ "max_position_embeddings": 8192,
32
+ "model_type": "new",
33
+ "num_attention_heads": 12,
34
+ "num_hidden_layers": 12,
35
+ "pack_qkv": true,
36
+ "pad_token_id": 1,
37
+ "position_embedding_type": "rope",
38
+ "rope_scaling": {
39
+ "factor": 8.0,
40
+ "type": "ntk"
41
+ },
42
+ "rope_theta": 20000,
43
+ "torch_dtype": "float32",
44
+ "transformers_version": "4.51.2",
45
+ "type_vocab_size": 1,
46
+ "unpad_inputs": false,
47
+ "use_memory_efficient_attention": false,
48
+ "vocab_size": 250048
49
+ }
checkpoint-400/config_sentence_transformers.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "__version__": {
3
+ "sentence_transformers": "4.1.0",
4
+ "transformers": "4.51.2",
5
+ "pytorch": "2.6.0+cu124"
6
+ },
7
+ "prompts": {},
8
+ "default_prompt_name": null,
9
+ "similarity_fn_name": "cosine"
10
+ }
checkpoint-400/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:84cab8228685eca48b6209d6aae0921b2fb6c6bdeabab0c5cf7564116101b8dd
3
+ size 1221487872
checkpoint-400/modules.json ADDED
@@ -0,0 +1,20 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ },
14
+ {
15
+ "idx": 2,
16
+ "name": "2",
17
+ "path": "2_Normalize",
18
+ "type": "sentence_transformers.models.Normalize"
19
+ }
20
+ ]
checkpoint-400/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b5fc1a48a09160159b9530c6fbda09eecc540c5b5596d0ea5f59bd2226de9941
3
+ size 2443060986
checkpoint-400/rng_state_0.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0344e55875506625799735a316cda354931058883283953c51ab59ea9e0f9513
3
+ size 15984
checkpoint-400/scaler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:002b6099b44b3d44ff246afe69f1991b8acc7b6b5f748a83f7331047e05a0a74
3
+ size 988
checkpoint-400/scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4d9c9d0b49c5f9c5e2d588479db1eb5e584a17c1703878b85a9f944eeeba9372
3
+ size 1064
checkpoint-400/sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 512,
3
+ "do_lower_case": false
4
+ }
checkpoint-400/special_tokens_map.json ADDED
@@ -0,0 +1,51 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<s>",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "cls_token": {
10
+ "content": "<s>",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "eos_token": {
17
+ "content": "</s>",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "mask_token": {
24
+ "content": "<mask>",
25
+ "lstrip": true,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ },
30
+ "pad_token": {
31
+ "content": "<pad>",
32
+ "lstrip": false,
33
+ "normalized": false,
34
+ "rstrip": false,
35
+ "single_word": false
36
+ },
37
+ "sep_token": {
38
+ "content": "</s>",
39
+ "lstrip": false,
40
+ "normalized": false,
41
+ "rstrip": false,
42
+ "single_word": false
43
+ },
44
+ "unk_token": {
45
+ "content": "<unk>",
46
+ "lstrip": false,
47
+ "normalized": false,
48
+ "rstrip": false,
49
+ "single_word": false
50
+ }
51
+ }
checkpoint-400/tokenizer.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:883b037111086fd4dfebbbc9b7cee11e1517b5e0c0514879478661440f137085
3
+ size 17082987
checkpoint-400/tokenizer_config.json ADDED
@@ -0,0 +1,55 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "<s>",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "1": {
12
+ "content": "<pad>",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "2": {
20
+ "content": "</s>",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "3": {
28
+ "content": "<unk>",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "250001": {
36
+ "content": "<mask>",
37
+ "lstrip": true,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ }
43
+ },
44
+ "bos_token": "<s>",
45
+ "clean_up_tokenization_spaces": true,
46
+ "cls_token": "<s>",
47
+ "eos_token": "</s>",
48
+ "extra_special_tokens": {},
49
+ "mask_token": "<mask>",
50
+ "model_max_length": 8192,
51
+ "pad_token": "<pad>",
52
+ "sep_token": "</s>",
53
+ "tokenizer_class": "XLMRobertaTokenizerFast",
54
+ "unk_token": "<unk>"
55
+ }
checkpoint-400/trainer_state.json ADDED
@@ -0,0 +1,603 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "best_global_step": null,
3
+ "best_metric": null,
4
+ "best_model_checkpoint": null,
5
+ "epoch": 3.3045267489711936,
6
+ "eval_steps": 200,
7
+ "global_step": 400,
8
+ "is_hyper_param_search": false,
9
+ "is_local_process_zero": true,
10
+ "is_world_process_zero": true,
11
+ "log_history": [
12
+ {
13
+ "epoch": 0.00823045267489712,
14
+ "grad_norm": Infinity,
15
+ "learning_rate": 0.0,
16
+ "loss": 4.3088,
17
+ "step": 1
18
+ },
19
+ {
20
+ "epoch": 0.823045267489712,
21
+ "grad_norm": 3.1721928119659424,
22
+ "learning_rate": 4.4163763066202094e-05,
23
+ "loss": 1.9026,
24
+ "step": 100
25
+ },
26
+ {
27
+ "epoch": 1.6502057613168724,
28
+ "grad_norm": 2.9604015350341797,
29
+ "learning_rate": 3.5452961672473864e-05,
30
+ "loss": 0.9336,
31
+ "step": 200
32
+ },
33
+ {
34
+ "epoch": 1.6502057613168724,
35
+ "eval_full_de_cosine_accuracy@1": 0.2955665024630542,
36
+ "eval_full_de_cosine_accuracy@100": 0.9901477832512315,
37
+ "eval_full_de_cosine_accuracy@150": 0.9901477832512315,
38
+ "eval_full_de_cosine_accuracy@20": 0.9852216748768473,
39
+ "eval_full_de_cosine_accuracy@200": 0.9901477832512315,
40
+ "eval_full_de_cosine_accuracy@50": 0.9852216748768473,
41
+ "eval_full_de_cosine_map@1": 0.2955665024630542,
42
+ "eval_full_de_cosine_map@100": 0.3618162919366333,
43
+ "eval_full_de_cosine_map@150": 0.37673093284239206,
44
+ "eval_full_de_cosine_map@20": 0.3952556219642319,
45
+ "eval_full_de_cosine_map@200": 0.3850375691141728,
46
+ "eval_full_de_cosine_map@50": 0.3565895386599598,
47
+ "eval_full_de_cosine_map@500": 0.3976475131909832,
48
+ "eval_full_de_cosine_mrr@1": 0.2955665024630542,
49
+ "eval_full_de_cosine_mrr@100": 0.5141580130059386,
50
+ "eval_full_de_cosine_mrr@150": 0.5141580130059386,
51
+ "eval_full_de_cosine_mrr@20": 0.5140785596450613,
52
+ "eval_full_de_cosine_mrr@200": 0.5141580130059386,
53
+ "eval_full_de_cosine_mrr@50": 0.5140785596450613,
54
+ "eval_full_de_cosine_ndcg@1": 0.2955665024630542,
55
+ "eval_full_de_cosine_ndcg@100": 0.5568512483442096,
56
+ "eval_full_de_cosine_ndcg@150": 0.5891923427833955,
57
+ "eval_full_de_cosine_ndcg@20": 0.5342874353496432,
58
+ "eval_full_de_cosine_ndcg@200": 0.6108910915140433,
59
+ "eval_full_de_cosine_ndcg@50": 0.5251712513704461,
60
+ "eval_full_de_cosine_precision@1": 0.2955665024630542,
61
+ "eval_full_de_cosine_precision@100": 0.24177339901477832,
62
+ "eval_full_de_cosine_precision@150": 0.18187192118226603,
63
+ "eval_full_de_cosine_precision@20": 0.5073891625615764,
64
+ "eval_full_de_cosine_precision@200": 0.1470689655172414,
65
+ "eval_full_de_cosine_precision@50": 0.3681773399014779,
66
+ "eval_full_de_cosine_recall@1": 0.01108543831680986,
67
+ "eval_full_de_cosine_recall@100": 0.6244233924508789,
68
+ "eval_full_de_cosine_recall@150": 0.687486468465792,
69
+ "eval_full_de_cosine_recall@20": 0.31887986522832873,
70
+ "eval_full_de_cosine_recall@200": 0.7334854348170513,
71
+ "eval_full_de_cosine_recall@50": 0.5004342335550164,
72
+ "eval_full_en_cosine_accuracy@1": 0.6571428571428571,
73
+ "eval_full_en_cosine_accuracy@100": 0.9904761904761905,
74
+ "eval_full_en_cosine_accuracy@150": 0.9904761904761905,
75
+ "eval_full_en_cosine_accuracy@20": 0.9904761904761905,
76
+ "eval_full_en_cosine_accuracy@200": 0.9904761904761905,
77
+ "eval_full_en_cosine_accuracy@50": 0.9904761904761905,
78
+ "eval_full_en_cosine_map@1": 0.6571428571428571,
79
+ "eval_full_en_cosine_map@100": 0.5799091076338031,
80
+ "eval_full_en_cosine_map@150": 0.5895042547793764,
81
+ "eval_full_en_cosine_map@20": 0.5516314386214587,
82
+ "eval_full_en_cosine_map@200": 0.5930550248640567,
83
+ "eval_full_en_cosine_map@50": 0.5474217433291914,
84
+ "eval_full_en_cosine_map@500": 0.5967311945998978,
85
+ "eval_full_en_cosine_mrr@1": 0.6571428571428571,
86
+ "eval_full_en_cosine_mrr@100": 0.8111111111111111,
87
+ "eval_full_en_cosine_mrr@150": 0.8111111111111111,
88
+ "eval_full_en_cosine_mrr@20": 0.8111111111111111,
89
+ "eval_full_en_cosine_mrr@200": 0.8111111111111111,
90
+ "eval_full_en_cosine_mrr@50": 0.8111111111111111,
91
+ "eval_full_en_cosine_ndcg@1": 0.6571428571428571,
92
+ "eval_full_en_cosine_ndcg@100": 0.7690946845916871,
93
+ "eval_full_en_cosine_ndcg@150": 0.7923061459636489,
94
+ "eval_full_en_cosine_ndcg@20": 0.6923506957704934,
95
+ "eval_full_en_cosine_ndcg@200": 0.8023952171736648,
96
+ "eval_full_en_cosine_ndcg@50": 0.7170311913169547,
97
+ "eval_full_en_cosine_precision@1": 0.6571428571428571,
98
+ "eval_full_en_cosine_precision@100": 0.18971428571428572,
99
+ "eval_full_en_cosine_precision@150": 0.13619047619047617,
100
+ "eval_full_en_cosine_precision@20": 0.5133333333333332,
101
+ "eval_full_en_cosine_precision@200": 0.10561904761904761,
102
+ "eval_full_en_cosine_precision@50": 0.3169523809523809,
103
+ "eval_full_en_cosine_recall@1": 0.06695957251887064,
104
+ "eval_full_en_cosine_recall@100": 0.8467073011936345,
105
+ "eval_full_en_cosine_recall@150": 0.9010846211520122,
106
+ "eval_full_en_cosine_recall@20": 0.5478306503729546,
107
+ "eval_full_en_cosine_recall@200": 0.9256595392715059,
108
+ "eval_full_en_cosine_recall@50": 0.7470276357469449,
109
+ "eval_full_es_cosine_accuracy@1": 0.11891891891891893,
110
+ "eval_full_es_cosine_accuracy@100": 1.0,
111
+ "eval_full_es_cosine_accuracy@150": 1.0,
112
+ "eval_full_es_cosine_accuracy@20": 1.0,
113
+ "eval_full_es_cosine_accuracy@200": 1.0,
114
+ "eval_full_es_cosine_accuracy@50": 1.0,
115
+ "eval_full_es_cosine_map@1": 0.11891891891891893,
116
+ "eval_full_es_cosine_map@100": 0.43522297182400527,
117
+ "eval_full_es_cosine_map@150": 0.4511056582755023,
118
+ "eval_full_es_cosine_map@20": 0.4839539531842883,
119
+ "eval_full_es_cosine_map@200": 0.45802493743471273,
120
+ "eval_full_es_cosine_map@50": 0.4288206349412292,
121
+ "eval_full_es_cosine_map@500": 0.47075604946048677,
122
+ "eval_full_es_cosine_mrr@1": 0.11891891891891893,
123
+ "eval_full_es_cosine_mrr@100": 0.5554054054054054,
124
+ "eval_full_es_cosine_mrr@150": 0.5554054054054054,
125
+ "eval_full_es_cosine_mrr@20": 0.5554054054054054,
126
+ "eval_full_es_cosine_mrr@200": 0.5554054054054054,
127
+ "eval_full_es_cosine_mrr@50": 0.5554054054054054,
128
+ "eval_full_es_cosine_ndcg@1": 0.11891891891891893,
129
+ "eval_full_es_cosine_ndcg@100": 0.6196114606257926,
130
+ "eval_full_es_cosine_ndcg@150": 0.6530674955405338,
131
+ "eval_full_es_cosine_ndcg@20": 0.6158554243812342,
132
+ "eval_full_es_cosine_ndcg@200": 0.670287400819268,
133
+ "eval_full_es_cosine_ndcg@50": 0.5886857089260162,
134
+ "eval_full_es_cosine_precision@1": 0.11891891891891893,
135
+ "eval_full_es_cosine_precision@100": 0.2541621621621622,
136
+ "eval_full_es_cosine_precision@150": 0.19225225225225226,
137
+ "eval_full_es_cosine_precision@20": 0.5767567567567567,
138
+ "eval_full_es_cosine_precision@200": 0.15264864864864866,
139
+ "eval_full_es_cosine_precision@50": 0.3907027027027027,
140
+ "eval_full_es_cosine_recall@1": 0.0035436931012884127,
141
+ "eval_full_es_cosine_recall@100": 0.6836436316189977,
142
+ "eval_full_es_cosine_recall@150": 0.7496865406970199,
143
+ "eval_full_es_cosine_recall@20": 0.3862419782331355,
144
+ "eval_full_es_cosine_recall@200": 0.7852629043380305,
145
+ "eval_full_es_cosine_recall@50": 0.5625768407738393,
146
+ "eval_full_zh_cosine_accuracy@1": 0.6601941747572816,
147
+ "eval_full_zh_cosine_accuracy@100": 0.9902912621359223,
148
+ "eval_full_zh_cosine_accuracy@150": 0.9902912621359223,
149
+ "eval_full_zh_cosine_accuracy@20": 0.9902912621359223,
150
+ "eval_full_zh_cosine_accuracy@200": 0.9902912621359223,
151
+ "eval_full_zh_cosine_accuracy@50": 0.9902912621359223,
152
+ "eval_full_zh_cosine_map@1": 0.6601941747572816,
153
+ "eval_full_zh_cosine_map@100": 0.5346277197767966,
154
+ "eval_full_zh_cosine_map@150": 0.5441006347287816,
155
+ "eval_full_zh_cosine_map@20": 0.5176817014415404,
156
+ "eval_full_zh_cosine_map@200": 0.547804939644668,
157
+ "eval_full_zh_cosine_map@50": 0.5050961591489588,
158
+ "eval_full_zh_cosine_map@500": 0.5524877228701637,
159
+ "eval_full_zh_cosine_mrr@1": 0.6601941747572816,
160
+ "eval_full_zh_cosine_mrr@100": 0.8068423485899215,
161
+ "eval_full_zh_cosine_mrr@150": 0.8068423485899215,
162
+ "eval_full_zh_cosine_mrr@20": 0.8068423485899215,
163
+ "eval_full_zh_cosine_mrr@200": 0.8068423485899215,
164
+ "eval_full_zh_cosine_mrr@50": 0.8068423485899215,
165
+ "eval_full_zh_cosine_ndcg@1": 0.6601941747572816,
166
+ "eval_full_zh_cosine_ndcg@100": 0.7344118850318737,
167
+ "eval_full_zh_cosine_ndcg@150": 0.7580048379992059,
168
+ "eval_full_zh_cosine_ndcg@20": 0.6629898844211244,
169
+ "eval_full_zh_cosine_ndcg@200": 0.769464510105362,
170
+ "eval_full_zh_cosine_ndcg@50": 0.682216395408567,
171
+ "eval_full_zh_cosine_precision@1": 0.6601941747572816,
172
+ "eval_full_zh_cosine_precision@100": 0.17902912621359218,
173
+ "eval_full_zh_cosine_precision@150": 0.12912621359223303,
174
+ "eval_full_zh_cosine_precision@20": 0.4868932038834952,
175
+ "eval_full_zh_cosine_precision@200": 0.10063106796116503,
176
+ "eval_full_zh_cosine_precision@50": 0.2959223300970874,
177
+ "eval_full_zh_cosine_recall@1": 0.06669332811942774,
178
+ "eval_full_zh_cosine_recall@100": 0.813864097315397,
179
+ "eval_full_zh_cosine_recall@150": 0.8683619147921042,
180
+ "eval_full_zh_cosine_recall@20": 0.52040897323663,
181
+ "eval_full_zh_cosine_recall@200": 0.8964210248615742,
182
+ "eval_full_zh_cosine_recall@50": 0.7067236634036261,
183
+ "eval_mix_de_cosine_accuracy@1": 0.642225689027561,
184
+ "eval_mix_de_cosine_accuracy@100": 0.9771190847633905,
185
+ "eval_mix_de_cosine_accuracy@150": 0.984919396775871,
186
+ "eval_mix_de_cosine_accuracy@20": 0.9240769630785232,
187
+ "eval_mix_de_cosine_accuracy@200": 0.9901196047841914,
188
+ "eval_mix_de_cosine_accuracy@50": 0.9635985439417577,
189
+ "eval_mix_de_cosine_map@1": 0.642225689027561,
190
+ "eval_mix_de_cosine_map@100": 0.6570111325791598,
191
+ "eval_mix_de_cosine_map@150": 0.6572712744212402,
192
+ "eval_mix_de_cosine_map@20": 0.6521189972338849,
193
+ "eval_mix_de_cosine_map@200": 0.6574012324541948,
194
+ "eval_mix_de_cosine_map@50": 0.6561813596290409,
195
+ "eval_mix_de_cosine_map@500": 0.6575399010277455,
196
+ "eval_mix_de_cosine_mrr@1": 0.642225689027561,
197
+ "eval_mix_de_cosine_mrr@100": 0.7262168772880452,
198
+ "eval_mix_de_cosine_mrr@150": 0.7262822017289415,
199
+ "eval_mix_de_cosine_mrr@20": 0.7246816496840639,
200
+ "eval_mix_de_cosine_mrr@200": 0.7263128860080087,
201
+ "eval_mix_de_cosine_mrr@50": 0.7260235454700952,
202
+ "eval_mix_de_cosine_ndcg@1": 0.642225689027561,
203
+ "eval_mix_de_cosine_ndcg@100": 0.7547612967303503,
204
+ "eval_mix_de_cosine_ndcg@150": 0.7575184392863841,
205
+ "eval_mix_de_cosine_ndcg@20": 0.7332013199323174,
206
+ "eval_mix_de_cosine_ndcg@200": 0.7593986816807992,
207
+ "eval_mix_de_cosine_ndcg@50": 0.7490333180034867,
208
+ "eval_mix_de_cosine_precision@1": 0.642225689027561,
209
+ "eval_mix_de_cosine_precision@100": 0.0261622464898596,
210
+ "eval_mix_de_cosine_precision@150": 0.01770844167100017,
211
+ "eval_mix_de_cosine_precision@20": 0.11911076443057722,
212
+ "eval_mix_de_cosine_precision@200": 0.013424336973478942,
213
+ "eval_mix_de_cosine_precision@50": 0.05086843473738951,
214
+ "eval_mix_de_cosine_recall@1": 0.2405616224648986,
215
+ "eval_mix_de_cosine_recall@100": 0.9480412549835328,
216
+ "eval_mix_de_cosine_recall@150": 0.9618651412723176,
217
+ "eval_mix_de_cosine_recall@20": 0.8650459351707401,
218
+ "eval_mix_de_cosine_recall@200": 0.9720922170220142,
219
+ "eval_mix_de_cosine_recall@50": 0.9226295718495406,
220
+ "eval_mix_es_cosine_accuracy@1": 0.7009880395215808,
221
+ "eval_mix_es_cosine_accuracy@100": 0.9885595423816953,
222
+ "eval_mix_es_cosine_accuracy@150": 0.9921996879875195,
223
+ "eval_mix_es_cosine_accuracy@20": 0.9474778991159646,
224
+ "eval_mix_es_cosine_accuracy@200": 0.9932397295891836,
225
+ "eval_mix_es_cosine_accuracy@50": 0.9776391055642226,
226
+ "eval_mix_es_cosine_map@1": 0.7009880395215808,
227
+ "eval_mix_es_cosine_map@100": 0.6984889579958145,
228
+ "eval_mix_es_cosine_map@150": 0.6986621032108891,
229
+ "eval_mix_es_cosine_map@20": 0.6938173897965141,
230
+ "eval_mix_es_cosine_map@200": 0.6987465392575996,
231
+ "eval_mix_es_cosine_map@50": 0.6978248868009254,
232
+ "eval_mix_es_cosine_map@500": 0.6988876342368443,
233
+ "eval_mix_es_cosine_mrr@1": 0.7009880395215808,
234
+ "eval_mix_es_cosine_mrr@100": 0.7724347923967887,
235
+ "eval_mix_es_cosine_mrr@150": 0.7724644404043258,
236
+ "eval_mix_es_cosine_mrr@20": 0.7712491671812917,
237
+ "eval_mix_es_cosine_mrr@200": 0.7724705526191206,
238
+ "eval_mix_es_cosine_mrr@50": 0.7722842539435679,
239
+ "eval_mix_es_cosine_ndcg@1": 0.7009880395215808,
240
+ "eval_mix_es_cosine_ndcg@100": 0.7884317468596705,
241
+ "eval_mix_es_cosine_ndcg@150": 0.7902844804245556,
242
+ "eval_mix_es_cosine_ndcg@20": 0.7690336236998598,
243
+ "eval_mix_es_cosine_ndcg@200": 0.7913994944724545,
244
+ "eval_mix_es_cosine_ndcg@50": 0.7838732562697655,
245
+ "eval_mix_es_cosine_precision@1": 0.7009880395215808,
246
+ "eval_mix_es_cosine_precision@100": 0.02598543941757671,
247
+ "eval_mix_es_cosine_precision@150": 0.017493499739989597,
248
+ "eval_mix_es_cosine_precision@20": 0.11968278731149247,
249
+ "eval_mix_es_cosine_precision@200": 0.013198127925117008,
250
+ "eval_mix_es_cosine_precision@50": 0.05085803432137287,
251
+ "eval_mix_es_cosine_recall@1": 0.27067577941212884,
252
+ "eval_mix_es_cosine_recall@100": 0.9599497313225862,
253
+ "eval_mix_es_cosine_recall@150": 0.9695527821112844,
254
+ "eval_mix_es_cosine_recall@20": 0.8850840700294678,
255
+ "eval_mix_es_cosine_recall@200": 0.9758970358814353,
256
+ "eval_mix_es_cosine_recall@50": 0.9390968972092216,
257
+ "eval_mix_zh_cosine_accuracy@1": 0.7713987473903967,
258
+ "eval_mix_zh_cosine_accuracy@100": 0.9947807933194155,
259
+ "eval_mix_zh_cosine_accuracy@150": 0.9963465553235908,
260
+ "eval_mix_zh_cosine_accuracy@20": 0.9806889352818372,
261
+ "eval_mix_zh_cosine_accuracy@200": 0.9973903966597077,
262
+ "eval_mix_zh_cosine_accuracy@50": 0.9916492693110647,
263
+ "eval_mix_zh_cosine_map@1": 0.7713987473903967,
264
+ "eval_mix_zh_cosine_map@100": 0.6981100717727863,
265
+ "eval_mix_zh_cosine_map@150": 0.6982601227257159,
266
+ "eval_mix_zh_cosine_map@20": 0.6929147114308428,
267
+ "eval_mix_zh_cosine_map@200": 0.6983171494463136,
268
+ "eval_mix_zh_cosine_map@50": 0.6972607407491801,
269
+ "eval_mix_zh_cosine_map@500": 0.6984116893552017,
270
+ "eval_mix_zh_cosine_mrr@1": 0.7713987473903967,
271
+ "eval_mix_zh_cosine_mrr@100": 0.8436803457907981,
272
+ "eval_mix_zh_cosine_mrr@150": 0.8436922193976949,
273
+ "eval_mix_zh_cosine_mrr@20": 0.8432785828350186,
274
+ "eval_mix_zh_cosine_mrr@200": 0.8436986631082636,
275
+ "eval_mix_zh_cosine_mrr@50": 0.8436385628906108,
276
+ "eval_mix_zh_cosine_ndcg@1": 0.7713987473903967,
277
+ "eval_mix_zh_cosine_ndcg@100": 0.8115576206060865,
278
+ "eval_mix_zh_cosine_ndcg@150": 0.8129087269558002,
279
+ "eval_mix_zh_cosine_ndcg@20": 0.7926986810043013,
280
+ "eval_mix_zh_cosine_ndcg@200": 0.8135973837485255,
281
+ "eval_mix_zh_cosine_ndcg@50": 0.8066848794942646,
282
+ "eval_mix_zh_cosine_precision@1": 0.7713987473903967,
283
+ "eval_mix_zh_cosine_precision@100": 0.02944676409185805,
284
+ "eval_mix_zh_cosine_precision@150": 0.0197633959638135,
285
+ "eval_mix_zh_cosine_precision@20": 0.13656054279749477,
286
+ "eval_mix_zh_cosine_precision@200": 0.014877348643006268,
287
+ "eval_mix_zh_cosine_precision@50": 0.05762004175365346,
288
+ "eval_mix_zh_cosine_recall@1": 0.2585731683069888,
289
+ "eval_mix_zh_cosine_recall@100": 0.9715031315240084,
290
+ "eval_mix_zh_cosine_recall@150": 0.9781576200417537,
291
+ "eval_mix_zh_cosine_recall@20": 0.9014352818371607,
292
+ "eval_mix_zh_cosine_recall@200": 0.9818110647181629,
293
+ "eval_mix_zh_cosine_recall@50": 0.950347947112039,
294
+ "eval_runtime": 11.3241,
295
+ "eval_samples_per_second": 0.0,
296
+ "eval_sequential_score": 0.8135973837485255,
297
+ "eval_steps_per_second": 0.0,
298
+ "step": 200
299
+ },
300
+ {
301
+ "epoch": 2.477366255144033,
302
+ "grad_norm": 2.7903037071228027,
303
+ "learning_rate": 2.6742160278745648e-05,
304
+ "loss": 0.161,
305
+ "step": 300
306
+ },
307
+ {
308
+ "epoch": 3.3045267489711936,
309
+ "grad_norm": 2.511436700820923,
310
+ "learning_rate": 1.803135888501742e-05,
311
+ "loss": 0.1398,
312
+ "step": 400
313
+ },
314
+ {
315
+ "epoch": 3.3045267489711936,
316
+ "eval_full_de_cosine_accuracy@1": 0.2955665024630542,
317
+ "eval_full_de_cosine_accuracy@100": 0.9901477832512315,
318
+ "eval_full_de_cosine_accuracy@150": 0.9901477832512315,
319
+ "eval_full_de_cosine_accuracy@20": 0.9753694581280788,
320
+ "eval_full_de_cosine_accuracy@200": 0.9901477832512315,
321
+ "eval_full_de_cosine_accuracy@50": 0.9852216748768473,
322
+ "eval_full_de_cosine_map@1": 0.2955665024630542,
323
+ "eval_full_de_cosine_map@100": 0.3632259128424842,
324
+ "eval_full_de_cosine_map@150": 0.37822275477623696,
325
+ "eval_full_de_cosine_map@20": 0.398398563122481,
326
+ "eval_full_de_cosine_map@200": 0.3863148456840816,
327
+ "eval_full_de_cosine_map@50": 0.36032758502543594,
328
+ "eval_full_de_cosine_map@500": 0.399227009561676,
329
+ "eval_full_de_cosine_mrr@1": 0.2955665024630542,
330
+ "eval_full_de_cosine_mrr@100": 0.5168213657719442,
331
+ "eval_full_de_cosine_mrr@150": 0.5168213657719442,
332
+ "eval_full_de_cosine_mrr@20": 0.5164773875147672,
333
+ "eval_full_de_cosine_mrr@200": 0.5168213657719442,
334
+ "eval_full_de_cosine_mrr@50": 0.5167647438366063,
335
+ "eval_full_de_cosine_ndcg@1": 0.2955665024630542,
336
+ "eval_full_de_cosine_ndcg@100": 0.5551941695921919,
337
+ "eval_full_de_cosine_ndcg@150": 0.5887611959940118,
338
+ "eval_full_de_cosine_ndcg@20": 0.537849085734973,
339
+ "eval_full_de_cosine_ndcg@200": 0.6092219717029682,
340
+ "eval_full_de_cosine_ndcg@50": 0.5288037060639387,
341
+ "eval_full_de_cosine_precision@1": 0.2955665024630542,
342
+ "eval_full_de_cosine_precision@100": 0.23965517241379314,
343
+ "eval_full_de_cosine_precision@150": 0.1807881773399015,
344
+ "eval_full_de_cosine_precision@20": 0.5103448275862069,
345
+ "eval_full_de_cosine_precision@200": 0.1461576354679803,
346
+ "eval_full_de_cosine_precision@50": 0.36935960591133016,
347
+ "eval_full_de_cosine_recall@1": 0.01108543831680986,
348
+ "eval_full_de_cosine_recall@100": 0.6172666777909689,
349
+ "eval_full_de_cosine_recall@150": 0.6848138831682932,
350
+ "eval_full_de_cosine_recall@20": 0.3207974783481294,
351
+ "eval_full_de_cosine_recall@200": 0.7253195006357535,
352
+ "eval_full_de_cosine_recall@50": 0.5042046446720455,
353
+ "eval_full_en_cosine_accuracy@1": 0.6666666666666666,
354
+ "eval_full_en_cosine_accuracy@100": 0.9904761904761905,
355
+ "eval_full_en_cosine_accuracy@150": 0.9904761904761905,
356
+ "eval_full_en_cosine_accuracy@20": 0.9904761904761905,
357
+ "eval_full_en_cosine_accuracy@200": 0.9904761904761905,
358
+ "eval_full_en_cosine_accuracy@50": 0.9904761904761905,
359
+ "eval_full_en_cosine_map@1": 0.6666666666666666,
360
+ "eval_full_en_cosine_map@100": 0.5852249415484134,
361
+ "eval_full_en_cosine_map@150": 0.5943042662925763,
362
+ "eval_full_en_cosine_map@20": 0.5566401101002375,
363
+ "eval_full_en_cosine_map@200": 0.5975837437975446,
364
+ "eval_full_en_cosine_map@50": 0.55344017265156,
365
+ "eval_full_en_cosine_map@500": 0.6015742986218369,
366
+ "eval_full_en_cosine_mrr@1": 0.6666666666666666,
367
+ "eval_full_en_cosine_mrr@100": 0.8182539682539683,
368
+ "eval_full_en_cosine_mrr@150": 0.8182539682539683,
369
+ "eval_full_en_cosine_mrr@20": 0.8182539682539683,
370
+ "eval_full_en_cosine_mrr@200": 0.8182539682539683,
371
+ "eval_full_en_cosine_mrr@50": 0.8182539682539683,
372
+ "eval_full_en_cosine_ndcg@1": 0.6666666666666666,
373
+ "eval_full_en_cosine_ndcg@100": 0.7732532874348539,
374
+ "eval_full_en_cosine_ndcg@150": 0.7947334799125039,
375
+ "eval_full_en_cosine_ndcg@20": 0.6952098522285352,
376
+ "eval_full_en_cosine_ndcg@200": 0.8038564389556094,
377
+ "eval_full_en_cosine_ndcg@50": 0.7229572913271685,
378
+ "eval_full_en_cosine_precision@1": 0.6666666666666666,
379
+ "eval_full_en_cosine_precision@100": 0.19047619047619047,
380
+ "eval_full_en_cosine_precision@150": 0.1361904761904762,
381
+ "eval_full_en_cosine_precision@20": 0.5147619047619048,
382
+ "eval_full_en_cosine_precision@200": 0.10542857142857143,
383
+ "eval_full_en_cosine_precision@50": 0.31999999999999995,
384
+ "eval_full_en_cosine_recall@1": 0.06854687410617222,
385
+ "eval_full_en_cosine_recall@100": 0.8503209224897438,
386
+ "eval_full_en_cosine_recall@150": 0.8994749092946579,
387
+ "eval_full_en_cosine_recall@20": 0.5491240579458434,
388
+ "eval_full_en_cosine_recall@200": 0.9207884118691805,
389
+ "eval_full_en_cosine_recall@50": 0.7553654907661455,
390
+ "eval_full_es_cosine_accuracy@1": 0.12432432432432433,
391
+ "eval_full_es_cosine_accuracy@100": 1.0,
392
+ "eval_full_es_cosine_accuracy@150": 1.0,
393
+ "eval_full_es_cosine_accuracy@20": 1.0,
394
+ "eval_full_es_cosine_accuracy@200": 1.0,
395
+ "eval_full_es_cosine_accuracy@50": 1.0,
396
+ "eval_full_es_cosine_map@1": 0.12432432432432433,
397
+ "eval_full_es_cosine_map@100": 0.43735327570764515,
398
+ "eval_full_es_cosine_map@150": 0.45269435912524697,
399
+ "eval_full_es_cosine_map@20": 0.48407152706202555,
400
+ "eval_full_es_cosine_map@200": 0.45930097680668164,
401
+ "eval_full_es_cosine_map@50": 0.43043374125481026,
402
+ "eval_full_es_cosine_map@500": 0.47204219228541466,
403
+ "eval_full_es_cosine_mrr@1": 0.12432432432432433,
404
+ "eval_full_es_cosine_mrr@100": 0.5581081081081081,
405
+ "eval_full_es_cosine_mrr@150": 0.5581081081081081,
406
+ "eval_full_es_cosine_mrr@20": 0.5581081081081081,
407
+ "eval_full_es_cosine_mrr@200": 0.5581081081081081,
408
+ "eval_full_es_cosine_mrr@50": 0.5581081081081081,
409
+ "eval_full_es_cosine_ndcg@1": 0.12432432432432433,
410
+ "eval_full_es_cosine_ndcg@100": 0.62350509928888,
411
+ "eval_full_es_cosine_ndcg@150": 0.6556716735369459,
412
+ "eval_full_es_cosine_ndcg@20": 0.6168674053047035,
413
+ "eval_full_es_cosine_ndcg@200": 0.6716557949894583,
414
+ "eval_full_es_cosine_ndcg@50": 0.5913690595071309,
415
+ "eval_full_es_cosine_precision@1": 0.12432432432432433,
416
+ "eval_full_es_cosine_precision@100": 0.2565945945945946,
417
+ "eval_full_es_cosine_precision@150": 0.19282882882882882,
418
+ "eval_full_es_cosine_precision@20": 0.575945945945946,
419
+ "eval_full_es_cosine_precision@200": 0.1527837837837838,
420
+ "eval_full_es_cosine_precision@50": 0.3923243243243244,
421
+ "eval_full_es_cosine_recall@1": 0.0036138931714884822,
422
+ "eval_full_es_cosine_recall@100": 0.6898678629281393,
423
+ "eval_full_es_cosine_recall@150": 0.7540209165372845,
424
+ "eval_full_es_cosine_recall@20": 0.3852888120551914,
425
+ "eval_full_es_cosine_recall@200": 0.7858170054407897,
426
+ "eval_full_es_cosine_recall@50": 0.5659574514538841,
427
+ "eval_full_zh_cosine_accuracy@1": 0.6796116504854369,
428
+ "eval_full_zh_cosine_accuracy@100": 0.9902912621359223,
429
+ "eval_full_zh_cosine_accuracy@150": 0.9902912621359223,
430
+ "eval_full_zh_cosine_accuracy@20": 0.9805825242718447,
431
+ "eval_full_zh_cosine_accuracy@200": 0.9902912621359223,
432
+ "eval_full_zh_cosine_accuracy@50": 0.9902912621359223,
433
+ "eval_full_zh_cosine_map@1": 0.6796116504854369,
434
+ "eval_full_zh_cosine_map@100": 0.5371705298206915,
435
+ "eval_full_zh_cosine_map@150": 0.5454012672534121,
436
+ "eval_full_zh_cosine_map@20": 0.522177160195635,
437
+ "eval_full_zh_cosine_map@200": 0.5494570875591636,
438
+ "eval_full_zh_cosine_map@50": 0.5082601209392789,
439
+ "eval_full_zh_cosine_map@500": 0.5542116087189223,
440
+ "eval_full_zh_cosine_mrr@1": 0.6796116504854369,
441
+ "eval_full_zh_cosine_mrr@100": 0.816279724215562,
442
+ "eval_full_zh_cosine_mrr@150": 0.816279724215562,
443
+ "eval_full_zh_cosine_mrr@20": 0.8158576051779936,
444
+ "eval_full_zh_cosine_mrr@200": 0.816279724215562,
445
+ "eval_full_zh_cosine_mrr@50": 0.816279724215562,
446
+ "eval_full_zh_cosine_ndcg@1": 0.6796116504854369,
447
+ "eval_full_zh_cosine_ndcg@100": 0.7378907298421352,
448
+ "eval_full_zh_cosine_ndcg@150": 0.7576651805692517,
449
+ "eval_full_zh_cosine_ndcg@20": 0.6680745295820606,
450
+ "eval_full_zh_cosine_ndcg@200": 0.7696718049970358,
451
+ "eval_full_zh_cosine_ndcg@50": 0.6856578240865067,
452
+ "eval_full_zh_cosine_precision@1": 0.6796116504854369,
453
+ "eval_full_zh_cosine_precision@100": 0.17883495145631062,
454
+ "eval_full_zh_cosine_precision@150": 0.12776699029126212,
455
+ "eval_full_zh_cosine_precision@20": 0.488349514563107,
456
+ "eval_full_zh_cosine_precision@200": 0.09990291262135924,
457
+ "eval_full_zh_cosine_precision@50": 0.29631067961165053,
458
+ "eval_full_zh_cosine_recall@1": 0.06931865009287731,
459
+ "eval_full_zh_cosine_recall@100": 0.8169166539243944,
460
+ "eval_full_zh_cosine_recall@150": 0.8613232254521018,
461
+ "eval_full_zh_cosine_recall@20": 0.5250914458143515,
462
+ "eval_full_zh_cosine_recall@200": 0.8898175710074696,
463
+ "eval_full_zh_cosine_recall@50": 0.7082715439925011,
464
+ "eval_mix_de_cosine_accuracy@1": 0.6484659386375455,
465
+ "eval_mix_de_cosine_accuracy@100": 0.984919396775871,
466
+ "eval_mix_de_cosine_accuracy@150": 0.9885595423816953,
467
+ "eval_mix_de_cosine_accuracy@20": 0.9323972958918356,
468
+ "eval_mix_de_cosine_accuracy@200": 0.9937597503900156,
469
+ "eval_mix_de_cosine_accuracy@50": 0.968278731149246,
470
+ "eval_mix_de_cosine_map@1": 0.6484659386375455,
471
+ "eval_mix_de_cosine_map@100": 0.6692634410264182,
472
+ "eval_mix_de_cosine_map@150": 0.669518875077899,
473
+ "eval_mix_de_cosine_map@20": 0.6646138211839377,
474
+ "eval_mix_de_cosine_map@200": 0.6696171599377958,
475
+ "eval_mix_de_cosine_map@50": 0.6683657128313888,
476
+ "eval_mix_de_cosine_map@500": 0.6697127210085475,
477
+ "eval_mix_de_cosine_mrr@1": 0.6484659386375455,
478
+ "eval_mix_de_cosine_mrr@100": 0.733776247038599,
479
+ "eval_mix_de_cosine_mrr@150": 0.7338087409764548,
480
+ "eval_mix_de_cosine_mrr@20": 0.7323691045739125,
481
+ "eval_mix_de_cosine_mrr@200": 0.7338398642058079,
482
+ "eval_mix_de_cosine_mrr@50": 0.733538875120878,
483
+ "eval_mix_de_cosine_ndcg@1": 0.6484659386375455,
484
+ "eval_mix_de_cosine_ndcg@100": 0.7656851368194345,
485
+ "eval_mix_de_cosine_ndcg@150": 0.7681576326024331,
486
+ "eval_mix_de_cosine_ndcg@20": 0.7448150588358,
487
+ "eval_mix_de_cosine_ndcg@200": 0.7696474672652458,
488
+ "eval_mix_de_cosine_ndcg@50": 0.7595232400510039,
489
+ "eval_mix_de_cosine_precision@1": 0.6484659386375455,
490
+ "eval_mix_de_cosine_precision@100": 0.02647425897035882,
491
+ "eval_mix_de_cosine_precision@150": 0.017892182353960822,
492
+ "eval_mix_de_cosine_precision@20": 0.12093083723348932,
493
+ "eval_mix_de_cosine_precision@200": 0.013530941237649509,
494
+ "eval_mix_de_cosine_precision@50": 0.05140925637025482,
495
+ "eval_mix_de_cosine_recall@1": 0.2435517420696828,
496
+ "eval_mix_de_cosine_recall@100": 0.9596117178020455,
497
+ "eval_mix_de_cosine_recall@150": 0.9718322066215982,
498
+ "eval_mix_de_cosine_recall@20": 0.87873114924597,
499
+ "eval_mix_de_cosine_recall@200": 0.9799791991679667,
500
+ "eval_mix_de_cosine_recall@50": 0.9319899462645173,
501
+ "eval_mix_es_cosine_accuracy@1": 0.7087883515340614,
502
+ "eval_mix_es_cosine_accuracy@100": 0.9901196047841914,
503
+ "eval_mix_es_cosine_accuracy@150": 0.9937597503900156,
504
+ "eval_mix_es_cosine_accuracy@20": 0.9552782111284451,
505
+ "eval_mix_es_cosine_accuracy@200": 0.9958398335933437,
506
+ "eval_mix_es_cosine_accuracy@50": 0.9802392095683827,
507
+ "eval_mix_es_cosine_map@1": 0.7087883515340614,
508
+ "eval_mix_es_cosine_map@100": 0.7112928928384499,
509
+ "eval_mix_es_cosine_map@150": 0.7114314004578745,
510
+ "eval_mix_es_cosine_map@20": 0.7070596364024803,
511
+ "eval_mix_es_cosine_map@200": 0.711504950521157,
512
+ "eval_mix_es_cosine_map@50": 0.7106867578203881,
513
+ "eval_mix_es_cosine_map@500": 0.7116431478000537,
514
+ "eval_mix_es_cosine_mrr@1": 0.7087883515340614,
515
+ "eval_mix_es_cosine_mrr@100": 0.7813961782842836,
516
+ "eval_mix_es_cosine_mrr@150": 0.7814280971923943,
517
+ "eval_mix_es_cosine_mrr@20": 0.7804158804359833,
518
+ "eval_mix_es_cosine_mrr@200": 0.7814392363829243,
519
+ "eval_mix_es_cosine_mrr@50": 0.7812547046826683,
520
+ "eval_mix_es_cosine_ndcg@1": 0.7087883515340614,
521
+ "eval_mix_es_cosine_ndcg@100": 0.7986024294603647,
522
+ "eval_mix_es_cosine_ndcg@150": 0.8001222520801115,
523
+ "eval_mix_es_cosine_ndcg@20": 0.7814741332820433,
524
+ "eval_mix_es_cosine_ndcg@200": 0.801183843730514,
525
+ "eval_mix_es_cosine_ndcg@50": 0.7944033394497885,
526
+ "eval_mix_es_cosine_precision@1": 0.7087883515340614,
527
+ "eval_mix_es_cosine_precision@100": 0.026125845033801356,
528
+ "eval_mix_es_cosine_precision@150": 0.017548968625411682,
529
+ "eval_mix_es_cosine_precision@20": 0.12158086323452937,
530
+ "eval_mix_es_cosine_precision@200": 0.013239729589183572,
531
+ "eval_mix_es_cosine_precision@50": 0.05122204888195529,
532
+ "eval_mix_es_cosine_recall@1": 0.2737959042171211,
533
+ "eval_mix_es_cosine_recall@100": 0.9650979372508233,
534
+ "eval_mix_es_cosine_recall@150": 0.9731582596637198,
535
+ "eval_mix_es_cosine_recall@20": 0.8990032934650719,
536
+ "eval_mix_es_cosine_recall@200": 0.979086496793205,
537
+ "eval_mix_es_cosine_recall@50": 0.9459438377535101,
538
+ "eval_mix_zh_cosine_accuracy@1": 0.7667014613778705,
539
+ "eval_mix_zh_cosine_accuracy@100": 0.9958246346555324,
540
+ "eval_mix_zh_cosine_accuracy@150": 0.9973903966597077,
541
+ "eval_mix_zh_cosine_accuracy@20": 0.9843423799582464,
542
+ "eval_mix_zh_cosine_accuracy@200": 0.9979123173277662,
543
+ "eval_mix_zh_cosine_accuracy@50": 0.9932150313152401,
544
+ "eval_mix_zh_cosine_map@1": 0.7667014613778705,
545
+ "eval_mix_zh_cosine_map@100": 0.7053668771050886,
546
+ "eval_mix_zh_cosine_map@150": 0.7055166914145262,
547
+ "eval_mix_zh_cosine_map@20": 0.7007206423896271,
548
+ "eval_mix_zh_cosine_map@200": 0.7055658329670217,
549
+ "eval_mix_zh_cosine_map@50": 0.7046277360194696,
550
+ "eval_mix_zh_cosine_map@500": 0.7056512281794008,
551
+ "eval_mix_zh_cosine_mrr@1": 0.7667014613778705,
552
+ "eval_mix_zh_cosine_mrr@100": 0.8425358910333786,
553
+ "eval_mix_zh_cosine_mrr@150": 0.8425483391786986,
554
+ "eval_mix_zh_cosine_mrr@20": 0.8421752732824312,
555
+ "eval_mix_zh_cosine_mrr@200": 0.8425515411459873,
556
+ "eval_mix_zh_cosine_mrr@50": 0.8424954415974232,
557
+ "eval_mix_zh_cosine_ndcg@1": 0.7667014613778705,
558
+ "eval_mix_zh_cosine_ndcg@100": 0.8167350090334409,
559
+ "eval_mix_zh_cosine_ndcg@150": 0.8181122471507385,
560
+ "eval_mix_zh_cosine_ndcg@20": 0.8002168358295473,
561
+ "eval_mix_zh_cosine_ndcg@200": 0.8186874070081017,
562
+ "eval_mix_zh_cosine_ndcg@50": 0.8125113081884888,
563
+ "eval_mix_zh_cosine_precision@1": 0.7667014613778705,
564
+ "eval_mix_zh_cosine_precision@100": 0.029598121085595,
565
+ "eval_mix_zh_cosine_precision@150": 0.01986778009742519,
566
+ "eval_mix_zh_cosine_precision@20": 0.13870041753653445,
567
+ "eval_mix_zh_cosine_precision@200": 0.014945198329853866,
568
+ "eval_mix_zh_cosine_precision@50": 0.05810020876826725,
569
+ "eval_mix_zh_cosine_recall@1": 0.25692041952480366,
570
+ "eval_mix_zh_cosine_recall@100": 0.9765483646485734,
571
+ "eval_mix_zh_cosine_recall@150": 0.9833768267223383,
572
+ "eval_mix_zh_cosine_recall@20": 0.9156576200417536,
573
+ "eval_mix_zh_cosine_recall@200": 0.986464857341684,
574
+ "eval_mix_zh_cosine_recall@50": 0.9582637439109255,
575
+ "eval_runtime": 11.6521,
576
+ "eval_samples_per_second": 0.0,
577
+ "eval_sequential_score": 0.8186874070081017,
578
+ "eval_steps_per_second": 0.0,
579
+ "step": 400
580
+ }
581
+ ],
582
+ "logging_steps": 100,
583
+ "max_steps": 605,
584
+ "num_input_tokens_seen": 0,
585
+ "num_train_epochs": 5,
586
+ "save_steps": 200,
587
+ "stateful_callbacks": {
588
+ "TrainerControl": {
589
+ "args": {
590
+ "should_epoch_stop": false,
591
+ "should_evaluate": false,
592
+ "should_log": false,
593
+ "should_save": true,
594
+ "should_training_stop": false
595
+ },
596
+ "attributes": {}
597
+ }
598
+ },
599
+ "total_flos": 9.180921003089096e+18,
600
+ "train_batch_size": 128,
601
+ "trial_name": null,
602
+ "trial_params": null
603
+ }
checkpoint-400/training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:bf4d5229cda7218c285b125b9db305e5b03ef7807e7bcdbf89afc0ac543dd982
3
+ size 5624
eval/Information-Retrieval_evaluation_full_de_results.csv ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ epoch,steps,cosine-Accuracy@1,cosine-Accuracy@20,cosine-Accuracy@50,cosine-Accuracy@100,cosine-Accuracy@150,cosine-Accuracy@200,cosine-Precision@1,cosine-Recall@1,cosine-Precision@20,cosine-Recall@20,cosine-Precision@50,cosine-Recall@50,cosine-Precision@100,cosine-Recall@100,cosine-Precision@150,cosine-Recall@150,cosine-Precision@200,cosine-Recall@200,cosine-MRR@1,cosine-MRR@20,cosine-MRR@50,cosine-MRR@100,cosine-MRR@150,cosine-MRR@200,cosine-NDCG@1,cosine-NDCG@20,cosine-NDCG@50,cosine-NDCG@100,cosine-NDCG@150,cosine-NDCG@200,cosine-MAP@1,cosine-MAP@20,cosine-MAP@50,cosine-MAP@100,cosine-MAP@150,cosine-MAP@200,cosine-MAP@500
2
+ 1.6502057613168724,200,0.2955665024630542,0.9852216748768473,0.9852216748768473,0.9901477832512315,0.9901477832512315,0.9901477832512315,0.2955665024630542,0.01108543831680986,0.5073891625615764,0.31887986522832873,0.3681773399014779,0.5004342335550164,0.24177339901477832,0.6244233924508789,0.18187192118226603,0.687486468465792,0.1470689655172414,0.7334854348170513,0.2955665024630542,0.5140785596450613,0.5140785596450613,0.5141580130059386,0.5141580130059386,0.5141580130059386,0.2955665024630542,0.5342874353496432,0.5251712513704461,0.5568512483442096,0.5891923427833955,0.6108910915140433,0.2955665024630542,0.3952556219642319,0.3565895386599598,0.3618162919366333,0.37673093284239206,0.3850375691141728,0.3976475131909832
3
+ 3.3045267489711936,400,0.2955665024630542,0.9753694581280788,0.9852216748768473,0.9901477832512315,0.9901477832512315,0.9901477832512315,0.2955665024630542,0.01108543831680986,0.5103448275862069,0.3207974783481294,0.36935960591133016,0.5042046446720455,0.23965517241379314,0.6172666777909689,0.1807881773399015,0.6848138831682932,0.1461576354679803,0.7253195006357535,0.2955665024630542,0.5164773875147672,0.5167647438366063,0.5168213657719442,0.5168213657719442,0.5168213657719442,0.2955665024630542,0.537849085734973,0.5288037060639387,0.5551941695921919,0.5887611959940118,0.6092219717029682,0.2955665024630542,0.398398563122481,0.36032758502543594,0.3632259128424842,0.37822275477623696,0.3863148456840816,0.399227009561676
4
+ 4.954732510288066,600,0.2955665024630542,0.9753694581280788,0.9802955665024631,0.9901477832512315,0.9901477832512315,0.9901477832512315,0.2955665024630542,0.01108543831680986,0.512807881773399,0.3237955971074792,0.36807881773399015,0.5030929796037713,0.24029556650246306,0.6172851336056435,0.18111658456486043,0.6855584980130183,0.14665024630541873,0.7279125666778575,0.2955665024630542,0.5145867542419259,0.5147235905856588,0.5148895627540307,0.5148895627540307,0.5148895627540307,0.2955665024630542,0.5399334191840455,0.5279243511614918,0.5555133187133253,0.5891570214465959,0.6103222576369508,0.2955665024630542,0.40047661522515754,0.3595621891952358,0.3631770989458169,0.37785921428477304,0.38624613510168,0.3990985450532001
eval/Information-Retrieval_evaluation_full_en_results.csv ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ epoch,steps,cosine-Accuracy@1,cosine-Accuracy@20,cosine-Accuracy@50,cosine-Accuracy@100,cosine-Accuracy@150,cosine-Accuracy@200,cosine-Precision@1,cosine-Recall@1,cosine-Precision@20,cosine-Recall@20,cosine-Precision@50,cosine-Recall@50,cosine-Precision@100,cosine-Recall@100,cosine-Precision@150,cosine-Recall@150,cosine-Precision@200,cosine-Recall@200,cosine-MRR@1,cosine-MRR@20,cosine-MRR@50,cosine-MRR@100,cosine-MRR@150,cosine-MRR@200,cosine-NDCG@1,cosine-NDCG@20,cosine-NDCG@50,cosine-NDCG@100,cosine-NDCG@150,cosine-NDCG@200,cosine-MAP@1,cosine-MAP@20,cosine-MAP@50,cosine-MAP@100,cosine-MAP@150,cosine-MAP@200,cosine-MAP@500
2
+ 1.6502057613168724,200,0.6571428571428571,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.6571428571428571,0.06695957251887064,0.5133333333333332,0.5478306503729546,0.3169523809523809,0.7470276357469449,0.18971428571428572,0.8467073011936345,0.13619047619047617,0.9010846211520122,0.10561904761904761,0.9256595392715059,0.6571428571428571,0.8111111111111111,0.8111111111111111,0.8111111111111111,0.8111111111111111,0.8111111111111111,0.6571428571428571,0.6923506957704934,0.7170311913169547,0.7690946845916871,0.7923061459636489,0.8023952171736648,0.6571428571428571,0.5516314386214587,0.5474217433291914,0.5799091076338031,0.5895042547793764,0.5930550248640567,0.5967311945998978
3
+ 3.3045267489711936,400,0.6666666666666666,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.6666666666666666,0.06854687410617222,0.5147619047619048,0.5491240579458434,0.31999999999999995,0.7553654907661455,0.19047619047619047,0.8503209224897438,0.1361904761904762,0.8994749092946579,0.10542857142857143,0.9207884118691805,0.6666666666666666,0.8182539682539683,0.8182539682539683,0.8182539682539683,0.8182539682539683,0.8182539682539683,0.6666666666666666,0.6952098522285352,0.7229572913271685,0.7732532874348539,0.7947334799125039,0.8038564389556094,0.6666666666666666,0.5566401101002375,0.55344017265156,0.5852249415484134,0.5943042662925763,0.5975837437975446,0.6015742986218369
4
+ 4.954732510288066,600,0.6666666666666666,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.9904761904761905,0.6666666666666666,0.06854687410617222,0.5147619047619048,0.5470067005308932,0.3182857142857143,0.7505056727432228,0.18971428571428572,0.8478262642391493,0.1354920634920635,0.8922827077083945,0.10542857142857143,0.9231956217242554,0.6666666666666666,0.8193650793650795,0.8193650793650795,0.8193650793650795,0.8193650793650795,0.8193650793650795,0.6666666666666666,0.6953331094276901,0.7205492972737558,0.7716799081442828,0.7920452317168161,0.8038291002924913,0.6666666666666666,0.5564682301725326,0.5511565436923715,0.5830965448117339,0.5920235672028571,0.5958093906842231,0.5997006503367213
eval/Information-Retrieval_evaluation_full_es_results.csv ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ epoch,steps,cosine-Accuracy@1,cosine-Accuracy@20,cosine-Accuracy@50,cosine-Accuracy@100,cosine-Accuracy@150,cosine-Accuracy@200,cosine-Precision@1,cosine-Recall@1,cosine-Precision@20,cosine-Recall@20,cosine-Precision@50,cosine-Recall@50,cosine-Precision@100,cosine-Recall@100,cosine-Precision@150,cosine-Recall@150,cosine-Precision@200,cosine-Recall@200,cosine-MRR@1,cosine-MRR@20,cosine-MRR@50,cosine-MRR@100,cosine-MRR@150,cosine-MRR@200,cosine-NDCG@1,cosine-NDCG@20,cosine-NDCG@50,cosine-NDCG@100,cosine-NDCG@150,cosine-NDCG@200,cosine-MAP@1,cosine-MAP@20,cosine-MAP@50,cosine-MAP@100,cosine-MAP@150,cosine-MAP@200,cosine-MAP@500
2
+ 1.6502057613168724,200,0.11891891891891893,1.0,1.0,1.0,1.0,1.0,0.11891891891891893,0.0035436931012884127,0.5767567567567567,0.3862419782331355,0.3907027027027027,0.5625768407738393,0.2541621621621622,0.6836436316189977,0.19225225225225226,0.7496865406970199,0.15264864864864866,0.7852629043380305,0.11891891891891893,0.5554054054054054,0.5554054054054054,0.5554054054054054,0.5554054054054054,0.5554054054054054,0.11891891891891893,0.6158554243812342,0.5886857089260162,0.6196114606257926,0.6530674955405338,0.670287400819268,0.11891891891891893,0.4839539531842883,0.4288206349412292,0.43522297182400527,0.4511056582755023,0.45802493743471273,0.47075604946048677
3
+ 3.3045267489711936,400,0.12432432432432433,1.0,1.0,1.0,1.0,1.0,0.12432432432432433,0.0036138931714884822,0.575945945945946,0.3852888120551914,0.3923243243243244,0.5659574514538841,0.2565945945945946,0.6898678629281393,0.19282882882882882,0.7540209165372845,0.1527837837837838,0.7858170054407897,0.12432432432432433,0.5581081081081081,0.5581081081081081,0.5581081081081081,0.5581081081081081,0.5581081081081081,0.12432432432432433,0.6168674053047035,0.5913690595071309,0.62350509928888,0.6556716735369459,0.6716557949894583,0.12432432432432433,0.48407152706202555,0.43043374125481026,0.43735327570764515,0.45269435912524697,0.45930097680668164,0.47204219228541466
4
+ 4.954732510288066,600,0.11891891891891893,1.0,1.0,1.0,1.0,1.0,0.11891891891891893,0.0035747235671014874,0.5716216216216217,0.38238488508523605,0.3897297297297297,0.5589774933986379,0.25486486486486487,0.6809046903877798,0.19196396396396398,0.7492085330846164,0.15216216216216216,0.7841796161358201,0.11891891891891893,0.5554054054054054,0.5554054054054054,0.5554054054054054,0.5554054054054054,0.5554054054054054,0.11891891891891893,0.6129390001663928,0.5871887749995743,0.6186776604605951,0.6521363774137173,0.6690500870831251,0.11891891891891893,0.48004412760250637,0.42709096639640703,0.43383015161725486,0.4492558751565027,0.4560059854501064,0.4687334978259114
eval/Information-Retrieval_evaluation_full_zh_results.csv ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ epoch,steps,cosine-Accuracy@1,cosine-Accuracy@20,cosine-Accuracy@50,cosine-Accuracy@100,cosine-Accuracy@150,cosine-Accuracy@200,cosine-Precision@1,cosine-Recall@1,cosine-Precision@20,cosine-Recall@20,cosine-Precision@50,cosine-Recall@50,cosine-Precision@100,cosine-Recall@100,cosine-Precision@150,cosine-Recall@150,cosine-Precision@200,cosine-Recall@200,cosine-MRR@1,cosine-MRR@20,cosine-MRR@50,cosine-MRR@100,cosine-MRR@150,cosine-MRR@200,cosine-NDCG@1,cosine-NDCG@20,cosine-NDCG@50,cosine-NDCG@100,cosine-NDCG@150,cosine-NDCG@200,cosine-MAP@1,cosine-MAP@20,cosine-MAP@50,cosine-MAP@100,cosine-MAP@150,cosine-MAP@200,cosine-MAP@500
2
+ 1.6502057613168724,200,0.6601941747572816,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.6601941747572816,0.06669332811942774,0.4868932038834952,0.52040897323663,0.2959223300970874,0.7067236634036261,0.17902912621359218,0.813864097315397,0.12912621359223303,0.8683619147921042,0.10063106796116503,0.8964210248615742,0.6601941747572816,0.8068423485899215,0.8068423485899215,0.8068423485899215,0.8068423485899215,0.8068423485899215,0.6601941747572816,0.6629898844211244,0.682216395408567,0.7344118850318737,0.7580048379992059,0.769464510105362,0.6601941747572816,0.5176817014415404,0.5050961591489588,0.5346277197767966,0.5441006347287816,0.547804939644668,0.5524877228701637
3
+ 3.3045267489711936,400,0.6796116504854369,0.9805825242718447,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.6796116504854369,0.06931865009287731,0.488349514563107,0.5250914458143515,0.29631067961165053,0.7082715439925011,0.17883495145631062,0.8169166539243944,0.12776699029126212,0.8613232254521018,0.09990291262135924,0.8898175710074696,0.6796116504854369,0.8158576051779936,0.816279724215562,0.816279724215562,0.816279724215562,0.816279724215562,0.6796116504854369,0.6680745295820606,0.6856578240865067,0.7378907298421352,0.7576651805692517,0.7696718049970358,0.6796116504854369,0.522177160195635,0.5082601209392789,0.5371705298206915,0.5454012672534121,0.5494570875591636,0.5542116087189223
4
+ 4.954732510288066,600,0.6699029126213593,0.9805825242718447,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.9902912621359223,0.6699029126213593,0.0668914656268579,0.48446601941747586,0.5216552728717982,0.29300970873786414,0.6977673292298668,0.17815533980582518,0.8121070659747489,0.12724919093851134,0.8591188085882469,0.09990291262135924,0.8883403241892006,0.6699029126213593,0.8110032362459547,0.811465557096625,0.811465557096625,0.811465557096625,0.811465557096625,0.6699029126213593,0.6644952763622808,0.6798029503509618,0.7344694576412091,0.7548761299269686,0.7675810647559893,0.6699029126213593,0.5195778690550232,0.5053363422108585,0.5348207175322741,0.5430669760209089,0.5474770834562246,0.5521257951989472
eval/Information-Retrieval_evaluation_mix_de_results.csv ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ epoch,steps,cosine-Accuracy@1,cosine-Accuracy@20,cosine-Accuracy@50,cosine-Accuracy@100,cosine-Accuracy@150,cosine-Accuracy@200,cosine-Precision@1,cosine-Recall@1,cosine-Precision@20,cosine-Recall@20,cosine-Precision@50,cosine-Recall@50,cosine-Precision@100,cosine-Recall@100,cosine-Precision@150,cosine-Recall@150,cosine-Precision@200,cosine-Recall@200,cosine-MRR@1,cosine-MRR@20,cosine-MRR@50,cosine-MRR@100,cosine-MRR@150,cosine-MRR@200,cosine-NDCG@1,cosine-NDCG@20,cosine-NDCG@50,cosine-NDCG@100,cosine-NDCG@150,cosine-NDCG@200,cosine-MAP@1,cosine-MAP@20,cosine-MAP@50,cosine-MAP@100,cosine-MAP@150,cosine-MAP@200,cosine-MAP@500
2
+ 1.6502057613168724,200,0.642225689027561,0.9240769630785232,0.9635985439417577,0.9771190847633905,0.984919396775871,0.9901196047841914,0.642225689027561,0.2405616224648986,0.11911076443057722,0.8650459351707401,0.05086843473738951,0.9226295718495406,0.0261622464898596,0.9480412549835328,0.01770844167100017,0.9618651412723176,0.013424336973478942,0.9720922170220142,0.642225689027561,0.7246816496840639,0.7260235454700952,0.7262168772880452,0.7262822017289415,0.7263128860080087,0.642225689027561,0.7332013199323174,0.7490333180034867,0.7547612967303503,0.7575184392863841,0.7593986816807992,0.642225689027561,0.6521189972338849,0.6561813596290409,0.6570111325791598,0.6572712744212402,0.6574012324541948,0.6575399010277455
3
+ 3.3045267489711936,400,0.6484659386375455,0.9323972958918356,0.968278731149246,0.984919396775871,0.9885595423816953,0.9937597503900156,0.6484659386375455,0.2435517420696828,0.12093083723348932,0.87873114924597,0.05140925637025482,0.9319899462645173,0.02647425897035882,0.9596117178020455,0.017892182353960822,0.9718322066215982,0.013530941237649509,0.9799791991679667,0.6484659386375455,0.7323691045739125,0.733538875120878,0.733776247038599,0.7338087409764548,0.7338398642058079,0.6484659386375455,0.7448150588358,0.7595232400510039,0.7656851368194345,0.7681576326024331,0.7696474672652458,0.6484659386375455,0.6646138211839377,0.6683657128313888,0.6692634410264182,0.669518875077899,0.6696171599377958,0.6697127210085475
4
+ 4.954732510288066,600,0.6526261050442018,0.9381175247009881,0.968278731149246,0.984399375975039,0.9890795631825273,0.9937597503900156,0.6526261050442018,0.24476512393829086,0.12171086843473738,0.8842780377881783,0.051513260530421226,0.9338100190674293,0.026557462298491943,0.9620384815392615,0.017919916796671865,0.9734789391575663,0.01352834113364535,0.9799791991679667,0.6526261050442018,0.7361257905522858,0.7371463016806762,0.7373955457270671,0.7374354870723953,0.7374620339191081,0.6526261050442018,0.7490629318893474,0.7628601222905265,0.769213497844745,0.7714755020563181,0.7726559015412698,0.6526261050442018,0.6683439802461194,0.672011814394583,0.6729253835478308,0.6731548906218254,0.6732301986902165,0.6733271347787819
eval/Information-Retrieval_evaluation_mix_es_results.csv ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ epoch,steps,cosine-Accuracy@1,cosine-Accuracy@20,cosine-Accuracy@50,cosine-Accuracy@100,cosine-Accuracy@150,cosine-Accuracy@200,cosine-Precision@1,cosine-Recall@1,cosine-Precision@20,cosine-Recall@20,cosine-Precision@50,cosine-Recall@50,cosine-Precision@100,cosine-Recall@100,cosine-Precision@150,cosine-Recall@150,cosine-Precision@200,cosine-Recall@200,cosine-MRR@1,cosine-MRR@20,cosine-MRR@50,cosine-MRR@100,cosine-MRR@150,cosine-MRR@200,cosine-NDCG@1,cosine-NDCG@20,cosine-NDCG@50,cosine-NDCG@100,cosine-NDCG@150,cosine-NDCG@200,cosine-MAP@1,cosine-MAP@20,cosine-MAP@50,cosine-MAP@100,cosine-MAP@150,cosine-MAP@200,cosine-MAP@500
2
+ 1.6502057613168724,200,0.7009880395215808,0.9474778991159646,0.9776391055642226,0.9885595423816953,0.9921996879875195,0.9932397295891836,0.7009880395215808,0.27067577941212884,0.11968278731149247,0.8850840700294678,0.05085803432137287,0.9390968972092216,0.02598543941757671,0.9599497313225862,0.017493499739989597,0.9695527821112844,0.013198127925117008,0.9758970358814353,0.7009880395215808,0.7712491671812917,0.7722842539435679,0.7724347923967887,0.7724644404043258,0.7724705526191206,0.7009880395215808,0.7690336236998598,0.7838732562697655,0.7884317468596705,0.7902844804245556,0.7913994944724545,0.7009880395215808,0.6938173897965141,0.6978248868009254,0.6984889579958145,0.6986621032108891,0.6987465392575996,0.6988876342368443
3
+ 3.3045267489711936,400,0.7087883515340614,0.9552782111284451,0.9802392095683827,0.9901196047841914,0.9937597503900156,0.9958398335933437,0.7087883515340614,0.2737959042171211,0.12158086323452937,0.8990032934650719,0.05122204888195529,0.9459438377535101,0.026125845033801356,0.9650979372508233,0.017548968625411682,0.9731582596637198,0.013239729589183572,0.979086496793205,0.7087883515340614,0.7804158804359833,0.7812547046826683,0.7813961782842836,0.7814280971923943,0.7814392363829243,0.7087883515340614,0.7814741332820433,0.7944033394497885,0.7986024294603647,0.8001222520801115,0.801183843730514,0.7087883515340614,0.7070596364024803,0.7106867578203881,0.7112928928384499,0.7114314004578745,0.711504950521157,0.7116431478000537
4
+ 4.954732510288066,600,0.7113884555382215,0.9578783151326054,0.9802392095683827,0.9911596463858554,0.9942797711908476,0.9963598543941757,0.7113884555382215,0.2743592600846891,0.12225689027561103,0.9034494713121858,0.05128445137805513,0.9468365401282718,0.026136245449817998,0.9653579476512393,0.017562835846767204,0.973678280464552,0.013244929797191891,0.979259837060149,0.7113884555382215,0.7831032164843245,0.7838272775216667,0.7839849724965707,0.7840120488977712,0.7840241592348755,0.7113884555382215,0.7855908925287227,0.7974527919158213,0.8015109259174463,0.8030947148671981,0.8040944464945255,0.7113884555382215,0.711471856133734,0.7147402910660883,0.7153107216129952,0.715461327633746,0.7155310997514029,0.7156626841865296
eval/Information-Retrieval_evaluation_mix_zh_results.csv ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ epoch,steps,cosine-Accuracy@1,cosine-Accuracy@20,cosine-Accuracy@50,cosine-Accuracy@100,cosine-Accuracy@150,cosine-Accuracy@200,cosine-Precision@1,cosine-Recall@1,cosine-Precision@20,cosine-Recall@20,cosine-Precision@50,cosine-Recall@50,cosine-Precision@100,cosine-Recall@100,cosine-Precision@150,cosine-Recall@150,cosine-Precision@200,cosine-Recall@200,cosine-MRR@1,cosine-MRR@20,cosine-MRR@50,cosine-MRR@100,cosine-MRR@150,cosine-MRR@200,cosine-NDCG@1,cosine-NDCG@20,cosine-NDCG@50,cosine-NDCG@100,cosine-NDCG@150,cosine-NDCG@200,cosine-MAP@1,cosine-MAP@20,cosine-MAP@50,cosine-MAP@100,cosine-MAP@150,cosine-MAP@200,cosine-MAP@500
2
+ 1.6502057613168724,200,0.7713987473903967,0.9806889352818372,0.9916492693110647,0.9947807933194155,0.9963465553235908,0.9973903966597077,0.7713987473903967,0.2585731683069888,0.13656054279749477,0.9014352818371607,0.05762004175365346,0.950347947112039,0.02944676409185805,0.9715031315240084,0.0197633959638135,0.9781576200417537,0.014877348643006268,0.9818110647181629,0.7713987473903967,0.8432785828350186,0.8436385628906108,0.8436803457907981,0.8436922193976949,0.8436986631082636,0.7713987473903967,0.7926986810043013,0.8066848794942646,0.8115576206060865,0.8129087269558002,0.8135973837485255,0.7713987473903967,0.6929147114308428,0.6972607407491801,0.6981100717727863,0.6982601227257159,0.6983171494463136,0.6984116893552017
3
+ 3.3045267489711936,400,0.7667014613778705,0.9843423799582464,0.9932150313152401,0.9958246346555324,0.9973903966597077,0.9979123173277662,0.7667014613778705,0.25692041952480366,0.13870041753653445,0.9156576200417536,0.05810020876826725,0.9582637439109255,0.029598121085595,0.9765483646485734,0.01986778009742519,0.9833768267223383,0.014945198329853866,0.986464857341684,0.7667014613778705,0.8421752732824312,0.8424954415974232,0.8425358910333786,0.8425483391786986,0.8425515411459873,0.7667014613778705,0.8002168358295473,0.8125113081884888,0.8167350090334409,0.8181122471507385,0.8186874070081017,0.7667014613778705,0.7007206423896271,0.7046277360194696,0.7053668771050886,0.7055166914145262,0.7055658329670217,0.7056512281794008
4
+ 4.954732510288066,600,0.7713987473903967,0.9853862212943633,0.994258872651357,0.9968684759916493,0.9973903966597077,0.9984342379958246,0.7713987473903967,0.25851227756238193,0.1394832985386221,0.9209203201113431,0.0581837160751566,0.9596555323590814,0.0296659707724426,0.9788100208768268,0.019888656924147527,0.9843771746694502,0.014968684759916497,0.988204592901879,0.7713987473903967,0.8448394509501435,0.8451478929977786,0.845185992264558,0.845190415321067,0.8451967261265305,0.7713987473903967,0.8040681543125745,0.8152493764658302,0.8196701648084599,0.8208022806585027,0.8214985185492638,0.7713987473903967,0.7044865019071646,0.7080890700816939,0.7088637949529534,0.7089946233102989,0.7090485452746494,0.709122690469693