eval_name
stringlengths
12
78
Precision
stringclasses
3 values
Type
stringclasses
5 values
T
stringclasses
5 values
Weight type
stringclasses
2 values
Architecture
stringclasses
35 values
Model
stringlengths
355
551
fullname
stringlengths
4
69
Model sha
stringlengths
0
40
Average ⬆️
float64
1.41
44.8
Hub License
stringclasses
22 values
Hub ❤️
int64
0
5.59k
#Params (B)
int64
-1
140
Available on the hub
bool
2 classes
Not_Merged
bool
2 classes
MoE
bool
2 classes
Flagged
bool
1 class
Chat Template
bool
2 classes
IFEval Raw
float64
0
0.87
IFEval
float64
0
86.7
BBH Raw
float64
0.29
0.75
BBH
float64
1.46
62.8
MATH Lvl 5 Raw
float64
0
0.41
MATH Lvl 5
float64
0
41.2
GPQA Raw
float64
0.22
0.41
GPQA
float64
0
20.9
MUSR Raw
float64
0.3
0.52
MUSR
float64
0.29
25.9
MMLU-PRO Raw
float64
0.1
0.57
MMLU-PRO
float64
0
52.6
Maintainer's Highlight
bool
2 classes
Upload To Hub Date
stringlengths
0
10
Submission Date
stringclasses
74 values
Generation
int64
0
6
Base Model
stringlengths
4
77
0-hero_Matter-0.2-7B-DPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/0-hero/Matter-0.2-7B-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">0-hero/Matter-0.2-7B-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/0-hero__Matter-0.2-7B-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
0-hero/Matter-0.2-7B-DPO
26a66f0d862e2024ce4ad0a09c37052ac36e8af6
8.805656
apache-2.0
3
7
true
true
true
false
true
0.330279
33.027921
0.359625
10.055525
0.008308
0.830816
0.259228
1.230425
0.381375
5.871875
0.116356
1.817376
false
2024-04-13
2024-08-05
0
0-hero/Matter-0.2-7B-DPO
01-ai_Yi-1.5-34B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/01-ai/Yi-1.5-34B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-1.5-34B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-1.5-34B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
01-ai/Yi-1.5-34B
4b486f81c935a2dadde84c6baa1e1370d40a098f
25.432496
apache-2.0
46
34
true
true
true
false
false
0.284117
28.411725
0.597639
42.749363
0.140483
14.048338
0.365772
15.436242
0.423604
11.217188
0.466589
40.732122
true
2024-05-11
2024-06-12
0
01-ai/Yi-1.5-34B
01-ai_Yi-1.5-34B-32K_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/01-ai/Yi-1.5-34B-32K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-1.5-34B-32K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-1.5-34B-32K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
01-ai/Yi-1.5-34B-32K
2c03a29761e4174f20347a60fbe229be4383d48b
26.400622
apache-2.0
35
34
true
true
true
false
false
0.311869
31.186917
0.601569
43.381847
0.134441
13.444109
0.363255
15.100671
0.439823
14.077865
0.470911
41.212323
true
2024-05-15
2024-06-12
0
01-ai/Yi-1.5-34B-32K
01-ai_Yi-1.5-34B-Chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/01-ai/Yi-1.5-34B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-1.5-34B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-1.5-34B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
01-ai/Yi-1.5-34B-Chat
f3128b2d02d82989daae566c0a7eadc621ca3254
32.627883
apache-2.0
228
34
true
true
true
false
true
0.606676
60.667584
0.608375
44.262826
0.233384
23.338369
0.364933
15.324385
0.428198
13.058073
0.452045
39.116061
true
2024-05-10
2024-06-12
0
01-ai/Yi-1.5-34B-Chat
01-ai_Yi-1.5-34B-Chat-16K_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/01-ai/Yi-1.5-34B-Chat-16K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-1.5-34B-Chat-16K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-1.5-34B-Chat-16K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
01-ai/Yi-1.5-34B-Chat-16K
ff74452e11f0f749ab872dc19b1dd3813c25c4d8
28.975559
apache-2.0
26
34
true
true
true
false
true
0.45645
45.645
0.610022
44.536157
0.188066
18.806647
0.338087
11.744966
0.43976
13.736719
0.454455
39.383865
true
2024-05-15
2024-07-15
0
01-ai/Yi-1.5-34B-Chat-16K
01-ai_Yi-1.5-6B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/01-ai/Yi-1.5-6B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-1.5-6B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-1.5-6B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
01-ai/Yi-1.5-6B
cab51fce425b4c1fb19fccfdd96bd5d0908c1657
16.5317
apache-2.0
27
6
true
true
true
false
false
0.26166
26.166017
0.449258
22.027905
0.053625
5.362538
0.313758
8.501119
0.437406
13.309115
0.314412
23.823508
true
2024-05-11
2024-08-10
0
01-ai/Yi-1.5-6B
01-ai_Yi-1.5-6B-Chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/01-ai/Yi-1.5-6B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-1.5-6B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-1.5-6B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
01-ai/Yi-1.5-6B-Chat
3f64d3f159c6ad8494227bb77e2a7baef8cd808b
22.048529
apache-2.0
39
6
true
true
true
false
true
0.48023
48.023023
0.455486
23.550511
0.125378
12.537764
0.317953
9.060403
0.44324
14.704948
0.319731
24.414524
true
2024-05-11
2024-06-12
0
01-ai/Yi-1.5-6B-Chat
01-ai_Yi-1.5-9B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/01-ai/Yi-1.5-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-1.5-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-1.5-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
01-ai/Yi-1.5-9B
8cfde9604384c50137bee480b8cef8a08e5ae81d
21.952492
apache-2.0
43
8
true
true
true
false
false
0.293584
29.358436
0.514294
30.500717
0.101964
10.196375
0.379195
17.225951
0.432781
12.03099
0.391622
32.402482
true
2024-05-11
2024-06-12
0
01-ai/Yi-1.5-9B
01-ai_Yi-1.5-9B-32K_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/01-ai/Yi-1.5-9B-32K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-1.5-9B-32K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-1.5-9B-32K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
01-ai/Yi-1.5-9B-32K
116561dfae63af90f9d163b43077629e0e916bb1
19.608376
apache-2.0
18
8
true
true
true
false
false
0.230311
23.031113
0.496332
28.937012
0.095921
9.592145
0.35906
14.541387
0.418615
10.826823
0.376496
30.721779
true
2024-05-15
2024-06-12
0
01-ai/Yi-1.5-9B-32K
01-ai_Yi-1.5-9B-Chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/01-ai/Yi-1.5-9B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-1.5-9B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-1.5-9B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
01-ai/Yi-1.5-9B-Chat
bc87d8557c98dc1e5fdef6ec23ed31088c4d3f35
27.705595
apache-2.0
127
8
true
true
true
false
true
0.604553
60.455259
0.555906
36.952931
0.116314
11.63142
0.334732
11.297539
0.425906
12.838281
0.397523
33.058141
true
2024-05-10
2024-06-12
0
01-ai/Yi-1.5-9B-Chat
01-ai_Yi-1.5-9B-Chat-16K_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/01-ai/Yi-1.5-9B-Chat-16K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-1.5-9B-Chat-16K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-1.5-9B-Chat-16K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
01-ai/Yi-1.5-9B-Chat-16K
2b397e5f0fab87984efa66856c5c4ed4bbe68b50
22.896812
apache-2.0
31
8
true
true
true
false
true
0.421404
42.14041
0.515338
31.497609
0.126133
12.613293
0.308725
7.829978
0.409906
10.038281
0.399352
33.261303
true
2024-05-15
2024-06-12
0
01-ai/Yi-1.5-9B-Chat-16K
01-ai_Yi-34B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/01-ai/Yi-34B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-34B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-34B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
01-ai/Yi-34B
e1e7da8c75cfd5c44522228599fd4d2990cedd1c
22.259834
apache-2.0
1,278
34
true
true
true
false
false
0.304575
30.457519
0.54571
35.542431
0.044562
4.456193
0.366611
15.548098
0.411854
9.648438
0.441157
37.906324
true
2023-11-01
2024-06-12
0
01-ai/Yi-34B
01-ai_Yi-34B-200K_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/01-ai/Yi-34B-200K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-34B-200K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-34B-200K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
01-ai/Yi-34B-200K
8ac1a1ebe011df28b78ccd08012aeb2222443c77
19.799477
apache-2.0
313
34
true
true
true
false
false
0.154249
15.424851
0.544182
36.02211
0.044562
4.456193
0.356544
14.205817
0.381719
9.414844
0.453457
39.27305
true
2023-11-06
2024-06-12
0
01-ai/Yi-34B-200K
01-ai_Yi-34B-Chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/01-ai/Yi-34B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-34B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-34B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
01-ai/Yi-34B-Chat
2e528b6a80fb064a0a746c5ca43114b135e30464
23.899372
apache-2.0
342
34
true
true
true
false
true
0.469889
46.988878
0.556087
37.623988
0.043051
4.305136
0.338087
11.744966
0.397844
8.363802
0.409325
34.369459
true
2023-11-22
2024-06-12
0
01-ai/Yi-34B-Chat
01-ai_Yi-6B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/01-ai/Yi-6B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-6B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-6B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
01-ai/Yi-6B
7f7fb7662fd8ec09029364f408053c954986c8e5
13.599029
apache-2.0
367
6
true
true
true
false
false
0.289338
28.933785
0.430923
19.408505
0.015106
1.510574
0.269295
2.572707
0.393687
7.044271
0.299119
22.124335
true
2023-11-01
2024-06-12
0
01-ai/Yi-6B
01-ai_Yi-6B-200K_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/01-ai/Yi-6B-200K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-6B-200K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-6B-200K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
01-ai/Yi-6B-200K
4a74338e778a599f313e9fa8f5bc08c717604420
11.895393
apache-2.0
173
6
true
true
true
false
false
0.084331
8.433069
0.428929
20.14802
0.012085
1.208459
0.281879
4.250559
0.45874
16.842448
0.284408
20.489805
true
2023-11-06
2024-06-12
0
01-ai/Yi-6B-200K
01-ai_Yi-6B-Chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/01-ai/Yi-6B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-6B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-6B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
01-ai/Yi-6B-Chat
01f7fabb6cfb26efeb764da4a0a19cad2c754232
14.004357
apache-2.0
62
6
true
true
true
false
true
0.339521
33.952136
0.41326
17.000167
0.006798
0.679758
0.294463
5.928412
0.368792
3.565625
0.3061
22.900044
true
2023-11-22
2024-06-12
0
01-ai/Yi-6B-Chat
01-ai_Yi-9B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/01-ai/Yi-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
01-ai/Yi-9B
b4a466d95091696285409f1dcca3028543cb39da
17.610457
apache-2.0
182
8
true
true
true
false
false
0.270878
27.087794
0.493961
27.626956
0.043807
4.380665
0.317953
9.060403
0.405406
8.909115
0.35738
28.597813
true
2024-03-01
2024-06-12
0
01-ai/Yi-9B
01-ai_Yi-9B-200K_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/01-ai/Yi-9B-200K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-9B-200K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-9B-200K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
01-ai/Yi-9B-200K
8c93accd5589dbb74ee938e103613508c4a9b88d
17.591083
apache-2.0
75
8
true
true
true
false
false
0.232709
23.270921
0.47933
26.492495
0.058157
5.81571
0.315436
8.724832
0.429406
12.109115
0.362201
29.133422
true
2024-03-15
2024-06-12
0
01-ai/Yi-9B-200K
152334H_miqu-1-70b-sf_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/152334H/miqu-1-70b-sf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">152334H/miqu-1-70b-sf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/152334H__miqu-1-70b-sf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
152334H/miqu-1-70b-sf
1dca4cce36f01f2104ee2e6b97bac6ff7bb300c1
28.820469
218
68
false
true
true
false
false
0.518174
51.8174
0.610236
43.807147
0.108006
10.800604
0.350671
13.422819
0.458208
17.209375
0.422789
35.86547
false
2024-01-30
2024-06-26
0
152334H/miqu-1-70b-sf
4season_final_model_test_v2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/4season/final_model_test_v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">4season/final_model_test_v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/4season__final_model_test_v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
4season/final_model_test_v2
cf690c35d9cf0b0b6bf034fa16dbf88c56fe861c
21.91554
apache-2.0
0
21
true
true
true
false
false
0.319113
31.911329
0.634205
47.41067
0.013595
1.359517
0.327181
10.290828
0.431448
12.43099
0.352809
28.089908
false
2024-05-20
2024-06-27
0
4season/final_model_test_v2
AALF_gemma-2-27b-it-SimPO-37K_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/AALF/gemma-2-27b-it-SimPO-37K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">AALF/gemma-2-27b-it-SimPO-37K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/AALF__gemma-2-27b-it-SimPO-37K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
AALF/gemma-2-27b-it-SimPO-37K
27f15219df2000a16955c9403c3f38b5f3413b3d
9.298079
gemma
12
27
true
true
true
false
true
0.240653
24.065258
0.391134
15.307881
0
0
0.280201
4.026846
0.34876
1.595052
0.197141
10.79344
false
2024-08-13
2024-09-05
2
google/gemma-2-27b
AI-MO_NuminaMath-7B-TIR_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/AI-MO/NuminaMath-7B-TIR" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">AI-MO/NuminaMath-7B-TIR</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/AI-MO__NuminaMath-7B-TIR-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
AI-MO/NuminaMath-7B-TIR
c6e394cc0579423c9cde6df6cc192c07dae73388
11.790547
apache-2.0
300
6
true
true
true
false
false
0.275624
27.562423
0.414369
16.873547
0.017372
1.73716
0.258389
1.118568
0.350927
4.199219
0.273271
19.252364
false
2024-07-04
2024-07-11
1
deepseek-ai/deepseek-math-7b-base
AI-Sweden-Models_Llama-3-8B-instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/AI-Sweden-Models/Llama-3-8B-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">AI-Sweden-Models/Llama-3-8B-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/AI-Sweden-Models__Llama-3-8B-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
AI-Sweden-Models/Llama-3-8B-instruct
4e1c955228bdb4d69c1c4560e8d5872312a8f033
13.777204
llama3
9
8
true
true
true
false
true
0.240128
24.012841
0.417346
18.388096
0.004532
0.453172
0.26594
2.12528
0.477094
19.936719
0.259724
17.747119
false
2024-06-01
2024-06-27
2
meta-llama/Meta-Llama-3-8B
AI-Sweden-Models_gpt-sw3-40b_float16
float16
🟢 pretrained
🟢
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/AI-Sweden-Models/gpt-sw3-40b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">AI-Sweden-Models/gpt-sw3-40b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/AI-Sweden-Models__gpt-sw3-40b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
AI-Sweden-Models/gpt-sw3-40b
1af27994df1287a7fac1b10d60e40ca43a22a385
4.684081
other
8
39
true
true
true
false
false
0.14703
14.702988
0.326774
6.894934
0.006042
0.60423
0.234899
0
0.36324
2.838281
0.127576
3.064051
false
2023-02-22
2024-06-26
0
AI-Sweden-Models/gpt-sw3-40b
AbacusResearch_Jallabi-34B_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/AbacusResearch/Jallabi-34B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">AbacusResearch/Jallabi-34B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/AbacusResearch__Jallabi-34B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
AbacusResearch/Jallabi-34B
f65696da4ed82c9a20e94b200d9dccffa07af682
25.972084
apache-2.0
2
34
true
true
true
false
false
0.35286
35.286041
0.602338
43.615765
0.039275
3.927492
0.338926
11.856823
0.482177
20.238802
0.468168
40.90758
false
2024-03-01
2024-06-27
0
AbacusResearch/Jallabi-34B
Alibaba-NLP_gte-Qwen2-7B-instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Alibaba-NLP/gte-Qwen2-7B-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Alibaba-NLP/gte-Qwen2-7B-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Alibaba-NLP__gte-Qwen2-7B-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Alibaba-NLP/gte-Qwen2-7B-instruct
e26182b2122f4435e8b3ebecbf363990f409b45b
13.34324
apache-2.0
151
7
true
true
true
false
true
0.22554
22.554045
0.449514
21.925482
0.034743
3.47432
0.244966
0
0.355854
6.315104
0.332114
25.790485
false
2024-06-15
2024-08-05
0
Alibaba-NLP/gte-Qwen2-7B-instruct
Artples_L-MChat-7b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Artples/L-MChat-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Artples/L-MChat-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Artples__L-MChat-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Artples/L-MChat-7b
e10137f5cbfc1b73068d6473e4a87241cca0b3f4
21.024495
apache-2.0
1
7
true
false
true
false
true
0.529665
52.966462
0.460033
24.201557
0.079305
7.930514
0.305369
7.38255
0.402865
8.12474
0.32987
25.54115
false
2024-04-02
2024-07-07
1
Artples/L-MChat-7b (Merge)
Artples_L-MChat-Small_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/Artples/L-MChat-Small" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Artples/L-MChat-Small</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Artples__L-MChat-Small-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Artples/L-MChat-Small
52484c277f6062c12dc6d6b6397ee0d0c21b0126
14.866273
mit
1
2
true
false
true
false
true
0.328706
32.870561
0.482256
26.856516
0.015861
1.586103
0.267617
2.348993
0.369594
9.265885
0.246426
16.269577
false
2024-04-11
2024-07-07
1
Artples/L-MChat-Small (Merge)
Azure99_blossom-v5.1-34b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Azure99/blossom-v5.1-34b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Azure99/blossom-v5.1-34b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Azure99__blossom-v5.1-34b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Azure99/blossom-v5.1-34b
2c803204f5dbf4ce37e2df98eb0205cdc53de10d
28.385288
apache-2.0
4
34
true
true
true
false
true
0.569656
56.965629
0.610911
44.147705
0.14426
14.425982
0.309564
7.941834
0.392792
7.298958
0.455785
39.531619
false
2024-05-19
2024-07-27
0
Azure99/blossom-v5.1-34b
Azure99_blossom-v5.1-9b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Azure99/blossom-v5.1-9b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Azure99/blossom-v5.1-9b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Azure99__blossom-v5.1-9b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Azure99/blossom-v5.1-9b
6044a3dc1e04529fe883aa513d37f266a320d793
24.682682
apache-2.0
1
8
true
true
true
false
true
0.508582
50.858167
0.534329
34.201244
0.104985
10.498489
0.33557
11.409396
0.399396
8.024479
0.397939
33.104314
false
2024-05-15
2024-07-24
0
Azure99/blossom-v5.1-9b
BAAI_Gemma2-9B-IT-Simpo-Infinity-Preference_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/BAAI/Gemma2-9B-IT-Simpo-Infinity-Preference" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Gemma2-9B-IT-Simpo-Infinity-Preference</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Gemma2-9B-IT-Simpo-Infinity-Preference-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BAAI/Gemma2-9B-IT-Simpo-Infinity-Preference
028a91b1a4f14d365c6db08093b03348455c7bad
20.974834
7
9
false
true
true
false
true
0.317638
31.763831
0.597946
42.190844
0
0
0.339765
11.96868
0.396573
8.104948
0.386386
31.8207
false
2024-08-28
2024-09-05
2
google/gemma-2-9b
BAAI_Infinity-Instruct-3M-0613-Llama3-70B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-3M-0613-Llama3-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-3M-0613-Llama3-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-3M-0613-Llama3-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BAAI/Infinity-Instruct-3M-0613-Llama3-70B
9fc53668064bdda22975ca72c5a287f8241c95b3
34.470489
apache-2.0
4
70
true
true
true
false
true
0.682113
68.211346
0.664161
51.327161
0.148792
14.879154
0.358221
14.42953
0.45226
16.532552
0.472989
41.443189
false
2024-06-27
2024-06-28
0
BAAI/Infinity-Instruct-3M-0613-Llama3-70B
BAAI_Infinity-Instruct-3M-0613-Mistral-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-3M-0613-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-3M-0613-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-3M-0613-Mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BAAI/Infinity-Instruct-3M-0613-Mistral-7B
c7a742e539ec264b9eaeefe2aed29e92e8a7ebd6
22.041768
apache-2.0
10
7
true
true
true
false
true
0.531987
53.198735
0.495823
28.992936
0.066465
6.646526
0.296141
6.152125
0.435083
13.252083
0.316074
24.0082
false
2024-06-21
2024-06-27
0
BAAI/Infinity-Instruct-3M-0613-Mistral-7B
BAAI_Infinity-Instruct-3M-0625-Llama3-70B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-3M-0625-Llama3-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-3M-0625-Llama3-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-3M-0625-Llama3-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BAAI/Infinity-Instruct-3M-0625-Llama3-70B
6d8ceada57e55cff3503191adc4d6379ff321fe2
35.877866
apache-2.0
3
70
true
true
true
false
true
0.744212
74.421202
0.667034
52.028162
0.163142
16.314199
0.357383
14.317673
0.461656
18.340365
0.45861
39.845597
false
2024-07-09
2024-08-30
0
BAAI/Infinity-Instruct-3M-0625-Llama3-70B
BAAI_Infinity-Instruct-3M-0625-Llama3-8B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-3M-0625-Llama3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-3M-0625-Llama3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-3M-0625-Llama3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BAAI/Infinity-Instruct-3M-0625-Llama3-8B
7be7c0ff1e35c3bb781c47222da99a1724f5f1da
21.47089
apache-2.0
3
8
true
true
true
false
true
0.605027
60.502688
0.495499
28.988222
0.05287
5.287009
0.275168
3.355705
0.371208
5.667708
0.325216
25.02401
false
2024-07-09
2024-07-13
0
BAAI/Infinity-Instruct-3M-0625-Llama3-8B
BAAI_Infinity-Instruct-3M-0625-Mistral-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-3M-0625-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-3M-0625-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-3M-0625-Mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BAAI/Infinity-Instruct-3M-0625-Mistral-7B
302e3ae0bcc50dae3fb69fc1b08b518398e8c407
22.692368
apache-2.0
2
7
true
true
true
false
true
0.586742
58.674207
0.493967
28.823289
0.067221
6.722054
0.286913
4.9217
0.42724
12.238281
0.322972
24.774675
false
2024-07-09
2024-08-05
0
BAAI/Infinity-Instruct-3M-0625-Mistral-7B
BAAI_Infinity-Instruct-3M-0625-Qwen2-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-3M-0625-Qwen2-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-3M-0625-Qwen2-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-3M-0625-Qwen2-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BAAI/Infinity-Instruct-3M-0625-Qwen2-7B
503c24156d7682458686a7b5324f7f886e63470d
24.009476
apache-2.0
7
7
true
true
true
false
true
0.555393
55.539302
0.534591
34.656829
0.061178
6.117825
0.312919
8.389262
0.38876
6.461719
0.396027
32.891918
false
2024-07-09
2024-08-05
0
BAAI/Infinity-Instruct-3M-0625-Qwen2-7B
BAAI_Infinity-Instruct-3M-0625-Yi-1.5-9B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-3M-0625-Yi-1.5-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-3M-0625-Yi-1.5-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-3M-0625-Yi-1.5-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BAAI/Infinity-Instruct-3M-0625-Yi-1.5-9B
a42c86c61b98ca4fdf238d688fe6ea11cf414d29
27.742141
apache-2.0
2
8
true
true
true
false
true
0.518598
51.859843
0.550912
35.378707
0.139728
13.97281
0.354027
13.870246
0.457531
16.72474
0.411818
34.646498
false
2024-07-09
2024-08-05
0
BAAI/Infinity-Instruct-3M-0625-Yi-1.5-9B
BAAI_Infinity-Instruct-7M-0729-Llama3_1-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-7M-0729-Llama3_1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-7M-0729-Llama3_1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-7M-0729-Llama3_1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BAAI/Infinity-Instruct-7M-0729-Llama3_1-8B
0aca33fd7500a781d041e8bf7e5e3789b03f54f4
22.943899
llama3.1
4
8
true
true
true
false
true
0.613195
61.319521
0.507734
30.888805
0.097432
9.743202
0.292785
5.704698
0.357844
5.297135
0.32239
24.710033
false
2024-08-02
2024-08-05
0
BAAI/Infinity-Instruct-7M-0729-Llama3_1-8B
BAAI_Infinity-Instruct-7M-0729-mistral-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-7M-0729-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-7M-0729-mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-7M-0729-mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BAAI/Infinity-Instruct-7M-0729-mistral-7B
36651591cb13346ecbde23832013e024029700fa
22.763277
apache-2.0
1
7
true
true
true
false
true
0.616193
61.619281
0.496381
28.697915
0.055891
5.589124
0.290268
5.369128
0.406188
10.040104
0.327377
25.264111
false
2024-07-25
2024-08-05
0
BAAI/Infinity-Instruct-7M-0729-mistral-7B
BAAI_Infinity-Instruct-7M-Gen-Llama3_1-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-7M-Gen-Llama3_1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-7M-Gen-Llama3_1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-7M-Gen-Llama3_1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BAAI/Infinity-Instruct-7M-Gen-Llama3_1-8B
56f9c2845ae024eb8b1dd9ea0d8891cbaf33c596
22.943899
llama3.1
4
8
true
true
true
false
true
0.613195
61.319521
0.507734
30.888805
0.097432
9.743202
0.292785
5.704698
0.357844
5.297135
0.32239
24.710033
false
2024-08-02
2024-08-29
0
BAAI/Infinity-Instruct-7M-Gen-Llama3_1-8B
BAAI_Infinity-Instruct-7M-Gen-mistral-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-7M-Gen-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-7M-Gen-mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-7M-Gen-mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BAAI/Infinity-Instruct-7M-Gen-mistral-7B
82c83d670a8954f4250547b53a057dea1fbd460d
22.737882
apache-2.0
1
7
true
true
true
false
true
0.614669
61.466908
0.496381
28.697915
0.055891
5.589124
0.290268
5.369128
0.406188
10.040104
0.327377
25.264111
false
2024-07-25
2024-08-29
0
BAAI/Infinity-Instruct-7M-Gen-mistral-7B
BEE-spoke-data_Meta-Llama-3-8Bee_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BEE-spoke-data/Meta-Llama-3-8Bee" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BEE-spoke-data/Meta-Llama-3-8Bee</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BEE-spoke-data__Meta-Llama-3-8Bee-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BEE-spoke-data/Meta-Llama-3-8Bee
8143e34e77a49a30ec2617c5c9cc22cb3cda2287
14.494166
llama3
0
8
true
true
true
false
false
0.195066
19.506576
0.462636
24.199033
0.03852
3.851964
0.313758
8.501119
0.365406
6.242448
0.321975
24.663859
false
2024-04-28
2024-07-04
1
meta-llama/Meta-Llama-3-8B
BEE-spoke-data_smol_llama-101M-GQA_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BEE-spoke-data/smol_llama-101M-GQA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BEE-spoke-data/smol_llama-101M-GQA</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BEE-spoke-data__smol_llama-101M-GQA-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BEE-spoke-data/smol_llama-101M-GQA
bb26643db413bada7e0c3c50752bf9da82403dba
3.918895
apache-2.0
25
0
true
true
true
false
false
0.138437
13.843712
0.301756
3.198004
0
0
0.25755
1.006711
0.371271
4.275521
0.110705
1.189421
false
2023-10-26
2024-07-06
0
BEE-spoke-data/smol_llama-101M-GQA
BEE-spoke-data_smol_llama-220M-GQA_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BEE-spoke-data/smol_llama-220M-GQA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BEE-spoke-data/smol_llama-220M-GQA</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BEE-spoke-data__smol_llama-220M-GQA-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BEE-spoke-data/smol_llama-220M-GQA
8845b1d3c0bc73522ef2700aab467183cbdca9f7
6.401567
apache-2.0
11
0
true
true
true
false
false
0.238605
23.860468
0.303167
3.037843
0
0
0.255872
0.782998
0.405875
9.067708
0.114943
1.660387
false
2023-12-22
2024-06-26
0
BEE-spoke-data/smol_llama-220M-GQA
BEE-spoke-data_smol_llama-220M-GQA-fineweb_edu_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BEE-spoke-data/smol_llama-220M-GQA-fineweb_edu" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BEE-spoke-data/smol_llama-220M-GQA-fineweb_edu</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BEE-spoke-data__smol_llama-220M-GQA-fineweb_edu-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BEE-spoke-data/smol_llama-220M-GQA-fineweb_edu
dec16b41d5e94070dbc1f8449a554373fd4cc1d1
6.516558
apache-2.0
1
0
true
true
true
false
false
0.198812
19.881248
0.292905
2.314902
0
0
0.259228
1.230425
0.43676
14.261719
0.112699
1.411052
false
2024-06-08
2024-06-26
1
BEE-spoke-data/smol_llama-220M-GQA
Ba2han_Llama-Phi-3_DoRA_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Ba2han/Llama-Phi-3_DoRA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Ba2han/Llama-Phi-3_DoRA</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Ba2han__Llama-Phi-3_DoRA-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Ba2han/Llama-Phi-3_DoRA
36f99064a7be8ba475c2ee5c5424e95c263ccb87
25.142604
mit
5
3
true
true
true
false
true
0.513053
51.305314
0.551456
37.249164
0.101964
10.196375
0.326342
10.178971
0.406927
9.532552
0.391539
32.393248
false
2024-05-15
2024-06-26
0
Ba2han/Llama-Phi-3_DoRA
Casual-Autopsy_L3-Umbral-Mind-RP-v2.0-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Casual-Autopsy/L3-Umbral-Mind-RP-v2.0-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Casual-Autopsy/L3-Umbral-Mind-RP-v2.0-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Casual-Autopsy__L3-Umbral-Mind-RP-v2.0-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Casual-Autopsy/L3-Umbral-Mind-RP-v2.0-8B
b46c066ea8387264858dc3461f382e7b42fd9c48
25.76087
llama3
11
8
true
false
true
false
true
0.712263
71.226346
0.526241
32.486278
0.101208
10.120846
0.286913
4.9217
0.368667
5.55
0.37234
30.260047
false
2024-06-26
2024-07-02
1
Casual-Autopsy/L3-Umbral-Mind-RP-v2.0-8B (Merge)
CausalLM_14B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/CausalLM/14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CausalLM/14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CausalLM__14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CausalLM/14B
cc054cf5953252d0709cb3267d1a85246e489e95
16.530646
wtfpl
299
14
true
true
true
false
false
0.278821
27.882131
0.470046
24.780943
0.033233
3.323263
0.302852
7.04698
0.415479
11.468229
0.322141
24.682329
true
2023-10-22
2024-06-12
0
CausalLM/14B
CausalLM_34b-beta_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/CausalLM/34b-beta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CausalLM/34b-beta</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CausalLM__34b-beta-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CausalLM/34b-beta
0429951eb30ccdfff3515e711aaa7649a8a7364c
23.18454
gpl-3.0
61
34
true
true
true
false
false
0.304325
30.432475
0.5591
36.677226
0.041541
4.154079
0.346477
12.863535
0.374865
6.92474
0.532497
48.055186
true
2024-02-06
2024-06-26
0
CausalLM/34b-beta
Changgil_K2S3-14b-v0.2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Changgil/K2S3-14b-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Changgil/K2S3-14b-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Changgil__K2S3-14b-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Changgil/K2S3-14b-v0.2
b4f0e1eed2640df2b75847ff37e6ebb1be217b6c
15.074375
cc-by-nc-4.0
0
14
true
true
true
false
false
0.324284
32.428401
0.461331
24.283947
0.045317
4.531722
0.28104
4.138702
0.39226
6.799219
0.264378
18.264258
false
2024-06-17
2024-06-27
0
Changgil/K2S3-14b-v0.2
Changgil_K2S3-v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Changgil/K2S3-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Changgil/K2S3-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Changgil__K2S3-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Changgil/K2S3-v0.1
d544e389f091983bb4f11314edb526d81753c919
14.751167
cc-by-nc-4.0
0
14
true
true
true
false
false
0.327656
32.765617
0.465549
24.559558
0.040785
4.07855
0.264262
1.901566
0.401406
7.842448
0.256233
17.359264
false
2024-04-29
2024-06-27
0
Changgil/K2S3-v0.1
ClaudioItaly_Evolutionstory-7B-v2.2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ClaudioItaly/Evolutionstory-7B-v2.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ClaudioItaly/Evolutionstory-7B-v2.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ClaudioItaly__Evolutionstory-7B-v2.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ClaudioItaly/Evolutionstory-7B-v2.2
9f838721d24a5195bed59a5ed8d9af536f7f2459
20.697542
mit
1
7
true
false
true
false
false
0.481379
48.137941
0.510804
31.623865
0.064199
6.41994
0.275168
3.355705
0.413531
10.658073
0.315908
23.989731
false
2024-08-30
2024-09-01
1
ClaudioItaly/Evolutionstory-7B-v2.2 (Merge)
CohereForAI_aya-23-35B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/aya-23-35B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/aya-23-35B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__aya-23-35B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/aya-23-35B
31d6fd858f20539a55401c7ad913086f54d9ca2c
24.616939
cc-by-nc-4.0
235
34
true
true
true
false
true
0.646193
64.619321
0.539955
34.85836
0.026435
2.643505
0.294463
5.928412
0.43099
13.473698
0.335605
26.178339
true
2024-05-19
2024-06-12
0
CohereForAI/aya-23-35B
CohereForAI_aya-23-8B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/aya-23-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/aya-23-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__aya-23-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/aya-23-8B
ec151d218a24031eb039d92fb83d10445427efc9
15.973219
cc-by-nc-4.0
354
8
true
true
true
false
true
0.469889
46.988878
0.429616
20.203761
0.01435
1.435045
0.284396
4.58613
0.394063
8.424479
0.227809
14.20102
true
2024-05-19
2024-06-12
0
CohereForAI/aya-23-8B
CohereForAI_c4ai-command-r-plus_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/c4ai-command-r-plus" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/c4ai-command-r-plus</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__c4ai-command-r-plus-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/c4ai-command-r-plus
fa1bd7fb1572ceb861bbbbecfa8af83b29fa8cca
30.860542
cc-by-nc-4.0
1,645
103
true
true
true
false
true
0.766419
76.641866
0.581542
39.919954
0.075529
7.55287
0.305369
7.38255
0.480719
20.423177
0.399186
33.242834
true
2024-04-03
2024-06-13
0
CohereForAI/c4ai-command-r-plus
CohereForAI_c4ai-command-r-v01_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/c4ai-command-r-v01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/c4ai-command-r-v01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__c4ai-command-r-v01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/c4ai-command-r-v01
16881ccde1c68bbc7041280e6a66637bc46bfe88
25.349978
cc-by-nc-4.0
1,041
34
true
true
true
false
true
0.674819
67.481948
0.540642
34.556659
0
0
0.307047
7.606264
0.451698
16.128906
0.336935
26.326093
true
2024-03-11
2024-06-13
0
CohereForAI/c4ai-command-r-v01
Columbia-NLP_LION-Gemma-2b-dpo-v1.0_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-Gemma-2b-dpo-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-Gemma-2b-dpo-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-Gemma-2b-dpo-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-Gemma-2b-dpo-v1.0
a5f780075831374f8850324448acf94976dea504
11.483995
0
2
false
true
true
false
true
0.327831
32.783127
0.391996
14.585976
0.043051
4.305136
0.249161
0
0.41201
9.834635
0.166556
7.395095
false
2024-06-28
2024-07-04
0
Columbia-NLP/LION-Gemma-2b-dpo-v1.0
Columbia-NLP_LION-Gemma-2b-dpo-v1.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-Gemma-2b-dpo-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-Gemma-2b-dpo-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-Gemma-2b-dpo-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-Gemma-2b-dpo-v1.0
a5f780075831374f8850324448acf94976dea504
11.085859
0
2
false
true
true
false
true
0.310246
31.02457
0.388103
14.243046
0.043051
4.305136
0.253356
0.447427
0.408073
9.109115
0.166473
7.38586
false
2024-06-28
2024-07-04
0
Columbia-NLP/LION-Gemma-2b-dpo-v1.0
Columbia-NLP_LION-Gemma-2b-odpo-v1.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-Gemma-2b-odpo-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-Gemma-2b-odpo-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-Gemma-2b-odpo-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-Gemma-2b-odpo-v1.0
090d9f59c3b47ab8dd099ddd278c058aa6d2d529
11.35609
4
2
false
true
true
false
true
0.306649
30.664858
0.389584
14.023922
0.037009
3.700906
0.24245
0
0.427917
12.05625
0.169215
7.690603
false
2024-06-28
2024-07-13
0
Columbia-NLP/LION-Gemma-2b-odpo-v1.0
Columbia-NLP_LION-Gemma-2b-sft-v1.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-Gemma-2b-sft-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-Gemma-2b-sft-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-Gemma-2b-sft-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-Gemma-2b-sft-v1.0
44d6f26fa7e3b0d238064d844569bf8a07b7515e
12.326312
0
2
false
true
true
false
true
0.369247
36.924693
0.387878
14.117171
0.05136
5.135952
0.255872
0.782998
0.40274
8.309115
0.178191
8.687943
false
2024-07-02
2024-07-04
0
Columbia-NLP/LION-Gemma-2b-sft-v1.0
Columbia-NLP_LION-LLaMA-3-8b-dpo-v1.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-LLaMA-3-8b-dpo-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-LLaMA-3-8b-dpo-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-LLaMA-3-8b-dpo-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-LLaMA-3-8b-dpo-v1.0
3cddd4a6f5939a0a4db1092a0275342b7b9912f3
21.34482
2
8
false
true
true
false
true
0.495742
49.574241
0.502848
30.356399
0.090634
9.063444
0.28104
4.138702
0.409719
10.28151
0.321892
24.654625
false
2024-06-28
2024-07-04
0
Columbia-NLP/LION-LLaMA-3-8b-dpo-v1.0
Columbia-NLP_LION-LLaMA-3-8b-odpo-v1.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-LLaMA-3-8b-odpo-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-LLaMA-3-8b-odpo-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-LLaMA-3-8b-odpo-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-LLaMA-3-8b-odpo-v1.0
e2cec0d68a67092951e9205dfe634a59f2f4a2dd
19.286743
2
8
false
true
true
false
true
0.396799
39.679938
0.502393
30.457173
0.072508
7.250755
0.285235
4.697987
0.40575
9.71875
0.315243
23.915854
false
2024-06-28
2024-07-04
0
Columbia-NLP/LION-LLaMA-3-8b-odpo-v1.0
Columbia-NLP_LION-LLaMA-3-8b-sft-v1.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-LLaMA-3-8b-sft-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-LLaMA-3-8b-sft-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-LLaMA-3-8b-sft-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-LLaMA-3-8b-sft-v1.0
822eddb2fd127178d9fb7bb9f4fca0e93ada2836
20.257926
0
8
false
true
true
false
true
0.381712
38.171164
0.508777
30.88426
0.084592
8.459215
0.277685
3.691275
0.450271
15.483854
0.32372
24.857787
false
2024-07-02
2024-07-04
0
Columbia-NLP/LION-LLaMA-3-8b-sft-v1.0
CombinHorizon_YiSM-blossom5.1-34B-SLERP_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/CombinHorizon/YiSM-blossom5.1-34B-SLERP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CombinHorizon/YiSM-blossom5.1-34B-SLERP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CombinHorizon__YiSM-blossom5.1-34B-SLERP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CombinHorizon/YiSM-blossom5.1-34B-SLERP
ebd8d6507623008567a0548cd0ff9e28cbd6a656
31.090404
apache-2.0
0
34
true
false
true
false
true
0.503311
50.331121
0.620755
46.397613
0.197885
19.78852
0.355705
14.09396
0.441344
14.367969
0.474069
41.563239
false
2024-08-27
2024-08-27
1
CombinHorizon/YiSM-blossom5.1-34B-SLERP (Merge)
CoolSpring_Qwen2-0.5B-Abyme_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CoolSpring/Qwen2-0.5B-Abyme" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CoolSpring/Qwen2-0.5B-Abyme</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CoolSpring__Qwen2-0.5B-Abyme-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CoolSpring/Qwen2-0.5B-Abyme
a48b7c04b854e5c60fe3464f96904bfc53c8640c
4.76082
apache-2.0
0
0
true
true
true
false
true
0.191519
19.15185
0.286183
2.276484
0.015106
1.510574
0.253356
0.447427
0.354219
1.477344
0.133311
3.701241
false
2024-07-18
2024-09-04
1
Qwen/Qwen2-0.5B
CoolSpring_Qwen2-0.5B-Abyme-merge2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CoolSpring/Qwen2-0.5B-Abyme-merge2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CoolSpring/Qwen2-0.5B-Abyme-merge2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CoolSpring__Qwen2-0.5B-Abyme-merge2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CoolSpring/Qwen2-0.5B-Abyme-merge2
02c4c601453f7ecbfab5c95bf5afa889350026ba
6.068496
apache-2.0
0
0
true
false
true
false
true
0.202185
20.218465
0.299427
3.709041
0.018127
1.812689
0.260067
1.342282
0.368729
3.891146
0.148936
5.437352
false
2024-07-27
2024-07-27
1
CoolSpring/Qwen2-0.5B-Abyme-merge2 (Merge)
CoolSpring_Qwen2-0.5B-Abyme-merge3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CoolSpring/Qwen2-0.5B-Abyme-merge3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CoolSpring/Qwen2-0.5B-Abyme-merge3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CoolSpring__Qwen2-0.5B-Abyme-merge3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CoolSpring/Qwen2-0.5B-Abyme-merge3
86fed893893cc2a6240f0ea09ce2eeda1a5178cc
6.643962
apache-2.0
0
0
true
false
true
false
true
0.238605
23.860468
0.300314
4.301149
0.021148
2.114804
0.264262
1.901566
0.350094
2.128385
0.150017
5.557402
false
2024-07-27
2024-07-27
1
CoolSpring/Qwen2-0.5B-Abyme-merge3 (Merge)
Corianas_llama-3-reactor_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Corianas/llama-3-reactor" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Corianas/llama-3-reactor</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Corianas__llama-3-reactor-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Corianas/llama-3-reactor
bef2eac42fd89baa0064badbc9c7958ad9ccbed3
13.945117
apache-2.0
0
-1
true
true
true
false
false
0.230012
23.001192
0.445715
21.88856
0.043807
4.380665
0.297819
6.375839
0.397719
8.014844
0.280086
20.009604
false
2024-07-20
2024-07-23
0
Corianas/llama-3-reactor
CortexLM_btlm-7b-base-v0.2_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/CortexLM/btlm-7b-base-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CortexLM/btlm-7b-base-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CortexLM__btlm-7b-base-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CortexLM/btlm-7b-base-v0.2
eda8b4298365a26c8981316e09427c237b11217f
8.844726
mit
0
6
true
true
true
false
false
0.148329
14.832866
0.400641
16.193277
0.010574
1.057402
0.253356
0.447427
0.384604
5.542188
0.234957
14.995198
false
2024-06-13
2024-06-26
0
CortexLM/btlm-7b-base-v0.2
Dampfinchen_Llama-3.1-8B-Ultra-Instruct_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Dampfinchen/Llama-3.1-8B-Ultra-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Dampfinchen/Llama-3.1-8B-Ultra-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Dampfinchen__Llama-3.1-8B-Ultra-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Dampfinchen/Llama-3.1-8B-Ultra-Instruct
46662d14130cfd34f7d90816540794f24a301f86
28.975994
llama3
4
8
true
false
true
false
true
0.808109
80.810915
0.525753
32.494587
0.149547
14.954683
0.291946
5.592841
0.400323
8.607031
0.382563
31.395907
false
2024-08-26
2024-08-26
1
Dampfinchen/Llama-3.1-8B-Ultra-Instruct (Merge)
Danielbrdz_Barcenas-14b-Phi-3-medium-ORPO_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Danielbrdz/Barcenas-14b-Phi-3-medium-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Danielbrdz/Barcenas-14b-Phi-3-medium-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Danielbrdz__Barcenas-14b-Phi-3-medium-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Danielbrdz/Barcenas-14b-Phi-3-medium-ORPO
b749dbcb19901b8fd0e9f38c923a24533569f895
31.423745
mit
3
13
true
true
true
false
true
0.479906
47.990554
0.653618
51.029418
0.174471
17.44713
0.326342
10.178971
0.48075
20.527083
0.472324
41.369311
false
2024-06-15
2024-08-13
0
Danielbrdz/Barcenas-14b-Phi-3-medium-ORPO
Danielbrdz_Barcenas-Llama3-8b-ORPO_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Danielbrdz/Barcenas-Llama3-8b-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Danielbrdz/Barcenas-Llama3-8b-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Danielbrdz__Barcenas-Llama3-8b-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Danielbrdz/Barcenas-Llama3-8b-ORPO
66c848c4526d3db1ec41468c0f73ac4448c6abe9
26.380536
other
6
8
true
true
true
false
true
0.737243
73.724274
0.498656
28.600623
0.057402
5.740181
0.307047
7.606264
0.418958
11.169792
0.382979
31.44208
false
2024-04-29
2024-06-29
0
Danielbrdz/Barcenas-Llama3-8b-ORPO
Dans-DiscountModels_Dans-Instruct-CoreCurriculum-12b-ChatML_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Dans-DiscountModels/Dans-Instruct-CoreCurriculum-12b-ChatML" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Dans-DiscountModels/Dans-Instruct-CoreCurriculum-12b-ChatML</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Dans-DiscountModels__Dans-Instruct-CoreCurriculum-12b-ChatML-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Dans-DiscountModels/Dans-Instruct-CoreCurriculum-12b-ChatML
56925fafe6a543e224db36864dd0927171542776
14.785708
apache-2.0
0
12
true
true
true
false
false
0.047811
4.781092
0.524414
32.024613
0.037764
3.776435
0.305369
7.38255
0.41849
12.077865
0.358045
28.67169
false
2024-09-04
2024-09-04
1
mistralai/Mistral-Nemo-Base-2407
Darkknight535_OpenCrystal-12B-L3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Darkknight535/OpenCrystal-12B-L3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Darkknight535/OpenCrystal-12B-L3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Darkknight535__OpenCrystal-12B-L3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Darkknight535/OpenCrystal-12B-L3
974d2d453afdde40f6a993601bbbbf9d97b43606
20.509243
10
11
false
true
true
false
false
0.407091
40.709096
0.52226
31.844491
0.079305
7.930514
0.306208
7.494407
0.365656
5.740365
0.364029
29.336584
false
2024-08-25
2024-08-26
0
Darkknight535/OpenCrystal-12B-L3
Deci_DeciLM-7B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
DeciLMForCausalLM
<a target="_blank" href="https://huggingface.co/Deci/DeciLM-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Deci/DeciLM-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Deci__DeciLM-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Deci/DeciLM-7B
c3c9f4226801dc0433f32aebffe0aac68ee2f051
14.947949
apache-2.0
223
7
true
true
true
false
false
0.281295
28.129474
0.442286
21.25273
0.024169
2.416918
0.295302
6.040268
0.435854
13.048438
0.269199
18.799867
true
2023-12-10
2024-06-12
0
Deci/DeciLM-7B
Deci_DeciLM-7B-instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
DeciLMForCausalLM
<a target="_blank" href="https://huggingface.co/Deci/DeciLM-7B-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Deci/DeciLM-7B-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Deci__DeciLM-7B-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Deci/DeciLM-7B-instruct
4adc7aa9efe61b47b0a98b2cc94527d9c45c3b4f
17.432328
apache-2.0
96
7
true
true
true
false
true
0.488024
48.8024
0.458975
23.887149
0.027946
2.794562
0.28943
5.257271
0.388417
5.985417
0.260805
17.867169
true
2023-12-10
2024-06-12
0
Deci/DeciLM-7B-instruct
DeepMount00_Llama-3-8b-Ita_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepMount00/Llama-3-8b-Ita" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepMount00/Llama-3-8b-Ita</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepMount00__Llama-3-8b-Ita-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepMount00/Llama-3-8b-Ita
d40847d2981b588690c1dc21d5157d3f4afb2978
26.582818
llama3
23
8
true
true
true
false
true
0.75303
75.302974
0.493577
28.077746
0.053625
5.362538
0.305369
7.38255
0.426771
11.679688
0.385223
31.691415
false
2024-05-01
2024-06-27
1
meta-llama/Meta-Llama-3-8B
DreadPoor_Irina-8B-model_stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/Irina-8B-model_stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Irina-8B-model_stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Irina-8B-model_stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/Irina-8B-model_stock
b282e3ab449d71a31f48b8c13eb43a4435968728
25.161035
apache-2.0
1
8
true
false
true
false
true
0.67994
67.994034
0.523664
32.08833
0.090634
9.063444
0.284396
4.58613
0.400292
8.636458
0.35738
28.597813
false
2024-08-30
2024-08-30
1
DreadPoor/Irina-8B-model_stock (Merge)
DreadPoor_ONeil-model_stock-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/ONeil-model_stock-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/ONeil-model_stock-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__ONeil-model_stock-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/ONeil-model_stock-8B
d4b84956211fd57b85122fe0c6f88b2cb9a9c86a
26.784851
apache-2.0
2
8
true
false
true
false
true
0.678566
67.85662
0.554834
36.412613
0.092145
9.214502
0.305369
7.38255
0.417344
10.967969
0.359874
28.874852
false
2024-07-06
2024-07-15
1
DreadPoor/ONeil-model_stock-8B (Merge)
DreadPoor_Sellen-8B-model_stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/Sellen-8B-model_stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Sellen-8B-model_stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Sellen-8B-model_stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/Sellen-8B-model_stock
accde7145d81a428c782695ea61eebc608efd980
26.173645
apache-2.0
1
8
true
false
true
false
true
0.711289
71.128938
0.523168
31.360979
0.120846
12.084592
0.274329
3.243848
0.396042
10.671875
0.356965
28.55164
false
2024-08-21
2024-08-27
1
DreadPoor/Sellen-8B-model_stock (Merge)
DreadPoor_Trinas_Nectar-8B-model_stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/Trinas_Nectar-8B-model_stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Trinas_Nectar-8B-model_stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Trinas_Nectar-8B-model_stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/Trinas_Nectar-8B-model_stock
cb46b8431872557904d83fc5aa1b90dabeb74acc
27.270692
apache-2.0
2
8
true
false
true
false
true
0.725927
72.592721
0.525612
31.975094
0.137462
13.746224
0.286074
4.809843
0.406771
11.413021
0.361785
29.087249
false
2024-08-16
2024-08-27
1
DreadPoor/Trinas_Nectar-8B-model_stock (Merge)
EleutherAI_gpt-j-6b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GPTJForCausalLM
<a target="_blank" href="https://huggingface.co/EleutherAI/gpt-j-6b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/gpt-j-6b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__gpt-j-6b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EleutherAI/gpt-j-6b
47e169305d2e8376be1d31e765533382721b2cc1
6.545236
apache-2.0
1,407
6
true
true
true
false
false
0.252219
25.221856
0.319104
4.912818
0.012085
1.208459
0.245805
0
0.36575
5.252083
0.124086
2.676197
true
2022-03-02
2024-08-19
0
EleutherAI/gpt-j-6b
EleutherAI_gpt-neo-1.3B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GPTNeoForCausalLM
<a target="_blank" href="https://huggingface.co/EleutherAI/gpt-neo-1.3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/gpt-neo-1.3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__gpt-neo-1.3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EleutherAI/gpt-neo-1.3B
dbe59a7f4a88d01d1ba9798d78dbe3fe038792c8
5.32815
mit
254
1
true
true
true
false
false
0.207905
20.790503
0.303923
3.024569
0.006798
0.679758
0.255872
0.782998
0.381656
4.873698
0.116356
1.817376
true
2022-03-02
2024-06-12
0
EleutherAI/gpt-neo-1.3B
EleutherAI_gpt-neo-125m_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GPTNeoForCausalLM
<a target="_blank" href="https://huggingface.co/EleutherAI/gpt-neo-125m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/gpt-neo-125m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__gpt-neo-125m-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EleutherAI/gpt-neo-125m
21def0189f5705e2521767faed922f1f15e7d7db
4.382146
mit
177
0
true
true
true
false
false
0.190544
19.054442
0.311516
3.436739
0.004532
0.453172
0.253356
0.447427
0.359333
2.616667
0.10256
0.284427
true
2022-03-02
2024-08-10
0
EleutherAI/gpt-neo-125m
EleutherAI_gpt-neo-2.7B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GPTNeoForCausalLM
<a target="_blank" href="https://huggingface.co/EleutherAI/gpt-neo-2.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/gpt-neo-2.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__gpt-neo-2.7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EleutherAI/gpt-neo-2.7B
e24fa291132763e59f4a5422741b424fb5d59056
6.342931
mit
414
2
true
true
true
false
false
0.258963
25.896289
0.313952
4.178603
0.005287
0.528701
0.26594
2.12528
0.355365
3.520573
0.116273
1.808141
true
2022-03-02
2024-06-12
0
EleutherAI/gpt-neo-2.7B
EleutherAI_gpt-neox-20b_float16
float16
🟢 pretrained
🟢
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/EleutherAI/gpt-neox-20b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/gpt-neox-20b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__gpt-neox-20b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EleutherAI/gpt-neox-20b
c292233c833e336628618a88a648727eb3dff0a7
5.990641
apache-2.0
514
20
true
true
true
false
false
0.258688
25.868806
0.316504
4.929114
0.006042
0.60423
0.243289
0
0.364667
2.816667
0.115525
1.72503
true
2022-04-07
2024-06-09
0
EleutherAI/gpt-neox-20b
EleutherAI_pythia-12b_float16
float16
🟢 pretrained
🟢
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/EleutherAI/pythia-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/pythia-12b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__pythia-12b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EleutherAI/pythia-12b
35c9d7f32fbb108fb8b5bdd574eb03369d1eed49
5.93396
apache-2.0
130
12
true
true
true
false
false
0.247148
24.714757
0.317965
4.987531
0.009063
0.906344
0.246644
0
0.364698
3.78724
0.110871
1.20789
true
2023-02-28
2024-06-12
0
EleutherAI/pythia-12b
EleutherAI_pythia-160m_float16
float16
🟢 pretrained
🟢
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/EleutherAI/pythia-160m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/pythia-160m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__pythia-160m-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EleutherAI/pythia-160m
50f5173d932e8e61f858120bcb800b97af589f46
5.617102
apache-2.0
23
0
true
true
true
false
false
0.181552
18.155162
0.297044
2.198832
0.002266
0.226586
0.258389
1.118568
0.417938
10.675521
0.111951
1.32794
true
2023-02-08
2024-06-09
0
EleutherAI/pythia-160m
EleutherAI_pythia-2.8b_float16
float16
🟢 pretrained
🟢
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/EleutherAI/pythia-2.8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/pythia-2.8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__pythia-2.8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EleutherAI/pythia-2.8b
2a259cdd96a4beb1cdf467512e3904197345f6a9
5.441653
apache-2.0
28
2
true
true
true
false
false
0.217322
21.732226
0.322409
5.077786
0.006798
0.679758
0.25
0
0.348573
3.638281
0.113697
1.521868
true
2023-02-13
2024-06-12
0
EleutherAI/pythia-2.8b
EleutherAI_pythia-410m_float16
float16
🟢 pretrained
🟢
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/EleutherAI/pythia-410m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/pythia-410m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__pythia-410m-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EleutherAI/pythia-410m
9879c9b5f8bea9051dcb0e68dff21493d67e9d4f
5.113779
apache-2.0
21
0
true
true
true
false
false
0.219545
21.954525
0.302813
2.715428
0.003021
0.302115
0.259228
1.230425
0.357813
3.059896
0.112783
1.420287
true
2023-02-13
2024-06-09
0
EleutherAI/pythia-410m
EleutherAI_pythia-6.9b_float16
float16
🟢 pretrained
🟢
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/EleutherAI/pythia-6.9b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/pythia-6.9b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__pythia-6.9b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EleutherAI/pythia-6.9b
f271943e880e60c0c715fd10e4dc74ec4e31eb44
5.853254
apache-2.0
43
6
true
true
true
false
false
0.228114
22.811363
0.323229
5.881632
0.007553
0.755287
0.251678
0.223714
0.359052
3.814844
0.114694
1.632683
true
2023-02-14
2024-06-12
0
EleutherAI/pythia-6.9b
Enno-Ai_EnnoAi-Pro-French-Llama-3-8B-v0.4_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Enno-Ai/EnnoAi-Pro-French-Llama-3-8B-v0.4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Enno-Ai/EnnoAi-Pro-French-Llama-3-8B-v0.4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Enno-Ai__EnnoAi-Pro-French-Llama-3-8B-v0.4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Enno-Ai/EnnoAi-Pro-French-Llama-3-8B-v0.4
328722ae96e3a112ec900dbe77d410788a526c5c
15.180945
creativeml-openrail-m
0
8
true
true
true
false
true
0.418881
41.888079
0.407495
16.875928
0.006042
0.60423
0.270973
2.796421
0.417
10.758333
0.263464
18.162677
false
2024-06-27
2024-06-30
0
Enno-Ai/EnnoAi-Pro-French-Llama-3-8B-v0.4
Enno-Ai_EnnoAi-Pro-Llama-3-8B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Enno-Ai/EnnoAi-Pro-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Enno-Ai/EnnoAi-Pro-Llama-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Enno-Ai__EnnoAi-Pro-Llama-3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Enno-Ai/EnnoAi-Pro-Llama-3-8B
6a5d745bdd304753244fe601e2a958d37d13cd71
12.174667
creativeml-openrail-m
0
8
true
true
true
false
true
0.319538
31.953772
0.415158
17.507545
0.001511
0.151057
0.261745
1.565996
0.407052
9.08151
0.215093
12.788121
false
2024-07-01
2024-07-08
0
Enno-Ai/EnnoAi-Pro-Llama-3-8B
Enno-Ai_EnnoAi-Pro-Llama-3-8B-v0.3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Enno-Ai/EnnoAi-Pro-Llama-3-8B-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Enno-Ai/EnnoAi-Pro-Llama-3-8B-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Enno-Ai__EnnoAi-Pro-Llama-3-8B-v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Enno-Ai/EnnoAi-Pro-Llama-3-8B-v0.3
cf29b8b484a909132e3a1f85ce891d28347c0d13
17.498882
creativeml-openrail-m
0
8
true
true
true
false
true
0.508257
50.825698
0.410058
16.668386
0.010574
1.057402
0.265101
2.013423
0.423573
12.313281
0.299036
22.1151
false
2024-06-26
2024-06-26
0
Enno-Ai/EnnoAi-Pro-Llama-3-8B-v0.3
EpistemeAI_Athena-gemma-2-2b-it_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI/Athena-gemma-2-2b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Athena-gemma-2-2b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Athena-gemma-2-2b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/Athena-gemma-2-2b-it
661c1dc6a1a096222e33416e099bd02b7b970405
14.051787
apache-2.0
3
2
true
true
true
false
false
0.292185
29.218474
0.424781
19.067417
0.033233
3.323263
0.267617
2.348993
0.441781
14.489323
0.242769
15.863254
false
2024-08-29
2024-09-05
2
unsloth/gemma-2-9b-it-bnb-4bit
EpistemeAI_FineLlama3.1-8B-Instruct_4bit
4bit
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/EpistemeAI/FineLlama3.1-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/FineLlama3.1-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__FineLlama3.1-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/FineLlama3.1-8B-Instruct
a8b0fc584b10e0110e04f9d21c7f10d24391c1d5
11.050434
llama3.1
1
8
true
true
true
false
false
0.08001
8.000993
0.455736
23.506619
0.023414
2.34139
0.280201
4.026846
0.348167
4.954167
0.311253
23.472592
false
2024-08-10
2024-08-10
0
EpistemeAI/FineLlama3.1-8B-Instruct
EpistemeAI_Fireball-12B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/Fireball-12B
e2ed12c3244f2502321fb20e76dfc72ad7817d6e
15.446415
apache-2.0
1
12
true
true
true
false
false
0.18335
18.335018
0.511089
30.666712
0.035498
3.549849
0.261745
1.565996
0.423635
12.521094
0.334358
26.03982
false
2024-08-20
2024-08-21
2
EpistemeAI/Fireball-Mistral-Nemo-Base-2407-sft-v2.1
EpistemeAI_Fireball-Mistral-Nemo-Base-2407-v1-DPO2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Mistral-Nemo-Base-2407-v1-DPO2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Mistral-Nemo-Base-2407-v1-DPO2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Mistral-Nemo-Base-2407-v1-DPO2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/Fireball-Mistral-Nemo-Base-2407-v1-DPO2
2cf732fbffefdf37341b946edd7995f14d3f9487
15.251223
apache-2.0
0
12
true
true
true
false
false
0.186073
18.607295
0.496777
28.567825
0.030967
3.096677
0.291946
5.592841
0.40401
9.501302
0.335273
26.141401
false
2024-08-19
2024-08-19
1
EpistemeAI/Fireball-Nemo-Base-2407-sft-v1