Spaces:
Running
on
Zero
Running
on
Zero
Commit
·
fd0a16f
1
Parent(s):
d13d572
Auto Daily Leaderboard udpate Fri Nov 8 12:00:28 PM EST 2024
Browse files- arena_elo/results/20241107/clean_battle_image_editing.json +0 -0
- arena_elo/results/20241107/elo_results_image_editing.pkl +3 -0
- arena_elo/results/20241107/image_editing_leaderboard.csv +10 -0
- arena_elo/results/20241108/clean_battle_t2i_generation.json +0 -0
- arena_elo/results/20241108/clean_battle_video_generation.json +0 -0
- arena_elo/results/20241108/elo_results_t2i_generation.pkl +3 -0
- arena_elo/results/20241108/elo_results_video_generation.pkl +3 -0
- arena_elo/results/20241108/t2i_generation_leaderboard.csv +18 -0
- arena_elo/results/20241108/video_generation_leaderboard.csv +14 -0
- arena_elo/results/latest/clean_battle_image_editing.json +32 -0
- arena_elo/results/latest/clean_battle_t2i_generation.json +378 -0
- arena_elo/results/latest/clean_battle_video_generation.json +616 -0
- arena_elo/results/latest/elo_results_image_editing.pkl +2 -2
- arena_elo/results/latest/elo_results_t2i_generation.pkl +1 -1
- arena_elo/results/latest/elo_results_video_generation.pkl +2 -2
- arena_elo/results/latest/image_editing_leaderboard.csv +9 -9
- arena_elo/results/latest/t2i_generation_leaderboard.csv +17 -17
- arena_elo/results/latest/video_generation_leaderboard.csv +13 -12
arena_elo/results/20241107/clean_battle_image_editing.json
ADDED
The diff for this file is too large to render.
See raw diff
|
|
arena_elo/results/20241107/elo_results_image_editing.pkl
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:be417506363c4ddba63510d435a00c86aa4400c1d427b7febd62035fecf64844
|
3 |
+
size 63250
|
arena_elo/results/20241107/image_editing_leaderboard.csv
ADDED
@@ -0,0 +1,10 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
key,Model,Arena Elo rating (anony),Arena Elo rating (full),License,Organization,Link
|
2 |
+
MagicBrush,MagicBrush,1106.5447717908323,1110.7186073195908,CC-BY-4.0,"The Ohio State University, University of Waterloo",https://osu-nlp-group.github.io/MagicBrush/
|
3 |
+
InfEdit,InfEdit,1074.6862089326528,1074.3713838133683,CC BY-NC-ND 4.0,"University of Michigan, University of California, Berkeley",https://sled-group.github.io/InfEdit/
|
4 |
+
CosXLEdit,CosXLEdit,1066.2411914733814,1067.2477407063507,cosxl-nc-community,Stability AI,https://huggingface.co/stabilityai/cosxl
|
5 |
+
InstructPix2Pix,InstructPix2Pix,1039.4256126871376,1037.1386428878702,"Copyright 2023 Timothy Brooks, Aleksander Holynski, Alexei A. Efros","University of California, Berkeley",https://www.timothybrooks.com/instruct-pix2pix
|
6 |
+
PNP,PNP,997.0961094254252,1001.6671711750197,-,Weizmann Institute of Science,https://github.com/MichalGeyer/plug-and-play
|
7 |
+
Prompt2prompt,Prompt2prompt,989.1778674835481,990.2728540135581,Apache-2.0,"Google, Tel Aviv University",https://prompt-to-prompt.github.io/
|
8 |
+
CycleDiffusion,CycleDiffusion,944.154660983573,937.6776782292361,X11,Carnegie Mellon University,https://github.com/ChenWu98/cycle-diffusion?tab=readme-ov-file
|
9 |
+
SDEdit,SDEdit,924.5780071575175,923.0459636105026,MIT License,Stanford University,https://sde-image-editing.github.io
|
10 |
+
Pix2PixZero,Pix2PixZero,858.0955700659324,857.8599582445036,MIT License,"Carnegie Mellon University, Adobe Research",https://pix2pixzero.github.io/
|
arena_elo/results/20241108/clean_battle_t2i_generation.json
ADDED
The diff for this file is too large to render.
See raw diff
|
|
arena_elo/results/20241108/clean_battle_video_generation.json
ADDED
The diff for this file is too large to render.
See raw diff
|
|
arena_elo/results/20241108/elo_results_t2i_generation.pkl
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:a6564c97a003ace32211956b729d1f20fb5487564a9f81882bd3bcd85affc4f4
|
3 |
+
size 88251
|
arena_elo/results/20241108/elo_results_video_generation.pkl
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:a1eda11d7421d84721e43a24bd205f1f8fc3b12204f967b8080b4abb1bb26236
|
3 |
+
size 75102
|
arena_elo/results/20241108/t2i_generation_leaderboard.csv
ADDED
@@ -0,0 +1,18 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
key,Model,Arena Elo rating (anony),Arena Elo rating (full),License,Organization,Link
|
2 |
+
PlayGround V2.5,PlayGround V2.5,1121.2448738490257,1120.8169702253008,Playground v2.5 Community License,Playground,https://huggingface.co/playgroundai/playground-v2.5-1024px-aesthetic
|
3 |
+
FLUX.1-dev,FLUX.1-dev,1116.7026685210212,1141.1151822428205,flux-1-dev-non-commercial-license (other),Black Forest Labs,https://huggingface.co/docs/diffusers/main/en/api/pipelines/flux
|
4 |
+
FLUX.1-schnell,FLUX.1-schnell,1086.512731859874,1075.3547247702184,Apache-2.0,Black Forest Labs,https://huggingface.co/docs/diffusers/main/en/api/pipelines/flux
|
5 |
+
PlayGround V2,PlayGround V2,1071.9998075391602,1070.301874711683,Playground v2 Community License,Playground,https://huggingface.co/playgroundai/playground-v2-1024px-aesthetic
|
6 |
+
Kolors,Kolors,1053.9296445023099,1052.2116999769546,Apache-2.0,Kwai Kolors,https://huggingface.co/Kwai-Kolors/Kolors
|
7 |
+
StableCascade,StableCascade,1038.9061641121523,1040.9157152652126,stable-cascade-nc-community (other),Stability AI,https://fal.ai/models/stable-cascade/api
|
8 |
+
HunyuanDiT,HunyuanDiT,1029.076379309246,1021.5378651760857,tencent-hunyuan-community,Tencent,https://github.com/Tencent/HunyuanDiT
|
9 |
+
PixArtAlpha,PixArtAlpha,1019.694068041725,1010.4226020005569,openrail++,PixArt-alpha,https://huggingface.co/PixArt-alpha/PixArt-XL-2-1024-MS
|
10 |
+
PixArtSigma,PixArtSigma,1018.785193454226,1018.1812651767434,openrail++,PixArt-alpha,https://github.com/PixArt-alpha/PixArt-sigma
|
11 |
+
SDXL-Lightning,SDXL-Lightning,1018.3482253799551,1021.076910864609,openrail++,ByteDance,https://huggingface.co/ByteDance/SDXL-Lightning
|
12 |
+
AuraFlow,AuraFlow,1012.1391160750907,1005.7338671616257,Apache-2.0,Fal.AI,https://huggingface.co/fal/AuraFlow
|
13 |
+
SD3,SD3,1009.5997566318815,1012.142846877162,stabilityai-nc-research-community,Stability AI,https://huggingface.co/blog/sd3
|
14 |
+
SDXL,SDXL,965.1116643161047,965.1708053088519,openrail++,Stability AI,https://huggingface.co/stabilityai/stable-diffusion-xl-base-1.0
|
15 |
+
SDXLTurbo,SDXLTurbo,914.5040248319536,911.2606190195628,sai-nc-community (other),Stability AI,https://huggingface.co/stabilityai/sdxl-turbo
|
16 |
+
LCM(v1.5/XL),LCM(v1.5/XL),904.6695030873209,898.0322197461196,openrail++,Latent Consistency,https://fal.ai/models/fast-lcm-diffusion-turbo
|
17 |
+
OpenJourney,OpenJourney,829.220699325143,823.5215471555109,creativeml-openrail-m,PromptHero,https://huggingface.co/prompthero/openjourney
|
18 |
+
LCM,LCM,789.5554791638133,802.894361004465,MIT License,Tsinghua University,https://huggingface.co/SimianLuo/LCM_Dreamshaper_v7
|
arena_elo/results/20241108/video_generation_leaderboard.csv
ADDED
@@ -0,0 +1,14 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
key,Model,Arena Elo rating (anony),Arena Elo rating (full),License,Organization,Link
|
2 |
+
Pyramid Flow,Pyramid Flow,1155.0515986410344,1154.8387278864816,MIT LICENSE,Peking University,https://pyramid-flow.github.io/
|
3 |
+
CogVideoX-5B,CogVideoX-5B,1141.1909760883307,1141.065783136741,CogVideoX LICENSE,THUDM,https://github.com/THUDM/CogVideo
|
4 |
+
StableVideoDiffusion,StableVideoDiffusion,1122.5352223205691,1125.370981071734,SVD-nc-community,Stability AI,https://fal.ai/models/fal-ai/fast-svd/text-to-video/api
|
5 |
+
CogVideoX-2B,CogVideoX-2B,1059.816984062651,1055.8349374966426,CogVideoX LICENSE,THUDM,https://github.com/THUDM/CogVideo
|
6 |
+
T2V-Turbo,T2V-Turbo,1054.2466944734051,1053.7591835284088,cc-by-nc-4.0,"University of California, Santa Barbara",https://github.com/Ji4chenLi/t2v-turbo
|
7 |
+
AnimateDiff,AnimateDiff,1044.3154266675936,1044.28949489859,creativeml-openrail-m,"The Chinese University of Hong Kong, Shanghai AI Lab, Stanford University",https://fal.ai/models/fast-animatediff-t2v
|
8 |
+
VideoCrafter2,VideoCrafter2,1042.2563883318585,1043.2956243436313,Apache 2.0,Tencent AI Lab,https://ailab-cvc.github.io/videocrafter2/
|
9 |
+
Allegro,Allegro,992.8493528200153,992.0385568838952,Apache 2.0,rhymes-ai,https://github.com/rhymes-ai/Allegro
|
10 |
+
LaVie,LaVie,971.1904311170582,971.9786970887593,Apache 2.0,Shanghai AI Lab,https://github.com/Vchitect/LaVie
|
11 |
+
OpenSora,OpenSora,887.8493947773078,888.1758928863985,Apache 2.0,HPC-AI Tech,https://github.com/hpcaitech/Open-Sora
|
12 |
+
OpenSora v1.2,OpenSora v1.2,849.8115139036329,848.6702656373551,Apache 2.0,HPC-AI Tech,https://github.com/hpcaitech/Open-Sora
|
13 |
+
AnimateDiff Turbo,AnimateDiff Turbo,841.2349314485369,841.035380180319,creativeml-openrail-m,"The Chinese University of Hong Kong, Shanghai AI Lab, Stanford University",https://fal.ai/models/fast-animatediff-t2v-turbo
|
14 |
+
ModelScope,ModelScope,837.6510853480027,839.6464749610427,cc-by-nc-4.0,Alibaba Group,https://arxiv.org/abs/2308.06571
|
arena_elo/results/latest/clean_battle_image_editing.json
CHANGED
@@ -19592,5 +19592,37 @@
|
|
19592 |
"judge": "arena_user_10.16.13.73",
|
19593 |
"anony": true,
|
19594 |
"tstamp": 1730836358.5089
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
19595 |
}
|
19596 |
]
|
|
|
19592 |
"judge": "arena_user_10.16.13.73",
|
19593 |
"anony": true,
|
19594 |
"tstamp": 1730836358.5089
|
19595 |
+
},
|
19596 |
+
{
|
19597 |
+
"model_a_conv_id": "bd4fbee78e7b4756a7097e8a76b84769",
|
19598 |
+
"model_b_conv_id": "e04eb12ccde944a5afaf987defc93452",
|
19599 |
+
"inputs": {
|
19600 |
+
"source_prompt": "The couch and table were in the living room.",
|
19601 |
+
"target_prompt": "The couch and aquarium were in the living room.",
|
19602 |
+
"instruct_prompt": "remove the table and add an aquarium"
|
19603 |
+
},
|
19604 |
+
"model_a": "Prompt2prompt",
|
19605 |
+
"model_b": "CycleDiffusion",
|
19606 |
+
"vote_type": "bothbad_vote",
|
19607 |
+
"winner": "tie (bothbad)",
|
19608 |
+
"judge": "arena_user_10.16.38.136",
|
19609 |
+
"anony": true,
|
19610 |
+
"tstamp": 1731015255.1367
|
19611 |
+
},
|
19612 |
+
{
|
19613 |
+
"model_a_conv_id": "fd58ee184e174efabe70f0b61c6d2ff5",
|
19614 |
+
"model_b_conv_id": "f24f388cc1364e0e9922d414d86bd6d4",
|
19615 |
+
"inputs": {
|
19616 |
+
"source_prompt": "A picture of a beautifully decorated wedding cake with fruits instead of donuts.",
|
19617 |
+
"target_prompt": "A bride admiring a beautifully decorated wedding cake with fruits instead of donuts.",
|
19618 |
+
"instruct_prompt": "let a woman in a bridal gown stand near the cake"
|
19619 |
+
},
|
19620 |
+
"model_a": "CycleDiffusion",
|
19621 |
+
"model_b": "Pix2PixZero",
|
19622 |
+
"vote_type": "leftvote",
|
19623 |
+
"winner": "model_a",
|
19624 |
+
"judge": "arena_user_10.16.38.136",
|
19625 |
+
"anony": false,
|
19626 |
+
"tstamp": 1731040620.9362
|
19627 |
}
|
19628 |
]
|
arena_elo/results/latest/clean_battle_t2i_generation.json
CHANGED
@@ -106268,5 +106268,383 @@
|
|
106268 |
"judge": "arena_user_10.16.13.73",
|
106269 |
"anony": true,
|
106270 |
"tstamp": 1730996143.3508
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
106271 |
}
|
106272 |
]
|
|
|
106268 |
"judge": "arena_user_10.16.13.73",
|
106269 |
"anony": true,
|
106270 |
"tstamp": 1730996143.3508
|
106271 |
+
},
|
106272 |
+
{
|
106273 |
+
"model_a_conv_id": "0ccfb3d96e614e69b5c92204c646cb5f",
|
106274 |
+
"model_b_conv_id": "af3512f1b5714ebbb6fb19dc95c8d4ad",
|
106275 |
+
"inputs": {
|
106276 |
+
"prompt": "big strong muscle legs of a strong woman standing before him she is massive busts lady Large breasts disproportionate to body size,\n very very tall , huge and large ass, huge thighs, , ,very fat arms ,150 kg ,large shoulders, big relaxed tits, very big muscle calves , Lady is holding with hand a bat that accentuates her powerful physique soc \n, short haircut ,all wear Knitted dress , toe pointy high heel implying a strong sense of control and authority, , cruel , Full body side view biside, possibly with tattoos or scars, hinting at a complex personality, living room , looking down , angry face , victory pose\nslave man kissing her shoes\n low view dowa strong woman standing before him she is massive busty lady \n \n\nstrong woman standing before him she is massive busty lady \n with wide spreaded legs over a exhausted badly bleeding slave . Ladys Face is Very angry and has a evil-triumphant Face-expression. "
|
106277 |
+
},
|
106278 |
+
"model_a": "HunyuanDiT",
|
106279 |
+
"model_b": "FLUX.1-schnell",
|
106280 |
+
"vote_type": "rightvote",
|
106281 |
+
"winner": "model_b",
|
106282 |
+
"judge": "arena_user_10.16.19.51",
|
106283 |
+
"anony": true,
|
106284 |
+
"tstamp": 1730999807.5458
|
106285 |
+
},
|
106286 |
+
{
|
106287 |
+
"model_a_conv_id": "beb498d7ff83465c9633ef4a8948db91",
|
106288 |
+
"model_b_conv_id": "e6e1242aafbf404883a500f311acb84f",
|
106289 |
+
"inputs": {
|
106290 |
+
"prompt": "image of a hyper realistic alien hand puppet on a white background"
|
106291 |
+
},
|
106292 |
+
"model_a": "FLUX.1-schnell",
|
106293 |
+
"model_b": "FLUX.1-dev",
|
106294 |
+
"vote_type": "rightvote",
|
106295 |
+
"winner": "model_b",
|
106296 |
+
"judge": "arena_user_10.16.19.51",
|
106297 |
+
"anony": false,
|
106298 |
+
"tstamp": 1731043513.109
|
106299 |
+
},
|
106300 |
+
{
|
106301 |
+
"model_a_conv_id": "9139316e25514306a879d45725b048db",
|
106302 |
+
"model_b_conv_id": "c2c18b5a3cbb4cb789be074eea5acd2e",
|
106303 |
+
"inputs": {
|
106304 |
+
"prompt": "New York Skyline with 'Diffusion' written with fireworks on the sky."
|
106305 |
+
},
|
106306 |
+
"model_a": "FLUX.1-schnell",
|
106307 |
+
"model_b": "FLUX.1-dev",
|
106308 |
+
"vote_type": "rightvote",
|
106309 |
+
"winner": "model_b",
|
106310 |
+
"judge": "arena_user_10.16.13.73",
|
106311 |
+
"anony": false,
|
106312 |
+
"tstamp": 1731043567.9344
|
106313 |
+
},
|
106314 |
+
{
|
106315 |
+
"model_a_conv_id": "b1d00dc986074a3c95cf6ee4662952b3",
|
106316 |
+
"model_b_conv_id": "a94c4b5b3aeb4c00a1e41026b0a1cf80",
|
106317 |
+
"inputs": {
|
106318 |
+
"prompt": "hyperrealism woman wearing a black robe holding a can of beer"
|
106319 |
+
},
|
106320 |
+
"model_a": "FLUX.1-schnell",
|
106321 |
+
"model_b": "FLUX.1-dev",
|
106322 |
+
"vote_type": "tievote",
|
106323 |
+
"winner": "tie",
|
106324 |
+
"judge": "arena_user_10.16.12.226",
|
106325 |
+
"anony": false,
|
106326 |
+
"tstamp": 1731043645.1557
|
106327 |
+
},
|
106328 |
+
{
|
106329 |
+
"model_a_conv_id": "8ce485b9cd70408dbd7fb7bb9616a8c5",
|
106330 |
+
"model_b_conv_id": "3757342ce2a145d99684db8c782302af",
|
106331 |
+
"inputs": {
|
106332 |
+
"prompt": "A futuristic hopeful busy city, purple and green color scheme"
|
106333 |
+
},
|
106334 |
+
"model_a": "FLUX.1-schnell",
|
106335 |
+
"model_b": "FLUX.1-dev",
|
106336 |
+
"vote_type": "rightvote",
|
106337 |
+
"winner": "model_b",
|
106338 |
+
"judge": "arena_user_10.16.19.51",
|
106339 |
+
"anony": false,
|
106340 |
+
"tstamp": 1731043750.8816
|
106341 |
+
},
|
106342 |
+
{
|
106343 |
+
"model_a_conv_id": "c32f3637ff944eabab6adb18117c3bf2",
|
106344 |
+
"model_b_conv_id": "d717c89d79fc4adc9d412db8a626043b",
|
106345 |
+
"inputs": {
|
106346 |
+
"prompt": "A futuristic hopeful busy city, purple and green color scheme"
|
106347 |
+
},
|
106348 |
+
"model_a": "AuraFlow",
|
106349 |
+
"model_b": "FLUX.1-dev",
|
106350 |
+
"vote_type": "rightvote",
|
106351 |
+
"winner": "model_b",
|
106352 |
+
"judge": "arena_user_10.16.13.73",
|
106353 |
+
"anony": false,
|
106354 |
+
"tstamp": 1731043834.3755
|
106355 |
+
},
|
106356 |
+
{
|
106357 |
+
"model_a_conv_id": "18bef5f9f6d145ca9ab3be922df80a53",
|
106358 |
+
"model_b_conv_id": "d6e6de8631de473f8ec750b8faac50d8",
|
106359 |
+
"inputs": {
|
106360 |
+
"prompt": "image of a fashion model standing in an audience, wearing black balenciaga hooded jacket, looking away from camera, covered in plastic, 3D, 4K, fashion shoot"
|
106361 |
+
},
|
106362 |
+
"model_a": "FLUX.1-dev",
|
106363 |
+
"model_b": "PlayGround V2.5",
|
106364 |
+
"vote_type": "leftvote",
|
106365 |
+
"winner": "model_a",
|
106366 |
+
"judge": "arena_user_10.16.19.51",
|
106367 |
+
"anony": false,
|
106368 |
+
"tstamp": 1731045571.1676
|
106369 |
+
},
|
106370 |
+
{
|
106371 |
+
"model_a_conv_id": "aba6778c0ee8452d90a607349bc93eea",
|
106372 |
+
"model_b_conv_id": "2eede64b963a42c18ecc711acfdca52b",
|
106373 |
+
"inputs": {
|
106374 |
+
"prompt": "A blue cup and a green cell phone."
|
106375 |
+
},
|
106376 |
+
"model_a": "FLUX.1-dev",
|
106377 |
+
"model_b": "PlayGround V2.5",
|
106378 |
+
"vote_type": "leftvote",
|
106379 |
+
"winner": "model_a",
|
106380 |
+
"judge": "arena_user_10.16.38.136",
|
106381 |
+
"anony": false,
|
106382 |
+
"tstamp": 1731045590.3649
|
106383 |
+
},
|
106384 |
+
{
|
106385 |
+
"model_a_conv_id": "d96f1b8f478a41f69e19917cce79595c",
|
106386 |
+
"model_b_conv_id": "e1e369c8e73a43588d6583e58770bd35",
|
106387 |
+
"inputs": {
|
106388 |
+
"prompt": "A baseball player in a blue and white uniform is next to a player in black and white ."
|
106389 |
+
},
|
106390 |
+
"model_a": "FLUX.1-dev",
|
106391 |
+
"model_b": "PlayGround V2.5",
|
106392 |
+
"vote_type": "leftvote",
|
106393 |
+
"winner": "model_a",
|
106394 |
+
"judge": "arena_user_10.16.31.254",
|
106395 |
+
"anony": false,
|
106396 |
+
"tstamp": 1731045615.3441
|
106397 |
+
},
|
106398 |
+
{
|
106399 |
+
"model_a_conv_id": "9ac051b0679b41d4bab762ed04250456",
|
106400 |
+
"model_b_conv_id": "880ee61da46949179ce0866dc67d3996",
|
106401 |
+
"inputs": {
|
106402 |
+
"prompt": "stylish cyberpunk woman full body, with tattoo and bald head, detailed cafe background ,fine detail, anime, realistic shaded lighting, directional lighting 2 d poster by ilya kuvshinov, magali villeneuve, artgerm, jeremy lipkin and michael garmash and rob rey "
|
106403 |
+
},
|
106404 |
+
"model_a": "FLUX.1-dev",
|
106405 |
+
"model_b": "PlayGround V2.5",
|
106406 |
+
"vote_type": "bothbad_vote",
|
106407 |
+
"winner": "tie (bothbad)",
|
106408 |
+
"judge": "arena_user_10.16.12.226",
|
106409 |
+
"anony": false,
|
106410 |
+
"tstamp": 1731045855.0911
|
106411 |
+
},
|
106412 |
+
{
|
106413 |
+
"model_a_conv_id": "f401ddcaa9df4d13a673541b3eb4c3c1",
|
106414 |
+
"model_b_conv_id": "1f8a1a40b0f5427bbb6fdd12ba966cee",
|
106415 |
+
"inputs": {
|
106416 |
+
"prompt": "A wine glass on top of a dog."
|
106417 |
+
},
|
106418 |
+
"model_a": "FLUX.1-dev",
|
106419 |
+
"model_b": "PlayGround V2.5",
|
106420 |
+
"vote_type": "leftvote",
|
106421 |
+
"winner": "model_a",
|
106422 |
+
"judge": "arena_user_10.16.19.51",
|
106423 |
+
"anony": false,
|
106424 |
+
"tstamp": 1731045866.0886
|
106425 |
+
},
|
106426 |
+
{
|
106427 |
+
"model_a_conv_id": "61ff6ab6552348a59907c1c9ac6d1561",
|
106428 |
+
"model_b_conv_id": "c5afc2f34ffc4cfb85d8e9e21bd3234b",
|
106429 |
+
"inputs": {
|
106430 |
+
"prompt": "photograph of a woman wearing a track belt"
|
106431 |
+
},
|
106432 |
+
"model_a": "FLUX.1-dev",
|
106433 |
+
"model_b": "PlayGround V2.5",
|
106434 |
+
"vote_type": "tievote",
|
106435 |
+
"winner": "tie",
|
106436 |
+
"judge": "arena_user_10.16.38.196",
|
106437 |
+
"anony": false,
|
106438 |
+
"tstamp": 1731045878.1092
|
106439 |
+
},
|
106440 |
+
{
|
106441 |
+
"model_a_conv_id": "18f6e571d0294b19a004b89182aea7ab",
|
106442 |
+
"model_b_conv_id": "53199499a87e4a4aaace27bbf463dd7a",
|
106443 |
+
"inputs": {
|
106444 |
+
"prompt": "Two cars on the street."
|
106445 |
+
},
|
106446 |
+
"model_a": "FLUX.1-dev",
|
106447 |
+
"model_b": "PlayGround V2.5",
|
106448 |
+
"vote_type": "leftvote",
|
106449 |
+
"winner": "model_a",
|
106450 |
+
"judge": "arena_user_10.16.38.196",
|
106451 |
+
"anony": false,
|
106452 |
+
"tstamp": 1731045935.7392
|
106453 |
+
},
|
106454 |
+
{
|
106455 |
+
"model_a_conv_id": "30383e6e351f4b84b47e3a59da4706d9",
|
106456 |
+
"model_b_conv_id": "d201ba6d76264d82be9153b6895e487d",
|
106457 |
+
"inputs": {
|
106458 |
+
"prompt": "Two brown horses grazing on green grass next to a lighthouse."
|
106459 |
+
},
|
106460 |
+
"model_a": "FLUX.1-dev",
|
106461 |
+
"model_b": "PlayGround V2.5",
|
106462 |
+
"vote_type": "leftvote",
|
106463 |
+
"winner": "model_a",
|
106464 |
+
"judge": "arena_user_10.16.19.51",
|
106465 |
+
"anony": false,
|
106466 |
+
"tstamp": 1731045976.1654
|
106467 |
+
},
|
106468 |
+
{
|
106469 |
+
"model_a_conv_id": "9fd7afe000364b16b0cf673f407ac03a",
|
106470 |
+
"model_b_conv_id": "6c6c215e9bbc4504b316f3739eb017ca",
|
106471 |
+
"inputs": {
|
106472 |
+
"prompt": "A fish eating a pelican."
|
106473 |
+
},
|
106474 |
+
"model_a": "FLUX.1-dev",
|
106475 |
+
"model_b": "PlayGround V2.5",
|
106476 |
+
"vote_type": "bothbad_vote",
|
106477 |
+
"winner": "tie (bothbad)",
|
106478 |
+
"judge": "arena_user_10.16.12.226",
|
106479 |
+
"anony": false,
|
106480 |
+
"tstamp": 1731046005.5739
|
106481 |
+
},
|
106482 |
+
{
|
106483 |
+
"model_a_conv_id": "5661118c1fc0495e840c6e4b038c8053",
|
106484 |
+
"model_b_conv_id": "b7d9aceb178241799e6cb88af2d53b6a",
|
106485 |
+
"inputs": {
|
106486 |
+
"prompt": "A pink and a white frisbee are on the ground ."
|
106487 |
+
},
|
106488 |
+
"model_a": "FLUX.1-dev",
|
106489 |
+
"model_b": "PlayGround V2.5",
|
106490 |
+
"vote_type": "leftvote",
|
106491 |
+
"winner": "model_a",
|
106492 |
+
"judge": "arena_user_10.16.12.226",
|
106493 |
+
"anony": false,
|
106494 |
+
"tstamp": 1731046021.7111
|
106495 |
+
},
|
106496 |
+
{
|
106497 |
+
"model_a_conv_id": "a919d21dd13e4e7b96281f3bbb504c90",
|
106498 |
+
"model_b_conv_id": "0928ad16331a4e0c84be9a609ecfeabd",
|
106499 |
+
"inputs": {
|
106500 |
+
"prompt": "Octothorpe."
|
106501 |
+
},
|
106502 |
+
"model_a": "FLUX.1-dev",
|
106503 |
+
"model_b": "PlayGround V2.5",
|
106504 |
+
"vote_type": "bothbad_vote",
|
106505 |
+
"winner": "tie (bothbad)",
|
106506 |
+
"judge": "arena_user_10.16.31.254",
|
106507 |
+
"anony": false,
|
106508 |
+
"tstamp": 1731046074.7711
|
106509 |
+
},
|
106510 |
+
{
|
106511 |
+
"model_a_conv_id": "96186ffda0bd402abfff83e01fc05327",
|
106512 |
+
"model_b_conv_id": "3057480af1bc44ba9b5c70ab0ccbaff2",
|
106513 |
+
"inputs": {
|
106514 |
+
"prompt": "a heavy downpour ,An oil painting of a couple in formal evening wear run home all wet"
|
106515 |
+
},
|
106516 |
+
"model_a": "FLUX.1-dev",
|
106517 |
+
"model_b": "PlayGround V2.5",
|
106518 |
+
"vote_type": "leftvote",
|
106519 |
+
"winner": "model_a",
|
106520 |
+
"judge": "arena_user_10.16.19.51",
|
106521 |
+
"anony": false,
|
106522 |
+
"tstamp": 1731046252.6639
|
106523 |
+
},
|
106524 |
+
{
|
106525 |
+
"model_a_conv_id": "7a146e98dfb54546980045277229abeb",
|
106526 |
+
"model_b_conv_id": "b442a2d8eb814b2093f9cb496dd45460",
|
106527 |
+
"inputs": {
|
106528 |
+
"prompt": "A sign that says 'Deep Learning'."
|
106529 |
+
},
|
106530 |
+
"model_a": "FLUX.1-dev",
|
106531 |
+
"model_b": "PlayGround V2.5",
|
106532 |
+
"vote_type": "leftvote",
|
106533 |
+
"winner": "model_a",
|
106534 |
+
"judge": "arena_user_10.16.31.254",
|
106535 |
+
"anony": false,
|
106536 |
+
"tstamp": 1731046302.9771
|
106537 |
+
},
|
106538 |
+
{
|
106539 |
+
"model_a_conv_id": "4a21e1476bfd40799d304462567cebcc",
|
106540 |
+
"model_b_conv_id": "804320748e684a539821f8aa6121a3b7",
|
106541 |
+
"inputs": {
|
106542 |
+
"prompt": "photorealistic, detailed mechanical motorcycle, detailed"
|
106543 |
+
},
|
106544 |
+
"model_a": "FLUX.1-dev",
|
106545 |
+
"model_b": "PlayGround V2.5",
|
106546 |
+
"vote_type": "leftvote",
|
106547 |
+
"winner": "model_a",
|
106548 |
+
"judge": "arena_user_10.16.38.136",
|
106549 |
+
"anony": false,
|
106550 |
+
"tstamp": 1731046329.826
|
106551 |
+
},
|
106552 |
+
{
|
106553 |
+
"model_a_conv_id": "abf45109440345de97a3e53f107656d4",
|
106554 |
+
"model_b_conv_id": "addab30eee1241ea88ed15fa5a77d20e",
|
106555 |
+
"inputs": {
|
106556 |
+
"prompt": "A young girl in a yellow dress with blue flowers on a bench"
|
106557 |
+
},
|
106558 |
+
"model_a": "FLUX.1-dev",
|
106559 |
+
"model_b": "PlayGround V2.5",
|
106560 |
+
"vote_type": "tievote",
|
106561 |
+
"winner": "tie",
|
106562 |
+
"judge": "arena_user_10.16.13.73",
|
106563 |
+
"anony": false,
|
106564 |
+
"tstamp": 1731046373.336
|
106565 |
+
},
|
106566 |
+
{
|
106567 |
+
"model_a_conv_id": "0d5788716a0e431ab34c3c4f52775e42",
|
106568 |
+
"model_b_conv_id": "13689909455a475d89d96424cf194f20",
|
106569 |
+
"inputs": {
|
106570 |
+
"prompt": "Triangle orange photo frame on photo in Orange photo frame in the shape of a triangle."
|
106571 |
+
},
|
106572 |
+
"model_a": "FLUX.1-dev",
|
106573 |
+
"model_b": "PlayGround V2.5",
|
106574 |
+
"vote_type": "bothbad_vote",
|
106575 |
+
"winner": "tie (bothbad)",
|
106576 |
+
"judge": "arena_user_10.16.31.254",
|
106577 |
+
"anony": false,
|
106578 |
+
"tstamp": 1731046538.6082
|
106579 |
+
},
|
106580 |
+
{
|
106581 |
+
"model_a_conv_id": "49de4e93fda3451491df34c211ac084e",
|
106582 |
+
"model_b_conv_id": "98749a6c77024e8d85e854e97e3c8c8c",
|
106583 |
+
"inputs": {
|
106584 |
+
"prompt": "beautiful gorgeous elegant porcelain ivory fair skin mechanoid woman, close - up, sharp focus, studio light, iris van herpen haute couture headdress made of rhizomorphs, daisies, arches, brackets, herbs, colorful corals, fractal mushrooms, puffballs, octane render, ultra sharp, 8 k "
|
106585 |
+
},
|
106586 |
+
"model_a": "FLUX.1-dev",
|
106587 |
+
"model_b": "PlayGround V2.5",
|
106588 |
+
"vote_type": "leftvote",
|
106589 |
+
"winner": "model_a",
|
106590 |
+
"judge": "arena_user_10.16.12.226",
|
106591 |
+
"anony": false,
|
106592 |
+
"tstamp": 1731046741.6196
|
106593 |
+
},
|
106594 |
+
{
|
106595 |
+
"model_a_conv_id": "b619bc9055264ffc888ae3aa88001d44",
|
106596 |
+
"model_b_conv_id": "921d735df86f4e8ba48dd08c1a27d9dc",
|
106597 |
+
"inputs": {
|
106598 |
+
"prompt": "Three girls and one boy in living room, afternoon daylight, warm color tone"
|
106599 |
+
},
|
106600 |
+
"model_a": "PlayGround V2.5",
|
106601 |
+
"model_b": "AuraFlow",
|
106602 |
+
"vote_type": "leftvote",
|
106603 |
+
"winner": "model_a",
|
106604 |
+
"judge": "arena_user_10.16.16.234",
|
106605 |
+
"anony": true,
|
106606 |
+
"tstamp": 1731059061.2847
|
106607 |
+
},
|
106608 |
+
{
|
106609 |
+
"model_a_conv_id": "561fef4d2b7b41f99132f86a0d559c10",
|
106610 |
+
"model_b_conv_id": "94883e951741454baee2204401e52286",
|
106611 |
+
"inputs": {
|
106612 |
+
"prompt": "photorealistic photo of a cactus flower in a field of sunflowershyperrealism, massive clouds, hyperrealistic, epic"
|
106613 |
+
},
|
106614 |
+
"model_a": "AuraFlow",
|
106615 |
+
"model_b": "PixArtAlpha",
|
106616 |
+
"vote_type": "bothbad_vote",
|
106617 |
+
"winner": "tie (bothbad)",
|
106618 |
+
"judge": "arena_user_10.16.38.196",
|
106619 |
+
"anony": true,
|
106620 |
+
"tstamp": 1731059135.7291
|
106621 |
+
},
|
106622 |
+
{
|
106623 |
+
"model_a_conv_id": "e0ff646dc3024738855cfa319405329c",
|
106624 |
+
"model_b_conv_id": "e4680714219d4ebc82ff00ec9a4d3fa5",
|
106625 |
+
"inputs": {
|
106626 |
+
"prompt": "a cute cat is playing a ball"
|
106627 |
+
},
|
106628 |
+
"model_a": "SDXLTurbo",
|
106629 |
+
"model_b": "FLUX.1-dev",
|
106630 |
+
"vote_type": "rightvote",
|
106631 |
+
"winner": "model_b",
|
106632 |
+
"judge": "arena_user_10.16.19.51",
|
106633 |
+
"anony": true,
|
106634 |
+
"tstamp": 1731066863.7962
|
106635 |
+
},
|
106636 |
+
{
|
106637 |
+
"model_a_conv_id": "f9163db1a4a144258ac1cd8dcc52309a",
|
106638 |
+
"model_b_conv_id": "19df95d3e3154d468be69a31c1e6a39b",
|
106639 |
+
"inputs": {
|
106640 |
+
"prompt": "a cute dog is playing a ball"
|
106641 |
+
},
|
106642 |
+
"model_a": "AuraFlow",
|
106643 |
+
"model_b": "FLUX.1-dev",
|
106644 |
+
"vote_type": "rightvote",
|
106645 |
+
"winner": "model_b",
|
106646 |
+
"judge": "arena_user_10.16.38.136",
|
106647 |
+
"anony": true,
|
106648 |
+
"tstamp": 1731075636.2991
|
106649 |
}
|
106650 |
]
|
arena_elo/results/latest/clean_battle_video_generation.json
CHANGED
@@ -30952,5 +30952,621 @@
|
|
30952 |
"judge": "arena_user_10.16.12.226",
|
30953 |
"anony": true,
|
30954 |
"tstamp": 1730936788.6334
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
30955 |
}
|
30956 |
]
|
|
|
30952 |
"judge": "arena_user_10.16.12.226",
|
30953 |
"anony": true,
|
30954 |
"tstamp": 1730936788.6334
|
30955 |
+
},
|
30956 |
+
{
|
30957 |
+
"model_a_conv_id": "0caa993a55c94a3bb09c2b4bafc52af7",
|
30958 |
+
"model_b_conv_id": "c310062cc42b4f3391c0744f6a12d680",
|
30959 |
+
"inputs": {
|
30960 |
+
"prompt": "a sheep running to join a herd of its kind"
|
30961 |
+
},
|
30962 |
+
"model_a": "T2V-Turbo",
|
30963 |
+
"model_b": "AnimateDiff Turbo",
|
30964 |
+
"vote_type": "leftvote",
|
30965 |
+
"winner": "model_a",
|
30966 |
+
"judge": "arena_user_10.16.38.136",
|
30967 |
+
"anony": true,
|
30968 |
+
"tstamp": 1731043963.5469
|
30969 |
+
},
|
30970 |
+
{
|
30971 |
+
"model_a_conv_id": "6f48e352de49435cb1fa2039a496d1ee",
|
30972 |
+
"model_b_conv_id": "589cc7543f594f22bb0468e77fa1f241",
|
30973 |
+
"inputs": {
|
30974 |
+
"prompt": "a bird building a nest from twigs and leaves"
|
30975 |
+
},
|
30976 |
+
"model_a": "T2V-Turbo",
|
30977 |
+
"model_b": "Pyramid Flow",
|
30978 |
+
"vote_type": "bothbad_vote",
|
30979 |
+
"winner": "tie (bothbad)",
|
30980 |
+
"judge": "arena_user_10.16.19.51",
|
30981 |
+
"anony": true,
|
30982 |
+
"tstamp": 1731044016.9479
|
30983 |
+
},
|
30984 |
+
{
|
30985 |
+
"model_a_conv_id": "92def5f703a74467baeaf10908d7ed72",
|
30986 |
+
"model_b_conv_id": "0316dc7c10ef4643a6d04ec853597417",
|
30987 |
+
"inputs": {
|
30988 |
+
"prompt": "Vampire makeup face of beautiful girl, red contact lenses."
|
30989 |
+
},
|
30990 |
+
"model_a": "CogVideoX-5B",
|
30991 |
+
"model_b": "AnimateDiff",
|
30992 |
+
"vote_type": "leftvote",
|
30993 |
+
"winner": "model_a",
|
30994 |
+
"judge": "arena_user_10.16.31.254",
|
30995 |
+
"anony": true,
|
30996 |
+
"tstamp": 1731044051.8893
|
30997 |
+
},
|
30998 |
+
{
|
30999 |
+
"model_a_conv_id": "a4a3a62b58c3486ea32ad8e4daf40c3a",
|
31000 |
+
"model_b_conv_id": "7e063767fb784a55bc2c4d54abbf0523",
|
31001 |
+
"inputs": {
|
31002 |
+
"prompt": "a handbag and a tie"
|
31003 |
+
},
|
31004 |
+
"model_a": "StableVideoDiffusion",
|
31005 |
+
"model_b": "Pyramid Flow",
|
31006 |
+
"vote_type": "rightvote",
|
31007 |
+
"winner": "model_b",
|
31008 |
+
"judge": "arena_user_10.16.31.254",
|
31009 |
+
"anony": true,
|
31010 |
+
"tstamp": 1731044074.9281
|
31011 |
+
},
|
31012 |
+
{
|
31013 |
+
"model_a_conv_id": "475693a5842e47a0a5291bc73f4c9f7c",
|
31014 |
+
"model_b_conv_id": "f539ffd1963045d788e7e2bf415ef447",
|
31015 |
+
"inputs": {
|
31016 |
+
"prompt": "a zebra taking a peaceful walk"
|
31017 |
+
},
|
31018 |
+
"model_a": "VideoCrafter2",
|
31019 |
+
"model_b": "StableVideoDiffusion",
|
31020 |
+
"vote_type": "leftvote",
|
31021 |
+
"winner": "model_a",
|
31022 |
+
"judge": "arena_user_10.16.31.254",
|
31023 |
+
"anony": true,
|
31024 |
+
"tstamp": 1731044096.101
|
31025 |
+
},
|
31026 |
+
{
|
31027 |
+
"model_a_conv_id": "f02dabdba61a47e0995182138bcfc937",
|
31028 |
+
"model_b_conv_id": "40556efbf2324f949dca64aa069a413c",
|
31029 |
+
"inputs": {
|
31030 |
+
"prompt": "a person eating a burger"
|
31031 |
+
},
|
31032 |
+
"model_a": "AnimateDiff Turbo",
|
31033 |
+
"model_b": "StableVideoDiffusion",
|
31034 |
+
"vote_type": "bothbad_vote",
|
31035 |
+
"winner": "tie (bothbad)",
|
31036 |
+
"judge": "arena_user_10.16.19.51",
|
31037 |
+
"anony": true,
|
31038 |
+
"tstamp": 1731044116.9322
|
31039 |
+
},
|
31040 |
+
{
|
31041 |
+
"model_a_conv_id": "173ac1f1344a4d9c84b6b4768e3f5895",
|
31042 |
+
"model_b_conv_id": "cad0a719664942ef9075100d79188689",
|
31043 |
+
"inputs": {
|
31044 |
+
"prompt": "a bird soaring gracefully in the sky"
|
31045 |
+
},
|
31046 |
+
"model_a": "VideoCrafter2",
|
31047 |
+
"model_b": "CogVideoX-2B",
|
31048 |
+
"vote_type": "tievote",
|
31049 |
+
"winner": "tie",
|
31050 |
+
"judge": "arena_user_10.16.31.254",
|
31051 |
+
"anony": true,
|
31052 |
+
"tstamp": 1731044136.9802
|
31053 |
+
},
|
31054 |
+
{
|
31055 |
+
"model_a_conv_id": "40c11f43b4fc40a0865e2a6379a00a02",
|
31056 |
+
"model_b_conv_id": "0c58c915bf8841228f8ebd409d0507e5",
|
31057 |
+
"inputs": {
|
31058 |
+
"prompt": "A tranquil tableau of tower"
|
31059 |
+
},
|
31060 |
+
"model_a": "CogVideoX-2B",
|
31061 |
+
"model_b": "StableVideoDiffusion",
|
31062 |
+
"vote_type": "leftvote",
|
31063 |
+
"winner": "model_a",
|
31064 |
+
"judge": "arena_user_10.16.12.226",
|
31065 |
+
"anony": true,
|
31066 |
+
"tstamp": 1731044181.357
|
31067 |
+
},
|
31068 |
+
{
|
31069 |
+
"model_a_conv_id": "64da3fb5c1484c819e20dee0ddba308e",
|
31070 |
+
"model_b_conv_id": "55b22135123149479b39a814f0794a7c",
|
31071 |
+
"inputs": {
|
31072 |
+
"prompt": "a dog drinking water"
|
31073 |
+
},
|
31074 |
+
"model_a": "CogVideoX-5B",
|
31075 |
+
"model_b": "AnimateDiff",
|
31076 |
+
"vote_type": "leftvote",
|
31077 |
+
"winner": "model_a",
|
31078 |
+
"judge": "arena_user_10.16.19.51",
|
31079 |
+
"anony": true,
|
31080 |
+
"tstamp": 1731044203.7884
|
31081 |
+
},
|
31082 |
+
{
|
31083 |
+
"model_a_conv_id": "51759e8d40d44d0a89fa3147709e4672",
|
31084 |
+
"model_b_conv_id": "f7c09ed9acfe4c0cadf7f64505fcdf2b",
|
31085 |
+
"inputs": {
|
31086 |
+
"prompt": "Busy freeway at night."
|
31087 |
+
},
|
31088 |
+
"model_a": "Pyramid Flow",
|
31089 |
+
"model_b": "Allegro",
|
31090 |
+
"vote_type": "tievote",
|
31091 |
+
"winner": "tie",
|
31092 |
+
"judge": "arena_user_10.16.38.196",
|
31093 |
+
"anony": true,
|
31094 |
+
"tstamp": 1731044231.9585
|
31095 |
+
},
|
31096 |
+
{
|
31097 |
+
"model_a_conv_id": "341b03eaf5204aa2bf182fe3416aecc8",
|
31098 |
+
"model_b_conv_id": "b37e0bb3740f41f3bbe861e33dac1637",
|
31099 |
+
"inputs": {
|
31100 |
+
"prompt": "a vase"
|
31101 |
+
},
|
31102 |
+
"model_a": "AnimateDiff",
|
31103 |
+
"model_b": "CogVideoX-2B",
|
31104 |
+
"vote_type": "leftvote",
|
31105 |
+
"winner": "model_a",
|
31106 |
+
"judge": "arena_user_10.16.31.254",
|
31107 |
+
"anony": true,
|
31108 |
+
"tstamp": 1731044244.5004
|
31109 |
+
},
|
31110 |
+
{
|
31111 |
+
"model_a_conv_id": "aed540ccac9440dfa38f1bd72791b431",
|
31112 |
+
"model_b_conv_id": "046e473b184f49b8a2483dfd6112b41c",
|
31113 |
+
"inputs": {
|
31114 |
+
"prompt": "A cute happy Corgi playing in park, sunset, Van Gogh style"
|
31115 |
+
},
|
31116 |
+
"model_a": "CogVideoX-2B",
|
31117 |
+
"model_b": "StableVideoDiffusion",
|
31118 |
+
"vote_type": "bothbad_vote",
|
31119 |
+
"winner": "tie (bothbad)",
|
31120 |
+
"judge": "arena_user_10.16.13.73",
|
31121 |
+
"anony": true,
|
31122 |
+
"tstamp": 1731044285.1852
|
31123 |
+
},
|
31124 |
+
{
|
31125 |
+
"model_a_conv_id": "0a95246319da49a0b3e6a0072343d72d",
|
31126 |
+
"model_b_conv_id": "b300eec124bd40cb82508813f7970da1",
|
31127 |
+
"inputs": {
|
31128 |
+
"prompt": "a sheep and a cow"
|
31129 |
+
},
|
31130 |
+
"model_a": "Allegro",
|
31131 |
+
"model_b": "Pyramid Flow",
|
31132 |
+
"vote_type": "bothbad_vote",
|
31133 |
+
"winner": "tie (bothbad)",
|
31134 |
+
"judge": "arena_user_10.16.13.73",
|
31135 |
+
"anony": true,
|
31136 |
+
"tstamp": 1731044300.9669
|
31137 |
+
},
|
31138 |
+
{
|
31139 |
+
"model_a_conv_id": "fc6c2beb8e894893a071dab9685720d6",
|
31140 |
+
"model_b_conv_id": "dcdc61d142c44273a42cc0a0820c0519",
|
31141 |
+
"inputs": {
|
31142 |
+
"prompt": "an elephant"
|
31143 |
+
},
|
31144 |
+
"model_a": "AnimateDiff Turbo",
|
31145 |
+
"model_b": "Allegro",
|
31146 |
+
"vote_type": "bothbad_vote",
|
31147 |
+
"winner": "tie (bothbad)",
|
31148 |
+
"judge": "arena_user_10.16.13.73",
|
31149 |
+
"anony": true,
|
31150 |
+
"tstamp": 1731044325.8943
|
31151 |
+
},
|
31152 |
+
{
|
31153 |
+
"model_a_conv_id": "b00d85de796641bbb006c80c125f6f1b",
|
31154 |
+
"model_b_conv_id": "c58d8543c79841c5b66bae4d4f437698",
|
31155 |
+
"inputs": {
|
31156 |
+
"prompt": "a boat sailing smoothly on a calm lake"
|
31157 |
+
},
|
31158 |
+
"model_a": "Allegro",
|
31159 |
+
"model_b": "Pyramid Flow",
|
31160 |
+
"vote_type": "rightvote",
|
31161 |
+
"winner": "model_b",
|
31162 |
+
"judge": "arena_user_10.16.38.196",
|
31163 |
+
"anony": true,
|
31164 |
+
"tstamp": 1731044374.3651
|
31165 |
+
},
|
31166 |
+
{
|
31167 |
+
"model_a_conv_id": "00a1f6d921e740cb9be02dcb178db267",
|
31168 |
+
"model_b_conv_id": "db71c9118ec445489096b62aed364228",
|
31169 |
+
"inputs": {
|
31170 |
+
"prompt": "Gwen Stacy reading a book, tilt down"
|
31171 |
+
},
|
31172 |
+
"model_a": "Pyramid Flow",
|
31173 |
+
"model_b": "AnimateDiff Turbo",
|
31174 |
+
"vote_type": "bothbad_vote",
|
31175 |
+
"winner": "tie (bothbad)",
|
31176 |
+
"judge": "arena_user_10.16.38.196",
|
31177 |
+
"anony": true,
|
31178 |
+
"tstamp": 1731044409.7427
|
31179 |
+
},
|
31180 |
+
{
|
31181 |
+
"model_a_conv_id": "5d26098943e7436aa57375d56d346b29",
|
31182 |
+
"model_b_conv_id": "794f7374c304472d91fad1f9c76c3827",
|
31183 |
+
"inputs": {
|
31184 |
+
"prompt": "a chair and a couch"
|
31185 |
+
},
|
31186 |
+
"model_a": "StableVideoDiffusion",
|
31187 |
+
"model_b": "T2V-Turbo",
|
31188 |
+
"vote_type": "bothbad_vote",
|
31189 |
+
"winner": "tie (bothbad)",
|
31190 |
+
"judge": "arena_user_10.16.31.254",
|
31191 |
+
"anony": true,
|
31192 |
+
"tstamp": 1731044454.5022
|
31193 |
+
},
|
31194 |
+
{
|
31195 |
+
"model_a_conv_id": "e362e048dba74b168ae82d00cd790888",
|
31196 |
+
"model_b_conv_id": "6dbb2e10244a454895771d0524e3439f",
|
31197 |
+
"inputs": {
|
31198 |
+
"prompt": "A tranquil tableau of a wooden bench in the park"
|
31199 |
+
},
|
31200 |
+
"model_a": "T2V-Turbo",
|
31201 |
+
"model_b": "AnimateDiff",
|
31202 |
+
"vote_type": "tievote",
|
31203 |
+
"winner": "tie",
|
31204 |
+
"judge": "arena_user_10.16.38.196",
|
31205 |
+
"anony": true,
|
31206 |
+
"tstamp": 1731044483.8245
|
31207 |
+
},
|
31208 |
+
{
|
31209 |
+
"model_a_conv_id": "cb8d247653dd427c9db8558039c663d3",
|
31210 |
+
"model_b_conv_id": "9302b9468d1248e78375e82890f4e5e0",
|
31211 |
+
"inputs": {
|
31212 |
+
"prompt": "A person is cutting watermelon"
|
31213 |
+
},
|
31214 |
+
"model_a": "T2V-Turbo",
|
31215 |
+
"model_b": "AnimateDiff",
|
31216 |
+
"vote_type": "bothbad_vote",
|
31217 |
+
"winner": "tie (bothbad)",
|
31218 |
+
"judge": "arena_user_10.16.12.226",
|
31219 |
+
"anony": true,
|
31220 |
+
"tstamp": 1731044502.9162
|
31221 |
+
},
|
31222 |
+
{
|
31223 |
+
"model_a_conv_id": "8689e62975a14f7b8f3eda999ff6fd43",
|
31224 |
+
"model_b_conv_id": "3c872907b4c84b11aeff70bdc2674243",
|
31225 |
+
"inputs": {
|
31226 |
+
"prompt": "a yellow suitcase"
|
31227 |
+
},
|
31228 |
+
"model_a": "Pyramid Flow",
|
31229 |
+
"model_b": "AnimateDiff",
|
31230 |
+
"vote_type": "bothbad_vote",
|
31231 |
+
"winner": "tie (bothbad)",
|
31232 |
+
"judge": "arena_user_10.16.38.196",
|
31233 |
+
"anony": true,
|
31234 |
+
"tstamp": 1731044535.5325
|
31235 |
+
},
|
31236 |
+
{
|
31237 |
+
"model_a_conv_id": "140a74161eeb4e32882184926c4c3a9a",
|
31238 |
+
"model_b_conv_id": "e6748d728a5c4881b841abec4d699a5a",
|
31239 |
+
"inputs": {
|
31240 |
+
"prompt": "cliff"
|
31241 |
+
},
|
31242 |
+
"model_a": "StableVideoDiffusion",
|
31243 |
+
"model_b": "AnimateDiff",
|
31244 |
+
"vote_type": "tievote",
|
31245 |
+
"winner": "tie",
|
31246 |
+
"judge": "arena_user_10.16.38.136",
|
31247 |
+
"anony": true,
|
31248 |
+
"tstamp": 1731044559.8592
|
31249 |
+
},
|
31250 |
+
{
|
31251 |
+
"model_a_conv_id": "5bc29481bc5e40048067d728d639b2fe",
|
31252 |
+
"model_b_conv_id": "0248cc075f9540ada20629054d142473",
|
31253 |
+
"inputs": {
|
31254 |
+
"prompt": "Flying through fantasy landscapes."
|
31255 |
+
},
|
31256 |
+
"model_a": "Pyramid Flow",
|
31257 |
+
"model_b": "Allegro",
|
31258 |
+
"vote_type": "leftvote",
|
31259 |
+
"winner": "model_a",
|
31260 |
+
"judge": "arena_user_10.16.13.73",
|
31261 |
+
"anony": true,
|
31262 |
+
"tstamp": 1731044591.9788
|
31263 |
+
},
|
31264 |
+
{
|
31265 |
+
"model_a_conv_id": "b299852e5f6c4bca9342e99a8cd9f507",
|
31266 |
+
"model_b_conv_id": "4e428de1e53345518195ced9c89200de",
|
31267 |
+
"inputs": {
|
31268 |
+
"prompt": "Snow rocky mountains peaks canyon. snow blanketed rocky mountains surround and shadow deep canyons. the canyons twist and bend through the high elevated mountain peaks, featuring a steady and smooth perspective"
|
31269 |
+
},
|
31270 |
+
"model_a": "OpenSora v1.2",
|
31271 |
+
"model_b": "Allegro",
|
31272 |
+
"vote_type": "bothbad_vote",
|
31273 |
+
"winner": "tie (bothbad)",
|
31274 |
+
"judge": "arena_user_10.16.31.254",
|
31275 |
+
"anony": true,
|
31276 |
+
"tstamp": 1731044672.7851
|
31277 |
+
},
|
31278 |
+
{
|
31279 |
+
"model_a_conv_id": "df93443bdf43485fb0644404ef21b78b",
|
31280 |
+
"model_b_conv_id": "c711e6bb59f74be7abeb95f66a166f65",
|
31281 |
+
"inputs": {
|
31282 |
+
"prompt": "mansion"
|
31283 |
+
},
|
31284 |
+
"model_a": "Pyramid Flow",
|
31285 |
+
"model_b": "Allegro",
|
31286 |
+
"vote_type": "bothbad_vote",
|
31287 |
+
"winner": "tie (bothbad)",
|
31288 |
+
"judge": "arena_user_10.16.31.254",
|
31289 |
+
"anony": true,
|
31290 |
+
"tstamp": 1731044721.3964
|
31291 |
+
},
|
31292 |
+
{
|
31293 |
+
"model_a_conv_id": "9f0c7de3b2194ab2a7e2c65d7b6f9e71",
|
31294 |
+
"model_b_conv_id": "310b1c7de1204c968cedf82bfc2ca845",
|
31295 |
+
"inputs": {
|
31296 |
+
"prompt": "a tennis racket on the left of a frisbee, front view"
|
31297 |
+
},
|
31298 |
+
"model_a": "CogVideoX-5B",
|
31299 |
+
"model_b": "VideoCrafter2",
|
31300 |
+
"vote_type": "bothbad_vote",
|
31301 |
+
"winner": "tie (bothbad)",
|
31302 |
+
"judge": "arena_user_10.16.12.226",
|
31303 |
+
"anony": true,
|
31304 |
+
"tstamp": 1731044751.6659
|
31305 |
+
},
|
31306 |
+
{
|
31307 |
+
"model_a_conv_id": "134ec4f1342b4396bb43c298b92909e6",
|
31308 |
+
"model_b_conv_id": "bff140d50e2a4ea1bde13d25723496ad",
|
31309 |
+
"inputs": {
|
31310 |
+
"prompt": "a vase and scissors"
|
31311 |
+
},
|
31312 |
+
"model_a": "VideoCrafter2",
|
31313 |
+
"model_b": "StableVideoDiffusion",
|
31314 |
+
"vote_type": "bothbad_vote",
|
31315 |
+
"winner": "tie (bothbad)",
|
31316 |
+
"judge": "arena_user_10.16.38.136",
|
31317 |
+
"anony": true,
|
31318 |
+
"tstamp": 1731044783.1967
|
31319 |
+
},
|
31320 |
+
{
|
31321 |
+
"model_a_conv_id": "e0bbf25bfb8f4423bbba0d0cbc681a40",
|
31322 |
+
"model_b_conv_id": "8f04ec004c9941229b883241d6ee1966",
|
31323 |
+
"inputs": {
|
31324 |
+
"prompt": "ballroom"
|
31325 |
+
},
|
31326 |
+
"model_a": "Pyramid Flow",
|
31327 |
+
"model_b": "StableVideoDiffusion",
|
31328 |
+
"vote_type": "rightvote",
|
31329 |
+
"winner": "model_b",
|
31330 |
+
"judge": "arena_user_10.16.38.196",
|
31331 |
+
"anony": true,
|
31332 |
+
"tstamp": 1731044799.1203
|
31333 |
+
},
|
31334 |
+
{
|
31335 |
+
"model_a_conv_id": "7f9d825df42b4978a9cf681c4dda3287",
|
31336 |
+
"model_b_conv_id": "eaeaca38937e43ffae7430bc187e15d6",
|
31337 |
+
"inputs": {
|
31338 |
+
"prompt": "A person is arranging flowers"
|
31339 |
+
},
|
31340 |
+
"model_a": "Pyramid Flow",
|
31341 |
+
"model_b": "AnimateDiff",
|
31342 |
+
"vote_type": "bothbad_vote",
|
31343 |
+
"winner": "tie (bothbad)",
|
31344 |
+
"judge": "arena_user_10.16.38.196",
|
31345 |
+
"anony": true,
|
31346 |
+
"tstamp": 1731044839.543
|
31347 |
+
},
|
31348 |
+
{
|
31349 |
+
"model_a_conv_id": "91935066effc443da28394e6098d3943",
|
31350 |
+
"model_b_conv_id": "91ddf31126ce42daa64a480cce65d161",
|
31351 |
+
"inputs": {
|
31352 |
+
"prompt": "a giraffe and a bird"
|
31353 |
+
},
|
31354 |
+
"model_a": "OpenSora v1.2",
|
31355 |
+
"model_b": "CogVideoX-5B",
|
31356 |
+
"vote_type": "rightvote",
|
31357 |
+
"winner": "model_b",
|
31358 |
+
"judge": "arena_user_10.16.31.254",
|
31359 |
+
"anony": true,
|
31360 |
+
"tstamp": 1731044858.9279
|
31361 |
+
},
|
31362 |
+
{
|
31363 |
+
"model_a_conv_id": "69ec9fb4f6f94a08abf13cade0dc7e2f",
|
31364 |
+
"model_b_conv_id": "7eb7aea719dc4fa6841374f1a3561c0a",
|
31365 |
+
"inputs": {
|
31366 |
+
"prompt": "a blue bird"
|
31367 |
+
},
|
31368 |
+
"model_a": "T2V-Turbo",
|
31369 |
+
"model_b": "OpenSora v1.2",
|
31370 |
+
"vote_type": "leftvote",
|
31371 |
+
"winner": "model_a",
|
31372 |
+
"judge": "arena_user_10.16.12.226",
|
31373 |
+
"anony": true,
|
31374 |
+
"tstamp": 1731044880.359
|
31375 |
+
},
|
31376 |
+
{
|
31377 |
+
"model_a_conv_id": "aae2cdf942e04ec2bc6181378cfef1d9",
|
31378 |
+
"model_b_conv_id": "68fd7965716c489dbfb479bf99f9e0d7",
|
31379 |
+
"inputs": {
|
31380 |
+
"prompt": "a bicycle leaning against a tree"
|
31381 |
+
},
|
31382 |
+
"model_a": "AnimateDiff",
|
31383 |
+
"model_b": "StableVideoDiffusion",
|
31384 |
+
"vote_type": "rightvote",
|
31385 |
+
"winner": "model_b",
|
31386 |
+
"judge": "arena_user_10.16.31.254",
|
31387 |
+
"anony": true,
|
31388 |
+
"tstamp": 1731044907.6548
|
31389 |
+
},
|
31390 |
+
{
|
31391 |
+
"model_a_conv_id": "2a3952506192437db109f594ce5ee991",
|
31392 |
+
"model_b_conv_id": "e2877e82928440f485024f21d409c0ec",
|
31393 |
+
"inputs": {
|
31394 |
+
"prompt": "a bear hunting for prey"
|
31395 |
+
},
|
31396 |
+
"model_a": "AnimateDiff Turbo",
|
31397 |
+
"model_b": "Pyramid Flow",
|
31398 |
+
"vote_type": "bothbad_vote",
|
31399 |
+
"winner": "tie (bothbad)",
|
31400 |
+
"judge": "arena_user_10.16.19.51",
|
31401 |
+
"anony": true,
|
31402 |
+
"tstamp": 1731044931.7829
|
31403 |
+
},
|
31404 |
+
{
|
31405 |
+
"model_a_conv_id": "75b6a6fa71b740bcb35d5506387797ad",
|
31406 |
+
"model_b_conv_id": "981c814b0cd646e5a47f7d7fff1ed609",
|
31407 |
+
"inputs": {
|
31408 |
+
"prompt": "a blue clock"
|
31409 |
+
},
|
31410 |
+
"model_a": "CogVideoX-5B",
|
31411 |
+
"model_b": "Allegro",
|
31412 |
+
"vote_type": "bothbad_vote",
|
31413 |
+
"winner": "tie (bothbad)",
|
31414 |
+
"judge": "arena_user_10.16.38.196",
|
31415 |
+
"anony": true,
|
31416 |
+
"tstamp": 1731044957.5241
|
31417 |
+
},
|
31418 |
+
{
|
31419 |
+
"model_a_conv_id": "ff0a8e3ff75c46d59bd7eaab46f82ec6",
|
31420 |
+
"model_b_conv_id": "650c85a1727f46be9a2b4864c1a89a8a",
|
31421 |
+
"inputs": {
|
31422 |
+
"prompt": "a dog"
|
31423 |
+
},
|
31424 |
+
"model_a": "Allegro",
|
31425 |
+
"model_b": "T2V-Turbo",
|
31426 |
+
"vote_type": "bothbad_vote",
|
31427 |
+
"winner": "tie (bothbad)",
|
31428 |
+
"judge": "arena_user_10.16.38.136",
|
31429 |
+
"anony": true,
|
31430 |
+
"tstamp": 1731044980.462
|
31431 |
+
},
|
31432 |
+
{
|
31433 |
+
"model_a_conv_id": "f9d696efb0a145b19b8f99e38376491c",
|
31434 |
+
"model_b_conv_id": "5f6040ae28414a30b65e0a329ee5a9b3",
|
31435 |
+
"inputs": {
|
31436 |
+
"prompt": "highway"
|
31437 |
+
},
|
31438 |
+
"model_a": "T2V-Turbo",
|
31439 |
+
"model_b": "AnimateDiff",
|
31440 |
+
"vote_type": "bothbad_vote",
|
31441 |
+
"winner": "tie (bothbad)",
|
31442 |
+
"judge": "arena_user_10.16.19.51",
|
31443 |
+
"anony": true,
|
31444 |
+
"tstamp": 1731045000.9011
|
31445 |
+
},
|
31446 |
+
{
|
31447 |
+
"model_a_conv_id": "608dc64abd6a4f48bc6eaa8f6ae8f2ca",
|
31448 |
+
"model_b_conv_id": "546d7858995b4f1b910e0795108a4ef9",
|
31449 |
+
"inputs": {
|
31450 |
+
"prompt": "a bottle on the left of a wine glass, front view"
|
31451 |
+
},
|
31452 |
+
"model_a": "StableVideoDiffusion",
|
31453 |
+
"model_b": "AnimateDiff",
|
31454 |
+
"vote_type": "bothbad_vote",
|
31455 |
+
"winner": "tie (bothbad)",
|
31456 |
+
"judge": "arena_user_10.16.13.73",
|
31457 |
+
"anony": true,
|
31458 |
+
"tstamp": 1731045024.1826
|
31459 |
+
},
|
31460 |
+
{
|
31461 |
+
"model_a_conv_id": "966abd121e7d41c1a3b6a4bd6bd194b1",
|
31462 |
+
"model_b_conv_id": "8e7b43a5922c41c2b4cf2359554199d8",
|
31463 |
+
"inputs": {
|
31464 |
+
"prompt": "Balloon full of water exploding in extreme slow motion."
|
31465 |
+
},
|
31466 |
+
"model_a": "Pyramid Flow",
|
31467 |
+
"model_b": "CogVideoX-2B",
|
31468 |
+
"vote_type": "leftvote",
|
31469 |
+
"winner": "model_a",
|
31470 |
+
"judge": "arena_user_10.16.12.226",
|
31471 |
+
"anony": true,
|
31472 |
+
"tstamp": 1731045059.3249
|
31473 |
+
},
|
31474 |
+
{
|
31475 |
+
"model_a_conv_id": "b68659b647174196ba31140d71d93689",
|
31476 |
+
"model_b_conv_id": "e61b4bef00b34172aa9cf8c5bbd08d47",
|
31477 |
+
"inputs": {
|
31478 |
+
"prompt": "In a still frame, the ornate Victorian streetlamp stands solemnly, adorned with intricate ironwork and stained glass panels"
|
31479 |
+
},
|
31480 |
+
"model_a": "Allegro",
|
31481 |
+
"model_b": "StableVideoDiffusion",
|
31482 |
+
"vote_type": "rightvote",
|
31483 |
+
"winner": "model_b",
|
31484 |
+
"judge": "arena_user_10.16.31.254",
|
31485 |
+
"anony": true,
|
31486 |
+
"tstamp": 1731045119.5593
|
31487 |
+
},
|
31488 |
+
{
|
31489 |
+
"model_a_conv_id": "064b90e960fd4edf8454fd01dc10890b",
|
31490 |
+
"model_b_conv_id": "d74d522005d546b294af46cb12fbc2ce",
|
31491 |
+
"inputs": {
|
31492 |
+
"prompt": "A shark swimming in clear Caribbean ocean"
|
31493 |
+
},
|
31494 |
+
"model_a": "Allegro",
|
31495 |
+
"model_b": "AnimateDiff Turbo",
|
31496 |
+
"vote_type": "leftvote",
|
31497 |
+
"winner": "model_a",
|
31498 |
+
"judge": "arena_user_10.16.19.51",
|
31499 |
+
"anony": true,
|
31500 |
+
"tstamp": 1731045142.1111
|
31501 |
+
},
|
31502 |
+
{
|
31503 |
+
"model_a_conv_id": "7213f5304b424136b373439425698dd5",
|
31504 |
+
"model_b_conv_id": "493c4d0058b348638141b729e7cd6efa",
|
31505 |
+
"inputs": {
|
31506 |
+
"prompt": "a pizza"
|
31507 |
+
},
|
31508 |
+
"model_a": "T2V-Turbo",
|
31509 |
+
"model_b": "AnimateDiff",
|
31510 |
+
"vote_type": "rightvote",
|
31511 |
+
"winner": "model_b",
|
31512 |
+
"judge": "arena_user_10.16.38.136",
|
31513 |
+
"anony": true,
|
31514 |
+
"tstamp": 1731045160.5606
|
31515 |
+
},
|
31516 |
+
{
|
31517 |
+
"model_a_conv_id": "ea98721b54444ed781730507a1aad9d9",
|
31518 |
+
"model_b_conv_id": "76528118b7664a64babb07dc3c8fbf37",
|
31519 |
+
"inputs": {
|
31520 |
+
"prompt": "a train speeding down the tracks"
|
31521 |
+
},
|
31522 |
+
"model_a": "VideoCrafter2",
|
31523 |
+
"model_b": "Pyramid Flow",
|
31524 |
+
"vote_type": "rightvote",
|
31525 |
+
"winner": "model_b",
|
31526 |
+
"judge": "arena_user_10.16.19.51",
|
31527 |
+
"anony": true,
|
31528 |
+
"tstamp": 1731045190.2784
|
31529 |
+
},
|
31530 |
+
{
|
31531 |
+
"model_a_conv_id": "8aaa143b89bd4c1a8eb72c722170e656",
|
31532 |
+
"model_b_conv_id": "2a356dab70ce499198fcd42fb266c167",
|
31533 |
+
"inputs": {
|
31534 |
+
"prompt": "A tranquil tableau of a bed"
|
31535 |
+
},
|
31536 |
+
"model_a": "StableVideoDiffusion",
|
31537 |
+
"model_b": "T2V-Turbo",
|
31538 |
+
"vote_type": "tievote",
|
31539 |
+
"winner": "tie",
|
31540 |
+
"judge": "arena_user_10.16.12.226",
|
31541 |
+
"anony": true,
|
31542 |
+
"tstamp": 1731045223.6538
|
31543 |
+
},
|
31544 |
+
{
|
31545 |
+
"model_a_conv_id": "fc51453f7b034e8bba212c0b5054e446",
|
31546 |
+
"model_b_conv_id": "c24d1300fa45476282c6a52289e1d746",
|
31547 |
+
"inputs": {
|
31548 |
+
"prompt": "an orange bird"
|
31549 |
+
},
|
31550 |
+
"model_a": "Pyramid Flow",
|
31551 |
+
"model_b": "StableVideoDiffusion",
|
31552 |
+
"vote_type": "rightvote",
|
31553 |
+
"winner": "model_b",
|
31554 |
+
"judge": "arena_user_10.16.19.51",
|
31555 |
+
"anony": true,
|
31556 |
+
"tstamp": 1731059395.4004
|
31557 |
+
},
|
31558 |
+
{
|
31559 |
+
"model_a_conv_id": "0b5a74d46a57423389c915f060da1a61",
|
31560 |
+
"model_b_conv_id": "ba151ea4183246bbb90d68f4e0013ef1",
|
31561 |
+
"inputs": {
|
31562 |
+
"prompt": "The bund Shanghai, vibrant color"
|
31563 |
+
},
|
31564 |
+
"model_a": "StableVideoDiffusion",
|
31565 |
+
"model_b": "OpenSora v1.2",
|
31566 |
+
"vote_type": "leftvote",
|
31567 |
+
"winner": "model_a",
|
31568 |
+
"judge": "arena_user_10.16.38.136",
|
31569 |
+
"anony": true,
|
31570 |
+
"tstamp": 1731059459.3946
|
31571 |
}
|
31572 |
]
|
arena_elo/results/latest/elo_results_image_editing.pkl
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:be417506363c4ddba63510d435a00c86aa4400c1d427b7febd62035fecf64844
|
3 |
+
size 63250
|
arena_elo/results/latest/elo_results_t2i_generation.pkl
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 88251
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:a6564c97a003ace32211956b729d1f20fb5487564a9f81882bd3bcd85affc4f4
|
3 |
size 88251
|
arena_elo/results/latest/elo_results_video_generation.pkl
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:a1eda11d7421d84721e43a24bd205f1f8fc3b12204f967b8080b4abb1bb26236
|
3 |
+
size 75102
|
arena_elo/results/latest/image_editing_leaderboard.csv
CHANGED
@@ -1,10 +1,10 @@
|
|
1 |
key,Model,Arena Elo rating (anony),Arena Elo rating (full),License,Organization,Link
|
2 |
-
MagicBrush,MagicBrush,1106.
|
3 |
-
InfEdit,InfEdit,1074.
|
4 |
-
CosXLEdit,CosXLEdit,1066.
|
5 |
-
InstructPix2Pix,InstructPix2Pix,1039.
|
6 |
-
PNP,PNP,997.
|
7 |
-
Prompt2prompt,Prompt2prompt,989.
|
8 |
-
CycleDiffusion,CycleDiffusion,
|
9 |
-
SDEdit,SDEdit,924.
|
10 |
-
Pix2PixZero,Pix2PixZero,858.
|
|
|
1 |
key,Model,Arena Elo rating (anony),Arena Elo rating (full),License,Organization,Link
|
2 |
+
MagicBrush,MagicBrush,1106.5447717908323,1110.7186073195908,CC-BY-4.0,"The Ohio State University, University of Waterloo",https://osu-nlp-group.github.io/MagicBrush/
|
3 |
+
InfEdit,InfEdit,1074.6862089326528,1074.3713838133683,CC BY-NC-ND 4.0,"University of Michigan, University of California, Berkeley",https://sled-group.github.io/InfEdit/
|
4 |
+
CosXLEdit,CosXLEdit,1066.2411914733814,1067.2477407063507,cosxl-nc-community,Stability AI,https://huggingface.co/stabilityai/cosxl
|
5 |
+
InstructPix2Pix,InstructPix2Pix,1039.4256126871376,1037.1386428878702,"Copyright 2023 Timothy Brooks, Aleksander Holynski, Alexei A. Efros","University of California, Berkeley",https://www.timothybrooks.com/instruct-pix2pix
|
6 |
+
PNP,PNP,997.0961094254252,1001.6671711750197,-,Weizmann Institute of Science,https://github.com/MichalGeyer/plug-and-play
|
7 |
+
Prompt2prompt,Prompt2prompt,989.1778674835481,990.2728540135581,Apache-2.0,"Google, Tel Aviv University",https://prompt-to-prompt.github.io/
|
8 |
+
CycleDiffusion,CycleDiffusion,944.154660983573,937.6776782292361,X11,Carnegie Mellon University,https://github.com/ChenWu98/cycle-diffusion?tab=readme-ov-file
|
9 |
+
SDEdit,SDEdit,924.5780071575175,923.0459636105026,MIT License,Stanford University,https://sde-image-editing.github.io
|
10 |
+
Pix2PixZero,Pix2PixZero,858.0955700659324,857.8599582445036,MIT License,"Carnegie Mellon University, Adobe Research",https://pix2pixzero.github.io/
|
arena_elo/results/latest/t2i_generation_leaderboard.csv
CHANGED
@@ -1,18 +1,18 @@
|
|
1 |
key,Model,Arena Elo rating (anony),Arena Elo rating (full),License,Organization,Link
|
2 |
-
PlayGround V2.5,PlayGround V2.5,1121.
|
3 |
-
FLUX.1-dev,FLUX.1-dev,
|
4 |
-
FLUX.1-schnell,FLUX.1-schnell,
|
5 |
-
PlayGround V2,PlayGround V2,
|
6 |
-
Kolors,Kolors,1053.
|
7 |
-
StableCascade,StableCascade,1038.
|
8 |
-
HunyuanDiT,HunyuanDiT,
|
9 |
-
PixArtAlpha,PixArtAlpha,1019.
|
10 |
-
PixArtSigma,PixArtSigma,1018.
|
11 |
-
SDXL-Lightning,SDXL-Lightning,1018.
|
12 |
-
AuraFlow,AuraFlow,
|
13 |
-
SD3,SD3,1009.
|
14 |
-
SDXL,SDXL,965.
|
15 |
-
SDXLTurbo,SDXLTurbo,914.
|
16 |
-
LCM(v1.5/XL),LCM(v1.5/XL),904.
|
17 |
-
OpenJourney,OpenJourney,829.
|
18 |
-
LCM,LCM,789.
|
|
|
1 |
key,Model,Arena Elo rating (anony),Arena Elo rating (full),License,Organization,Link
|
2 |
+
PlayGround V2.5,PlayGround V2.5,1121.2448738490257,1120.8169702253008,Playground v2.5 Community License,Playground,https://huggingface.co/playgroundai/playground-v2.5-1024px-aesthetic
|
3 |
+
FLUX.1-dev,FLUX.1-dev,1116.7026685210212,1141.1151822428205,flux-1-dev-non-commercial-license (other),Black Forest Labs,https://huggingface.co/docs/diffusers/main/en/api/pipelines/flux
|
4 |
+
FLUX.1-schnell,FLUX.1-schnell,1086.512731859874,1075.3547247702184,Apache-2.0,Black Forest Labs,https://huggingface.co/docs/diffusers/main/en/api/pipelines/flux
|
5 |
+
PlayGround V2,PlayGround V2,1071.9998075391602,1070.301874711683,Playground v2 Community License,Playground,https://huggingface.co/playgroundai/playground-v2-1024px-aesthetic
|
6 |
+
Kolors,Kolors,1053.9296445023099,1052.2116999769546,Apache-2.0,Kwai Kolors,https://huggingface.co/Kwai-Kolors/Kolors
|
7 |
+
StableCascade,StableCascade,1038.9061641121523,1040.9157152652126,stable-cascade-nc-community (other),Stability AI,https://fal.ai/models/stable-cascade/api
|
8 |
+
HunyuanDiT,HunyuanDiT,1029.076379309246,1021.5378651760857,tencent-hunyuan-community,Tencent,https://github.com/Tencent/HunyuanDiT
|
9 |
+
PixArtAlpha,PixArtAlpha,1019.694068041725,1010.4226020005569,openrail++,PixArt-alpha,https://huggingface.co/PixArt-alpha/PixArt-XL-2-1024-MS
|
10 |
+
PixArtSigma,PixArtSigma,1018.785193454226,1018.1812651767434,openrail++,PixArt-alpha,https://github.com/PixArt-alpha/PixArt-sigma
|
11 |
+
SDXL-Lightning,SDXL-Lightning,1018.3482253799551,1021.076910864609,openrail++,ByteDance,https://huggingface.co/ByteDance/SDXL-Lightning
|
12 |
+
AuraFlow,AuraFlow,1012.1391160750907,1005.7338671616257,Apache-2.0,Fal.AI,https://huggingface.co/fal/AuraFlow
|
13 |
+
SD3,SD3,1009.5997566318815,1012.142846877162,stabilityai-nc-research-community,Stability AI,https://huggingface.co/blog/sd3
|
14 |
+
SDXL,SDXL,965.1116643161047,965.1708053088519,openrail++,Stability AI,https://huggingface.co/stabilityai/stable-diffusion-xl-base-1.0
|
15 |
+
SDXLTurbo,SDXLTurbo,914.5040248319536,911.2606190195628,sai-nc-community (other),Stability AI,https://huggingface.co/stabilityai/sdxl-turbo
|
16 |
+
LCM(v1.5/XL),LCM(v1.5/XL),904.6695030873209,898.0322197461196,openrail++,Latent Consistency,https://fal.ai/models/fast-lcm-diffusion-turbo
|
17 |
+
OpenJourney,OpenJourney,829.220699325143,823.5215471555109,creativeml-openrail-m,PromptHero,https://huggingface.co/prompthero/openjourney
|
18 |
+
LCM,LCM,789.5554791638133,802.894361004465,MIT License,Tsinghua University,https://huggingface.co/SimianLuo/LCM_Dreamshaper_v7
|
arena_elo/results/latest/video_generation_leaderboard.csv
CHANGED
@@ -1,13 +1,14 @@
|
|
1 |
key,Model,Arena Elo rating (anony),Arena Elo rating (full),License,Organization,Link
|
2 |
-
Pyramid Flow,Pyramid Flow,
|
3 |
-
CogVideoX-5B,CogVideoX-5B,
|
4 |
-
StableVideoDiffusion,StableVideoDiffusion,
|
5 |
-
CogVideoX-2B,CogVideoX-2B,
|
6 |
-
T2V-Turbo,T2V-Turbo,
|
7 |
-
AnimateDiff,AnimateDiff,
|
8 |
-
VideoCrafter2,VideoCrafter2,
|
9 |
-
|
10 |
-
|
11 |
-
OpenSora
|
12 |
-
|
13 |
-
|
|
|
|
1 |
key,Model,Arena Elo rating (anony),Arena Elo rating (full),License,Organization,Link
|
2 |
+
Pyramid Flow,Pyramid Flow,1155.0515986410344,1154.8387278864816,MIT LICENSE,Peking University,https://pyramid-flow.github.io/
|
3 |
+
CogVideoX-5B,CogVideoX-5B,1141.1909760883307,1141.065783136741,CogVideoX LICENSE,THUDM,https://github.com/THUDM/CogVideo
|
4 |
+
StableVideoDiffusion,StableVideoDiffusion,1122.5352223205691,1125.370981071734,SVD-nc-community,Stability AI,https://fal.ai/models/fal-ai/fast-svd/text-to-video/api
|
5 |
+
CogVideoX-2B,CogVideoX-2B,1059.816984062651,1055.8349374966426,CogVideoX LICENSE,THUDM,https://github.com/THUDM/CogVideo
|
6 |
+
T2V-Turbo,T2V-Turbo,1054.2466944734051,1053.7591835284088,cc-by-nc-4.0,"University of California, Santa Barbara",https://github.com/Ji4chenLi/t2v-turbo
|
7 |
+
AnimateDiff,AnimateDiff,1044.3154266675936,1044.28949489859,creativeml-openrail-m,"The Chinese University of Hong Kong, Shanghai AI Lab, Stanford University",https://fal.ai/models/fast-animatediff-t2v
|
8 |
+
VideoCrafter2,VideoCrafter2,1042.2563883318585,1043.2956243436313,Apache 2.0,Tencent AI Lab,https://ailab-cvc.github.io/videocrafter2/
|
9 |
+
Allegro,Allegro,992.8493528200153,992.0385568838952,Apache 2.0,rhymes-ai,https://github.com/rhymes-ai/Allegro
|
10 |
+
LaVie,LaVie,971.1904311170582,971.9786970887593,Apache 2.0,Shanghai AI Lab,https://github.com/Vchitect/LaVie
|
11 |
+
OpenSora,OpenSora,887.8493947773078,888.1758928863985,Apache 2.0,HPC-AI Tech,https://github.com/hpcaitech/Open-Sora
|
12 |
+
OpenSora v1.2,OpenSora v1.2,849.8115139036329,848.6702656373551,Apache 2.0,HPC-AI Tech,https://github.com/hpcaitech/Open-Sora
|
13 |
+
AnimateDiff Turbo,AnimateDiff Turbo,841.2349314485369,841.035380180319,creativeml-openrail-m,"The Chinese University of Hong Kong, Shanghai AI Lab, Stanford University",https://fal.ai/models/fast-animatediff-t2v-turbo
|
14 |
+
ModelScope,ModelScope,837.6510853480027,839.6464749610427,cc-by-nc-4.0,Alibaba Group,https://arxiv.org/abs/2308.06571
|