Polish Open LLM Leaderboard

Select columns to show
TModel / AdapterPrecisionAveragePPC APolEmo2 In A8TAGS ABelebele PL ABelebele ENLEKPSCDYK
N/AUnknownRandom BaselineN/A21.8825.025.012.525.025.020.025.025.0
🔶fine-tunedVoicelab/trurl-2-7bfloat1650.3942.747.5167.0244.3371.8927.8344.2582.8
🟦RL-tunedmeta-llama/Llama-2-7b-chat-hffloat1643.6724.656.5154.838.7857.8927.8569.4856.75
🟢pretrainedmistralai/Mistral-7B-v0.1bfloat1641.6538.918.4253.9655.3377.7827.8532.06.41
instruction-tunedAzurro/llama-2-7b-qlora-polish-instructfloat1635.473.560.5345.8432.045.7820.5561.9716.91
🟢pretrainedAzurro/llama-2-7b-qlora-polishfloat1624.598.921.0535.1833.2243.5619.9957.0519.14
🔶fine-tunedVoicelab/trurl-2-13b-academic8bit54.845.254.2965.4854.2281.3328.9532.6516.91
🟢pretrainedmeta-llama/Llama-2-7b-hffloat1621.0111.420.517.5734.5644.021.1115.0327.41
🟢pretrainedAzurro/APT-1B-Basefloat326.221.13.056.2714.4415.016.160.090.39
🟦RL-tunedmeta-llama/Llama-2-13b-chat-hf8bit51.3955.745.1554.1650.5675.5624.7569.9416.91
🔶fine-tunedTyply/Pigeon-7Bfloat1619.210.126.3225.2125.2239.3317.480.011.37
instruction-tunedAzurro/Mistral-7B-Instruct-v0.1-qlora-polishbfloat1643.9339.639.7558.3738.073.4424.880.3716.42
🟢pretrainedtiiuae/falcon-7bbfloat1615.086.716.212.5324.8923.5618.533.340.0
🟢pretrainedAzurro/APT3-500M-Basefloat325.450.52.2210.988.116.8915.650.00.68
instruction-tunedtiiuae/falcon-7b-instructbfloat1613.420.014.6816.0123.028.7817.950.012.73
🟦RL-tunedcodellama/CodeLlama-34b-hf4bit36.660.244.8857.0244.5662.7822.961.9548.59
instruction-tunedHuggingFaceH4/zephyr-7b-alphabfloat1658.6939.762.667.3665.1182.5633.3822.1716.91
instruction-tunedmistralai/Mistral-7B-Instruct-v0.1bfloat1649.6650.648.8955.7243.4474.5628.7929.0447.04
🟢pretrainedAzurro/APT2-1B-Basefloat3212.670.016.212.5821.8919.4417.840.00.0
A - Included in 'Average' Calculation
© 2023 Azurro. Version 23.11.13.