diff --git "a/README.md" "b/README.md" --- "a/README.md" +++ "b/README.md" @@ -23,12 +23,30 @@ tags: - meralion-2 --- -# ๐ŸŽ‰ MERaLiON-2-10B +

๐Ÿ”ฅ MERaLiON-2 ๐Ÿ”ฅ

+

+ ๐Ÿš€ MERaLiON-2-10B | + ๐Ÿš€ MERaLiON-2-10B-ASR | + ๐Ÿš€ MERaLiON-2-3B +

-## ๐Ÿ†™ What's New in V2 +## Introduction -- **Extended Audio Length**: Improved support for audio inputs up to 300 seconds (5 minutes), compared to the 30-second limit in V1. We suggest a maximum audio length of 30 seconds for ASR. +We are excited to announce the release of MERaLiON2, the latest addition to the MERaLiON family of Speech-Text Large Language Models. Our flagship model, [MERaLiON-2-10B](https://huggingface.co/MERaLiON/MERaLiON-2-10B), achieves competitive results in benchmark evaluations of multilingual speech recognition (ASR), speech translation (ST), audio scene understanding, emotion recognition, general speech understanding etc., +when compared to other state-of-the-art AudioLLMs such as Qwen2.5-Omni-7B, Phi-4-multimodal-instruct. It is tailored to follow **complex instructions** with a deep understanding of **Singaporeโ€™s multilingual and multicultural landscape**. + +model_capability + +Additionall, we provide an ASR-optimized model, [MERaLiON-2-10B-ASR](https://huggingface.co/MERaLiON/MERaLiON-2-10B-ASR), demonstrates **5-30%** performance improvement over `whisper-large-v3` on speech recognition tasks across Singapore's 4 official languages (**English**, **Mandarin**, **Malay**, and **Tamil**), 3 SEA languages (**Indonesian**, **Thai**, and **Vietnamese**), **code-switch senarios**, and various local phrases. +The following visualisation shows `1 - Word Error Rate` for the 7 languages across MERaLiON-2 and various models. + +model_capability + +We also provide [MERaLiON-2-3B](https://huggingface.co/MERaLiON/MERaLiON-2-3B) that balances performance with reduced computational requirements, enabling broader accessibility and lightweight deployment. + + +- **Extended Audio Length**: Support audio inputs up to 300 seconds (5 minutes) for audio & speech question answering tasks, **30s for a satisfactory performance for speech transcription (ASR) and speech translation (ST) tasks**. - **Expanded Language Coverage**: In addition to English, Chinese, and Singlish, V2 introduces support for Malay, Tamil, and other regional languages including Indonesian, Thai, and Vietnamese. @@ -38,8 +56,6 @@ tags: - **Three Model Variants**: Available in general-purpose ([MERaLiON-2-10B](https://huggingface.co/MERaLiON/MERaLiON-2-10B)), ASR-optimized ([MERaLiON-2-10B-ASR](https://huggingface.co/MERaLiON/MERaLiON-2-10B-ASR)) and light-weight ([MERaLiON-2-3B](https://huggingface.co/MERaLiON/MERaLiON-2-3B)) configurations to balance latency, compute efficiency, and task performance across different deployment needs. ---- - ## ๐Ÿ“ Model Description: MERaLiON stands for **M**ultimodal **E**mpathetic **R**easoning **a**nd **L**earning **i**n **O**ne **N**etwork. @@ -58,1432 +74,431 @@ The model supports long-form audio inputs of up to 300 seconds (5 minutes) and i - **License:** [MERaLiON Public License](https://huggingface.co/MERaLiON/MERaLiON-AudioLLM-Whisper-SEA-LION/blob/main/MERaLiON-Public-Licence-v1.pdf) - **Demo:** [MERaLiON-AudioLLM Web Demo](https://meralion.org/demo/) - - **MERaLiON-2** is an upgraded version of [MERaLiON-AudioLLM](https://huggingface.co/MERaLiON/MERaLiON-AudioLLM-Whisper-SEA-LION). ---- - - -## ๐Ÿ“ˆ Evaluations: - -model_capability - -model_capability +## ๐Ÿ“ˆ Performance: We benchmark MERaLiON-2 series models with extended [AudioBench benchmark](https://github.com/AudioLLMs/AudioBench) | [LeaderBoard](https://huggingface.co/spaces/MERaLiON/AudioBench-Leaderboard) againstย several recently released open-source multimodal models โ€” SALMONN-7B, Qwen2.5-Omni series and Phi-4-Multimodal โ€” as well as two cascade model. The MERaLiON-2 series models shows stronger performance on a wide range of audio/speech understanding tasks. -**Automatic Speech Recognition (ASR) results** -
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
typedatasetMERaLiON-1MERaLiON-2-3BMERaLiON-2-10BMERaLiON-2-10B-ASRMERaLiON-2-Whisper whisper_large_v3Phi-4-multimodal-instructQwen2.5-Omni-3BQwen2.5-Omni-7BSALMONN-7Bcascade-whisper_v2+sealioncascade-whisper_v3+llama
Englishcommon_voice_15_en0.0780.0930.0870.0760.1020.1000.0810.0940.0800.3160.1060.099
earnings210.1380.2190.1080.0920.1300.1320.1310.1470.1890.2770.1410.109
earnings220.1660.2390.1510.1280.1680.1650.2260.1970.2410.3800.1720.146
gigaspeech0.1450.0920.0900.0880.0890.0980.0990.1140.1400.1100.1000.095
librispeech_clean0.0240.0270.0250.0210.0200.0220.0170.0210.0440.0960.0330.018
librispeech_other0.0420.0510.0470.0400.0440.0390.0390.0450.0690.1180.0540.036
peoples_speech0.2160.2060.2050.1960.1970.1500.2150.2620.3120.2420.2030.145
tedlium30.0820.0350.0350.0310.0360.0410.0290.0480.0490.0390.0490.038
tedlium3_long_form0.1050.1380.0440.0350.0480.0450.0510.0710.0840.1410.0860.049
average0.1110.1220.0880.0790.0930.0880.0980.1110.1340.1910.1050.082
Inhousecna0.1450.1350.1330.1270.1280.1380.1910.1740.1830.1490.1520.138
idpc0.2040.1770.1600.1660.1690.1790.2610.1990.2200.5410.1700.162
idpc_short0.1650.1510.1570.1400.1520.2200.5390.2110.4140.2400.1970.153
mediacorp0.1230.1230.1050.1040.1160.1290.1980.1520.2350.3640.1580.151
mediacorp_short0.1280.1210.1170.1180.1220.1270.1220.1480.1410.1990.1540.114
parliament0.0590.1850.0600.0530.0780.0900.2780.1000.1100.2040.0900.065
ste0.1590.2630.1470.1250.1510.2980.2970.2870.2880.4220.1320.144
ukusnews0.1130.1740.0700.0560.0830.1230.0750.0910.1760.1920.1230.089
ytb_asr_batch10.1070.0990.0980.0920.1120.1330.1690.1620.1740.2210.1250.108
ytb_asr_batch20.1330.1600.1110.0990.1180.1290.2320.2450.3510.3500.1260.084
ytb_asr_batch3_chinese0.4180.2560.1910.1490.1770.2660.4400.2500.2060.8860.3470.270
ytb_asr_batch3_malay0.2900.2800.2090.1950.2900.2603.7632.9441.4611.0860.3140.312
ytb_asr_batch3_tamil0.6930.7500.6640.5470.9270.8412.7501.4611.3620.9850.9670.898
average0.2100.2210.1710.1520.2020.2260.7170.4940.4090.4490.2350.207
Mandarinaishell_asr_zh0.1280.0500.0580.0430.0560.1230.1220.0280.0240.9310.2090.125
commonvoice_zh0.3270.1310.1470.1180.1410.1980.1540.1130.0761.0010.3190.196
average0.2280.0910.1020.0810.0980.1610.1380.0710.0500.9660.2640.160
SEA languagescommonvoice_id0.2600.0850.1130.0790.0690.0751.3270.1360.1101.1890.1000.078
commonvoice_ta0.5280.1390.1560.1290.1950.2711.1780.8310.8471.4270.2380.244
commonvoice_th0.8470.3070.4660.6350.0510.0691.0540.1130.1041.0440.0930.064
commonvoice_vi0.9220.1420.1560.1420.1180.1291.1070.1960.1841.4960.1570.117
fleurs_tamil_ta0.4620.1430.1610.1380.2240.2761.7021.6540.8671.5080.2720.284
gigaspeech2_id0.3370.1780.1720.1630.1850.1965.8040.2750.2272.1180.2190.193
gigaspeech2_th0.9870.2000.2000.1820.1710.2221.7340.3000.2321.2470.2760.209
gigaspeech2_vi0.9820.1680.1130.0950.1270.1772.5040.1770.2271.5460.1710.155
lotus_thai_th0.8520.0150.0190.0110.0260.0391.2860.0260.0211.1350.0680.032
average0.6860.1530.1730.1750.1290.1621.9660.4120.3131.4120.1770.153
Singlishimda_part1_asr0.0430.0490.0520.0440.0520.0690.0580.0530.0530.0930.0710.069
imda_part2_asr0.0470.0580.1450.0540.0800.3180.3450.0950.0940.4580.3300.319
imda_part3_30s_asr0.2130.2640.2270.1960.2110.3200.4380.4750.5350.6810.2810.277
imda_part4_30s_asr0.2970.3600.2950.2460.2710.5031.4701.2501.3030.7870.4590.458
imda_part5_30s_asr0.1540.2020.1680.1400.1490.2370.2390.2800.3740.3750.2180.214
imda_part6_30s_asr0.1090.1490.1270.0990.1100.1980.1440.1830.2750.2550.1750.172
average0.1440.1800.1690.1300.1450.2740.4490.3890.4390.4410.2560.252
-
- -**Spoken Question Answering (SQA) results** -
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
datasetMERaLiON-1MERaLiON-2-3BMERaLiON-2-10BPhi-4-multimodal-instructQwen2.5-Omni-3BQwen2.5-Omni-7BSALMONN-7Bcascade-whisper_v2+sealioncascade-whisper_v3+llama
cn_college_listen_mcq57.11166.00684.58875.64981.41881.72650.81589.52084.985
dream_tts_mcq51.54261.16083.32577.52269.99570.77956.56085.15486.200
imda_part3_30s_sqa55.20052.60059.40055.00052.40054.20042.00051.40051.600
imda_part4_30s_sqa50.00054.60063.00056.40054.40052.00035.40046.40055.600
imda_part5_30s_sqa63.00061.40072.00064.60066.00062.80045.80054.60062.000
imda_part6_30s_sqa67.40070.20071.80071.80069.20064.60049.60062.60068.200
mmau_mini53.10051.00056.70058.80060.70056.10050.60052.60055.900
muchomusic51.34855.60263.94355.26559.30947.59949.70550.46356.698
public_sg_speech_qa59.59369.47775.02974.18661.07661.71559.39070.93069.680
slue_p2_sqa586.71683.18689.55983.72573.87377.30480.88251.52086.961
spoken_squad74.20781.46189.20983.19659.85062.86765.64857.16387.434
average60.83864.24573.50568.74064.38462.88153.30961.12369.569
-
- -**Speech Translation (ST) results** -
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
datasetMERaLiON-1MERaLiON-2-3BMERaLiON-2-10BMERaLiON-2-Whisper whisper_large_v3Phi-4-multimodal-instructQwen2.5-Omni-3BQwen2.5-Omni-7BSALMONN-7Bcascade-whisper_v2+sealioncascade-whisper_v3+llama
covost2_en_id37.05830.65836.242--14.55422.67722.38114.19327.59210.753
covost2_en_ta13.8095.60210.886--0.1480.1140.7240.0017.4751.003
covost2_en_zh43.96340.02843.747--45.48041.39040.43633.25628.7146.090
covost2_id_en43.37437.77347.85921.26944.6670.37744.70243.84527.88546.80546.797
covost2_ta_en4.7581.9423.4790.0222.4940.0730.2120.0570.4062.8332.418
covost2_zh_en19.55616.77822.13412.22514.86522.33021.56416.6865.17615.21014.156
average27.08622.13027.39111.17220.67513.82721.77720.68813.48621.43813.536
-
- -**Spoken Dialogue Summarization (SDS) results** -
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
datasetMERaLiON-1MERaLiON-2-3BMERaLiON-2-10BPhi-4-multimodal-instructQwen2.5-Omni-3BQwen2.5-Omni-7BSALMONN-7Bcascade-whisper_v2+sealioncascade-whisper_v3+llama
imda_part3_30s_ds47.80042.20049.80043.60042.80039.8009.00048.40038.000
imda_part4_30s_ds46.40040.20046.60042.80033.20031.6007.40045.60038.200
imda_part5_30s_ds54.60051.80055.40055.60052.20042.80016.00053.40046.200
imda_part6_30s_ds65.60060.00060.60061.00058.80058.40025.20056.60061.000
average53.60048.55053.10050.75046.75043.15014.40051.00045.850
-
- -**Speech instruction (SI) results** -
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
datasetMERaLiON-1MERaLiON-2-3BMERaLiON-2-10BPhi-4-multimodal-instructQwen2.5-Omni-3BQwen2.5-Omni-7BSALMONN-7Bcascade-whisper_v2+sealioncascade-whisper_v3+llama
alpaca_audio75.20025.60074.20033.40064.00059.20010.40067.00069.400
openhermes_audio66.40012.60066.20039.00066.00057.40015.40078.80062.800
average70.80019.10070.20036.20065.00058.30012.90072.90066.100
-
- -**Audio Captioning (AC) results** -
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
datasetMERaLiON-1MERaLiON-2-3BMERaLiON-2-10BPhi-4-multimodal-instructQwen2.5-Omni-3BQwen2.5-Omni-7BSALMONN-7Bcascade-whisper_v2+sealioncascade-whisper_v3+llama
audiocaps39.38635.07736.04133.59543.69537.70035.2412.4552.514
wavcaps34.56631.41035.16828.06934.70526.09222.5203.8273.318
average36.97633.24435.60430.83239.20031.89628.8813.1412.916
-
- -**Accent Recognition (AR) results** -
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
datasetMERaLiON-1MERaLiON-2-3BMERaLiON-2-10BPhi-4-multimodal-instructQwen2.5-Omni-3BQwen2.5-Omni-7BSALMONN-7Bcascade-whisper_v2+sealioncascade-whisper_v3+llama
voxceleb_accent47.06666.59840.7882.6260.9031.66231.69928.00640.295
-
- -**Audio-Scene Question Answering (ASQA) results** -
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
datasetMERaLiON-1MERaLiON-2-3BMERaLiON-2-10BPhi-4-multimodal-instructQwen2.5-Omni-3BQwen2.5-Omni-7BSALMONN-7Bcascade-whisper_v2+sealioncascade-whisper_v3+llama
audiocaps_qa48.81844.79250.35140.31948.56250.41550.35117.44417.061
clotho_aqa62.67450.54058.20148.37152.64946.59258.19222.67429.820
wavcaps_qa45.13243.09244.86837.96143.15840.00046.90814.01318.750
average52.20846.14151.14042.21748.12345.66951.81718.04421.877
-
- -**Emotion Recognition (ER) results** -
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
datasetMERaLiON-1MERaLiON-2-3BMERaLiON-2-10BPhi-4-multimodal-instructQwen2.5-Omni-3BQwen2.5-Omni-7BSALMONN-7Bcascade-whisper_v2+sealioncascade-whisper_v3+llama
iemocap_emotion49.10451.39462.55032.07234.36336.55426.19541.98246.912
meld_emotion44.17652.14659.80840.84334.33030.07732.29944.27249.425
meld_sentiment52.45258.58268.85149.11930.42127.77842.26158.39156.475
average48.57754.04163.73640.67833.03831.46933.58548.21550.938
-
- -**Gender Recognition (GR) results** -
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
datasetMERaLiON-1MERaLiON-2-3BMERaLiON-2-10BPhi-4-multimodal-instructQwen2.5-Omni-3BQwen2.5-Omni-7BSALMONN-7Bcascade-whisper_v2+sealioncascade-whisper_v3+llama
iemocap_gender94.62287.92892.96846.85362.94843.36780.19912.21144.382
voxceleb_gender99.73399.69297.25194.58432.78654.08388.53126.63169.696
average97.17793.81095.10970.71847.86748.72584.36519.42157.039
-
+**Better Automatic Speech Recognition (ASR) Accuracy** +MERaLiON-2-10B-ASR and MERaLiON-2-10B demonstrate leading performance in Singlish, Mandarin, Malay, Tamil, and other Southeast Asian languages, while maintaining competitive results in English compared to `Whisper-large-v3`. + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
 MERaLiON-2-10B-ASRMERaLiON-2-10BMERaLiON-2-3Bwhisper_large_v3cascade_whisper_large_v3_llama_3_8b_instructcascade_whisper_large_v2_gemma2_9b_cpt_sea_lionv3_instructMERaLiON-AudioLLM-Whisper-SEA-LIONQwen2.5-Omni-7BSeaLLMs-Audio-7BQwen2.5-Omni-3BSALMONN_7Bphi_4_multimodal_instruct
thai0.0965260.1093650.1072790.1210730.1202570.1721050.9193300.1264970.1171520.1631501.1910991.510068
tamil0.2712790.3270810.3440810.4414830.4752250.4923360.5613151.0249162.3254021.3151431.3066941.876722
singlish0.1298300.1688130.1803950.2489450.2516080.2557170.1438000.4390710.7959900.3893930.4414900.448863
malay0.1946380.2090740.2798910.2196920.3119210.3143780.2898951.4606640.7655652.9437501.0858673.762933
english0.0785440.0882590.1222950.0808410.0815680.1048300.1105670.1342160.1978240.1103530.1914920.098225
indonesian0.1210200.1428130.1319500.1371020.1353900.1594760.2983650.1686590.2202270.2052161.6535023.565510
mandarin0.1036940.1320250.1458780.1709800.1968670.2917330.2911830.1024190.3097820.1304290.9395450.238879
vietnamese0.1186930.1348080.1551100.1484740.1360750.1640780.9520400.2054910.2220010.1867861.5211741.805643
private0.1061500.1123600.1472580.1166300.1184340.1438120.1306670.2227700.4965400.1645560.2733040.229450
+ + +**Better Instruction Following and Audio Understanding** + + +MERaLiON-2-10B has demonstrated significant improvement across the speech understanding, audio understanding, and paralinguistic tasks. Specifically, MERaLiON-2-10B is able to handle more complicated instructions and answer with more flexibility, minimizing the lost of Gemma's pre-trained knowledge during the audio finetuning process. This allows MERaLiON-2-10B to provide more detailed explaination to queries about the speech content or speaker's emotion status. With further adjustment of the text prompt, it can play different roles like voice assistant, virtual caregiver, or become part of sophisticated multi-AI agent system and software solutions. + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
 MERaLiON-2-10BMERaLiON-AudioLLM-Whisper-SEA-LIONMERaLiON-2-10B-ASRMERaLiON-2-3BSeaLLMs-Audio-7BQwen2-Audio-7B-InstructQwen2.5-Omni-3Bphi_4_multimodal_instructcascade_whisper_large_v3_llama_3_8b_instructQwen2.5-Omni-7Bcascade_whisper_large_v2_gemma2_9b_cpt_sea_lionv3_instructQwen-Audio-ChatSALMONN_7BWavLLM_fairseq
speech_instruction70.20000070.80000013.40000019.10000066.90000048.70000065.00000036.20000066.10000058.30000072.90000010.20000012.90000020.400000
emotion_recognition63.73626848.57731353.69329854.04079752.00757649.84654033.03783640.67780050.93757831.46939748.21496941.67155133.58486950.801545
audio_scene_question_answering51.14037452.20775649.51188646.14135350.19373947.04802548.12322842.21714321.87694345.66915318.04368151.61862251.81695833.034083
gender_recognition95.10942397.17739697.22033593.81026675.44939295.96326647.86721070.71804757.03940948.72471119.42113060.34934984.36509260.773275
sqa_singlish66.55000058.90000061.85000059.70000051.35000046.70000060.50000061.95000059.35000058.40000053.75000042.30000043.20000051.200000
audio_captioning35.60427036.97641934.46671033.24383945.08937237.27881039.20032830.8324092.91577831.8962433.14056839.98866328.8805706.200867
sds_singlish53.10000053.60000055.80000048.55000045.45000036.30000046.75000050.75000045.85000043.15000051.00000025.25000014.40000039.450000
sqa_english79.73504963.71148173.97583468.71517970.92051968.88856567.81854675.51315278.52656968.41513167.81453866.06904760.64907170.595242
music_understanding63.94271351.34793660.65711955.60235963.68997571.60909959.30918355.26537556.69755747.59898950.46335359.05644549.70513944.313395
accent_recognition41.81539643.79979947.78886460.05498110.14383610.9013970.4786943.09761521.3984820.58729325.92969317.55029411.57738114.294613
st27.39111527.08636628.54035922.13025821.14321510.82666621.77662813.82711013.53627220.68824121.4379974.97318413.4860039.046791
## ๐Ÿ”ง How to Use