Great work on MoE!

#1
by killawhale2 - opened

Your work on MoE is absolutely wonderful!
It appears that you have leveraged the solar 10.7B model, which is fantastic!
If this is true, we should definitely promote your amazing work! :D

Hi, Thank you for your interest for my work!
Unfortunately, Solar 10.7B has not been used in this model.
but i am going to use Solar 10.7B in my next work or so.
Since Solar is now on llama architecture it was hard to merge with other mistral based models.

μ•ˆλ…•ν•˜μ„Έμš”. μž‘μ—…μ— 관심 κ°€μ Έμ£Όμ…”μ„œ κ°μ‚¬ν•©λ‹ˆλ‹€.
μ•ˆνƒ€κΉκ²Œλ„, μ—…μŠ€ν…Œμ΄μ§€ 솔라 λͺ¨λΈμ€ 이번 λͺ¨λΈμ— μ‚¬μš©λ˜μ§€ μ•Šμ•˜μŠ΅λ‹ˆλ‹€ γ… γ… 
λ¬Όλ‘  λ‹€μŒ λͺ¨λΈμ— ν›Œλ₯­ν•œ μ„±λŠ₯을 μ§€λ‹Œ 솔라 λͺ¨λΈμ„ μ‚¬μš©ν•  κ³„νšμ€ μžˆμŠ΅λ‹ˆλ‹€.
솔라 λͺ¨λΈμ΄ ν˜„μž¬ 라마 μ•„ν‚€ν…μ³λ‘œλ§Œ μ‘΄μž¬ν•˜κΈ°μ— λ‹€λ₯Έ λ―ΈμŠ€νŠΈλž„ 기반 λͺ¨λΈλ“€κ³Ό κ²°ν•©ν•˜λŠ”λ°μ— 어렀움이 μžˆμ—ˆμŠ΅λ‹ˆλ‹€.

λͺ¨λΈμ— 관심 κ°€μ Έμ£Όμ…”μ„œ κ°μ‚¬ν•©λ‹ˆλ‹€ 😊

Oh I see, congratulations on your great work!
May I ask what weights were used when initializing the 10.7B experts, as I was unaware of other pretrained/fine-tuned models with 10.7B parameters.

Based on my PiVoT-10.7B-Mistral-v0.2-RP, other 10.7B models with RP finetune on huggingface used.

killawhale2 changed discussion status to closed

but i am going to use Solar 10.7B in my next work or so.

@maywell We would love to see the Solar model along with your MOE. Thank you in advance.

Sign up or log in to comment