Will we get 65B?

#5
by appvoid - opened

First of all, I really do appreciate what you did. I'm getting remarkable results from this one. I'm looking forward to know if you are going to upload the 65B one. That would be really great!

Most likely not, for two reasons:

  1. I cannot make the LoRA weights because my GPU isn't powerful enough and no one has made 65B Alpaca LoRA weights yet.
  2. Merging the weights and quantizing the models uses a lot of RAM. 30B needed 85GB so I had to use swap. 65B would probably use 170GB or more, and I don't have enough space for a swap that large.

Holy weight!

That things is humongous. Hopefully someone do it soon. Anyway, thanks for this one though.

appvoid changed discussion status to closed

Sign up or log in to comment