I successfully installed oobavooga on a PC, and by using only the CPU I can run a vicuna and a wizardlm model. I can’t run any llama, opt, gpt-j which looks required when trying to train the model through Lora. Do you have any suggestion?

  • suokoOP
    link
    English
    111 months ago

    I know how it works but my GPU is weak (4gb) while my cpu and RAM are great. Theoretically I should just have some (maybe way) longest waiting times but in the end I should get the same results.

    • @Kerfuffle@sh.itjust.works
      link
      fedilink
      English
      3
      edit-2
      11 months ago

      Well, I hope you succeed in finding a way to do it, but again, it’s going to be hard to find tools and information to accomplish this. Since using the CPU for training is ineffective, most existing software is oriented toward GPU-based training.

      You’re not wrong that if you could find the software and if you were willing to wait long enough you’d get the same results but getting to that point isn’t so easy. Also, it maybe not be an effective use of time/money: after all, if you could train your model in… I don’t know, several weeks on CPU and it takes a lot of effort to develop/find the resources or you could spent $4 to rent an A100 for two hours why take the former approach? You might spend as much just in electricity running all threads on the CPU at 100% for an extended time.