From: carlf@panix.com   
      
   On 2024-12-01, Tristan Miller wrote:   
      
   > Yes, it's absolutely possible to train generative AI on a CPU rather   
   > than a GPU, but whether this is feasible depends entirely on what sort   
   > of model you are expecting to end up with, how much training time you're   
   > willing to tolerate, and how much RAM you have available. Without   
   > further details on your requirements it's difficult to make any specific   
   > recommendations. I've seen a few reports comparing CPU and GPU training   
   > of LoRA models for Stable Diffusion, for example, that indicate that CPU   
   > training can require four times more memory and/or forty times more   
   > time. If you don't already have the requisite GPU hardware and don't   
   > want to buy it yourself, it might be economical to rent the processing   
   > power from a cloud service.   
      
   Can you link to those reports, or tell me where to search (e. g. a journal   
   name)?   
      
   This box has 48 GB of RAM. As I wrote originally, I'm fine with long   
   processing times, although 40x seems very long.   
      
   Thank you.   
   --   
   Carl Fink carl@finknetwork.com   
   https://reasonablyliterate.com https://nitpicking.com   
   If you want to make a point, somebody will take the point and stab you with it.   
    -Kenne Estes   
      
   --- SoupGate-DOS v1.05   
    * Origin: you cannot sedate... all the things you hate (1:229/2)   
|