home bbs files messages ]

Forums before death by AOL, social media and spammers... "We can't have nice things"

   comp.ai      Awaiting the gospel from Sarah Connor      1,954 messages   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]

   Message 1,948 of 1,954   
   Tristan Miller to Carl Fink   
   Re: Is it possible to train generative A   
   02 Dec 24 18:46:53   
   
   From: psychonaut@nothingisreal.com   
      
   Dear Carl,   
      
   On 2024-12-01 19:23, Carl Fink wrote:   
   > Can you link to those reports, or tell me where to search (e. g. a journal   
   > name)?   
   >   
   > This box has 48 GB of RAM. As I wrote originally, I'm fine with long   
   > processing times, although 40x seems very long.   
      
   I'm afraid I was reporting only half-remembered results from the last   
   time I looked into the question, which would have been months ago.  I   
   did a quick web search just now and came up with a couple queries from   
   the Kohya's GUI GitHub project that roughly accord with my recollection:   
      
   https://github.com/bmaltais/kohya_ss/discussions/679   
   https://github.com/bmaltais/kohya_ss/issues/2632   
      
   The first of these claims CPU training time of about 40×, and the second   
   claims 4× for both time and memory.  They both refer to Stable Diffusion.   
      
   There's also this LoRA finetuning guide for LLaMA that provides detailed   
   CPU time and memory metrics for various models: https://rentry.org/cpu-lora   
      
   Regards,   
   Tristan   
      
   --   
   =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-   
                      Tristan Miller   
   Free Software developer, ferret herder, logologist   
                 https://logological.org/   
   =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-   
      
   --- SoupGate-DOS v1.05   
    * Origin: you cannot sedate... all the things you hate (1:229/2)   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]


(c) 1994,  bbs@darkrealms.ca