home bbs files messages ]

Forums before death by AOL, social media and spammers... "We can't have nice things"

   comp.lang.c      Meh, in C you gotta define EVERYTHING      243,242 messages   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]

   Message 241,707 of 243,242   
   Keith Thompson to bart   
   Re: New and improved version of cdecl   
   30 Oct 25 16:44:38   
   
   From: Keith.S.Thompson+u@gmail.com   
      
   bart  writes:   
   > On 30/10/2025 18:59, Kaz Kylheku wrote:   
   >> On 2025-10-30, bart  wrote:   
   >>> On 30/10/2025 15:04, David Brown wrote:   
   >>>> On 30/10/2025 13:07, bart wrote:   
   >>>   
   >>>> You moan that compiles are too slow.  Yet doing them in parallel is a   
   >>>> "workaround".  Avoiding compiling unnecessarily is a "workaround".   
   >>>> Caching compilation work is a "workaround".  Using a computer from this   
   >>>> century is a "workaround".  Using a decent OS is a "workaround".  Is /   
   >>>> everything/ that would reduce your scope for complaining loudly to the   
   >>>> wrong people a workaround?   
   >>>   
   >>> Yes, they are all workarounds to cope with unreasonably slow compilers.   
   >> The idea of incremental rebuilding goes back to a time when   
   >> compilers   
   >> were fast, but machines were slow.   
   >   
   > What do you mean by incremental rebuilding? I usually talk about   
   > /independent/ compilation.   
   >   
   > Then incremental builds might be about deciding which modules to   
   > recompile, except that that is so obvious, you didn't give it a name.   
   >   
   > Compile the one file you've just edited. If it might impact on any   
   > others (you work on a project for months, you will know it   
   > intimately), then you just compile the lot.   
      
   I'll assume that was a serious question.  Even if you don't care,   
   others might.   
      
   Let's say I'm working on a project that has a bunch of *.c and   
   *.h files.   
      
   If I modify just foo.c, then type "make", it will (if everything   
   is set up correctly) recompile "foo.c" generating "foo.o", and   
   then run a link step to recreate any executable that depends on   
   "foo.o".  It knows it doesn't have to recompile "bar.c" because   
   "bar.o" sill exists and is newer than "bar.c".   
      
   Perhaps the project provides several executable programs, and   
   only two of them rely on foo.o.  Then it can relink just those   
   two executables.   
      
   This is likely to give you working executables substantially   
   faster than if you did a full rebuild.  It's more useful while   
   you're developing and updating a project than when you download   
   the source and build it once.   
      
   (I often tend to do full rebuilds anyway, for vague reasons I won't   
   get into.)   
      
   This depends on all relevant dependencies being reflected in the   
   Makefile, and on file timestamps being updated correctly when files   
   are edited.  (In the distant past, I've run into problems with the   
   latter when the files are on an NFS server and the server and client   
   have their clocks set differently.)   
      
   (I'll just go ahead and acknowledge, so you don't have to, that   
   this might not be necessary if the build tools are infinitely fast.)   
      
   If I've done a "make clean" or "git clean", or started from scratch   
   by cloning a git repo or unpacking a .tar.gz file, then any generated   
   files will not be present, and typing "make" will have to rebuild   
   everything.   
      
   [...]   
      
   --   
   Keith Thompson (The_Other_Keith) Keith.S.Thompson+u@gmail.com   
   void Void(void) { Void(); } /* The recursive call of the void */   
      
   --- SoupGate-Win32 v1.05   
    * Origin: you cannot sedate... all the things you hate (1:229/2)   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]


(c) 1994,  bbs@darkrealms.ca