home bbs files messages ]

Forums before death by AOL, social media and spammers... "We can't have nice things"

   comp.lang.c      Meh, in C you gotta define EVERYTHING      243,242 messages   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]

   Message 242,101 of 243,242   
   bart to David Brown   
   Re: _BitInt(N)   
   24 Nov 25 18:35:01   
   
   From: bc@freeuk.com   
      
   On 24/11/2025 14:41, David Brown wrote:   
   > On 24/11/2025 13:31, bart wrote:   
      
   > That's all up to the implementation.   
   >   
   > You are worrying about completely negligible things here.   
      
   Is it that negligible? That's easy to say when you're not doing the   
   implementing! However it may impact on the size and performance of code.   
      
      
   >> And allowing random sizes such as int817838_t. (See, it seems much   
   >> sillier using this syntax!)   
   >   
   > I had taken your "ridiculous" comment to be part of your complaint that   
   > "multiplying even two one-million-bit types could overflow".  But those   
   > statements are independent, then only the first is silly - of course   
   > arithmetic on any finite sized type can overflow unless specifically   
   > limited (such as by wrapping behaviour for unsigned types).  I agree   
   > that huge fixed-size integer types are not useful, though I am not sure   
   > where the ideal limit lies.   
      
   You don't think it strange that C doesn't even have a 128-bit type yet   
   (it only barely has width-specific 64-bit ones).   
      
   There is just the poor gnu extension where 128-bit integers didn't have   
   a literal form, and there was no way to print such values.   
      
   But now there is this huge leap, not only to 128/256/512/1024 bits, but   
   to conceivably millions, plus the ability to specify any weird type you   
   like, like 182 bits (eg. somebody makes a typo for _BitInt(128), but   
   they silently get a viable type that happens to be a little less   
   efficient!).   
      
   So, 20 years of having 64-bit processors with little or no support for   
   even double-word types, and now there is this explosion in capabilities.   
      
   Or, are literals and print facilities for these new types still missing?   
      
   Personally I think they should have got the basics right first, like a   
   decent 128-bit type, proper literals, and ways to print.   
      
   This looks like VLAs all over again (eg. is '_BitInt(1000000) A'   
   allocated on the stack?). A poorly suited, hard-to-implement feature.   
      
   --- SoupGate-Win32 v1.05   
    * Origin: you cannot sedate... all the things you hate (1:229/2)   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]


(c) 1994,  bbs@darkrealms.ca