home bbs files messages ]

Forums before death by AOL, social media and spammers... "We can't have nice things"

   comp.lang.c      Meh, in C you gotta define EVERYTHING      243,242 messages   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]

   Message 242,144 of 243,242   
   David Brown to bart   
   Re: _BitInt(N)   
   25 Nov 25 21:25:01   
   
   From: david.brown@hesbynett.no   
      
   On 24/11/2025 23:27, bart wrote:   
   > On 24/11/2025 20:26, David Brown wrote:   
   >> On 24/11/2025 19:35, bart wrote:   
   >   
   >>> There is just the poor gnu extension where 128-bit integers didn't   
   >>> have a literal form, and there was no way to print such values.   
   >>>   
   >>   
   >> How many times have you felt the need to write a 128-bit literal?  And   
   >> how many times has that literal been in decimal   
   >   
   > I don't think there were hex literals either.   
   >   
   >   
   >> (it's not difficult to put together a 128-bit value from two 64-bit   
   >> values)?  You really are making a mountain out of a molehill here.   
   >   
   > Well, it seems that such literals now exist (with 'wb' suffix). So I   
   > guess somebody other than you decided that feature WAS worth adding!   
   >   
   > But you can't as yet print out such values; I guess you can't 'scanf'   
   > them either. These are necessary to perform I/O on such data from/to   
   > text files.   
   >   
   > I must say you have a very laidback attibute to language design:   
   >   
   > "Let's add this 128-bit type, but let's not bother providing a way to   
   > enter such values, or add any facilities to print them out. How often   
   > would somebody need to do that anyway? But if they really /have/ to,   
   > then there are plenty of hoops they can jump through to achieve it!"   
   >   
   > (In my implementation of 128-bit types, from 2021, I allowed full 128-   
   > bit decimal, hex and binary literals, and they could be printed in any   
   > base.   
   >   
   > But they weren't used enough and were dropped, in favour of an unlimited   
   > precision type in my other language.   
   >   
   > On interesting use-case for literals was short-strings; 128 bits allowed   
   > character literals up to 16 characters: 'ABCDEFGHIJKLMNOP'. I think C is   
   > still stuck at one, or 4 if you're lucky.)   
   >   
      
   I have no idea or opinion on why /you/ might want 128-bit or larger   
   integer types.  I believe there is very little use for "normal" numbers   
   - things you might want to write as literals, calculate with, and read   
   or write - that won't fit perfectly well within 64 bit types, and would   
   not be better served by arbitrary sized integers.  Arbitrary sized   
   integers are a very different kettle of fish from large fixed-size   
   integers, and are not something that would fit in the C language - they   
   need a library.   
      
   I can tell you why /I/ might find larger integer types useful.  They   
   include :   
      
   * 128-bit for IPv6 address.  These use a variety of styles for input and   
   display, and thus would use specialised routines, not simple literals or   
   printf-style IO.   
      
   * Big units for passing data around with larger memory transfers, using   
   SIMD registers.  IO is irrelevant here.   
      
   * Cryptography.  IO is irrelevant here.  But a variety of sizes are   
   useful including 56, 80, 112, 128, 168, 192, 384, 512, 521, 2048, 3072,   
   4096, 7680, 8096 bits.  There may be more common sizes - I'm just   
   thinking of DES, 3DES, AES, SHA, ECC and RSA.   
      
      
      
   Smaller sizes can be useful for holding RGB pixel values, audio data, etc.   
      
      
   In none of these cases are bit-precise integer types essential.  People   
   have been doing cryptography for a long time without them.  But they can   
   be convenient, and help people write code that is simpler, clearer, or   
   more directly expresses their intent.  The only specific additional   
   power you get from these is that you can do arithmetic on bigger types   
   without having to write the code manually.  I don't know if compilers   
   currently do a good enough job for that to be suitable for   
   multiplication and modulo of larger integers (addition is easy, but for   
   big sizes, smarter multiplication techniques can be a significant   
   performance gain).   
      
      
   But those are just the uses /I/ see for them, in things /I/ work with.   
   (I might also use them for FPGA programming in the future, but I'm not   
   doing that at the moment.)  However, unlike some people, I don't think   
   the C language should pick features based purely on what I personally   
   want to use, or what would be even sillier, what I personally think is   
   easy to implement in a compiler.  Other people will have other uses for   
   different sizes.   
      
      
   >   
   >>> But now there is this huge leap, not only to 128/256/512/1024 bits,   
   >>> but to conceivably millions, plus the ability to specify any weird   
   >>> type you like, like 182 bits (eg. somebody makes a typo for   
   >>> _BitInt(128), but they silently get a viable type that happens to be   
   >>> a little less efficient!).   
   >>>   
   >>   
   >> And this huge leap also lets you have 128-bit, 256-bit, 512-bit, etc.,   
   >   
   > And 821 bits. This is what I don't get. Why is THAT so important?   
   >   
   > Why couldn't 128/256/etc have been added first, and then those funny   
   > ones if the demand was still there?   
      
   The folks behind the proposal provided both.  The fact that you can   
   write _BitInt(821) does not in any way hinder use of _BitInt(256).  I   
   really don't get your problem here.   
      
   >   
   > If the proposal had instead been simply to extend the 'u8 u16 u32 u64'   
   > set of types by a few more entries on the right, say 'u128 u256 u512',   
   > would anyone have been clamouring for types like 'u1187'? I doubt it.   
      
   /You/ might not have wanted them, but other people would.   
      
   >   
   > For sub-64-bit types on conventional hardware, I simply can't see the   
   > point, not if they are rounded up anyway. Either have a full range-based   
   > types like Ada, or not at all.   
   >   
      
   Fortunately for the C world, you are not on the C committee - it doesn't   
   matter if you can't see beyond the end of your nose.   
      
   --- SoupGate-Win32 v1.05   
    * Origin: you cannot sedate... all the things you hate (1:229/2)   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]


(c) 1994,  bbs@darkrealms.ca