Forums before death by AOL, social media and spammers... "We can't have nice things"
|    comp.lang.c    |    Meh, in C you gotta define EVERYTHING    |    243,242 messages    |
[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]
|    Message 242,110 of 243,242    |
|    bart to David Brown    |
|    Re: _BitInt(N)    |
|    24 Nov 25 22:27:10    |
      From: bc@freeuk.com              On 24/11/2025 20:26, David Brown wrote:       > On 24/11/2025 19:35, bart wrote:              >> There is just the poor gnu extension where 128-bit integers didn't       >> have a literal form, and there was no way to print such values.       >>       >       > How many times have you felt the need to write a 128-bit literal? And       > how many times has that literal been in decimal              I don't think there were hex literals either.                     > (it's not difficult to       > put together a 128-bit value from two 64-bit values)? You really are       > making a mountain out of a molehill here.              Well, it seems that such literals now exist (with 'wb' suffix). So I       guess somebody other than you decided that feature WAS worth adding!              But you can't as yet print out such values; I guess you can't 'scanf'       them either. These are necessary to perform I/O on such data from/to       text files.              I must say you have a very laidback attibute to language design:              "Let's add this 128-bit type, but let's not bother providing a way to       enter such values, or add any facilities to print them out. How often       would somebody need to do that anyway? But if they really /have/ to,       then there are plenty of hoops they can jump through to achieve it!"              (In my implementation of 128-bit types, from 2021, I allowed full       128-bit decimal, hex and binary literals, and they could be printed in       any base.              But they weren't used enough and were dropped, in favour of an unlimited       precision type in my other language.              On interesting use-case for literals was short-strings; 128 bits allowed       character literals up to 16 characters: 'ABCDEFGHIJKLMNOP'. I think C is       still stuck at one, or 4 if you're lucky.)                     >> But now there is this huge leap, not only to 128/256/512/1024 bits,       >> but to conceivably millions, plus the ability to specify any weird       >> type you like, like 182 bits (eg. somebody makes a typo for       >> _BitInt(128), but they silently get a viable type that happens to be a       >> little less efficient!).       >>       >       > And this huge leap also lets you have 128-bit, 256-bit, 512-bit, etc.,              And 821 bits. This is what I don't get. Why is THAT so important?              Why couldn't 128/256/etc have been added first, and then those funny       ones if the demand was still there?              If the proposal had instead been simply to extend the 'u8 u16 u32 u64'       set of types by a few more entries on the right, say 'u128 u256 u512',       would anyone have been clamouring for types like 'u1187'? I doubt it.              For sub-64-bit types on conventional hardware, I simply can't see the       point, not if they are rounded up anyway. Either have a full range-based       types like Ada, or not at all.              --- SoupGate-Win32 v1.05        * Origin: you cannot sedate... all the things you hate (1:229/2)    |
[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]
(c) 1994, bbs@darkrealms.ca