From: cross@spitfire.i.gajendra.net   
      
   In article <108g8kk$33isk$1@dont-email.me>,   
   Arne Vajhøj wrote:   
   >On 8/24/2025 7:27 PM, Dan Cross wrote:   
   >> In article <108dlq4$2fi6h$4@dont-email.me>,   
   >> Arne Vajhøj wrote:   
   >>> On 8/19/2025 1:26 PM, Dan Cross wrote:   
   >>>> In article <10823ei$3pb8v$3@dont-email.me>,   
   >>>> Arne Vajhøj wrote:   
   >>>>> Whether we follow tradition and call them integer and cardinal   
   >>>>> or more modern style and call them int and uint is less important.   
   >>>>   
   >>>> I would argue that, at this point, there's little need for a   
   >>>> generic "int" type anymore, and that types representing integers   
   >>>> as understood by the machine should explicitly include both   
   >>>> signedness and width. An exception may be something like,   
   >>>> `size_t`, which is platform-dependent, but when transferred   
   >>>> externally should be given an explicit size. A lot of the   
   >>>> guesswork and folklore that goes into understanding the   
   >>>> semantics of those things just disappears when you're explicit.   
   >>>   
   >>> The integer types should have well defined width.   
   >>>   
   >>> And they could also be called int32 and uint32.   
   >>>   
   >>> That seems to be in fashion in low level languages   
   >>> competing with C.   
   >>>   
   >>> Many higher level languages just define that int is 32 bit,   
   >>> but don't show it in the name.   
   >>   
   >> If by "many higher level languages" you mean languages in the   
   >> JVM and CLR ecosystem, then sure, I guess so. But it's not   
   >> universal, and I don't see how it's an improvement.   
   >   
   >That are two huge group of languages with a pretty big   
   >market share in business applications.   
      
   Market share is not the same as influence, and while the JVM/CLR   
   languages _do_ have a lot of users, that does not imply that all   
   are good languages. In fact, only a handful of languages in   
   each family have any significant adoption, and I don't think PL   
   designers are mining them for much inspiration these days.   
      
   Again, not universal, nor really an improvement over just using   
   explicitly sized types.   
      
   >Delphi provide both flavors. shortint/smallint/integer   
   >and int8/int16/int32, byte/word/cardinal and   
   >uint8/uint16/uint32. I believe the first are the most   
   >widely used.   
      
   The older names feel like they're very much looking backwards in   
   time.   
      
   >(64 bit is just int64 and uint64, because somehow they   
   >fucked up longint and made it 32 bit on 32 bit and 64 bit   
   >Windows but 64 bit on 64 bit *nix)   
      
   I'd blame C for that. I've heard some folks suggest that the   
   real mistake was not making `long` 64 bits on the first VAX C   
   compiler, which admittedly may have already been too late (the   
   Interdata compilers for the 7/32 and 8/32 Unix ports targeted a   
   32-bit machine very early on).   
      
   John Mashey et al got a lot of this mess fixed up with   
   `` in the 1990s as they were pushing the 64-bit   
   adoption. But had types been annotated with widths very early   
   on, most of these problems wouldn't have existed in the first   
   place.   
      
    - Dan C.   
      
   --- SoupGate-Win32 v1.05   
    * Origin: you cannot sedate... all the things you hate (1:229/2)   
|