From: arne@vajhoej.dk   
      
   On 8/19/2025 1:26 PM, Dan Cross wrote:   
   > In article <10823ei$3pb8v$3@dont-email.me>,   
   > Arne Vajhøj wrote:   
   >> On 8/19/2025 9:01 AM, Simon Clubley wrote:   
   >>> On 2025-08-18, Dan Cross wrote:   
   >>>> I happen to disagree with Simon's notion of what makes for   
   >>>> robust programming, but to go to such an extreme as to suggest   
   >>>> that writing code as if logical operators don't short-circuit   
   >>>> is the same as not knowing the semantics of division is   
   >>>> specious.   
   >>>   
   >>> That last one is an interesting example. I may not care about   
   >>> short circuiting, but I am _very_ _very_ aware of the combined   
   >>> unsigned integers and signed integers issues in C expressions. :-(   
   >>>   
   >>> It also affects how I look at the same issues in other languages.   
   >>>   
   >>> I've mentioned this before, but I think languages should give you   
   >>> unsigned integers by default, and you should have to ask for   
   >>> a signed integer if you really want one.   
   >>   
   >> "by default" sort of imply signedness being an attribute of   
   >> same type.   
   >>   
   >> Why not just make it two different types with different names?   
      
   >> Whether we follow tradition and call them integer and cardinal   
   >> or more modern style and call them int and uint is less important.   
   >   
   > I would argue that, at this point, there's little need for a   
   > generic "int" type anymore, and that types representing integers   
   > as understood by the machine should explicitly include both   
   > signedness and width. An exception may be something like,   
   > `size_t`, which is platform-dependent, but when transferred   
   > externally should be given an explicit size. A lot of the   
   > guesswork and folklore that goes into understanding the   
   > semantics of those things just disappears when you're explicit.   
      
   The integer types should have well defined width.   
      
   And they could also be called int32 and uint32.   
      
   That seems to be in fashion in low level languages   
   competing with C.   
      
   Many higher level languages just define that int is 32 bit,   
   but don't show it in the name.   
      
   Arne   
      
   --- SoupGate-Win32 v1.05   
    * Origin: you cannot sedate... all the things you hate (1:229/2)   
|