home bbs files messages ]

Forums before death by AOL, social media and spammers... "We can't have nice things"

   comp.lang.forth      Forth programmers eat a lot of Bratwurst      117,927 messages   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]

   Message 116,297 of 117,927   
   Paul Rubin to Anton Ertl   
   Re: push for memory safe languages -- im   
   11 Mar 24 20:07:01   
   
   From: no.email@nospam.invalid   
      
   anton@mips.complang.tuwien.ac.at (Anton Ertl) writes:   
   > Not at all.  Modular arithmetic is not arithmetic in Z, but it's a   
   > commutative ring and has the nice properties of this algebraic   
   > structure.   
      
   Right, those modular values aren't integers, they are equivalence   
   classes of integers.  The ring Z/NZ might have some nice properties   
   but they aren't the properties of integers.   
      
   > but even that works surprisingly well, so well that the RISC-V   
   > designers have not seen a need to include an efficient way to detect   
   > those cases where the result deviates from that in Z.   
      
   Sure, C worked pretty well in the 1980s but we've seen how well that   
   worked out.  RISC-V perpetuates the bugs of the 1980s instead of taking   
   the opportunity to fix them.   
      
   > Still, the nice algebraic properties of modular arithmetic can be of   
   > benefit even in such cases.... 64 bit machine   
      
   Another thing, if I run the same integer calculation on two machines, at   
   least programmed in a HLL, I should expect the same result on both.  But   
   if the word sizes are different then the results will be different.  (If   
   one or both crash due to implementation restrictions such as machine   
   overflow, that's annoying, but it's better than getting wrong answers).   
      
   >>In what world can it be right for n to be a positive integer and n+1 to   
   >>be a negative integer?  That's not how integers work.   
   > It's how Java's int and long types work.   
      
   Yes, that's a mistake.  I just don't see how it can be anything else.   
   2+2=5 would be obviously wrong, but it's hypothetical, or as you say, a   
   straw man.  20+20=50 or 2000+2000=5000 or 200000+200000=500000 would   
   also be straw men, since they don't happen either.  What about   
   2000000000+2000000000=-294967296?  Java actually does that, it can't be   
   called a straw man, so instead I'm supposed to believe that it's a valid   
   result.  I just can't.   
      
   > And if you want something closer to Z, Java also has BigInteger.   
      
   Those are boxed and expensive for the usual case where the results are   
   expected to fit into the machine word.  Of course that expectation may   
   be wrong (say due to a program bug), but in that case I want the program   
   to crash, like it would for an out-of-range subscript.   
      
   Maybe it is a mistake for Java to have an int type like that at all,   
   i.e. BigInteger should be the default, like in Python.  It was a design   
   choice to make machine arithmetic more accessible to gain acceptance by   
   some potential users.  Guy Steele famously said "We were after the C++   
   programmers. We managed to drag a lot of them about halfway to Lisp."   
   Java today seems awfully old-fashioned of course.   
      
   .   
   >>Tony Hoare in 2009 said about null pointers:   
   > And the relevance is?   
      
   Both are instances where adding a "feature" for implementation   
   convenience turned out to attract bugs and vulnerabilities.   
      
   >>Java-style wraparound arithmetic is more of the same.  A bug magnet,   
   > Unsupported claim.   
      
   It's supported by that page linked a few days ago, about overflow bugs   
   in real programs.   
      
   > I think I saw the unintended result on a 32-bit machine   
      
   I agree that it's less likely to be a problem if the ints are 64 bits.   
   And of course it was a frequent occurence in the 16 bit era.   
      
   Note that at least in gcc on x64, ints and longs by default are still 32   
   bits.  These days when I write C code I tend to use stdint.h and specify   
   int sizes explicitly, e.g. int64_t or int32_t rather than int or long or   
   whatever.   
      
   > I don't know much about C++, but I would be surprised if they had   
   > given up on uninitialized data.  And an uninitialized reference is   
   > certainly not better than a null reference.   
      
   I don't know a way to make an uninitialized reference in C++ but maybe   
   it's possible.  If you just say "int &y;" you get a compile time error.   
      
   > The fact that Java idiomatics is to implement trees and linked lists   
   > not in the object-oriented way I outlined above   
      
   The OO description is similar to using a sum type, and it's reasonable   
   for the implementation under the covers to use a zero pointer to   
   represent an empty list.  Some Lisp implementations go even further and   
   used "cdr coding", which means using a single bit to indicate that the   
   next list node is at the next word in memory, so the "next" pointer   
   (cdr) can be eliminated.  You might allocate the list nodes   
   non-consecutively when the list is created, but a compacting GC can   
   later make the elements consecutive in memory and get rid of the pointer   
   overhead.   
      
   --- SoupGate-Win32 v1.05   
    * Origin: you cannot sedate... all the things you hate (1:229/2)   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]


(c) 1994,  bbs@darkrealms.ca