From: sfuld@alumni.cmu.edu.invalid   
      
   On 2/14/2026 1:22 PM, MitchAlsup wrote:   
   >   
   > David Brown posted:   
   >   
   >> On 14/02/2026 19:48, MitchAlsup wrote:   
   >>>   
   >>> Thomas Koenig posted:   
   >>>   
   >>>> Some people are throwing large amounts of computing power at trying   
   >>>> to disprove the Collatz conjecture. Graphics cards have sped this   
   >>>> up enormously.   
   >>>>   
   >>>> (A reminder, if any is needed: From a starting number n, the   
   >>>> transformation, recursively applied,   
   >>>>   
   >>>>   
   >>>> f(n) = 3*n+1, if n is odd; =n/2 if n is even.   
   >>>>   
   >>>> is conjectured to always reach 1 for any positive integer 1).   
   >>>   
   >>> While n/2 remains even, n/2 is guaranteed to converge to 1.   
   >>> So, only the odd cases are interesting.   
   >>>   
   >>> 3*n+1 is always even (since n was odd, and odd*odd is odd and   
   >>> odd+1 is even). So, a double iteration is (3*n+1)/2.   
   >>   
   >> Yes, that is often a micro-optimisation used.   
   >>   
   >>>   
   >>>> All the work is done on general-purpose hardware, which has many   
   >>>> capabilities that are not needed, and consume area and power that   
   >>>> special-purpose hardware would not need. Also, the hardware   
   >>>> is limited to 64-bit integers, and the range of tested numbers   
   >>>> is now up to 2^71, so   
   >>>   
   >>> Induction would say it has been proven by then.   
   >>   
   >> Would you like to re-think that? "Induction" says nothing of the sort.   
   >>   
   >> The conjecture has been proven true for all numbers up to 2 ^ 71   
   >> (assuming that is the current figure) - it has most definitely /not/   
   >> been proven to be true for all numbers. There are other things proven   
   >> about it, such as the asymptotic rarity of numbers that are exceptions   
   >> to the pattern, but no one has any notion of how to prove it in general.   
   >   
   > By proving up to 2^71, you simultaneously prove that it cannot be proven   
   > by computer-like arithmetic processes. What if the number was 2^71,000,000   
   > you still have not proven it? Thereby, it is not provable by simple computer   
   > arithmetic processes.   
      
   While it is certainly true that you can't prove it, no matter how large   
   a number you get up to, you may be able to disprove it, which would be   
   important in and of itself.   
      
      
      
   --   
    - Stephen Fuld   
   (e-mail address disguised to prevent spam)   
      
   --- SoupGate-Win32 v1.05   
    * Origin: you cannot sedate... all the things you hate (1:229/2)   
|