Forums before death by AOL, social media and spammers... "We can't have nice things"
|    comp.ai.philosophy    |    Perhaps we should ask SkyNet about this    |    59,235 messages    |
[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]
|    Message 58,829 of 59,235    |
|    Richard Damon to olcott    |
|    Re: Proof that the halting problem is in    |
|    26 Dec 25 22:37:12    |
      [continued from previous message]              This is because, as you have proven, you are just a pathological liar.              Now, part of the problem is you just don't understand what you are       calling the "input", and thus your whole arguement is based on being       duplicitous.              Actualy, by what you have said and what you claim to be the input, it       can be proven that you started with a lie, and just never knew what a       program actually was.              Sorry, but everything you say is just another nail in the coffin of your       reputation that is now at the bottom of that lake of fire.              >       >> Your problem is you forget about that part of the meaning of the word,       >> because you just don't think about requirements, as being "correct"       >> isn't a thing to you, just like Truth, or Proof don't mean anything to       >> you, as meaning doesn't actually have meaning to you.       >>>       >>> Four LLM systems have now fully agreed with all of       >>> my reasoning about the general subject of undecidability.       >>> ChatGPT and Claude AI have agreed in fresh brand new       >>> conversations a dozen times each.       >>       >> Which just shows you are too stupid to know they lie.       >>       >>>       >>> It initially took them fifty pages of dialogue to get it.       >>> I am now down to 15 pages on each system.       >>>       >>> It is not that these LLM systems are terribly faulty.       >>> It is that conventional wisdom about undecidability       >>> across computer science , math and logic is a foundational       >>> error.       >>       >> Shows how hard you had to work for them to remember your lies.       >>       >> All you are doing is proving you are just a liar.       >>       >>>       >>>> The problem is you forget to define what it means to be a Halt       >>>> Decider, or any form of XXXX Decider. Your problem is "Halting" is       >>>> defined as a property of the actual machine being talked about,       >>>> which can be expressed in terms of a UTM processing the string       >>>> representation of it.       >>>>       >>>> You then get this crasy idea (which is just a lie) that you can just       >>>> ignore the behavior of the CORRECT simulation of that input, as       >>>> shown by what the UTM does, and try to define it's incorrect       >>>> simulation (since it just stops short based on its own error) as       >>>> being correct.       >>>>       >>>> And then, you show your problem by just refusing to even try to       >>>> answer with a justification on why your idea is correct.       >>>>       >>>> How can your H have "Correctly Simuated" and input that "Correctly       >>>> Spedifies" the behavior of the machine P, and get the different       >>>> result of that machine or the machine defined to do the correct       >>>> simulation, that is, the UTM.       >>>>       >>>> Remember, if UTM([x]) doesn't match the behavior of machine X, then       >>>> it just isn't a UTM.       >>>>       >>>> If your problem is that you encoding method can't produce a string       >>>> that allows for a UTM to exist, then you encoding method is just       >>>> insufficient, and you doomed yourself from the start, as the       >>>> criteria for semantic properties ALWAYS goes back to the original       >>>> machine.       >>>>       >>>> All you are doing is proving you don't understand how "requirements"       >>>> work, as you just try to sweep them under the carpet with your lies.       >>>>       >>>> Sorry, all you are doing is proving your stupidity.       >>>       >>>       >>       >       >              --- SoupGate-Win32 v1.05        * Origin: you cannot sedate... all the things you hate (1:229/2)    |
[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]
(c) 1994, bbs@darkrealms.ca