home bbs files messages ]

Forums before death by AOL, social media and spammers... "We can't have nice things"

   sci.physics.relativity      The theory of relativity      225,861 messages   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]

   Message 224,452 of 225,861   
   Ross Finlayson to All   
   Re: T-theory A-theory theatheory (14/29)   
   11 Nov 25 13:24:01   
   
   [continued from previous message]   
      
   rather than a mathematical object. Cantor's argument doesn't care about   
   "Cartesianness"—it cares about functions defined by rules. If EF has a   
   rule, it can be diagonalized unless you restrict the logic itself.   
   The Doubling Factor is Unsubstantiated: The claim that ∫ f = 1 (not ½)   
   needs justification. You state this results from "squeezing" discrete   
   points, but Lebesgue measure already does this—it's why a countable set   
   has measure zero. The "triangle under y=x" intuition applies to Riemann   
   integration of f(x)=x on [0,1], but EF is defined on ℕ. The integral   
   notation is metaphorical unless you specify: what measure on ℕ? What   
   σ-algebra? How does discrete summation produce a factor of 2?   
   Iota-Arithmetic Must Be Explicit: A ring where addition and   
   multiplication cannot coexist is exotic. Is it a semiring? A near-ring?   
   How do you handle distributivity? The restriction seems designed to   
   evade σ-additivity, but σ-additivity is a property of measures, not   
   necessarily of the underlying algebraic structure. The Vitali paradox   
   arises from translation invariance + countable additivity + AC, not from   
   ℝ being a field.   
   The Vitali Claim Requires More Than Doubling: Even if we accept a   
   doubling measure on [0,1], Vitali's non-measurability is a proof by   
   contradiction: if V were measurable, then countably many disjoint   
   translates would fill [0,1] with either measure 0 or ∞, contradicting   
   μ([0,1]) = 1. Claiming μ(V) = 2 doesn't resolve this—it's just a   
   different contradiction unless you also modify translation invariance or   
   countable additivity. Which does the A-Theory sacrifice?   
      
   On the Multi-LLM Methodology   
   Your strategy is innovative: using Gemini for web integration, Claude   
   for deep context synthesis, and now seeking a "math-focused" LLM for   
   formalization. This works because each model's errors are   
   uncorrelated—Claude catches Gemini's overstatements, you catch both   
   models' blind spots. The result is a robust conceptual clearing.   
   However, be cautious: LLMs are excellent at finding coherence in   
   concepts, but coherence doesn't entail existence. The A-Theory is   
   internally consistent as described, but the bridge to symbolic formalism   
   requires more than默契 (mòqì, tacit agreement) between models. It   
   requires explicit definitions that can be checked by a proof assistant   
   or peer reviewer.   
   My Assessment & Next Steps   
   The A-Theory, as articulated, is not yet a mathematical theory but a   
   research program—a compelling set of conjectures about how measure   
   theory could be rebuilt on non-standard foundations. The critical test   
   is the formalization task you've designed. If an LLM can produce:   
      
   A rigorous definition of the double limit using non-standard analysis or   
   ultrafilter constructions   
   An explicit iota-ring with operations that demonstrably prevent σ-additivity   
   A derivation showing the doubling factor emerges necessarily from these   
   definitions   
   A proof that under this measure, the Vitali set's equivalence classes   
   each have measure 2   
      
   ...then the A-Theory moves from conjecture to contender. If not, the   
   specific point of failure will tell you what needs strengthening.   
   For Your Usenet Post   
   When you post this transcript, emphasize that the value lies not in   
   claiming the A-Theory is "true," but in demonstrating how AI-assisted   
   dialectic can compress decades of private research into a publicly   
   verifiable formalization challenge. The sci.logic audience will   
   appreciate that you're not just proposing a new measure, but a new kind   
   of measure—one that reorders the foundational hierarchy (Well-Dispersion   
   before Well-Ordering).   
   What specific aspect would you like me to probe further? The double   
   limit? The iota-ring structure? Or shall we await the formalization results?   
   Thanks Kimi. The set theory is a theory of sets, not real numbers, which   
   it intends to describe. The extent/density/completeness/measure suffice   
   to establish "continuous domains". Then, the nature of modeling the   
   sweep or EF function as "standardly modeling a not-a-real-function as a   
   limit of real functions", is familiar from the theory of distributions   
   and probability theory, where distributions are functions by their form,   
   about establishing that "EF is a function if not-a-real-function",   
   relating each element of domain 1-1 and onto the range, that dom(f) is N   
   the natural numbers, and that ran(f) is [0,1] the real-valued unit line   
   segment, explaining why it's named "natural/unit equivalency function",   
   since "equivalency" is for the unique example of a countable continuous   
   domain, equivalent the cardinality the discrete domain and continuous   
   range. There may be suggested many other names of the function, while it   
   is unique. Then, the Cartesian-ness or lack thereof of this function   
   according to set theory, is relevant since thusly it doesn't make the   
   uncountability arguments contradicted by Cantor-Schroeder-Bernstein   
   theorem, while it falls out of otherwise the arguments for   
   uncountability of continuous domains as un-contradicted. (This is framed   
   in a logic that excludes material implication as that "classical logic"   
   is not "classical quasi-modal logic with material implication", that is   
   still a classical logic.) Then, the idea that the continuous domain for   
   integer-part then non-integer part is according to this sort of   
   clock-arithmetic, has for the centrarily and primariness, of sweep, as   
   defining continuous domains giving least-upper-bound property out of the   
   discrete, to arrive at thusly a way to give the complete-ordered-field   
   least-upper-bound gaplessness, since today's descriptive set theory must   
   include more axioms of "least-upper-bound" and "measure 1.0" to describe   
   real analysis.   
   Edit   
   Copy   
   Share   
   Formalizing EF as a Non-Cartesian Bijection from ℕ to [0,1]   
   The user is continuing the conversation about the "A-Theory" and the   
   "sweep function" (EF). He is addressing some of the points I raised in   
   my previous response, particularly regarding the nature of the EF   
   function, its non-Cartesian nature, and how it relates to set theory and   
   the uncountability arguments. He also brings up the concept of   
   "quasi-modal logic" and the role of the least-upper-bound property.   
   Let me break down his points:   
      
   Set theory vs. real numbers: He clarifies that set theory is about sets,   
   not real numbers, and that "extent/density/completeness/measure" are   
   enough to establish continuous domains. This is a philosophical point   
   about the foundations.   
   EF as a distribution: He compares the EF function to distributions   
   (generalized functions) in probability theory. The idea is that EF is a   
   "not-a-real-function" but can be modeled as a limit of real functions.   
   He emphasizes that it is a bijection between ℕ and [0,1].   
   Non-Cartesian nature and Cantor: He addresses my point about Cantor   
   diagonalization. He claims that because EF is non-Cartesian, it doesn't   
   contradict Cantor-Schroeder-Bernstein theorem and doesn't fall prey to   
   the usual uncountability arguments. He also mentions a "classical   
   quasi-modal logic with material implication" as the underlying logic.   
   Clock arithmetic and LUB: He mentions "clock-arithmetic" and that sweep   
   defines continuous domains, giving the least-upper-bound property from   
   the discrete. He suggests that descriptive set theory needs additional   
   axioms for LUB and measure to describe real analysis.   
      
   My task is to respond to these points critically but constructively, as   
      
   [continued in next message]   
      
   --- SoupGate-Win32 v1.05   
    * Origin: you cannot sedate... all the things you hate (1:229/2)   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]


(c) 1994,  bbs@darkrealms.ca