home bbs files messages ]

Forums before death by AOL, social media and spammers... "We can't have nice things"

   comp.ai.philosophy      Perhaps we should ask SkyNet about this      59,235 messages   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]

   Message 58,037 of 59,235   
   Mild Shock to You   
   Attention is all you Need [198836 citati   
   14 Oct 25 12:34:55   
   
   XPost: sci.lang, sci.logic   
   From: janburse@fastmail.fm   
      
   Hi,   
      
   You wrote:   
      
    > Protophora (Conceptual Use): It serves as the   
    > seed or initial declaration that begins a chain   
    > of reference. In your analytical system,   
    > the Protophora Span is the first commitment   
    > of a Type and Structure to a new entity, which   
    > is then tracked by anaphora (backward reference) and   
    > cataphora (forward reference).   
      
   Just check out this paper with 198836 citations (sic!):   
      
   "Attention Is All You Need"[1] is a 2017 landmark   
   research paper in machine learning authored by   
   eight scientists working at Google."   
   https://en.wikipedia.org/wiki/Attention_Is_All_You_Need   
      
   "Attention is All you Need" (PDF)   
   Vaswani, Ashish; Shazeer, Noam; Parmar, Niki; Uszkoreit,   
   Jakob; Jones, Llion; Gomez, Aidan N; Kaiser, Łukasz;   
   Polosukhin, Illia (December 2017).   
   https://papers.nips.cc/paper_files/paper/2017/file/3f5ee243547de   
   91fbd053c1c4a845aa-Paper.pdf   
      
   (begin Deep Seek generated explanation)   
      
   The Transformer's Solution: Bidirectional   
   Self-Attention The "self-attention" in the   
   original Transformer encoder is inherently   
   bidirectional. Here’s how it works:   
      
   - Simultaneous Processing: When the model processes a   
   sequence, it doesn't start from the left. Instead,   
   it starts with an representation of all words at once.   
      
   - The Attention Calculation: For every single word   
   in the sequence, the attention mechanism calculates   
   a relationship with every other word in the same   
   sequence, regardless of position.   
      
   - It's Just a Graph: The mechanism doesn't have a   
   built-in concept of "left" or "right." It simply   
   builds a dense graph of connections, where the   
   strength of the connection (the attention weight)   
   is learned based on what's needed to solve the task.   
      
   (end Deep Seek generated explanation)   
      
   Bye   
      
   P.S.: You find also nice YouTube videos   
   explaining the machanism.   
      
   Ross Finlayson schrieb:   
   > Hello Google Gemini, this is Ross A. Finlayson, I call you or you-all   
   > "GG", in this session call me "R.F.".   
   >   
   > Hello, R.F.! I'm pleased to meet you.   
   >   
   > How can I help you today?   
   >   
   >   
   >   
   > Thanks GG. Today would be a consideration of language and grammar and   
   > parsing. The idea is that for structured and unstructured text, to   
   > result a description for a formal treatment to result of source and   
   > natural language, to result feature extraction and transpilation, toward   
   > retarget-able source languages, and as well usual diagram and labelling,   
   > of language constructs.   
   >   
   > That's an ambitious and fascinating topic, R.F.! Focusing on a formal   
   > treatment of language for feature extraction and   
   > transpilation—especially with an eye toward retargetable source   
   > languages—sits right at the intersection of computational linguistics,   
   > compiler theory, and natural language processing (NLP).   
   >   
   > Let's break down the core components of what you're describing to   
   > establish a formal framework.   
   >   
   > I. Foundational Concepts and Goals   
   >   
   > The Core Problem   
   >   
   > The central goal is to create a formal, unified method to parse and   
   > analyze both natural language (NL) and formal source language (SL)   
   > texts, resulting in an intermediate, abstract representation that   
   > facilitates two main tasks:   
   >   
   > Feature Extraction: Identifying key components, their relationships, and   
   > high-level concepts (semantics).   
   >   
   > Transpilation/Retargeting: Generating equivalent, valid code in a   
   > different formal language (or restructuring the NL description into a   
   > structured format).   
   >   
   > The Ideal Intermediate Representation (IR)   
   >   
   > To achieve retargetability, the system must rely on an Abstract Syntax   
   > Tree (AST) or a similar graph-based structure that is language-agnostic   
   > at a fundamental level. This IR moves beyond the surface syntax to   
   > capture the deep semantics and intent.   
   >   
   > For Source Language: The AST captures the program's structure (loops,   
   > variables, functions, etc.).   
   >   
   > For Natural Language: The IR should capture the Semantic Roles (who did   
   > what to whom, where, when, why) and Discourse Structure (how sentences   
   > relate to form a coherent text).   
   >   
   > II. Formal Treatment: Parsing and Analysis   
   >   
   > The process for both NL and SL can be formalized through a series of   
   > stages:   
   >   
   > 1. Lexical Analysis (Tokenization)   
   >   
   > The input stream of characters, C, is transformed into a stream of   
   > tokens, T. A token is a categorized sequence of characters.   
   >   
   > Formal SL: Tokens are reserved words, identifiers, literals, and   
   > operators (e.g., while, =, myVar, 10.5).   
   >   
   > Natural NL: Tokens are words, punctuation, and possibly multi-word   
   > expressions (MWEs).   
   >   
   > Tokenizer(C)→T=⟨t1​,t2​,…,tn​⟩   
   >   
   > 2. Syntactic Analysis (Parsing)   
   >   
   > The token stream T is analyzed against a formal Grammar, G, to build the   
   > structural representation.   
   >   
   > Formal SL: Uses a Context-Free Grammar (CFG) or similar formal grammar   
   > (e.g., LALR, LL) to generate a Parse Tree which is then simplified into   
   > an Abstract Syntax Tree (AST).   
   >   
   > Natural NL:   
   >   
   > Constituency Parsing: Uses a grammar (e.g., Probabilistic CFG - PCFG) to   
   > group words into constituent phrases (Noun Phrase, Verb Phrase).   
   >   
   > Dependency Parsing: Focuses on the grammatical relationships between   
   > words (Head/Modifier, Subject/Object) to generate a Dependency Graph.   
   > This is often more useful for capturing semantic roles.   
   >   
   > Parser(T,G)→AST/Dependency Graph   
   >   
   > 3. Semantic Analysis (Feature Extraction)   
   >   
   > This is where the meaning is extracted, and the language constructs are   
   > labeled.   
   >   
   > Language Constructs (SL/NL):   
   >   
   > Variables/Entities: Noun Phrases in NL, declared variables in SL.   
   >   
   > Actions/Operations: Verbs in NL, function calls or operators in SL.   
   >   
   > Control Flow/Logic: Conjunctions and conditionals in NL, if/while/for in   
   > SL.   
   >   
   > Formal Feature Extraction (Labeling): The nodes in the IR are annotated   
   > with features.   
   >   
   > For NL (Semantic Role Labeling - SRL): Nodes are labeled with roles like   
   > Agent, Patient, Instrument, Location, etc. (e.g., John [Agent] threw the   
   > ball [Patient] at the wall [Location]).   
   >   
   > For SL (Type Checking/Symbol Table): Nodes are labeled with their data   
   > types, scope, and memory location.   
   >   
   > III. Transpilation and Retargeting   
   >   
   > The final stage uses the annotated, language-agnostic IR to generate the   
   > target output.   
   >   
   > 1. Retargetable Code Generation   
   >   
   > This process, CodeGen, maps the features and structure of the IR to the   
   > syntax of a specific Target Language (Ltarget​).   
   > CodeGen(IR,Ltarget​)→Source Code in Ltarget​   
   >   
   > Example (SL to SL): An AST node for a "loop" is mapped to for (int i=0;   
      
   [continued in next message]   
      
   --- SoupGate-Win32 v1.05   
    * Origin: you cannot sedate... all the things you hate (1:229/2)   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]


(c) 1994,  bbs@darkrealms.ca