home bbs files messages ]

Forums before death by AOL, social media and spammers... "We can't have nice things"

   comp.programming      Programming issues that transcend langua      57,431 messages   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]

   Message 56,886 of 57,431   
   =?UTF-8?B?QXVnx51s?= to Amine Moulay Ramdane   
   Re: More of my philosophy about transfor   
   02 Jan 23 17:47:35   
   
   From: angel00000100000@mail.ee   
      
   Hello, idiOt...............................................   
      
      
      
      
   On Sunday, January 1, 2023 at 7:25:33 PM UTC+2, Amine Moulay Ramdane wrote:   
   > Hello,    
   >    
   >    
   >    
   >    
   > More of my philosophy about transformers and about the next GPT-4 and about   
   ChatGPT and more of my thoughts..    
   >    
   > I am a white arab from Morocco, and i think i am smart since i have also    
   > invented many scalable algorithms and algorithms..    
   >    
   >    
   >    
   > The capabilities of transformer architectures, as in GPT of ChatGPT that is   
   called Generative Pre-trained Transformer, are truly remarkable, as they allow   
   machine learning models to surpass human reading comprehension and cognitive   
   abilities in many    
   ways. These models are trained on massive amounts of text data, including   
   entire corpora such as the English Wikipedia or the entire internet, which   
   enables them to become highly advanced language models (LMs) with a deep   
   understanding of language and    
   the ability to perform complex predictive analytics based on text analysis.   
   The result is a model that is able to approximate human-level text cognition,   
   or reading, to an exceptional degree - not just simple comprehension, but also   
   the ability to make    
   sophisticated connections and interpretations about the text, because the   
   network of Transformer pay “attention” to multiple sentences, enabling it   
   to grasp "context" and "antecedents". These transformer models represent a   
   significant advancement in    
   the field of natural language processing and have the potential to   
   revolutionize how we interact with and understand language.    
   >    
   > GPT-4 is significantly larger and more powerful than GPT-3, with 170   
   trillion parameters compared to GPT-3’s 175 billion parameters(and even   
   GPT-3.5 of the new ChatGPT has 175 billion parameters). This allows GPT-4 to   
   process and generate text with    
   greater accuracy and fluency, so with feedback from users and a more powerful   
   GPT-4 model coming up and by being trained on a substantially larger amount of   
   data , ChatGPT that will use GPT-4 may "significantly" improve in the future.   
   So i think ChatGPT    
   will still become much more powerful. And i invite you to read my previous   
   thoughts about my experience with the new ChatGPT:    
   >    
   >    
   > More of my philosophy about my experience with ChatGPT and about artificial   
   intelligence and more of my thoughts..    
   >    
   >    
   > I think i am highly smart since I have passed two certified IQ tests and i   
   have scored "above" 115 IQ, and i mean that it is "above" 115 IQ,    
   > so in those two last days i have just tested ChatGPT so that to see if    
   > this new artificial intelligence launched by OpenAI in November 2022 is   
   efficient, and i think that it is really useful, since i think by testing it   
   that it can score well on the human average smartness, but if you want it to   
   be highly smart by    
   inventing highly smart things , it will not be able to do it, but if you want   
   ChatGPT to be highly smart on what it has learned from the existing smartness   
   of the human knowledge that it has been trained on, i think it can also score   
   high in many times    
   of it, also ChatGPT can in many times make much less errors than humans, so i   
   think that ChatGPT is really useful, and i think that    
   > ChatGPT will be improved much more by increasing the size of    
   > its transformer (A transformer is a deep learning model that adopts the   
   mechanism of self-attention) , and i also think that ChatGPT will be    
   > improved much more when it will be trained on a substantially larger amount   
   of data, considering an article that DeepMind just published a few days ago   
   demonstrating that the performance of these models can be drastically improved   
   by scaling data more    
   aggressively than parameters ( Read it here: https://arxiv.org/p   
   f/2203.15556.pdf ), and it is    
   > why i am optimistic about the performance of ChatGPT and i think that it   
   will be much more improved.    
   >    
   >    
   > Thank you,    
   > Amine Moulay Ramdane.   
      
   --- SoupGate-Win32 v1.05   
    * Origin: you cannot sedate... all the things you hate (1:229/2)   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]


(c) 1994,  bbs@darkrealms.ca