home bbs files messages ]

Forums before death by AOL, social media and spammers... "We can't have nice things"

   comp.ai.philosophy      Perhaps we should ask SkyNet about this      59,235 messages   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]

   Message 57,962 of 59,235   
   J D to All   
   Re: Nicolai Tesla immigrant   
   02 Oct 25 19:18:25   
   
   XPost: alt.politics.immigration, alt.politics.republicans, talk.politics.guns   
   XPost: or.politics   
   From: j_d@invalid.org   
      
   On 30 Sep 2025, Baxter  posted some   
   news:10bgs7f$3otvb$2@dont-email.me:   
      
   > Klaus  Schadenfreude    
   > wrote in   
   > news:cihndktbkfpbs6hc80r824udj2o66nuofk@Rudy.Canoza.is.a.forging.cocksu   
   > ckin g.dwarf.com:   
   >   
   > Until the late 19th century, there wasn't any such thing as "illegal"   
   > or "legal" immigration to the United States. That's because before you   
   > can immigrate somewhere illegally, there has to be a law for you to   
   > break.   
   >   
   > Being denied a visa or denied entry is not immigration.   
   >   
   >============   
   > AI Overview   
   >   
   > An illegal immigrant is a term for an individual who is present in a   
   > country without legal authorization, which can be due to entering   
   > without official inspection or overstaying a temporary visa.   
   >   
   > - Note:  "present in a country"   
      
   ============   
   AI Overview   
      
   How often AI is wrong depends on the model, the task, and the data it was   
   trained on, but studies consistently show significant error rates,   
   especially for generative AI. Many AI models are trained to prioritize   
   answering a query over admitting a lack of knowledge, which causes them to   
   "hallucinate" incorrect but plausible-sounding information.   
      
   Study findings on AI inaccuracies   
      
   AI search tools: Popular AI search tools, including ChatGPT and Gemini,   
   gave incorrect or misleading information over 60% of the time in a March   
   2025 study. Another study from March 2025 reported AI search engines   
   invented sources for about 60% of queries.   
      
   AI assistants: When generative AI tools were asked about the world's   
   elections, even the most accurate models got 1 in 5 responses wrong,   
   according to a 2025 study.   
      
   News-related queries: A February 2025 research found that 90% of AI   
   chatbot responses about news contained some inaccuracies, with 51%   
   containing "significant" inaccuracies.   
      
   Bias: A 2022 study by USC researchers found that biased "facts" made up   
   3.4% to 38.6% of the data used by some AI systems. Algorithms sometimes   
   exaggerate these biases.   
      
   --- SoupGate-Win32 v1.05   
    * Origin: you cannot sedate... all the things you hate (1:229/2)   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]


(c) 1994,  bbs@darkrealms.ca