Forums before death by AOL, social media and spammers... "We can't have nice things"
|    alt.cyberpunk.tech    |    Cyberpunks LOVE making shit complicated    |    1,115 messages    |
[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]
|    Message 710 of 1,115    |
|    thule to All    |
|    Re: Giving AGI developers at target to h    |
|    27 Oct 25 15:56:36    |
      From: thule@thule.invalid              On 10/27/25 12:56 AM, an0n wrote:       > Interesting article shared on Y-Combinator > https://arxiv.org/       > abs/2510.18212 trying to quantify how close we are to AGI by starting       > with an empirical analysis of basic human intelligence.       >       > Tl;Dr > The paper attributes overall AGI score of 27% for GPT-4 and 57%       > for GPT-5, showing both the rapid progress and the substantial gaps       > remaining to reach human-level cognition. The most significant       > bottleneck identified is Long-Term Memory Storage (MS); This fundamental       > "amnesia" means models must rely on "capability contortions", such as       > using massive context windows (Working Memory) to compensate for their       > inability to learn and store new information persistently, often the       > root cause of hallucinations.       >       > Soooooo .... not long now boys !! "just" need to solve for these issues       > and we are off to the races.              i'm not one of those ai-hating redditors; i host a few small models on       my home network and use them regularly. i'm unqualified but i have read       a few computational neuroscience textbooks and i'm convinced llms cannot       become agi. a fundamental limitation is their lack of world models, and       some models now work around this with chain of thought, but it's not a       true solution. while general intelligence is defined as whatever humans       can do, i doubt there's such thing as truly "general" - i.e. opposite of       narrow - intelligence; indeed, the brain is a collection of specialised       regions bootstrapped together, with the cortex and its canonical circuit       being the only sort of general-purpose structure i'm aware of.              --- SoupGate-Win32 v1.05        * Origin: you cannot sedate... all the things you hate (1:229/2)    |
[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]
(c) 1994, bbs@darkrealms.ca