Forums before death by AOL, social media and spammers... "We can't have nice things"
|    sci.electronics.repair    |    Fixing electronic equipment    |    124,925 messages    |
[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]
|    Message 123,814 of 124,925    |
|    rbowman to Stan Brown    |
|    Re: Is it AI or not    |
|    11 Aug 23 17:51:26    |
      XPost: alt.comp.os.windows-10, alt.home.repair       From: bowman@montana.com              On Fri, 11 Aug 2023 10:09:23 -0700, Stan Brown wrote:              > Or, at least, that's how I understand it. A NN product was offered by       > the company I used to work for, and the programmer explained it to me       > that way. Nothing I've seen has told me it's different in principle now,       > though I believe much bigger computers are being used, and with most of       > the Internet as a training set.              GPUs were the breakthrough. The Graphics part was a misnomer although the       original intent was to speed up the calculations involved in CG. That made       them ideal for crypto mining to the point where a GPU shortage was caused       by the miners buying the high end boards. They are also good at the vector       manipulations needed for training a neural net.              Using something like PyTorch you can experiment on a PC. Even then it's       much faster if you have a GPU that supports CUDA. CUDA is an open standard       but afaik only Nvidia supports it in their chips.              When you get to something like Chat you're talking many very expensive       GPUs, a lot of power, and millions of dollars. Chat isn't aware of recent       events since it was frozen with the data available when the training       occurred and retraining is very expensive.              Once the model is trained inference, or using the model, is much less       intensive. That's the 'Pre-trained' in GPT.              For me the interesting part is pruning the model developed on a huge       system to run locally with limited resources. Cell phones are getting to       be power enough to do so. Many people didn't realize that for something       like speech recognition the audio was sent off to Google, processed, and       the text returned. the light dawned when they realized Alexa has very big       ears. The desired outcome is to have more functionality at a local level,       including very small processors like the Arduino family that don't require       huge amout of power to run.              It's a fascinating development but like all disruptors the potential for       bad is just as high as good.              --- SoupGate-Win32 v1.05        * Origin: you cannot sedate... all the things you hate (1:229/2)    |
[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]
(c) 1994, bbs@darkrealms.ca