home bbs files messages ]

Forums before death by AOL, social media and spammers... "We can't have nice things"

   comp.ai.philosophy      Perhaps we should ask SkyNet about this      59,235 messages   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]

   Message 57,375 of 59,235   
   D. Ray to All   
   =?UTF-8?Q?Neural=20Networks=20(MNIST=20i   
   21 Oct 24 20:06:28   
   
   XPost: comp.misc, alt.microcontrollers, comp.arch.embedded   
   XPost: alt.microcontrollers.8bit   
   From: d@ray   
      
   Bouyed by the surprisingly good performance of neural networks with   
   quantization aware training on the CH32V003, I wondered how far this can be   
   pushed. How much can we compress a neural network while still achieving   
   good test accuracy on the MNIST dataset? When it comes to absolutely   
   low-end microcontrollers, there is hardly a more compelling target than the   
   Padauk 8-bit microcontrollers. These are microcontrollers optimized for the   
   simplest and lowest cost applications there are. The smallest device of the   
   portfolio, the PMS150C, sports 1024 13-bit word one-time-programmable   
   memory and 64 bytes of ram, more than an order of magnitude smaller than   
   the CH32V003. In addition, it has a proprieteray accumulator based 8-bit   
   architecture, as opposed to a much more powerful RISC-V instruction set.   
      
   Is it possible to implement an MNIST inference engine, which can classify   
   handwritten numbers, also on a PMS150C?   
      
   â€¦   
      
   â€¦   
      
      
      
      
      
   --- SoupGate-Win32 v1.05   
    * Origin: you cannot sedate... all the things you hate (1:229/2)   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]


(c) 1994,  bbs@darkrealms.ca