home bbs files messages ]

Forums before death by AOL, social media and spammers... "We can't have nice things"

   comp.ai      Awaiting the gospel from Sarah Connor      1,954 messages   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]

   Message 861 of 1,954   
   MajorSetback@excite.com to Greg Heath   
   Re: Two Class Multidimensional Decision    
   09 Dec 05 00:02:39   
   
   XPost: comp.ai.neural-nets, sci.image.processing, sci.math.num-analysis   
      
   Greg Heath wrote:   
   > MajorSetback@excite.com wrote:   
   > > I would like to separate two classes based upon 8 metrics.  I am   
   > > thinking of using supervised classification based upon defining a   
   > > decision hypersurface in 8-dimensional space.  I would be most grateful   
   > > if someone could suggest the best algorithm for this purpose.   
   > >   
   > > Many thanks in advance,   
   > > Peter.   
   >   
   > The best algorithm depends on the data. MLPs (Multilayer Perceptrons)   
   > and RBFs (Radial Basis Functions) are uniform approximators that   
   > can provide estimates of conditional class  posterior probabilities.   
      
   Many thanks for your reply.  I would like to separate two classes, pass   
   and fail.  I don't want any failures to pass and am prepared to fail   
   several objects that should pass.  However, the threshold for passing   
   is fairly flexible.  Whether an object should pass is based upon the   
   magnitude of an (unknown) error.  The lower the error the better but   
   there is no clear cut threshold.  I would like to make the decision   
   based upon a number of (known) reliability metrics that are related to   
   the error but not completely orthogonal to one another.   
      
   >   
   > However, sometimes the more elementary classifiers (e.g., linear,   
   > logistic,   
   > quadratic, or k-Nearest Neighbor) yield the best results.   
      
   Elementary is good.  I was thinking along the lines of a quadratic.   
      
   >   
   > Before jumping in and trying to obtain quick classification results I   
   > often   
   > recommend that exploratory data analysis like scatter plots, clustering   
   > and PCA be investigated in order to get a better feel for the data.   
      
   I agree, particularly since the metrics are not all orthogonal.  I have   
   8 but I tend to doubt I have a rank of 8.   
      
   >   
   > Hope this helps.   
      
   Indeed it does.  Thanks again,   
   Peter.   
      
   [ comp.ai is moderated.  To submit, just post and be patient, or if ]   
   [ that fails mail your article to , and ]   
   [ ask your news administrator to fix the problems with your system. ]   
      
   --- SoupGate-Win32 v1.05   
    * Origin: you cannot sedate... all the things you hate (1:229/2)   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]


(c) 1994,  bbs@darkrealms.ca