home bbs files messages ]

Forums before death by AOL, social media and spammers... "We can't have nice things"

   comp.ai      Awaiting the gospel from Sarah Connor      1,954 messages   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]

   Message 1,201 of 1,954   
   Michael to Xiao Xiong   
   Re: A general pattern classification que   
   05 Oct 06 09:39:44   
   
   From: mlh496@gmail.com   
      
   On Oct 4, 1:41 am, "Xiao Xiong"  wrote:   
   > I have a general question to ask:   
   >   
   > There are a set of supervised training data which belong to two classes   
   > A and B. Each training sample is a multi-dimensional vector and the   
   > covariance matrix of the features are not diagonal, that is the feature   
   > elements are correlated. Assume the underlying pdf is completely known.   
   > We can use two Gaussian mixture models (GMM) for the two classes   
   > separately and build a Bayesian classifer.   
   >   
   > My quesetion is:   
   >   
   > Is it possible to use some kinds of feature transformation method, such   
   > as Linear Discriminative Analysis, to project the features into a new   
   > space for better discriminative power?   
   >   
      
   One way to extract the features from a multi-dimensional normal   
   distribution is to use the Mahalanobis distance.  This can be thought   
   of as a generalization of the z-score to multivariate normal   
   distributions.  If you let m1 and m2 be the Mahalanobis distances for A   
   & B respectively, then your decision rule is:   
      - classify 'x' as A if m1 < m2,   
      - otherwise classify 'x' as B.   
      
   -Michael   
      
   PS: http://en.wikipedia.org/wiki/Mahalanobis_distance   
      
      
   [ comp.ai is moderated ... your article may take a while to appear. ]   
      
   --- SoupGate-Win32 v1.05   
    * Origin: you cannot sedate... all the things you hate (1:229/2)   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]


(c) 1994,  bbs@darkrealms.ca