XPost: comp.ai.neural-nets   
   From: rif@mit.edu   
      
   "Marina Sapir" writes:   
      
   > I do not know about generalizations of SMO. Instead, I would like to   
   > ask you some questions about the existing algorithm.   
   >   
   > We did some experiments with SMO, which bring some peculiar results.   
   >   
   > 1. We build SVR with linear kernel. We dropped one variable at a time,   
   > and looked how the criterion SVR minimizes changes. We were very   
   > surprised to find that the criterion value got smaller and smaller as   
   > more variables were dropped. If SMO finds the global minimum, it could   
   > not have happened.   
   >   
   > Is this a known fact that SMO does not find the global minimum?   
      
   SMO really describes a collection of algorithms. In general, the   
   problem SMO solves is a convex quadratic optimziation problem, and   
   therefore has no local minima. *Proving* that the SMO algorithm as   
   originally stated by Platt does converge turns out to be challenging   
   (there are several proofs for minor variants), but in practice, it   
   always converges on reasonable problems if implemented correctly. If   
   you were using a linear kernel and keeping the regularization   
   parameters fixed, I would look for a bug.   
      
   rif   
      
   [ comp.ai is moderated. To submit, just post and be patient, or if ]   
   [ that fails mail your article to , and ]   
   [ ask your news administrator to fix the problems with your system. ]   
      
   --- SoupGate-Win32 v1.05   
    * Origin: you cannot sedate... all the things you hate (1:229/2)   
|