home bbs files messages ]

Forums before death by AOL, social media and spammers... "We can't have nice things"

   comp.ai      Awaiting the gospel from Sarah Connor      1,954 messages   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]

   Message 964 of 1,954   
   Ted Dunning to All   
   Re: optimization   
   15 Mar 06 09:18:31   
   
   From: ted.dunning@gmail.com   
      
   > Imagine a general heuristic problem.   
      
   It sounds to me like your definition of heuristic is different from the   
   standard definition.   
      
   > In general, as one increases the number of variables that need to be   
   optimized, the   
   > number of local optima in this optimization surface increases.   
      
   This definitely *can* happen, but it definitely does not happen in   
   general.  Consider the quadratic bowl, \sum x_i^2.  The number of local   
   minima is one (i.e. the global minimum) for any dimensionality number   
   of x_i's.   
      
   I think perhaps you are confusing the issue of multiple local minima   
   with the so-called curse of dimensionality which has more to do with   
   over-fitting than with the number of local minima.  Over-fitting is due   
   to trying to fit noisy data with a model that has too many free   
   parameters.  The critical value of "too many" is the key problem and   
   that has to do with things like generalization and empirical risk, not   
   local minima.   
      
   Can you say more about what you are really asking about?   
      
   [ comp.ai is moderated.  To submit, just post and be patient, or if ]   
   [ that fails mail your article to , and ]   
   [ ask your news administrator to fix the problems with your system. ]   
      
   --- SoupGate-Win32 v1.05   
    * Origin: you cannot sedate... all the things you hate (1:229/2)   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]


(c) 1994,  bbs@darkrealms.ca