home bbs files messages ]

Forums before death by AOL, social media and spammers... "We can't have nice things"

   comp.ai      Awaiting the gospel from Sarah Connor      1,954 messages   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]

   Message 702 of 1,954   
   Ted Dunning to All   
   Re: Decision Trees   
   14 Apr 05 18:32:20   
   
   From: ted.dunning@gmail.com   
      
   >>>In practice, I have found that an interesting option is to use a   
   >>>decision tree on a problem and then let something like a logistic   
   >>>regression classifier cheat by getting to see some outputs of the   
   rules   
   >>>used by the decision tree system.  You can then turn the tables and   
   let   
   >>>the decision tree system see the output of the logistic regression.   
   >>>You know you are done with this process when the decision tree   
   system   
   >>>ignores everything except your logistic regression output.   
      
   >> That is really interesting. Have you published something on that?   
      
   > Indeed, kudos for an interesting idea.   
      
   Thanks.   
      
   > It works because of the problem of "fragmentation", something   
   discussed   
   > in "Global data analysis and the fragmentation problem in   
   > decision tree induction",   
      
   Actually, I don't think so.   
      
   I think it works because the decision tree is looking for things that   
   look complex to the linear classifier but simple to the decision tree.   
   Then when we reformulate the input space to include the observation of   
   the decision tree, we have made the problem simpler for the linear   
   classifier.   
      
   Essentially, we are using a mixture of the prior distribution of   
   non-linear classifiers represented by the decision tree simplicity   
   heuristics and the prior distribution represented by the variable   
   selection or other regularization of the linear classifier.  I have to   
   presume that this mixed distribution is a better fit with the set of   
   problems that I have seen than either component of the mixture by   
   itself.   
      
      
      
   > There are some alternative solutions. One is to average lots of small   
   > trees (Breiman's random forests).   
      
   This doesn't bring the same power to the decision tree approach unless   
   you average truly stupendous numbers of trees.   
      
   > Second is to use naive Bayes in the leaves of the tree,   
      
   This is essentially the same as putting linear classifiers in the   
   leaves.  This turns my suggestion on its head (instead of using the   
   decision tree to help the linear system, we use the linear system to   
   help the decision tree).  This would be better done (I think, in   
   customary grandiose fashion) by tying the prior distribution of the   
   classifiers in the leaves together.  In the end, though, I still find   
   that it is easier to deploy a system that looks mostly like a linear   
   classifier with a few non-linear inputs than a full decision tree with   
   fancy stuff at every leaf.  That has lots to do with the fact that I   
   already have code to emit one kind of model in SQL and similar   
   formalisms, but not the other.   
      
   > Alternatively, one can put logistic regession in the leaves.   
      
   Again, this really needs a more complex look at the regularization   
   procedures.  If you need regularization on the entire problem, then you   
   definitely need it when you subset the data.   
      
   Again, for the problems I have seen where a few non-linear inputs gave   
   dramatic improvement over other approaches, I would tend to say that   
   the fancy-inputs to linear classifier approach is prefered over   
   fancy-leaves in a decision tree class of approaches.   Your mileage   
   will vary of course.   
      
   Thanks for the interesting references!   
      
   [ comp.ai is moderated.  To submit, just post and be patient, or if ]   
   [ that fails mail your article to , and ]   
   [ ask your news administrator to fix the problems with your system. ]   
      
   --- SoupGate-Win32 v1.05   
    * Origin: you cannot sedate... all the things you hate (1:229/2)   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]


(c) 1994,  bbs@darkrealms.ca