home bbs files messages ]

Forums before death by AOL, social media and spammers... "We can't have nice things"

   comp.ai.fuzzy      Fuzzy logic... all warm and fuzzy-like      1,275 messages   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]

   Message 544 of 1,275   
   Dmitry A. Kazakov to Ernst Murnleitner   
   Re: FCL   
   07 Jan 06 12:35:53   
   
   From: mailbox@dmitry-kazakov.de   
      
   On Fri, 06 Jan 2006 21:12:37 +0100, Ernst Murnleitner wrote:   
      
   >> BUT the only feasible accumulation method is OR, so accumulation   
   >> specifications are superfluous anyway. In my compiler all accumulation   
   >> methods except OR are flagged as errors.   
   >   
   > It should be defined in the standard, that the same method as for OR is   
   > be used by default. I think such is not mentioned? For example, ACT is   
   > an optional parameter in the FCL. But what is the default if it is not   
   > defined? MAX or PROD?   
      
   It should be MAX and only MAX. I know no system other than max, min, 1-x   
   where inference could be performed. More precisely:   
      
   For the system +,*,1-x we know that the result is probability (=additive   
   measure) and that no inference can be done, if the variables aren't   
   independent or else joint distributions are defined.   
      
   For the system max,min,1-x we know that the result under certain conditions   
   is an interval containing the possibility and necessity.   
      
   For all other systems the result is just a number of no meaning. So let be   
   42 and don't bother with compilers... (:-))   
      
   > I also dislike that the syntax is not made in order to be able to   
   > produce a simple parser, according to FCL one can write:   
   >   
   > A) IF NOT Temp IS cold THEN ...   
   > B) IF Temp IS NOT cold THEN ...   
   >   
   > A) and B) is the same but it makes the parsing unnecessarily complex.   
      
   Egh, why? It does not make parsing more complex. What indeed is more   
   complex, is semantic analysis, which follows parsing. To deal with above we   
   should 1) make the types system right, 2) convert all expressions to some   
   canonic form.   
      
   1. As for the types system, I have identified, so far:   
      
   Numeric types:   
      Integer   
      Real   
   String types: (not in the standard, but...)   
      String   
      Character   
   Lattice types (types with the operations and, or, xor, not):   
      Fuzzy lattice types:   
         Logical:   
          	 Truth value from [0,1]   
         Sets:   
            Singletons:   
               Integer singleton: (35,0.4)   
               Real singleton: (35.0,0.4)   
            Ranges (rectangular membership functions):   
               Integer range: 2..3   
               Real range: 0.1..89.0   
            Subsets:   
               Integer subset: (0,0)(1,1)(2,0)   
               Real subset: (0.0,0)(1.0,1)(2.0,0)   
               Domain subset: (red,1)(green,0.3)   
      Intuitionistic lattice types:   
         Logical:   
            Intuitionistic truth value from [0,1]×[0,1]   
         Sets   
            Intuitionistic domain subset (interval set)   
      Predicates: var is value   
   Predicate variables types (when a variable is involved):   
      Nominal predicate variables        
         Enumeration variable: red, blue, green   
      Numeric predicate variables   
         Discrete numeric predicate variables       
            Integer variable: Counter   
         Continuous numeric predicate variables   
            Real variable: Temperature   
            Linguistic set variable: hot, cold, tepid over Temperature   
      
   As you see it is not simple. It just cannot be simple. But from compiler   
   construction point of view it is quite trivial stuff - early 70's. Modern   
   technology allows to build such compiler in two weeks. At least that was   
   the time I spent for the front-end.   
      
   2. Predicates need to be converted to DNF. Then things like A vs. B are   
   solved automatically. That happens after parsing and after types analysis.   
   Basically it is a part of the back-end, i.e. code generator. The variant A)   
   is parsed to the tree:   
      
   IF   
      NOT   
         IS   
            Temp   
            Cold   
      
   The predicate NOT(IS(Temp, Cold)) is not canonic. It is converted to DNF by   
   applying the rule NOT(IS(X,Y))-->IS(X,NOT(Y)). Because Cold is a constant   
   NOT(Cold) is evaluated, and the result is: IS(Temp,~Cold). Note that ~Cold   
   will be an intuitionistic domain subset, because precise inversion is in   
   general impossible for sets linguistic variables. That's why FCL needs   
   intuitionistic lattice types.   
      
   --   
   Regards,   
   Dmitry A. Kazakov   
   http://www.dmitry-kazakov.de   
      
   --- SoupGate-Win32 v1.05   
    * Origin: you cannot sedate... all the things you hate (1:229/2)   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]


(c) 1994,  bbs@darkrealms.ca