home bbs files messages ]

Forums before death by AOL, social media and spammers... "We can't have nice things"

   comp.ai      Awaiting the gospel from Sarah Connor      1,954 messages   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]

   Message 1,295 of 1,954   
   jonesrob@emporia.edu to All   
   Biologically inspired utility decomposit   
   20 Jan 07 00:23:57   
   
   Intelligence is a lot of things so no one single definition is   
   necessarily adequate.  Still, as a working definition I accept Werbo's   
   "a system to handle all of the calculations from crude inputs through   
   to overt actions in an adaptive way so as to maximize some measure of   
   performance over time."  (P. J. Werbos, IEEE Trans. Systems, Man, and   
   Cybernetics, 1987, pg 7)  Of course one can have an artificial   
   intelligence without claiming it is in any way as smart as a human.   
      
   Evolution imposes on humans (and other animals) an economic utility   
   something like   
   U=(N-2)/L, where N is the number of offspring a pair of mammals has,   
   and L is the animal's lifespan.  For a robot simulation or computer   
   virus reproducing by file copying U=(N-1)/L   
   (R. Jones, Trans. Kansas Acad. Sci., 2004, vol. 107, pg 32 and 2006,   
   vol 109, pg 159 and pg 254)  It is difficult for a creature to   
   decompose this utility, U, into a judgment about any particular action.   
    In animals, evolution has hardwired in a set of heuristics (drives,   
   aversions, etc.), which perform this decomposition (i.e., pain,   
   pleasure, sex drive, hunger, thirst, discomfort, innate fears,   
   sickness, loneliness, curiosity, etc.)  The relative weighting (and   
   timing) of these heuristic human/animal values can be only slightly   
   modified by the creature during its lifetime.  Note that all of these   
   produce a much more immediate reward (feedback) compared to U.  Just   
   such "immediate" rewards are required for some proposed AI systems (W.   
   Fritz, SIGART newsletter, 1984, num. 90, pg 34)   
      
   In the case of mobile robots pain could be measured by breakage   
   detection, overheating, etc.  For a virus or simulated robot pleasure   
   or sex could be measured by the occurence of file copying.  In mobile   
   robots hunger/thirst could be measured by battery charge levels.   
   Discomfort might be measured by bump sensors, overheating, etc.  One   
   example of a hardwired innate fear would be a cliff detector.  Sickness   
   might be measured by motor stall detection, occasional software   
   speed/performance tests, etc.  Loneliness might be measured by   
   frequency of I/O communications and curiosity would be reflected in the   
   amount of time spent on "random" exploratory efforts.   
      
   [ comp.ai is moderated ... your article may take a while to appear. ]   
      
   --- SoupGate-Win32 v1.05   
    * Origin: you cannot sedate... all the things you hate (1:229/2)   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]


(c) 1994,  bbs@darkrealms.ca