b6ff41ec   
   From: joss@nospampleasewerebritish.nekrodomos.net   
      
      
      
   > yes. yes. yes.   
   >   
   > the part that has always confused me in these conversations on AI is why   
   > people seem to be convinced that such a creature would even be aware of   
   > the macro-verse surrounding it.   
   >   
   > such a perception would likely be beyond this entity's capability.. when   
   > people conviece of an AI "expierencing" the world as we know it they   
   > invariably depend on cameras, microphones, and the like for its input.   
   >   
   > there is a massive flaw in this premise.. a creature that exists solely   
   > in computer code does not have the advantage of ur hardware constraints   
   > (by which i mean it does not view the world through eyes, listen through   
   > ears, it does not compete with gravity, etc) therefore it would not even   
   > have a conception of this sensoriums existence.   
   >   
   > monkeys like us spend the first few years of our lives coming to terms   
   > with this sensory input.. and we do not have to assemble it into   
   > intelligible data.. it comes to us through tailored modules designed   
   > specifically for accepting and translating this data (eyes, ears, but   
   > more importantly the visual cortex) a machine intelligence would have no   
   > such assistance.   
   >   
   > human intelligence is molded by sensory input.. this artificial   
   > intelligence will be a non-sensory entity.. and the effects this would   
   > have on its consciousness are massive.   
   >   
   > i do not believe we would even be able to recognize, much less   
   > communicate with an AI .. until it sought to communicate with us.   
   >   
   >   
   i'm replying here because this ties in to my use of the "human" metaphor   
   for an AI.   
      
   the metaphor which i used was, i admit, based around humans. this doesn't,   
   i feel, invalidate its use. the common terminology used to refer to   
   intelligent agent systems uses the terms "effector" for agent routines   
   which manipulate the agent's environment, and "sensor" for... well, it's   
   obvious. it's easy to extend a metaphor to an AI in this way. it must   
   _have_ (at least in an abstract sense) a processing center which   
   correlates to the brain. similarly it must have sensors of some form (see   
   below) which correlate to our own five senses.   
      
   i think that this links in here with your point. I would argue that any   
   system which is to learn _must_ have sensors. the type/style of these   
   sensors is not really the issue (especially when you're being as   
   conjectural with very little data as we are in this discussion/forum). if   
   a system is not receiving input then it will have no new data from which   
   to learn and thus the concept of intelligence is void.   
      
   without getting into the wider issue of what intelligence/consciousness   
   actually is, (what is truth, man? you heard the weirdo.), i would argue   
   that any AI system which humans had a significant hand in developing would   
   very likely be equipped with some form of I/O which which humans could   
   interface in some fashion.   
      
   if, however, you refer to an AI perceiving only a simulated universe then   
   surely we would have been the creators of that universe and as such in a   
   very good position to observe interact with it. we are gods in the digital   
   domain.   
      
   as to altering its own source code.... for a start i'd like to reiterate   
   my post from another thread that a lot of current work in AI is in simpler   
   systems working together rather than monolithic minds. as such, evolution   
   seems perhaps a more likely approach to improving the capabilities of   
   systems. for monolithic artificial mind style AI, i believe that it is   
   likely that altering its own source code would probably be not of the   
   greatest importance in comparison with the sorting/processing of its   
   inputted data. not that i think it's an invalid point.   
      
   actually. the concept reminds me of a game that one of my supervisors   
   recently mentioned to me. i include it here as a point of interest. the   
   game is based around changing the rules of the game itself. a "move"   
   consists of changing/creating a rule which players must then follow   
   (unless the rule that "players obey the rules" is itself changed). the   
   game is called "nomic" and some information can be found at:   
      
   http://www.earlham.edu/~peters/nomic.htm   
      
   i thought some people might find it interesting.   
      
   anyway. i need to go and do insignificant things like eating and drinking.   
      
   pip pip,   
      
   joss   
      
   --   
   "A theory however elegant and economical must be rejected or revised   
   if it is untrue; likewise laws and institutions no matter how   
   efficient and well-arranged must be reformed or abolished if they   
   are unjust" - Rawls, "A Theory of Justice"   
      
   --- SoupGate-Win32 v1.05   
    * Origin: you cannot sedate... all the things you hate (1:229/2)   
|