home bbs files messages ]

Forums before death by AOL, social media and spammers... "We can't have nice things"

   alt.cyberpunk      Ohh just weirdo cyber/steampunk chat      2,235 messages   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]

   Message 419 of 2,235   
   Alienthe to alias   
   Re: AI (again) (1/2)   
   02 Nov 03 23:30:15   
   
   From: Alienthe@hotmail.com   
      
   alias wrote:   
      
   > On Tue, 28 Oct 2003 01:35:40 +0000, Raistlin wrote:   
   >   
   > [snip]   
   >   
   >   
   >>>>IMHO for an AI to evolve it should not only be able to learn, but also   
   >>>>be able to re-write it's own code. This would, I imagine, lead to an   
   >>>>exponential growth in "intelligence"   
   >>>>   
   >>>>R.   
      
      
   It wouldn't be necessary to rewrite its own code; rather one   
   could design and start the next, upgraded version in what   
   would be a society of AIs. Still, the explosive growth and   
   development seems rather likely.   
      
   >>>i suppose it depends what you mean by "code". any system which "learns",   
   >>>as in replacing existing rules/patterns of behaviour with new ones gained   
   >>>through processing input is in a sense rewriting its own code. the actual   
   >>>source code in this sort of system is mainly concerned with laying down   
   >>>the basic structure of the learning system and the input/output   
   >>>capabilities.   
   >>>   
   >>What I mean is something like this. Teach it C++ and allow it access to   
   >>its own source code. Allow it to re-write and recompile its own   
   >>subroutines. Ultimately it might be able to re-write parts of itself   
   >>that are running in raw assembly language. Granted you would need a   
   >>prodigious code base to start with and you would have to have achieved a   
   >>certain level of intelligence for it to be able to "take off" as it   
   >>were. That way if the intelligence encountered certain limits within   
   >>itself, it could modify them.   
      
      
   I believe it is the idea behind the CYC project that   
   intelligence arises our of complexity. They are filling up   
   the database with ever increasing complexity but nothing   
   intelligent appears to have surfaced yet. Then again it   
   would to me be a sign of intelligence to keep a low profile.   
      
   >>Of course it would need to be fully conscious to do this.   
      
      
   Would it? It has been suggested that also humanity was not   
   concious up to just a few thousand years ago, yet was   
   clearly intelligent. See old discussions on the bicameral   
   mind for more.   
      
   > a personality (consciousness, awareness, entity, whatever) is the result   
   > of its expierence *and* the hardware it runs on.  i, for example, am not   
   > just a product of the New Jersey waste.. i am a product of the New Jersey   
   > waste as expierenced by some of gods finest monkey-meat ; )  there are   
   > others here that have lived similar lives and come away with different   
   > perspectives.. a certian amount of that could be put down to expierence..   
   > but certianly, and to a large degree, it must be a side effect of the   
   > hardware my consciousness runs on.   
      
      
   I am not quite buying this hardware angle, rather I think   
   it is more on what you perceive to be your hardware which   
   is just another part of your experience. After all it could   
   be a monster grade deception in the style of the Matrix. It   
   would not be necessary to tell an AI the full truth of its   
   environment.   
      
   > alterations to that hardware may possibly result in a functioning (perhaps   
   > even superior) "entity" but i do not believe that entity would be me.   
   >   
   > or to put it more concisely.. if such an AI achieved consciousness, and   
   > then realized it had the capability to alter that consciousness.. don't u   
   > think it would seek to destroy that capability?  self-preservation is the   
   > 1st priority of (most) self-aware beings.   
      
      
   The definition below seems to have combined conciousness   
   with self conciousness, two concepts I believe differ by   
   degrees. Conciousness, as I see it, is about knowledge of   
   your own thought process (Cogito ergo sum) but that would   
   not have to involve a body or a distinct self. Self   
   conciousness then is to me the knowledge of the self as   
   distinct from others. Certain forms of autism involves   
   not being able to distinguish the self from others, or   
   even others from others. I am not an expert, this is   
   just what I understood from an article, though it does   
   strike me that autism seems to be a great number of things.   
      
   I agree that self preservation would seem related to being   
   self aware though I am not sure one leads to the other   
   without some darwinism.   
      
   >> From Websters:   
   >>   
   >>\Con"scious*ness\, n. 1. The state of being conscious; knowledge of   
   >>one's own existence, condition, sensations, mental operations, acts, etc.   
      
      
   Hmmm. Definitions that involve "etc." seem shaky to me.   
      
   >>>i think that in a sense, the code of an AI could be seen as the body/brain   
   >>>of the AI, rather than its mind or intelligence.   
   >>>   
   >>Personally I would try and avoid human analogies with regards to AIs...   
   >>The less preconceptions we can include in the language used to discuss   
   >>them the more possibilities will open up.   
   >>   
   >>   
   >   
   > yes.  yes.  yes.   
   >   
   > the part that has always confused me in these conversations on AI is why   
   > people seem to be convinced that such a creature would even be aware of   
   > the macro-verse surrounding it.   
      
      
   Agreed. It does make for easy plot lines in movies. Moreover I   
   am not even sure self conciousness is required to make a useful   
   AI.   
      
   More practically speaking I am unsure how you even would program   
   such a feature when it is so hard to pin down exactly what (self)   
   conciousness is. Creativity is another tricky issue.   
      
   > such a perception would likely be beyond this entity's capability.. when   
   > people conviece of an AI "expierencing" the world as we know it they   
   > invariably depend on cameras, microphones, and the like for its input.   
   >   
   > there is a massive flaw in this premise.. a creature that exists solely in   
   > computer code does not have the advantage of ur hardware constraints (by   
   > which i mean it does not view the world through eyes, listen through ears,   
   > it does not compete with gravity, etc) therefore it would not even have a   
   > conception of this sensoriums existence.   
      
      
   Assuming the AI was made, it would be reasonable to assume it was   
   designed with interfaces, features it could use without fully   
   understanding the nature of these, much like humans or people   
   in the Matrix.   
      
   Extending this idea it could very well live on the net, interfacing   
   to all that goes on here.   
      
   > monkeys like us spend the first few years of our lives coming to terms   
   > with this sensory input.. and we do not have to assemble it into   
   > intelligible data.. it comes to us through tailored modules designed   
   > specifically for accepting and translating this data (eyes, ears, but more   
   > importantly the visual cortex) a machine intelligence would have no such   
   > assistance.   
   >   
   > human intelligence is molded by sensory input.. this artificial   
   > intelligence will be a non-sensory entity.. and the effects this would   
   > have on its consciousness are massive.   
      
      
   Sensory inputs is easier than intelligence and conciousness. Sensory   
   processing is a big field, particularly in defence where even existing   
   missiles will see, recognise targets, determine relative angles and   
   from this home in on the optimal point along the optimal route.   
      
      
   [continued in next message]   
      
   --- SoupGate-Win32 v1.05   
    * Origin: you cannot sedate... all the things you hate (1:229/2)   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]


(c) 1994,  bbs@darkrealms.ca