Forums before death by AOL, social media and spammers... "We can't have nice things"
|    alt.cyberpunk    |    Ohh just weirdo cyber/steampunk chat    |    2,235 messages    |
[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]
|    Message 345 of 2,235    |
|    alias to Raistlin    |
|    Re: AI (again)    |
|    28 Oct 03 02:29:28    |
      From: alias@removenetserver.org              On Tue, 28 Oct 2003 01:35:40 +0000, Raistlin wrote:              [snip]              >>>       >>>IMHO for an AI to evolve it should not only be able to learn, but also       >>>be able to re-write it's own code. This would, I imagine, lead to an       >>>exponential growth in "intelligence"       >>>       >>>R.       >>       >>       >> i suppose it depends what you mean by "code". any system which "learns",       >> as in replacing existing rules/patterns of behaviour with new ones gained       >> through processing input is in a sense rewriting its own code. the actual       >> source code in this sort of system is mainly concerned with laying down       >> the basic structure of the learning system and the input/output       >> capabilities.       >       > What I mean is something like this. Teach it C++ and allow it access to       > its own source code. Allow it to re-write and recompile its own       > subroutines. Ultimately it might be able to re-write parts of itself       > that are running in raw assembly language. Granted you would need a       > prodigious code base to start with and you would have to have achieved a       > certain level of intelligence for it to be able to "take off" as it       > were. That way if the intelligence encountered certain limits within       > itself, it could modify them.       >       > Of course it would need to be fully conscious to do this.              a personality (consciousness, awareness, entity, whatever) is the result       of its expierence *and* the hardware it runs on. i, for example, am not       just a product of the New Jersey waste.. i am a product of the New Jersey       waste as expierenced by some of gods finest monkey-meat ; ) there are       others here that have lived similar lives and come away with different       perspectives.. a certian amount of that could be put down to expierence..       but certianly, and to a large degree, it must be a side effect of the       hardware my consciousness runs on.              alterations to that hardware may possibly result in a functioning (perhaps       even superior) "entity" but i do not believe that entity would be me.              or to put it more concisely.. if such an AI achieved consciousness, and       then realized it had the capability to alter that consciousness.. don't u       think it would seek to destroy that capability? self-preservation is the       1st priority of (most) self-aware beings.              >       > From Websters:       >       > \Con"scious*ness\, n. 1. The state of being conscious; knowledge of       > one's own existence, condition, sensations, mental operations, acts, etc.       >       >>       >> i think that in a sense, the code of an AI could be seen as the body/brain       >> of the AI, rather than its mind or intelligence.       >       > Personally I would try and avoid human analogies with regards to AIs...       > The less preconceptions we can include in the language used to discuss       > them the more possibilities will open up.       >              yes. yes. yes.              the part that has always confused me in these conversations on AI is why       people seem to be convinced that such a creature would even be aware of       the macro-verse surrounding it.              such a perception would likely be beyond this entity's capability.. when       people conviece of an AI "expierencing" the world as we know it they       invariably depend on cameras, microphones, and the like for its input.              there is a massive flaw in this premise.. a creature that exists solely in       computer code does not have the advantage of ur hardware constraints (by       which i mean it does not view the world through eyes, listen through ears,       it does not compete with gravity, etc) therefore it would not even have a       conception of this sensoriums existence.              monkeys like us spend the first few years of our lives coming to terms       with this sensory input.. and we do not have to assemble it into       intelligible data.. it comes to us through tailored modules designed       specifically for accepting and translating this data (eyes, ears, but more       importantly the visual cortex) a machine intelligence would have no such       assistance.              human intelligence is molded by sensory input.. this artificial       intelligence will be a non-sensory entity.. and the effects this would       have on its consciousness are massive.              i do not believe we would even be able to recognize, much less communicate       with an AI .. until it sought to communicate with us.              ..       alias              --- SoupGate-Win32 v1.05        * Origin: you cannot sedate... all the things you hate (1:229/2)    |
[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]
(c) 1994, bbs@darkrealms.ca