>In the meantime, I apologies if my own posts have actually contributed to
the rise to
>confusion :) Here is a post about the simulation of subjective experience.
>addition to my previous posts.
You are not confusing things Jennifer. I think it is how we want to talk
about simulation and your interest appears to be of simulation as having
potential in computational systems, which is timely, although I'm not a
religious person so I don't see it pertainint to God, but I do see it as
pertaining to we are all part of an evolving cybernetics.
>Like I stated before, I'm interested in the Simulation of Empathy, well
known in humans,
>but considered impossible in computers (for AI purposes). Empathy being the
>simulation of the experience of the other.
Empathy may be the most needed and also the most difficult experiential
behavior to obtain. To have empathy an agent needs to understand the
thoughts, feelings and state of another agent/person. To have empathy then,
the agent would also have to have "personhood." So, what is personhood if
it is not to be alive, self-awareness, and able to make decisions. How can
something make decisions if it is not alive and self-aware? Certainly AI
makes decisions and interacts with its environment, but not alive. So the
issue is what makes AI alive?
>Some believe that it is impossible for AI to feel, subjective experience is
>ambiguity causes error in a computers calculations. Some believe that
>comes from a heart and soul, and that computers do not have a soul and thus
AI is narrow. "Strong AI" is where we would have to begin, and which takes
us to the baby steps of A[G]I (i.e., artificial general intelligence,
hereinafter "AGI"), which is where AI was originally headed before its
winter (inability to achieve its original directive in producing human level
intelligence). AGI offers the potential for being self-aware and able to
make decisions based on "experience". Through its experience in its
learning, it could obtain personhood at the juncture where the idea of life
and death becomes redefined based on semi and non-biological or synthetic
systems which develop self-awareness and may want rights, similar to the
rights of humans.
With all this said, the issue of empathy could be obtainable by AGIs. But I
have to return to my original post on this one, if I may. A brain that is
transferable or copied onto a computational system, would also transfer or
copy its mind (in the material sense) and that mind would contain the
feelings, emotions, and sensorial memory of the biological person). If the
AGI could relate to this, it would also become familiar and experience the
feelings, emotions and sensory memory of the human. So the merging of
humans and technology becomes even more blurred and the AGI would learn
empathy through its own experiential behavior.
All my best,
Yasmin_discussions mailing list
Yasmin URL: http://www.media.uoa.gr/yasmin
HOW TO SUBSCRIBE: click on the link to the list you wish to subscribe to. In the page that will appear ("info page"), enter e-mail address, name, and password in the fields found further down the page.
HOW TO UNSUBSCRIBE: on the info page, scroll all the way down and enter your e-mail address in the last field. Enter password if asked. Click on the unsubscribe button on the page that will appear ("options page").
HOW TO ENABLE / DISABLE DIGEST MODE: in the options page, find the "Set Digest Mode" option and set it to either on or off.