1
   

AIs

 
 
Eryemil
 
Reply Mon 21 Feb, 2005 11:58 pm
A question has come to mind. What makes a being alive? Not just alive, but sentient.

Would any of you consider an advanced artificial intelligence as alive?
  • Topic Stats
  • Top Replies
  • Link to this Topic
Type: Discussion • Score: 1 • Views: 1,701 • Replies: 21
No top replies

 
silversturm
 
  1  
Reply Tue 22 Feb, 2005 01:58 am
Hi, this topic has always interested me. If you keep in mind that 'advanced artificial intelligence' will only be software - a bunch of 1's and 0's flying around, it's hard to see how anything like that could really be "alive." I mean after all, a program is just executing a series of instructions repeatedly in a pattern that depends on its current and past states.

However, the flip side of that story comes down to: what happens if we do create this "brain" program and it reacts just like I would? Then you have to start reevaluating sentience.

Can a program really know it's alive? Or does it just read its system clock and understand it has been running for 3 years 87 days? Does the ability to solve complex problems signify intelligence - probably not, programs increasingly include more and more variables to solve problems every day.

So the answer I have is none, and have said these things to spark further discussion. But finally, I might venture a guess that I would consider such a program "alive" - when it asks, "why am I here?"
0 Replies
 
Cyracuz
 
  1  
Reply Tue 22 Feb, 2005 08:53 am
The development with artificial intelligence is the same as the development in all else. There are ants that function purely on commands, in the same way that a cpu program does. We don't call them artificial because we don't know where they had their origins. But we cannot argue that these beings are not alive. Many machines move about according to commands issued, but we see them as lifeless. And I agree that they are.

If we define artificial intelligence as an intelligece that is able to evolve on its own, then I would say that a being with this intelligence, artificial or not, has to be considered alive.

With todays artificial intelligence we have to manually upgrade it, so it does not have life. After all, life is change, and any being that can change must be said to be alive.
0 Replies
 
silversturm
 
  1  
Reply Tue 22 Feb, 2005 10:30 am
Cyracuz wrote:
able to evolve on its own

I think this is a great characteristic too.
0 Replies
 
Ray
 
  1  
Reply Wed 23 Feb, 2005 12:18 pm
Maybe true conscious life only arises in carbon-based materials.
0 Replies
 
Eryemil
 
  1  
Reply Wed 23 Feb, 2005 03:14 pm
You seen to have misunderstood my words. It is obvious that no known program today can be considered sentient.

When I posted this topic, I was thinking of something my friend said. He's religious, and said that no 'machine' can be alive because it has no soul.

I guess I should have specified on my question.

If the being fits every aspect of being human, would it be considered just a program because it was man made, and therefore not alive?
0 Replies
 
silversturm
 
  1  
Reply Wed 23 Feb, 2005 03:40 pm
Eryemil wrote:
no 'machine' can be alive because it has no soul.

I think arguments should be divided into either 1) you assume "souls" exist and 2) you assume otherwise.

If you bring religion into the topic, then you've basically already answered the question since computers aren't exactly mentioned in the Bible, Koran, etc. Because of this, your arguments probably would always eventually come back to, "well it doesn't have a soul." That's one valid way to think about things.

The other way to think about things, which may or may not be seen as a much less religious argument, is that we are all machines of nature. We have all the essential components to perform our tasks. These components transfer messages to each other as needed by their functions. We make decisions based on current and past events that have happened to us. If you look along these lines, then recreating such a system is merely a feat of engineering.

So you can attack the problem from either angle, but I think it's important to assume one of them before talking about the subject because religion has a tendency to greatly influence viewpoints.
0 Replies
 
Cyracuz
 
  1  
Reply Thu 24 Feb, 2005 05:31 am
Make up you mind. Are we talking about being alive or being human?

If we compare the human mind to a computer, we see that the differences are few. One difference is that of organic vs mechanic, but it is a minor difference, and the principles that govern each is the same:

Input material is relayed through the senses into a "program" that can understand the material. On a computer you would have to manually install sound drivers and programs for it to "understand" the music, and you would also have to upgrade manually. Eventually the software will need stronger hardware, and you'll have to upgrade.

A living being does not require external upgrades. Instead it evolves. My understanding of music, to use the same example, is not the same as it was ten years ago. I didn't install new software or anything, the old software just evolved. This is part of that software's properties.

Suddenly I started thinking about frankenstein...
0 Replies
 
Eryemil
 
  1  
Reply Thu 24 Feb, 2005 06:24 pm
The soul bit was just a random thought, I don't know if there is a soul, and frankly I don't care.

What I was trying to ask was that if something man-made could, with time be considered alive. I hope this was easier to understand, my English is not as good as I would like yet. I tend to go on forever and say very little.
0 Replies
 
Merry Andrew
 
  1  
Reply Thu 24 Feb, 2005 06:30 pm
It's a fascinating subject. I think that for an AI to be considered alive, it would have to be programmed with emotions as well as intelligence. One of the things that makes humans different from machines is that the judgements and decisions we make are invariably influenced by personal, emotional considereations as well as rational thought. Machines are free of that.
0 Replies
 
silversturm
 
  1  
Reply Thu 24 Feb, 2005 10:35 pm
Emotions are definitely needed to simulate a human being. And that is a lot more difficult to program than just straight logic! But I guess as for just making life...do we need to define life?
0 Replies
 
Merry Andrew
 
  1  
Reply Fri 25 Feb, 2005 07:50 am
I remember reading, quite a few years ago, that some programmers were trying to devise a computer program which would enable (well, not enable, more like require)a computer to make a random mistake in its computation from time to time. The idea was that this intentional glitch would make the machine seem more human, somehow, and make people working with that program more comfortable. It would also keep them on their toes, knowing that nobody -- including the machine -- is perfect and that data have to be double-checked.
0 Replies
 
silversturm
 
  1  
Reply Fri 25 Feb, 2005 12:05 pm
Wow, that's quite a story to me. I've been programming for about 10 years now, and well, that sounds like the dumbest thing I've ever heard. Hehe, I totally believe you Merry, and that someone actually tried that, but doing that in itself seems so backwards. We are taught to go for 100% data coherence, because 1 wrong bit can have a hideous trickle effect throughout the rest of the program, making bugs hard to find. Thus I've developed the idea that the best computer is the one who does its job perfectly, I never would have thought of something like the previous. Thanks, interesting post.
0 Replies
 
Eryemil
 
  1  
Reply Fri 25 Feb, 2005 06:33 pm
I agree with Silver; because of the way our society relies on computers, that is that very foolish thing to do.
Most programmers rely on computers making accurate calculations, doing such a thing not only makes them work twice as hard but it simply has no purpose. Everyone who works with computers is completely comfortable with the fact that they are machines.
0 Replies
 
Mills75
 
  1  
Reply Fri 25 Feb, 2005 11:48 pm
What is a 'soul?' Show me a 'soul.' And what of human cognition? Is it more than a collection of electro-chemical reactions? We cannot say with certainty.

I think therefore I am, but I'm not so sure about you. How do I know you are self-aware/have a soul? I don't, but I assume such is the case because you exhibit those characteristics associated with self-awareness. Thus if a machine ever did this as well, then would we not also have to assume it is self-aware (if for no other reason than to avoid the possibility of committing murder)?

As for evolution--you and I do not evolve, but humans do over many generations. Species evolve not because they want or choose to, but because external forces are at work upon them. Machines certainly evolve and they evolve as a result of external forces at work upon them. How long long did 'natural' life exist and evolve before sentience resulted?
0 Replies
 
silversturm
 
  1  
Reply Sat 26 Feb, 2005 08:36 am
I agree with Mills. I think the brunt of the analysis would be to first define the consciousness we have. How do I know I'm alive? I could say cogito ergo sum but what of cogito? What processes must a machine have in order for it to be considered 'thought'? To be honest, I really don't even understand how I got inside this body, or how I have the consciousness, if you can call it that, that I so enjoy. So, I'm not really sure yet how my 'thought' is any different from the 'thought' inside my Powerbook G4, it's just unsequenced (without explicit instructions) and diverse (my thought not confined to any given topic or set of topics). Alright, I'm off, my brain needs a reboot. Razz
0 Replies
 
au1929
 
  1  
Reply Sat 26 Feb, 2005 08:49 am
Imagine if IA could develop to a point where it could create and propagate on it's own. A machine or computer that was equal to or even superior to the human mind. A machine without a conscience. Imagine the consequences.
0 Replies
 
Mills75
 
  1  
Reply Sat 26 Feb, 2005 03:01 pm
au1929 wrote:
Imagine if IA could develop to a point where it could create and propagate on it's own. A machine or computer that was equal to or even superior to the human mind. A machine without a conscience. Imagine the consequences.


Someone with more knowledge of current developments in computers correct me if I'm wrong, but I believe there are programs out there that can create new programs, and certainly there are programs that can replicate and spread themselves (i.e., viruses).

As for how would we know if a machine became sentient: if it looks, walks, and quacks like a duck, then.... Why would we not use the same test for sentience in machines?
0 Replies
 
Cyracuz
 
  1  
Reply Sun 27 Feb, 2005 07:17 am
The answer to all your questions mills, is pride and a misguided view on what humans are and what life is.
0 Replies
 
Eryemil
 
  1  
Reply Sun 27 Feb, 2005 07:28 am
F33R M3 1 4M TH3 M3<H4N1<4| 4NT1<HR1$T! U R @|| &00M3&!!!

http://www.emutalk.net/images/smilies/newsmilies/robot.gif



Sorry for my silliness, I've had a little too much coffee this morning. ^^'
0 Replies
 
 

Related Topics

How can we be sure? - Discussion by Raishu-tensho
Proof of nonexistence of free will - Discussion by litewave
Destroy My Belief System, Please! - Discussion by Thomas
Star Wars in Philosophy. - Discussion by Logicus
Existence of Everything. - Discussion by Logicus
Is it better to be feared or loved? - Discussion by Black King
Paradigm shifts - Question by Cyracuz
 
  1. Forums
  2. » AIs
Copyright © 2025 MadLab, LLC :: Terms of Service :: Privacy Policy :: Page generated in 0.04 seconds on 06/21/2025 at 10:56:16