Just a question.

Rarity

Onyx user!
Reputation
0
So, lets say we start to produce sophisticated robots that can carry out functions like a human, it would be the programming that would carry out those functions, but lets say that the robot commits a crime, would it be that the robot wouldn't be guilty of it because he was programmed to do it in the first place therefore not making it the robot's fault. This is caused by the robot not having any free will to change any of it. Lets also say if there was an invention such that it gave a robot "free will" so to say, would it still not have free will do to it being more programming instead of actual judgment.

So... does this mean that there will never be something invented that will have its own so called "free will"?

This is also getting at an argument that nothing is random at all (In a way).

TL;DR: Is there there a such thing as free will for something as a robot, also is there no such thing as random?

(I'll let you guess how I thought of it.)
 
Did you think of this while watching Futurama? Because they had that episode on today about robots and free will.
 
deathcrow said:
Did you think of this while watching Futurama? Because they had that episode on today about robots and free will.

ding ding, nailed it.
 
Cookies are required to use this site. You must accept them to continue using the site. Learn more…