So, lets say we start to produce sophisticated robots that can carry out functions like a human, it would be the programming that would carry out those functions, but lets say that the robot commits a crime, would it be that the robot wouldn't be guilty of it because he was programmed to do it in the first place therefore not making it the robot's fault. This is caused by the robot not having any free will to change any of it. Lets also say if there was an invention such that it gave a robot "free will" so to say, would it still not have free will do to it being more programming instead of actual judgment.
So... does this mean that there will never be something invented that will have its own so called "free will"?
This is also getting at an argument that nothing is random at all (In a way).
TL;DR: Is there there a such thing as free will for something as a robot, also is there no such thing as random?
(I'll let you guess how I thought of it.)