Humans Creating Autonomous Machines
Humans are creating an increasing amount of machines that can perform jobs for them, in other words, automated systems. In ScienceDaily’s article called “Emerging ethical dilemmas in science and technology”, written by William G. Gilroy, some questions were raised about this issue.
This matter advanced many debates since these machines take decisions on their own instead of humans, but who will be responsible for all of the decisions they take? Undoubtedly, if a machine makes a wrong decision and causes damage either to humans or their environment, the person owning the machine will be responsible especially since they would have signed a contract agreeing to all the terms of the sale. Although, it would be the responsibility of the scientist who invented the technology only if all autonomous systems, built as a result of his idea, were to have the same problem. This scientist would probably have to review the basis of his method of fabrication.
Furthermore, someone might ask; would it become scientist’s responsibility to design these machines if it could reduce the risk of human casualties? In my opinion, responsibility is a bit of an extreme term. A scientist must judge whether the proposed project is worth their time and money. Not all projects might save lives but they might still be considered as major improvements in scientific technology.
In the near future, there will be more and more autonomous systems that will be part of our society. This means that we will soon have to establish clear regulations concerning these apparatuses. The proof that these rules should be made clear is that there are still a lot of debates concerning this issue. I have come across this debate more than once whether I was in school, at home or even at work. Autonomous systems are an important matter since they will unquestionably enable humans to improve their lives in the near future.