Sharing Moral Responsibility with Robots: A Pragmatic Approach

نویسندگان

  • Gordana Dodig-Crnkovic
  • Daniel Persson
چکیده

Roboethics is a recently developed field of applied ethics which deals with the ethical aspects of technologies such as robots, ambient intelligence, direct neural interfaces and invasive nano-devices and intelligent soft bots. In this article we look specifically at the issue of (moral) responsibility in artificial intelligent systems. We argue for a pragmatic approach, where responsibility is seen as a social regulatory mechanism. We claim that having a system which takes care of certain tasks intelligently, learning from experience and making autonomous decisions gives us reasons to talk about a system (an artifact) as being “responsible” for a task. No doubt, technology is morally significant for humans, so the “responsibility for a task” with moral consequences could be seen as moral responsibility. Intelligent systems can be seen as parts of socio-technological systems with distributed responsibilities, where responsible (moral) agency is a matter of degree. Knowing that all possible abnormal conditions of a system operation can never be predicted, and no system can ever be tested for all possible situations of its use, the responsibility of a producer is to assure proper functioning of a system under reasonably foreseeable circumstances. Additional safety measures must however be in place in order to mitigate the consequences of an accident. The socio-technological system aimed at assuring a beneficial deployment of intelligent systems has several functional responsibility feedback loops which must function properly: the awareness and procedures for handling of risks and responsibilities on the side of designers, producers, implementers and maintenance personnel as well as the understanding of society at large of the values and dangers of intelligent technology. The basic precondition for developing of this socio-technological control system is education of engineers in ethics and keeping alive the democratic debate on the preferences about future society.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Towards Trustworthy Intelligent Robots - A Pragmatic Approach to Moral Responsibility

Today’s robots are used mainly as advanced tools and do not have any capability of taking moral responsibility. However, autonomous, learning intelligent systems are developing rapidly, resulting in a new division of tasks between humans and robots. The biggest worry about autonomous intelligent systems seems to be the fear of human loss of control and robots running amok. We argue that for all...

متن کامل

To delegate or not to delegate: Care robots, moral agency and moral responsibility

The use of robots in healthcare is on the rise, from robots to assist with lifting, bathing and feeding, to robots used for social companionship. Given that the tradition and professionalization of medicine and nursing has been grounded on the fact that care providers can assume moral responsibility for the outcome of medical interventions, we must ask whether or not a robot can assume moral re...

متن کامل

Predicting the Use of Masks in the COVID-19 Based on the Systems Thinking, Personal - Social Responsibility, Moral Obligations and Individualism: An Approach of Consumer Behavior Theory

Since the release of COVID-19 epidemic in late December 2020, recommendations issued for personal protection by the World Health Organization and National Health Organizations around the world. The most prominent of which has the use of masks to prevent the spread of the virus. Despite the importance of this solution, many people still resist using the mask. Therefore, this study, by emphasizin...

متن کامل

: When Is a Robot a Moral Agent ?

In this paper I argue that in certain circumstances robots can be seen as real moral agents. A distinction is made between persons and moral agents such that, it is not necessary for a robot to have personhood in order to be a moral agent. I detail three requirements for a robot to be seen as a moral agent. The first is achieved when the robot is significantly autonomous from any programmers or...

متن کامل

We Want From a Robot Ethic ?

There are at least three things we might mean by “ethics in robotics”: the ethical systems built into robots, the ethics of people who design and use robots, and the ethics of how people treat robots. This paper argues that the best approach to robot ethics is one which addresses all three of these, and to do this it ought to consider robots as socio-technical systems. By so doing, it is possib...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008