Robotics needs ethical guidelines – speakers say at Vatican conference on robotics and AI
Vatican City- As robotics play an increasingly greater role in today's world, their technological capabilities must not outpace ethical concerns, said a number of speakers at the Pontifical Academy for Life's workshop on "Robo-ethics: Humans, Machines and Health." Some of the most pioneering engineers, developers and scientists, Catholic and not, gathered to discuss the ever-growing field of Artificial Intelligence.
A conference organized by the Vatican’s Pontifical Academy for Life on Monday mixed enthusiasm with caution regarding recent developments in artificial intelligence, emphasizing the importance of placing the human person at the center.
“An appeal for a new alliance of humanism and technology” was at the heart of the introductory speech by Italian Archbishop Vincenzo Paglia, the head of the academy.
His speech opened the Feb. 25-26 workshop on “Robo Ethics: Humans, Machines and Health,” which attracted some of the most pioneering engineers, developers and scientists, Catholic and not, to discuss the ever-growing field of Artificial Intelligence..
While the workshop included presentations on current developments in robotics and advanced technology, some speakers, such as Emmanuel Agius, professor of moral theology and philosophical ethics at the University of Malta, were critical of the direction the developments are taking.
Designers, engineers and programmers tend to focus on “technical challenges and advances,” Agius said, without a “reflection on the pressing philosophical, ethical and religious questions” they raise.
The moral questions related to robotics are too important and complex to be ignored, he said.
“Moral agency is a characteristic of humans, not of machines,” Agius said. One cannot “ascribe intrinsic intentionality to the robot” no matter how “intelligent” the robot is; the only choices they make are a result of their programming, he said.
According to a UNESCO report on Robotics Ethics in 2017, “Given the increasing autonomy of robots, the question arises who exactly should bear ethical and/or legal responsibility for robot behavior.”
When asked whether someone could program ethics into a robot, Peter J. Opio, vice chancellor at the Kigali Institute of Management University in Rwanda, responded no.
“It is not a matter of putting a set of values or a set of principles within the limited intelligence of a robot,” Opio said. Because they cannot make deliberate, intentional actions, robots cannot take on an ethical role, he said.
The robot’s creators and programmers are human beings, and they must be the ones responsible for this role, several speakers said.
The workshop also introduced many examples of the growing use of robots in manufacturing, medicine, surgery, elder-care and in the service industry.
One speaker warned of the social impact humanity might face if people increasingly rely on robots, not other humans.
Marita Carballo, president of Argentina’s National Academy of Moral and Political Sciences and president of the consulting agency, underlined that “An eye-to-eye connection, a hug, cannot be replaced by a robot,”
To look for emotional and social satisfaction in a robot may only increase isolation and would negatively impact interpersonal relationships, she said.
After considering robots and ethics, the Pontifical Academy for Life plans to dedicate its 2020 assembly to ethical considerations connected to artificial intelligence.