Search results
1 – 10 of 17The purpose of this paper is to consider the question of equipping fully autonomous robotic weapons with the capacity to kill. Current ideas concerning the feasibility and…
Abstract
Purpose
The purpose of this paper is to consider the question of equipping fully autonomous robotic weapons with the capacity to kill. Current ideas concerning the feasibility and advisability of developing and deploying such weapons, including the proposal that they be equipped with a so-called “ethical governor”, are reviewed and critiqued. The perspective adopted for this study includes software engineering practice as well as ethical and legal aspects of the use of lethal autonomous robotic weapons.
Design/methodology/approach
In the paper, the author survey and critique the applicable literature.
Findings
In the current paper, the author argue that fully autonomous robotic weapons with the capacity to kill should neither be developed nor deployed, that research directed toward equipping such weapons with a so-called “ethical governor” is immoral and serves as an “ethical smoke-screen” to legitimize research and development of these weapons and that, as an ethical duty, engineers and scientists should condemn and refuse to participate in their development.
Originality/value
This is a new approach to the argument for banning autonomous lethal robotic weapons based on classical work of Joseph Weizenbaum, Helen Nissenbaum and others.
Details
Keywords
This chapter presents reflections and considerations regarding artificial intelligence (AI) and contemporary and future warfare. As “an evolving collection of computational…
Abstract
This chapter presents reflections and considerations regarding artificial intelligence (AI) and contemporary and future warfare. As “an evolving collection of computational techniques for solving problems,” AI holds great potential for national defense endeavors (Rubin, Stafford, Mertoguno, & Lukos, 2018). Though decades old, AI is becoming an integral instrument of war for contemporary warfighters. But there are also challenges and uncertainties. Johannsen, Solka, and Rigsby (2018), scientists who work with AI and national defense, ask, “are we moving too quickly with a technology we still don't fully understand?” Their concern is not if AI should be used, but, if research and development of it and pursuit of its usage are following a course that will reap the rewards desired. Although they have long-term optimism, they ask: “Until theory can catch up with practice, is a system whose outputs we can neither predict nor explain really all that desirable?” 1 Time (speed of development) is a factor, but so too are research and development priorities, guidelines, and strong accountability mechanisms. 2
– This first part of a two-part paper aims to provide an insight into the ethical and legal issues associated with certain classes of robot. This part is concerned with ethics.
Abstract
Purpose
This first part of a two-part paper aims to provide an insight into the ethical and legal issues associated with certain classes of robot. This part is concerned with ethics.
Design/methodology/approach
Following an introduction, this paper first considers the ethical deliberations surrounding robots used in warfare and healthcare. It then addresses the issue of robot truth and deception and subsequently discusses some on-going deliberations and possible ways forward. Finally, brief conclusions are drawn.
Findings
Robot ethics are the topic of wide-ranging debate and encompass such diverse applications as military drones and robotic carers. Many ethical considerations have been raised including philosophical issues such as moral behaviour and truth and deception. Preliminary research suggests that some of these concerns may be ameliorated through the use of software which encompasses ethical principles. It is widely recognised that a multidisciplinary approach is required and there is growing evidence of this.
Originality/value
This paper provides an insight into the highly topical and complex issue of robot ethics.
Details
Keywords
Harnessing the power and potential of Artificial Intelligence (AI) continues a centuries-old trajectory of the application of science and knowledge for the benefit of humanity…
Abstract
Harnessing the power and potential of Artificial Intelligence (AI) continues a centuries-old trajectory of the application of science and knowledge for the benefit of humanity. Such an endeavor has great promise, but also the possibility of creating conflict and disorder. This chapter draws upon the strengths of the previous chapters to provide readers with a purposeful assessment of the current AI security landscape, concluding with four key considerations for a globally secure future.
Details
Keywords
Kenneth D. Lawrence, Ronald Klimberg and Sheila M. Lawrence
This paper will detail the development of a multi-objective mathematical programming model for audit sampling of balances for accounts receivable. The nonlinear nature of the…
Abstract
This paper will detail the development of a multi-objective mathematical programming model for audit sampling of balances for accounts receivable. The nonlinear nature of the model structure will require the use of a nonlinear solution algorithm, such as the GRG or the genetic algorithm embedded in a Solver spreadsheet modeling system, to obtain appropriate results.
IN NOVEMBER 1989, RONALD W. SKEDDLE, chief executive of Libbey‐Owens‐Ford Co., stood before a group of financial executives and delivered a sobering speech about business ethics…
Abstract
IN NOVEMBER 1989, RONALD W. SKEDDLE, chief executive of Libbey‐Owens‐Ford Co., stood before a group of financial executives and delivered a sobering speech about business ethics. Four years later, he was standing before his own board members trying to explain certain alleged irregularities in the running of the company. Apparently they didn't like what they heard, and he (along with two other Libbey‐Owens‐Ford executives) was asked to step down. According to court documents filed in Columbus, Ohio (the company, a division of Pilkington P.L.C., is based in Toledo), Skeddle et. al. had bilked over $7.7 million from Libbey‐Owens‐Ford through various schemes. Skeddle could not be reached for comment.