Mediating morality:Using unmanned aerial systems for decision making in moral situations

Tuesday January 21, 2014


10:00 - 10:30   Registration with coffee


10:30 - 10:40   Welcome - ECIS Synergy Program

Dr. Bart van Bezooijen ♦ Philosophy & Ethics, TU/e


10:40 - 10:50   Overview of the symposium

Gen. Maj. (ret.) Kees Homan ♦ Clingendael - Netherlands Institute of International Relations


10:50 - 11:50   Keynote speech - Killing in the name: Stopping         

                        autonomous warfare

Prof. dr. Noel Sharkey ♦ University of Sheffield


11:50 - 12:25   Bounding the debate on drones: the paradox of

                        postmodern warfare

Commodore prof. dr. Frans Osinga ♦ Netherlands Defence Academy


12:25 - 13:10   Lunch 


13:10 - 13:45   Autonomous killing machines? Lessons from machine


Dr. Mark Coeckelbergh ♦ University of Twente, 3TU Centre for Ethics and Technology


13:45 - 14:20     A moral approach to armed uninhabited vehicles: 

                          Preventive arms control

Dr. Jürgen Altmann ♦ Technische Universtität Dortmund


14:20 - 14:35   Coffee break


14:35 - 15:10   The ethical boundary agent and the moral implications on

                        autonomy and distance         

Dr. Tjerk de Greef ♦ Delft University of Technology, University of Oxford


15:10 - 15:45   Death at a distance: Unpacking the psychological effects

                        of drone warfare

Prof. dr. Wijnand IJsselstein ♦ TU/e School of Innovation Sciences


15:45 - 16:20   Forum discussion

Discussion leader: Gen. Maj. (ret.) Kees Homan


16:20 - 17:00   Reception and drinks



Eindhoven University of Technology

Dorgelo Room (Room 1.52, Traverse building, situated at the La Place square)


Contact and enrollment


Topic and Aim of the Workshop

The U.S. Congress declared in the 2007 National Defense Authorization Act that it had: “A preference for unmanned systems (...)” because of the safety that unmanned systems offer to military personnel. From that point on, when the U.S. military were to buy new weapons, it had to justify why it would not be unmanned (as discussed in Singer, 2009). Strawser (2010) has argued that protecting soldiers from harm by using Unmanned Aerial Vehicles (UAV’s), commonly known as drones, is an ethical improvement and morally required. However, although the physical safety of military personnel is guaranteed, there are other moral considerations to take into account before embracing drones. Are information technologies changing the face of war? The technological developments in warfare confront us with a fundamental question about the interplay of technology and morality: Is it easier to behave immorally in a virtual environment?

The interrelatedness of physical and psychological distance in warfare (Grossman, 2009) has recently been extended to specific characteristics of drone warfare, where soldiers on a remotely located military base can kill someone on the other side of the world from a safe and distant location. Some people have highlighted the possibility that drones will make killing easier (e.g., Sharkey, 2012). The viewpoint is exemplified by the experiences of a drone-pilot fighting in the Iraq war from a cubicle thousands of miles away from the battlefield, who remarked: “It’s like a video game. It can get a little bloodthirsty. But it's fucking cool.” (Singer, 2009, p. 332).

However, others have highlighted the opposite side of the use of drones (e.g., Coeckelbergh, 2013). The ability to hover above a fixed location makes it possible to observe targets for hours or days before taking action, which could actually reduce the psychological distance to the target. As explained by another drone pilot: “I see mothers with children, I see fathers with children, I see fathers with mothers, I see kids playing soccer”, before the call comes to fire a missile and kill the target (Bumiller, 2012). This drone pilot describes that the hair on the back of his neck stands up when he has to fire a missile, just as it did when he used to line up targets in his F-16 fighter jet. Taken together, these comments indicate that the role of information technology in moral situations may be more complex than the often heard assumption that technological mediation makes it easier to fire lethal missiles. Only a combination of normative requirements and the realities of human experience and behaviour will lead to a full understanding of the moral effects of technological mediation in modern remote warfare.

Following the increasing civil and military deployment of drones, an intense public debate on the positive and negative effects of drones has emerged. The ECIS symposium “Mediating morality: Using unmanned aerial systems for decision making in moral situations” focuses on how the use of drones affects decision-making in moral situations. Current research in moral philosophy and moral psychology provides no clear answers regarding the effects that drones have on moral decision making. Does technological mediation, offered by drones, detach decision makers from the outcomes of their decisions in moral situations? Do drones afford working environments where personnel is no longer distracted by situational factors? Or both? Importantly, these issues concern both armed as well as unarmed drones, where the latter category concerns the large majority of drones. The aim of the workshop is to bring together stakeholders and researchers in the field of technology, philosophy and psychology to discuss the mentioned topics. The goal of the symposium is to share insights from theory and practice on technological mediation in moral situations and combine insights from these diverse disciplines to guide future interdisciplinary work.

Speaker biographies

Dr. Bart van Bezooijen (1979) studied Economic Psychology at Tilburg University. After earning his master’s in 2004 (with a minor in Social Psychology), he started as a data-analyst and junior project leader at a small market research agency in Zeist. He returned to the Tilburg University as a PhD-candidate in 2005. From 2005 until 2010 he conducted research on coordination in virtual teams as part of  a collaborative effort of Tilburg University, the Netherlands Defence Academy (NLDA), and TNO Human Factors. He was offered a position as postdoctoral researcher at the Eindhoven University of Technology on moral decision making in technology-mediated environments, again in a collaboration with the NLDA and TNO Human Factors. This research is supported by a grant from the NWO Thematic Programme on Responsible Innovation (SRI grant no. 313-99-110). He received the Eindhoven School of Innovation Sciences (ECIS) Synergy Grant 2012, together with Ron Broeders, Auke Pols, and Daniel Lakens, for their philosophical-psychological approach for studying technological mediation in moral decision making. His has published in various journals and books, ranging from the Journal of Strategic Studies to Metaphilosophy.

Major General (ret) Kees Homan RNLMC is a senior research associate at the Clingendael Institute. His last position in the military was Director of the Netherlands Defence College. He holds a Master’s degree in Law from the Amsterdam University and a Master’s degree in Political Science from the Leyden University. His main fields of research are peace support operations, R2P, proliferation, civil-military relations, robotics and warfare, climate change and security, and Dutch, US, UK and Chinese security and defence policy.

Prof. dr. Noel Sharkey is professor of AI and Robotics and professor of Public Engagement at the University of Sheffield and an EPSRC Senior Media Fellow (2004-2010).  He has held a number of research and teaching positions in the UK (Essex, Exeter, and Sheffield) and the USA (Yale, and Stanford). Noel has moved freely across academic disciplines, lecturing in departments of  engineering, philosophy, psychology, cognitive science, linguistics, artificial intelligence and computer science. He holds a Doctorate in Experimental Psychology and a Doctorate of Science. He is a chartered electrical engineer, a chartered information technology professional and is a member of both the Experimental Psychology Society and Equity (the actor's union). He has published well over a hundred academic articles and books as well writing for national newspaper and magazines. In addition to editing several journal special issues on modern robotics, Noel has been Editor-in-Chief of the journal Connection Science for 22 years and an editor of both Robotics and Autonomous Systems and Artificial Intelligence Review. His research interests include Biologically Inspired Robotics, Cognitive Processes, History of Automata/Robots (from ancient to modern), Human-Robot interaction and communication, representations of language and emotion and neural computing/machine learning. But his current research passion is for the ethics of robot applications.

Noel appears regularly on TV (around 300 appearances) and is interviewed regularly on radio, in magazines and newspapers.  He was chief judge for every series of Robot Wars throughout the world as well as “techspert” for 4 series of TechnoGames and co-presenter of Bright Sparks.  He has been on lecture tours of India, China, Egypt, Australia and Singapore. He has developed large scale museum exhibitions at the Magna Science Adventure Centre and Think Tank galleries in Birmingham which have brought integrated SET directly to the public. Noel has also run robot control and construction competitions for children and young adults from 26 countries including the National Chinese Creative Robotics Competition and The National Egyptian Schools AI/Robotics Competition. He also worked on the development of a number of mechanical art installations.

Air Commodore Prof. Dr. Frans Osinga (born 1963) is Professor in War Studies, Head of the Military Operational Art and Science Section, and Chair of the War Studies Program, one of the three BA-level programs taught at the Faculty of Military Studies of the Netherlands Defence Academy in Breda, the Netherlands. He is also director of the MA program in Military Strategic Studies.
His previous assignments include a tour at NATO Allied Command Transformation (Norfolk, Va.) from 2005-2007 as the Liaison Officer for the Germany based newly established Joint Air Power Competence Center. Prior to that he was the MoD Research Fellow at the Clingendael Institute of International Relations, the premier think tank in the Netherlands on international security. He was director of the Air Power and Strategy Department of the Netherlands Defence College from 1999-2000 and lecturer in Air Doctrine at the same institute from 1997-1998. He has held a number of staff positions at the Netherlands Air Force Air Staff. From 1987-1994 he served in a various NF-5 and F-16 squadrons, also as an F-16 instructor. He is a graduate of the Netherlands Defence Academy and the Netherlands Defence College. From 1998-1999 he attended the School of Advanced Airpower Studies at Maxwell AFB, Alabama. He holds a PhD in political science from Leiden University. Topics of his presentations and lectures include NATO, ESDP, defence policy, terrorism, air power, statebuilding, irregular warfare, coercive diplomacy, contemporary military operations, strategic theory, international security and military technological developments and military innovation. He is Vice-Chairman of the KVBK, the Netherlands Royal Society for War Studies.

Mark Coeckelbergh (Ph.D., University of Birmingham) teaches philosophy at the Philosophy Department of the University of Twente, The Netherlands, and is managing director of the 3TU.Centre for Ethics and Technology. He is also co-chair of the IEEE Robotics & Automation Society Technical Committee on Robot Ethics and member of the Advisory Board of the Dutch National Economic Forum. He is the author of Liberation and Passion (2002), The Metaphysics of Autonomy (2004), Imagination and Principles (2007), Growing Moral Relations (2012), Human Being @ Risk (2013), and numerous articles in the area of ethics and technology, including ethics of information technologies and robotics, ethics of technology in medicine and health care, and environmental ethics. He also regularly writes and appears in the media. In 2007 he received the Prize of the Dutch Society for Bioethics (with J. Mesman).

Jürgen Altmann, PhD, is a physicist and peace researcher at Technische Universität Dortmund, Germany. Since 1985 he has studied scientific-technical problems of disarmament. An experimental focus is automatic sensor systems for co-operative verification of disarmament and peace agreements and for IAEA safeguards for an underground final repository. Another focus is assessment of new military technologies and preventive arms control. Major studies have dealt with laser weapons, ballistic missile defence, microsystems technology, nanotechnology, non-lethal weapons and armed uninhabited vehicles. He is a co-founder of the German Research Association for Science, Disarmament and International Security (FONAS), a deputy speaker of the Working Group on Physics and Disarmament of the Deutsche Physikalische Gesellschaft (DPG), and a deputy chair of the International Committee for Robot Arms Control ICRAC.

Dr. Tjerk de Greef (born on the 4th of August 1975 in Rheden) received his secondary education (V.W.O.) at the Rhedens Lyceum in Rozendaal and graduated in 1995. Afterwards, he graduated at HAN University of Applied Sciences (Hogeschool van Arnhem en Nijmegen) in 1999 and obtained his
Master’s degree in Information and Computer Sciences at Utrecht University in 2004. After receiving his master’s degree, he was offered a position at the Netherlands organization for Applied Research (TNO), Defense, Safety and Security where he was mainly involved in human factors related research for the Royal Netherlands Navy. In 2008 he started as a PhD student at Delft University of Technology on the topic of designing ePartners for dynamic task allocation and coordination in high-risk professional domains such as Urban Search & Rescue and the Navy. From 2012, he holds a postdoc position at Delft University of Technology, in close collaboration with the Oxford Institute of Ethics, Law and Armed Conflict (University of Oxford), on the topic of ethics and automation. In his capacity as a researcher he has published in number international journals and conferences and managed several projects. He has also been involved in the organization of various conferences and workshops. He frequently serves as an external reviewer for international journals such as IEEE Transactions on Systems, Man, and Cybernetics and Ethics and Information Technology. Since 2010 he is the treasurer of the European Association of Cognitive Ergonomics (EACE).

Prof. dr. Wijnand A. IJsselsteijn has a background in cognitive neuropsychology and artificial intelligence. In 2004, he received his PhD from Eindhoven University of Technology (TU/e) on the topic of telepresence. Since 2012, he is full professor of Cognition and Affect in Human-Technology Interaction at Eindhoven University of Technology in The Netherlands. Wijnand’s focus is on conceptualizing and measuring human experiences in relation to advanced media environments (immersive 3D media, serious games, affective computing) in the service of human learning, communication, health, and wellbeing. His current  projects deal with the ways in which media can transform our sense of self and other, and can influence environmental perceptions and social judgments. Wijnand has a keen interest in technological innovations (e.g., sensor-enabled mobile technologies, virtual environments) that make possible novel forms of human behaviour tracking, combining methodological rigor with ecological validity. He is programme manager of the Psychology and Technology bachelor programme at TU/e, and one of the founders of the Data Science Centre Eindhoven (DSC/e). He has published over 200 academic papers, 9 edited volumes, and one patent.  His work has been funded through several industrial, national, and EU grants on digital games, mediated communication, and 3D technologies.


The workshop is sponsored by

Eindhoven Centre for Innovation Studies

NWO Program ‘Social Responsible Innovation’ (SRI grant no. 313-99-110)