OBSERVARE
Universidade Autónoma de Lisboa
e-ISSN: 1647-7251
Vol. 14, Nº. 2 (November 2023-April 2024)
117
ARTIFICIAL INTELLIGENCE IN DRONES AND ROBOTS FOR WAR PURPOSES:
A BIOLEGAL PROBLEM
CÉSAR OLIVEROS-AYA
cesar.oliveros@unimilitar.edu.co
Lawyer. PhD in Bioethics. Master in Administrative Law and in Teaching and University
Research. Research Professor at the Nueva Granada Military University, Bogotá,
Cundinamarca (Colombia)
Abstract
The ever-increasing use of drones as war weapons is not a concern that is simply left to the
news; it is a problem that involves the entire humanity, especially as an aspect that needs to
be studied rigorously from biolaw. Among the nuances that this issue entails, the use of
artificial intelligence is perhaps the one that, nowadays, receives the most attention, due to
the idea of providing a significant level of autonomy in the selection of targets, and the same
moving ability that they can reach. to have. This article aims to contribute to this debate, by
reviewing artificial intelligence from a bioethical approach, in relation to the persistence of the
responsibility that a human being has as a drone operator in a war context.
Keywords
Artificial Intelligence, Drones, War, Legal Responsibility, Bioethics.
Resumo
O uso cada vez maior de drones como armas de guerra não é uma preocupação que fica
apenas nos noticiários; é um problema que envolve toda a humanidade, especialmente como
um aspeto que precisa ser estudado com rigor pelo biodireito. Dentre as nuances que essa
questão comporta, o uso da inteligência artificial talvez seja a que, atualmente, recebe maior
atenção, devido à idéia de proporcionar um nível significativo de autonomia na seleção dos
alvos, e a mesma capacidade de movimentação que eles podem alcançar. Este artigo pretende
contribuir para este debate, revendo a inteligência artificial a partir de uma abordagem
bioética, em relação à persistência da responsabilidade que um ser humano tem enquanto
operador de um drone em contexto de guerra.
Palavras chave
Inteligência artificial, drones, guerra, responsabilidade jurídica, bioética.
How to cite this article
Aya, sar Oliveros (2023). Artificial intelligence in drones and robots for war purposes: a biolegal
problem. Janus.net, e-journal of international relations, Vol14 N2, November 2023-April 2024.
Consulted [online] in date of last view, https://doi.org/10.26619/1647-7251.14.2.5
Article received on November 14, 2022 and accepted on April 5, 2023
JANUS.NET, e-journal of International Relations
e-ISSN: 1647-7251
Vol. 14, Nº. 2 (November 2023-April 2024), pp. 117-129
Artificial intelligence in drones and robots for war purposes: a biolegal problem
César Oliveros-Aya
118
ARTIFICIAL INTELLIGENCE IN DRONES AND ROBOTS FOR WAR
PURPOSES: A BIOLEGAL PROBLEM
1
CÉSAR OLIVEROS-AYA
Introduction
Over the past few years, the insertion of drones in everyday life has become more and
more frequent and, in the same way, there is also a growing concern about the limits and
restrictions that these could have, in light of the immense possibilities of, one day,
achieving a level of automation ostensibly uncontrollable.
Although the help that can be obtained through these devices is undeniable, it is no less
true that, in war scenarios, their lethality has a significant potential that could exceed
expectations, especially when artificial intelligence and robotics have shown great
progress that makes us to ask ourselves whether a clear and precise legal framework, in
which the human factor is not disregarded in its operation, is necessary.
1. Artificial intelligence as a bioethical category
In the book Life 3.0, the author Max Tegmark explains that life’s development has gone
through three stages, in which he highlights the conjunction of factors that have allowed
life to design itself. These phases have been distinguished as:
Table 1 Three stages by Max Tegmark
STAGE
FACTORS
Life 1.0 (biological stage)
Evolution of its hardware and software
Life 2.0 (cultural stage)
Evolution of its hardware, with the ability to design a great
part of its software
Life (technological stage)
Full ability to design its hardware and
software
Source: Mark Tegmark (2017: 35)
1
This article is product of INV-DER-3430 project, corresponding to the "Public Law" group, research line on
"Law, Education and Society" at the Center for Legal, Political and Faculty of Law in Nueva Granada Military
University, financed by the Vice-Rectory of Investigations of the Nueva Granada Military University 2021.
JANUS.NET, e-journal of International Relations
e-ISSN: 1647-7251
Vol. 14, Nº. 2 (November 2023-April 2024), pp. 117-129
Artificial intelligence in drones and robots for war purposes: a biolegal problem
César Oliveros-Aya
119
On the first level, life is incapable of redesigning itself. The author takes bacteria as an
example, since its scenario of existence lies in mechanical, predisposed activities, where
there is a non-existent initiative. There, programming and formal configuration are given
by evolution and not by design.
The second phase points out that configuration is given by evolution, but programming
refers to some type of design. The author, in this regard, understands software as the
set of algorithms and knowledge used to process the information provided by the senses,
and make decisions, from the ability to recognize faces, to activities such as walking,
reading, writing, singing or telling jokes (2017: 32).
This is the result of the learning process that is incorporated into the brain, allowing the
creation of an interconnected relationship with the environment itself; thus, the influence
of social relations contributes to programming design. Therefore, Life 2.0 has the capacity
to design its own software, being superior to Life 1.0, through learning from the moment
one is born. From that categorization, the second stage, that was the one which involved
the evolution of human beings on Earth, has allowed them to be much more intelligent
compared other beings, becoming more flexible and having higher levels of adaptation
(Tegmark, 2017: 33- 3. 4).
The third level, Life 3.0, involves artificial intelligence in an unavoidable and significant
way, as well as the effects of adjusting to new ways of conceiving the interaction between
human beings and his environment, in different contexts. Beyond the fears of this
technological advance, it is a new step in which there will be possibilities of redesigning
the software and achieving an unusual form of transcendence.
Ian Morris, Professor of History at Stanford University, argues that human development
is linked to four components: energy capture (calories per person, obtained through the
environment for food, home and business, industry, agriculture, and transportation),
organization (the size of the largest city), war capacity (number of troops, power, speed
of weapons, logistic capabilities and information technology), sophistication of the tools
available to share and process information and scope of their use (Brynjolfsson and
MacAfee, 2014: 21).
In his work The Measure of Civilization (2013), Morris carried out a research, in order to
explain why the West ended up leading the exercise of power, regarding the rest of the
world. To do this, he supports his arguments in the aforementioned components,
highlighting that there is an accordance between energy consumption and war capacity,
a situation that continues to this day.
This argument is interesting, since in this century the scope of these approaches seems
to be manifested with special profusion, in the face of the constant risk of seeing
ourselves, as humanity, doomed to wars that exceed any yesteryear expectation.
Paul Scharre, in the book Army of None, argues that the emergence of artificial
intelligence will transform military confrontations in the same way as the industrial
revolution, at the beginning of the 20th century, transformed the concept of war with the
creation of weapons with a greater lethal capacity, such as such as tanks, planes and
machine guns, and inserting unprecedented levels of devastation. In this way, once,
JANUS.NET, e-journal of International Relations
e-ISSN: 1647-7251
Vol. 14, Nº. 2 (November 2023-April 2024), pp. 117-129
Artificial intelligence in drones and robots for war purposes: a biolegal problem
César Oliveros-Aya
120
mechanization originated machines that exceeded human potential; today, although AI
has provided artifacts to improve logistics, cyber defense and robots for medical
evacuation, resupply or surveillance, it is the question of automation that makes a dent
in the possibility that one day it will not be human beings who choose targets and pull
triggers (2018: 11-12).
In this sense, when speaking about artificial intelligence, according to the Spanish
Ministry of Defense it is necessary to consider that it is related to areas of great
complexity, whose importance is consolidated in front of the panorama of new trades,
occupations and professions, among which stand out machine learning, intelligent
robotics, natural language processing, intelligent perception, neuromorphic computing,
among others, which represent an undeniable challenge for science, and the ethical and
legal situations that can emanate from them (2018: 41).
In the book Our Final Invention: Artificial Intelligence and the End of the Human Era
(2014), the writer James Barrat exposes the main fears that society harbors, in both
present and future, around the notorious technological advance of the so-called
"intelligence explosion" that have the creations arising from this scenario can have. It
exposes that the power and sophistication of AI increases daily; it is enough to see that
there is something of it in every computer, smartphone, car, or in powerful programs such
as Watson and others derived from organizations such as Cycorp, Google, Novamente,
Numenta, Self-Aware Systems, Vicarious Systems and DARPA (acronym for the Defense
Advanced Research Projects Agency); it is also in “cognitive architectures”, whose
creators hope that they will reach human-level intelligence, and some believe that it will
happen in little more than a decade (Barrat, 2014: 24).
Lasse Rouhiainen in Artificial Intelligence: 101 Things You Must Know Today About Our
Future (2018), expresses the need to delve deeper into debates and educational
proposals on this topic, in order to seek effective benefits from AI, as well as fully
understand the transformations that it will bring in all fields.
Faced with this, it raises three fundamental issues on which it is urgent to dive into:
1. Re-education of millions of people who will be unemployed due to AI, robots
and automation. 2. The ethical and moral use of AI and robotics technologies,
so that they promote the general well-being of human beings, and not the
other way around. 3. Work on the prevention of possible technological
addictions, and other disorders generated by the excessive use of AI and
technology, such as anxiety, loneliness, etc. (2018:11).
For this reason, he defines AI as “the ability of machines to use algorithms, learn from
data and use what they have learned in decision-making just as a human being would”
(Rouhiainen, 2018: 14), and it is clear that it can be used in many of the tasks performed
by individuals.
Therefore, nowadays, the best research scenarios for the development of AI are the
following:
JANUS.NET, e-journal of International Relations
e-ISSN: 1647-7251
Vol. 14, Nº. 2 (November 2023-April 2024), pp. 117-129
Artificial intelligence in drones and robots for war purposes: a biolegal problem
César Oliveros-Aya
121
recognition of static images, classification and labeling, improvements in the
performance of commercial algorithmic strategy, efficient and scalable
processing of patient data, predictive maintenance, content distribution on
social networks, protection against cybersecurity threats (Rouhiainen, 2018:
14-16).
From this point of view, it is unavoidable to include AI applied to military purposes, which
is conceived as “the sum of three elements. Information processing (logical), warfare and
weaponry platforms (physical), and continuous threat and situational awareness
(human)” (IEEE, 2017: 84).
Faced with this scenario, law needs to be linked to the debate in order to accurately
identify aspects such as liability for damages, the role of artifacts with a certain level of
autonomy, the conceptualization of guilt in the actions of a robot, the different causal
relationships, among others, and also detail the situations arising from the robotics
industry, for example, how to manage the immunity of manufacturers, the predictability
of behavior, the details in the design of artifacts, the possible risks that may affect to the
consumer (Tirado, Oliveros, Laverde, 2021: 34).
2. The estimation of drones as weapons of war
Christopher Coker, professor at the London School of Economics and Political Science,
stated that drones, as artifacts integrated into the dynamics of contemporary wars, offer
new problems and challenges for ethics, politics and law. In a conversation held at
Chatham House, London, in 2013, he highlighted five points that supported the insertion
of said technology into the panorama of war:
a) Drones, without necessarily being robots yet, once they acquire autonomy, they will
achieve that condition and, consequently, will accentuate the complexity in the
management of situations of war.
b) They are the result of the reduction of the human space of war, which is increasingly
becoming more cerebral:
To be a warrior in the 21st century is to essentially be somebody behind a
screen, whether it’s a cyber screen, a cyber warrior or what the Americans
call cubicle warriors drone pilots, analytical warriors, people whose job is to
process data. People who have three particular attributes which are now
required of warfare in the 21st century, compared with, say, a hundred years
ago: mental agility, communication skills and multitasking. A particular
generation and most drone pilots in the United States are between the ages
of 19 and 21, precisely the generation that is very good at these particular
things. But a generation that has difficulty coping with stress, a generation
that does get traumatized by what they see on their screens, and a generation
that may not be able to cope with stress as much as the ideal age for coping
with stress on a battlefield, which is still around 23 (Coker, 2013: 3).
JANUS.NET, e-journal of International Relations
e-ISSN: 1647-7251
Vol. 14, Nº. 2 (November 2023-April 2024), pp. 117-129
Artificial intelligence in drones and robots for war purposes: a biolegal problem
César Oliveros-Aya
122
c) War increasingly resembles a video game. With this, among drone operators, arises
the tendency to dissociate their sensitivity and not be sufficiently aware of the effects
of their behaviour and the damage they may be causing. This is a problem of empathy.
d) Can drone pilots be considered real warriors? Because their function, so impersonal
and distant, differs ostensibly from the idea of the soldier on the battlefield. Therein
lie ethical and moral issues, not only from the individuality of the operator but also
from the institutional context, from the perception of those who consider military work
from traditional perspectives.
e) War diminishes its operational paraphernalia. Real heroic personalities tend to fade
away. Today conflicts are remote controlled.
In the book Warrior Geeks: How 21st Century Technology is Changing the Way We Fight
and Think About War (2013), Coker analyses the technological dependence and the
critical scope of these points. He argues that the new profile of the soldier who face the
war is not far from that of hackers, who will ultimately have in their hands the fate of
devastating attacks which they cannot fully understand, driven by the great advance of
cyber technologies.
Likewise, in his book The Warrior Ethos: Military Culture and the War on Terror (2007)
highlighted the progressive instrumentalization of war and how both, the behaviours and
thoughts of soldiers, are subject to greater follow-up and monitoring, putting their agency
role in crisis and a kind of disenchantment with the military profession. (Larraín, 2018).
Thus, the leading role that drones are obtaining in the new war tensions was already
anticipated in popular culture narratives. For example, John Updike, in the novel Toward
the End of Time (1997), shows a future marked by a war between the USA and China,
carried out by combatants who do not understand the real world and whose role is
immersed in the abstraction of computer graphics. Similarly, Don DeLillo, in the tale
Human Moments in World War III (1982), narrates the task of a drone pilot who attacks
anything that is a threat to the planet. An individual who does not even need to put on a
uniform and who carries out his task without knowing the magnitude of what he is doing.
(Coker, 2013: 6-7).
From another perspective, the film Eye in the Sky (Gavin Hood, 2015), offers a clear
example of the dilemmas that military drone operators must face under the tenor of a
mission ordered by higher spheres.
In the narrative, the plot places the viewer in the perspective of those who decide in
political, legal and military terms, to generate a lethal attack with a remotely piloted
drone in foreign territory. Added to the operation, is the bioethical dilemma related to
determining the percentage of the critical hit in the area, the estimation of damage by
an expert using the ISTAR procedure, and the consequent steps until the effective shot
against the adversary is calculated. The events take place in Kenya where a team
constituted by American, British and Kenyan personnel contribute to the elimination of a
JANUS.NET, e-journal of International Relations
e-ISSN: 1647-7251
Vol. 14, Nº. 2 (November 2023-April 2024), pp. 117-129
Artificial intelligence in drones and robots for war purposes: a biolegal problem
César Oliveros-Aya
123
terrorist group in which there are two foreign subjects who, once detected, are preparing
a large-scale attack (GIASP, 2016).
The matter is not so easy to handle, from the legal point of view: note that there are
people of different nationalities, therefore, their own State is the one that has the
competence to judge them. There is also a transcript linked to violating the principle of
non-intervention, since the mission goes from being a simple detection task leading to a
capture, to a task of elimination.
There is a whole chain of command, from the management of the British Prime Minister
(Jeremy Northam), his subordinates Lieutenant General Frank Benson (Alan Rickman),
Colonel Katherine Powell and drone operator Steve Watts (Aaron Paul). The situation is
very clear until a little girl is located next to the target of attack. At that moment, an
ethical dilemma shines: neutralize the enemy at the cost of imminent collateral damage
or let them go and later attack at least 80 civilians in a place of mass gathering.
The dilemma lies in acting and assuming responsibility for the incidents. Make the right
military decision or win a media war? The film deals with a theme of throbbing topicality,
as they ponder about how many times have similar events happened. If, from the outset,
a difficult situation is shown for a power that is fully dependent on human will, how willing
will be the institutions to leave such a decision to an artifact with programmed autonomy?
Human Rights Watch, in the 2014 report on lethal autonomous weapons systems (LAWS),
indicated that the use of weapons outside the war scene has not been addressed as it
should be and, therefore, the potential use of these artifacts in local situations that affect
public order, such as the fight against crime, riots and public demonstrations, are still
understood under the aegis of a significant risk to the civilian population, since beyond
the legitimate or illegitimate purposes, the violation of the right to life, physical integrity,
the condition of the victims, etc., could insult human dignity (Del Valle, 2016: 232-233).
Thus, the implications of the use of drones, become a prospective dilemma due to the
accumulation of doubts produced by formulating hypotheses about their operation in war
scenarios. The same report emphasizes that there is legality in the act of killing as long
as three conditions are met: that it is essential to protect the life of individuals, that the
absence of other means or resources is evident, and that there is proportionality between
force and threat. to conjure. Therefore, these variables are linked to particular situations,
as well as to the corresponding and necessary qualitative evaluation of the case. In this
aspect, the concern about the risk of an unprogrammed attack system to deal with each
situation and, consequently, carry out arbitrary assassinations derived from unforeseen
circumstances shines forth (Del Valle, 2016: 233).
Because on the battlefield, situations are not always very clear; sometimes drone attacks
occur in places where there is no military presence and the estimated number of victims
depend on subjective assessments provided by the press or local leaders distinguished
by a certain tendency to hyperbolize or underestimate the circumstances. Hence, the
veracity of the events is subject to unreliable data, as has been questioned, for example,
in the publication in July 2016 of the records on civilian victims regarding the use of drones
in the Middle East by the United States (Rushby, 2017: 25).
JANUS.NET, e-journal of International Relations
e-ISSN: 1647-7251
Vol. 14, Nº. 2 (November 2023-April 2024), pp. 117-129
Artificial intelligence in drones and robots for war purposes: a biolegal problem
César Oliveros-Aya
124
Therefore, given the plurality of asymmetric confrontations, it is necessary to evaluate
each context based on variables such as origin, quantity, legal nature of the parties in
dispute, duration and intensity levels, in order to analyze the relevance, legality and
morality of the use of unmanned aerial vehicles that show lethality and precision.
Consequently, specify: the need to resort to drones, the proportionality of the use of
weapons, discrimination between combatants and non-combatants, which government
unit civilian or military will make the decision to identify legal responsibility and
establish the neatness of the process for making the respective decision (Haluani, 2014).
The foregoing does not stop involving great difficulties; for example, identifying possible
terrorists is not an easy thing; personal and behavioral patterns, such as nationality,
ethnicity, place of residence, family patterns, attitudes, places of travel, allow having a
particular profile, but not a definitive degree of certainty, which leads to "plausible
suspicions" (Zenko, 2012). In addition to the above, it is very difficult to specify the
number of victims distinguished between defined targets and collateral damage, which
generates indignation and resentment among the affected population (Haluani, 2014).
3. The persistence of the human factor as a bioethical category for the
use of drones in war
Paul Scharre comments the subject studied has motivated us to think that granting
autonomy to an armed robot is giving free rein to dystopian nightmares; especially, those
weapons which search for the target, detect it and attack it, it is not equivalent to messing
around with games. Avoiding human intervention in this process is risking too much. But
it does not underestimate the relevance of applying this technology to avoid civilian
casualties in war, for example, by making use of facial recognition, the detection of non-
combatants, although the fact that the machines cannot make context interpretations
still remains as a major obstacle (2018: 12).
That is an approach in which many academics converge; technology allows the
development of automated weapons, with which the problem increases its dimension by
not knowing if the armed forces will cross that line (Sánchez, 2018). Scharre, among his
experiences, highlights the research around swarm warfare, to cite an example. Unlike
Predator drones, which are controlled individually by humans, swarm drones are
controlled en masse (2018: 16) and according to programming, it is already possible that
in minor aspects they can detect peer adversaries and eliminate them without a command
order.
States are interested in automating their systems; at least thirty countries make use of
supervised autonomous armed systems and have been adapted to ships, defense bases,
etc. For example, Lockheed Martin's Aegis anti-missile system has an intelligent brain
that interfaces with a ship's radars to attack targets. It claims that more than ninety
countries use drones to patrol the skies, and at least sixteen have weapons, such as
Egypt, Turkey, Saudi Arabia, United Arab Emirates, Israel, United Kingdom or China
(Sánchez, 2018).
JANUS.NET, e-journal of International Relations
e-ISSN: 1647-7251
Vol. 14, Nº. 2 (November 2023-April 2024), pp. 117-129
Artificial intelligence in drones and robots for war purposes: a biolegal problem
César Oliveros-Aya
125
So, it is inevitable to ask ourselves what could happen if the human controller is
dispensed with, and the weapon is left to act on its own. There is already a drone with
this tendency, manufactured by Israel Aerospace Industries (Sánchez, 2018), the Harop,
which is an unmanned aircraft in which the platform itself acts as ammunition, although
it only carries limited quantities (less than 10 kg. or 5 pounds) of explosives on its nose.
It acts as a kind of suicide plane, rocket or cruise missile, but differs from it in its ability
to hover over a target, which then attacks by self-destructing (Kreps, 2016: 9). This
device has the capacity to be in the air for two and a half hours, can detect radar systems
within a radius of 500 kilometers and even select the target to destroy (Sánchez, 2018).
On the other hand, the Defense Advanced Research Projects Agency DARPA, based in
Arlington (Virginia, United States), has two programs: FLA (Fast Lightweight Autonomy),
which develops algorithms designed to give drones autonomy of flight through rooms
and corridors without there being any type of communication with the operator, and
CODE (Collaborative Operations in Denied Environment), a system designed to generate
collaboration between unmanned aircraft under the supervision and control of a single
person. So far, the DARPA projects do not consider developing weapons with full
autonomy or that can be reprogrammed to make decisions on their own; they are in favor
of the fact that they should be directed by human beings permanently (Sánchez, 2018).
Now, there is already a certain talk about a "mosaic war" as DARPA's response to the
increase in military artillery by China; "Like Lego blocks that almost universally fit
together, Mosaic forces can be integrated in ways that create packages [or structures]
that can effectively target an adversary's system with enough overlap to be successful,"
says one study. from the Mitchell Institute (pdf), published in September” (The Epoch
Times, 2020).
From his point of view, James Barrat explores the possibility that control of the future
could be lost by mankind as it will be the machines that determine outcomes in terms of
developing unexpected behaviors as levels of that unpredictable and powerful force of
the universe, which is intelligence, increase. that we cannot even reach and put into our
survival (2014: 19). And it is that robots are machines with abilities to perceive their
environment and recognize changes in it, process this information and make decisions in
response, as well as act on it without constant human direction. (Grossman, 2018: 4).
Therefore, it is undeniable that the center of the debate lies in lethal autonomy. I.e., in
the possibility that at some point can be granted to the machines in terms of surpassing
the volitional sphere without the need of human monitoring or control. Some scientific
figures, such as Stephen Hawking, Elon Musk and Steve Wozniak have expressed their
disagreement on this issue, indicating that it could trigger a global AI arms race (Scharre,
2018: 13).
Following Max Tegmark again, it is necessary to create artificial intelligence that is always
beneficial, one aimed at maintaining the human factor as the basis of activities in an
attempt to improve and not worsen the situation of individuals, because robots have a
great disadvantage. By removing the human being from the vehicle, they lose the most
advanced cognitive processor on the planet: the human brain (2017: 33).
JANUS.NET, e-journal of International Relations
e-ISSN: 1647-7251
Vol. 14, Nº. 2 (November 2023-April 2024), pp. 117-129
Artificial intelligence in drones and robots for war purposes: a biolegal problem
César Oliveros-Aya
126
The activity of the human being should not be eluded because this could denote attempts
to exonerate responsibility; contrario sensu, advances must be managed to avoid real
harm,both by legal and by technological means. New inventions must be civilized and
tamed in their details, but only with a deep commitment, giving precedence to first-hand
experience and constant vigilance (Kelly, 2016: 5), since these are objects that do not
need or require to assume by themselves entire suppressive roles or substitutions of a
determined subject.
The French philosopher Grégoire Chamayou in his book A Theory of the Drone (2016),
identifies the problems that the distancing of the human factor can bring about granting
high levels of autonomy to drones. He points out that the most representative advantage
of LAWS is to demonstrate power by reducing vulnerability. For this reason, removing the
human body from the task of piloting and leaving it out of reach fulfills that ancient desire,
conceived since the creation of ballistic weapons, in terms of allowing it to extend its path
and finish off the enemy from considerable distances;
“However, the specificity of the drone allows it to act in another segment of
distance. Thousands of kilometers now stand between the trigger, on which
the finger is placed, and the barrel, from which the bullet is going to come
out. To the range distance -between the weapon and its target- is added that
of the telecommand - between the teleoperator and his weapon” (2016: 18).
In this sense, the strategic purpose of reducing own damage is also fulfilled, since the
capacity for destruction is unidirectional; Whoever uses this type of weapon no longer
runs the risk of dying by killing, the unilateral prevails, the war is no longer sustained by
fighting but by massacres. From this, the drone can well be understood under the idea
of an "unidentified violent object", which forces us to rethink basic notions of a
geographical and ontological nature such as area or place, of an ethical order when
speaking of virtue and courage and, likewise, the concepts of war and conflict, in the
strategic, legal and political sense (Chamayou, 2016: 19).
For this reason, evading the principles of responsibility and reciprocity in the context of
a conflict questions the military function of States, focused on rights such as honor,
making the drone a coward's weapon. Of course, that has not prevented this
transmutation of values from being defended, but it is one of the aspects most studied
in military ethics (Chamayou, 2016: 23), which is not an ideal ethic insofar as it poses
rights and duties in a context where some part is missing from those that are basic
(Rivera, 2017), that is, someone attacks, intimidates, threatens or attacks and, with it,
alters the proportions of balance in terms of coexistence.
Professors Kristin Bergtora Sandvik and Bruno Oliveira Martins, in the article Revisitando
el espacio aéreo latinoamericano: una exploración de los drones como sujetos de
regulación, indicate this task must be done based on the identification of public uses,
interests and concerns, the analysis of regulatory approaches, the way in which in which
specific tasks are assigned and the study of airspace to establish its parameterization
possibilities. In this order, knowledge of the context is indispensable for its local use
(2018: 77).
JANUS.NET, e-journal of International Relations
e-ISSN: 1647-7251
Vol. 14, Nº. 2 (November 2023-April 2024), pp. 117-129
Artificial intelligence in drones and robots for war purposes: a biolegal problem
César Oliveros-Aya
127
But the discussion advances mostly in areas of legal theory. A war without risks through
the use of drones is, at the same time, a risk for the Law itself, since it introduces a kind
of legalization of selective assassination, with which IHL is subverted, to the point of
converting, paraphrasing Archbishop Silvano M. Tomasi
2
, technological instruments are
scapegoats for the sins of those who handle them, because they are neither good nor
bad, but the way they are used is the determining factor of their value (Chamayou, 2016:
29- 30), especially when its critical aspect shines in the impossibility of responding to moral
dilemmas about life and death and attending to everything that concerns the concept of
humanity (Rossini and Gerbino, 2016: 28).
In that order of ideas, the Argentine professor Adriana Margarita Porcelli (2021), argues
that the debate is served and the law must anticipate events before lamenting their
consequences, because it is human dignity that is at stake. Any test in this regard that
does not guarantee it, must be preventively prohibited, as happened with the creation of
blinding laser weapons.
Under this categorical affirmation, there are several tasks: participatory conjunction of
the States, industry, AI programmers, international organizations and academic and
scientific institutions, to formalize an ethical instrument that guides the use of AI and
robotics towards bioethical principles, with highlighting restrictions and prohibitions,
giving prominence to significant human control (Porcelli, 2021).
It is urgent to emphasize the responsibility of high government and military officials,
representatives at the political and social level, so as not to fall into a trivialization of
violence that, consequently, ignores IHRL and turns people into expendable pieces in the
game of the war (Oliveros, 2021: 28).
Conclusions
Artificial intelligence has consolidated perspectives and achievements that have gradually
given rise to a new industrial revolution, contributing to the modification of the
environment and its different power factors.
The combinations of these objectives has allowed the development of robotics to reach
levels of assumption and interpretation of the current world that once were just ideas
provided by science fiction.
Among many of these obtained purposes, the implementation of drones stands out for
the fulfillment of activities and tasks that denoted great human efforts and delay in time.
However, the difficulty lies in their use for war purposes.
In this aspect, bioethics scrutinizes in detail the ins and outs that may question the
responsibility of those who operate drones in war situations. The film Eye in the Sky
(2017) is a good example of the dilemmas involved in the use of this technology.
2
Permanent Representative of the Holy See to the United Nations and other international organizations in
Geneva, who presented these ideas on the occasion of the annual meeting of the States Parties to the
Convention on the Prohibition and Limitation of the Use of Certain Conventional Weapons that can Produce
Traumatic Effects excessive or indiscriminate.
JANUS.NET, e-journal of International Relations
e-ISSN: 1647-7251
Vol. 14, Nº. 2 (November 2023-April 2024), pp. 117-129
Artificial intelligence in drones and robots for war purposes: a biolegal problem
César Oliveros-Aya
128
The international academic community agrees that facing up to this problem is an urgent
task, under penalty of avoiding what has been achieved in areas such as IHRL and IHL.
To prevent the threats warned of in dystopian narratives from materializing, the human
factor should never be replaced by programming derived from the indiscriminate use of
AI and robotics.
Although the concern about the autonomy of these creations remains, the principle of
international morality needs to inspire nations and States to structure a normative
manual with clear bioethical guidelines that seek to avoid the outrage of war and the
advent of equal or worse disasters than those caused by the world wars.
References
Barrat, James (2014). Nuestra invención final. La inteligencia artificial y el fin de la era
humana. Grupo Planeta: España.
Bergtora, K. y Oliveira, B. (2018). Revisitando el espacio aéreo latinoamericano: una
exploración de los drones como sujetos de regulación. Latin American Law Review N.º
01 pp. 61-81 ISSN: 2619-4880 (En línea) https://doi.org/10.29263/lar01.2018.03
Brynjolfsson, Erik y McAfee, Andrew (2014). The Second Machine Age: Work, Progress,
and Prosperity in a Time Brilliant Technologies. W.W. Norton & Company.
Chamayou, Grégoire (2016). Teoría del dron: Nuevos paradigmas de los conflictos del
siglo XXI. NED Ediciones: Barcelona.
Coker, C., Roscini, M., Haynes, D. (2013). Drones: The Future of War? Chatham House,
London. Recuperado de www.files.ethz.ch/isn/162999/080413Drones.pdf
Del Valle, María Julieta (2016). Sistemas de Armas Letales Autónomas: ¿Un riesgo que
vale la pena tomar? Lecciones y Ensayos, No. 97, 2016, pp. 225-247. Disponible en
http://www.derecho.uba.ar/publicaciones/lye/revistas/97/sistemas-de-armas-letales-
autonomas.pdf
GIASP INTEL (2016). Eye in the Sky y el peso de la ética en una decisión de guerra.
Disponible en https://intelgiasp.com/2016/11/18/eye-in-the-sky-y-el-peso-de-la-etica-
en- una-decision-de-guerra-eye-in-the-sky-and-the-weight-of-the-ethics-in-a-war-
decision/
Gobierno de España, Ministerio de Defensa (2012). ISTAR. Recuperado de:
https://www.tecnologiaeinnovacion.defensa.gob.es/es-
es/Estrategia/HojasDeRuta/Paginas/ISTAR.aspx
Grossman, Nicholas (2018). Drones and Terrorism: Asymmetric Warfare and the Threat
to global Security. Bloomsbury Publishing.
Haluani, Makram (2014). La tecnología aviónica militar en los conflictos asimétricos:
problemáticas implicaciones del uso de los drones letales. Cuadernos del CENDES, vol.
31, núm. 85, enero-abril, 2014, pp. 23-67 Universidad Central de Venezuela Caracas,
Venezuela.
JANUS.NET, e-journal of International Relations
e-ISSN: 1647-7251
Vol. 14, Nº. 2 (November 2023-April 2024), pp. 117-129
Artificial intelligence in drones and robots for war purposes: a biolegal problem
César Oliveros-Aya
129
Kelly, Kevin (2016). The Inevitable: Understanding the 12 Technological Forces That Will
Shape Our Future. Penguin.
Kreps, Sarah E. (2016). Drones: What Everyone Needs to Know. Oxford University Press.
La Gran Época (2020). Guerra mosaico: la nueva estrategia militar de DARPA; 29 de
febrero de 2020. Recuperado de https://es.theepochtimes.com/guerra-mosaico-la-
nueva-estrategia-militar-de-darpa_619857.html
Larraín, F. (2018). El pensamiento de Christopher Coker. Pontificia Universidad Católica
de Valparaíso. Recuperado de: https://www.pucv.cl/uuaa/asia-pacifico/noticias/el-
pensamiento-de-christopher-coker
Morris, Ian (2013). The Measure of Civilization: How Social Development Decides the Fate
of Nations. Princeton University Press, Oxfordshire.
Oliveros-Aya, César (2021). Drones de guerra: preocupaciones jurídicas y bioéticas.
Janus.net, e-journal of international relations. Vol12, Nº.2, Noviembre-Abril de 2021.
Consultado en https://doi.org/10.26619/1647- 7251.12.2.2
Porcelli, A. (2021). La inteligencia artificial aplicada a la robótica en los conflictos armados.
Debates sobre los sistemas de armas letales autónomas y la (in)suficiencia de los
estándares del derecho internacional humanitario. Revista Estudios Socio-Jurídicos, vol.
23, núm. 1, 2021, Universidad del Rosario, Argentina. Recuperado de:
https://revistas.urosario.edu.co/xml/733/73365628017/html/index.html
Rivera López, Eduardo (2017). Los drones, la moralidad profunda y las convenciones de
la guerra. Isonomía, No. 46, abril 2017. xico. Recuperado de
http://www.scielo.org.mx
Rossini, Sandro y Gerbino, Lucía (2016). Per una lettura ermeneutica del drone.
Convergenze e conflitti negli scenari internazionali. Youcanprint: Roma.
Rouhiainen, Lasse (2018). Inteligencia artificial: 101 cosas que debes saber hoy sobre
nuestro futuro. Alienta Editorial.
Rushby, Rachel Simon (2017). Drones armados y el uso de fuerza letal: nuevas
tecnologías y retos conocidos. Rev. CES Derecho., 8(1), 22-47.
Sánchez, Cristina (2018). Robots y drones asesinos: así serán las armas de las guerras
del futuro. El Confidencial 5/08/2018. Recuperado
de https://www.elconfidencial.com/tecnologia/2018-08-05/robots-
drones-asesinos-armas-futuro_1601198/
Scharre, Paul (2018). Army of None. Autonomous weapons and the future of war. W. W.
Norton & Company.
Tegmark, Max (2017). Vida 3.0. Qué significa ser humano en la era de la inteligencia
artificial. Taurus.
Tirado, M., Oliveros, C. y Laverde, C. (2021). Robótica y sexualidad. ILAE: Bogotá,
Colombia