TY - JOUR
T1 - Corporate data enablers
T2 - a missing piece in the regulatory response to the military use of artificial intelligence
AU - Stamp, Helen
N1 - Publisher Copyright:
© 2023, Edward Elgar Publishing Ltd. All rights reserved.
PY - 2024/12/20
Y1 - 2024/12/20
N2 - The military use of artificial intelligence is increasing in area of strategic decision making and targeting systems. These systems require large amounts of data to operate, including population surveillance and profiling information, social medial inferences, and drone footage. Private technology companies – ‘data enablers’ – are now offeringmilitaries the opportunity to supplement their data bases. These companies offer their clients access to a range of services and products including aggregated data sets, data analytics and related software systems. This powerful combination provides militarieswith datasets they would not ordinarily have access to and the ability to build on these datasets using software systems which collate and analyse data in situations of armed conflict. The quality of these datasets and software systems can have profound effects on how military AI functions yet there is minimal accountability for technology companies underinternational law frameworks when these AI systems cause harm. This paper argues that a starting point for addressing this accountability deficit could take the form of proactive regulation by states of data enablers including incorporating data governance processes in national weapon reviews together with sui generis agreements between states and technology companies to ensure data quality and accuracy.
AB - The military use of artificial intelligence is increasing in area of strategic decision making and targeting systems. These systems require large amounts of data to operate, including population surveillance and profiling information, social medial inferences, and drone footage. Private technology companies – ‘data enablers’ – are now offeringmilitaries the opportunity to supplement their data bases. These companies offer their clients access to a range of services and products including aggregated data sets, data analytics and related software systems. This powerful combination provides militarieswith datasets they would not ordinarily have access to and the ability to build on these datasets using software systems which collate and analyse data in situations of armed conflict. The quality of these datasets and software systems can have profound effects on how military AI functions yet there is minimal accountability for technology companies underinternational law frameworks when these AI systems cause harm. This paper argues that a starting point for addressing this accountability deficit could take the form of proactive regulation by states of data enablers including incorporating data governance processes in national weapon reviews together with sui generis agreements between states and technology companies to ensure data quality and accuracy.
KW - artificial intelligence
KW - Data
KW - international criminal law
KW - international human rights law
KW - international humanitarian law
KW - military
UR - https://www.scopus.com/pages/publications/105000889070
U2 - 10.4337/mllwr.2024.02.03
DO - 10.4337/mllwr.2024.02.03
M3 - Article
AN - SCOPUS:105000889070
SN - 1370-6209
VL - 62
SP - 259
EP - 293
JO - Military Law and the Law of War Review
JF - Military Law and the Law of War Review
IS - 2
ER -