International Humanitarian Law and Artificial Intelligence: Legal Scope of Autonomous Weapon Systems Used in the War in Gaza

Welcome to DSpace BU Repository

Welcome to the Bahria University DSpace digital repository. DSpace is a digital service that collects, preserves, and distributes digital material. Repositories are important tools for preserving an organization's legacy; they facilitate digital preservation and scholarly communication.

Show simple item record

dc.contributor.author SamihaYousaf, 01-155212-036
dc.date.accessioned 2025-08-11T06:09:57Z
dc.date.available 2025-08-11T06:09:57Z
dc.date.issued 2025
dc.identifier.uri http://hdl.handle.net/123456789/19834
dc.description Supervised by Mr. Khaqan Ahmed en_US
dc.description.abstract This study critically examines the legal challenges posed by the deployment of Artificial Intelligence (AI)-powered Autonomous Weapon Systems (AWS) under International Humanitarian Law (IHL), with a focus on the 2023-24 war in Gaza. It addresses the "context-dependency" gap in IHL, where compliance often fails in real-world asymmetric conflicts. Using a qualitative methodology, this research explores and analyses how legal norms and realities are socially constructed, continuously contested, and redefined. Furthermore, the study is grounded on secondary data sources. The Israeli military's use of AI systems such as Lavender and Gospel serves as a case study revealing these tensions. Despite Israeli claims of multi-layered human oversight, investigative reports and findings by international organisations show growing reliance on algorithmic outputs and broad categorisation of what constitutes a military target, where civilians, including civil society activists and NGO workers, have been broadly labelled as terrorists. These biases, deeply rooted in Israel’s broad and politicised definitions of terrorism, are absorbed by the machine learning models, producing automated suspicion scores that systematically endanger civilian populations. Such practices challenge the principle of meaningful human control and undermine the core IHL obligations of distinction, proportionality, and precaution. Israel’s invocation of the "human shield" argument cannot legally justify the structural errors and disproportionate harm caused by these AI-driven systems. Through a constructivist lens, the study argues that whether future regulation of AWS will occur depends on the strategic alignment of powerful states' interests; absent a direct threat to their security, the normalization of AWS use is likely to prevail. The findings suggest that the governance of AI in warfare is shaped not by humanitarian ideals but by political calculations and evolving power structures. en_US
dc.language.iso en en_US
dc.publisher Humanaties and Social Sciences en_US
dc.relation.ispartofseries BSS;P-11938
dc.subject International Humanitarian Law en_US
dc.subject Artificial Intelligence en_US
dc.subject Autonomous Weapon Systems en_US
dc.title International Humanitarian Law and Artificial Intelligence: Legal Scope of Autonomous Weapon Systems Used in the War in Gaza en_US
dc.type Project Reports en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search DSpace


Advanced Search

Browse

My Account