Disarmament and Non-proliferation Verification
What We’ve Got, What’s Needed, and Can Open Source Research Help?
If you were designing a disarmament and non-proliferation verification system from scratch, what would it look like? What would you want it to do, and how could you make this happen?
Existing verification systems – established within international treaties – can do well in demonstrating states’ compliance with international treaty obligations and in contributing to robust systems of global governance. Ideally, it would also be useful to have monitoring systems which go beyond negotiated treaty frameworks. Current open source investigative research suggests that this sector could provide complementary monitoring functions, supporting worldwide arms control, disarmament and non-proliferation efforts.
Existing verification systems
Traditionally, verification systems have been designed within international agreements which regulate, or outlaw entirely, particular weapons. In this mode, ‘verification’ denotes mechanisms which build confidence that disarmament and non-proliferation commitments are being met and provide reassurance that significant levels of cheating could and would be detected.
The international community has devised ingenious solutions to meeting verification needs of different international arms control, non-proliferation and disarmament treaties. While the details vary between specific treaties – reflecting the needs, understandings and political acceptability of different contexts – typically, core verification components involve collecting relevant information from states, and systems for confirming these declarations are accurate, including, in some instances, through carefully managed inspections of key sites. This standard pattern can be seen within many international monitoring efforts, including, e.g. those associated with the nuclear Non-Proliferation Treaty (NPT) and the Chemical Weapons Convention (CWC).
However, while effective and useful, these treaty-based approaches are limited, constrained to the confines of their negotiated, legally-binding regimes. This controls what they can monitor, and the processes they can use, including the details of how they store and share data, and whether and how they can conduct inspections. These restrictions are important and unavoidable; international treaties are delicate balancing acts between different priorities, and without clearly specified procedures, countries could not sign up to, or implement, them.
But this rigidity has implications. Traditional verification systems have little scope for flexibility in what they monitor and how they monitor it, which limits their ability to trace overall patterns or spot illicit activities which use unexpected techniques. Despite the fact that verification needs and preferences might change, history has shown that once a treaty regime has been accepted and entered into force, it is extremely difficult to renegotiate its verification system, even when there is a widely recognised desire to do so.
The verified disarmament of Iraq in the 1990s is instructive, demonstrating that the verification that had previously been in place was unable to detect illegal proliferation, as the Iraq regime had used innovative approaches within their weapons programmes. Following the 1991 ceasefire agreement, UNSCOM (the international body established to oversee the verified destruction of Iraq’s nuclear, biological and chemical weapons and missiles with a range over 150 km) – found that it benefitted from being able to access and compare information from across the different weapons systems – this sort of cross-sector approach is much harder for other negotiated verification systems.
Open source investigations
While multilateral treaties remain vital within the global regulation of weapons, and their verification systems are indispensable to encouraging widespread treaty implementation, their verification provisions remain limited and hard to change. Can outside groups, equipped with new tracking technologies, support and complement existing systems?
In recent years, the capacity for external groups to monitor activities and artefacts has been transformed by new technologies. Central to this is the internet, as it provides access to data both from other new technologies including commercial satellites and social media, and from traditional media outlets including the world’s print and broadcast journalism. It also facilitates the development of networks of researchers to coordinate and check details with each other. This has enabled a big growth in non-governmental monitoring systems. There are many groups now working on tracking and understanding overlapping issues e.g. monitoring human rights abuses, sanctions violations, shipping routes, or where European-manufactured weapons end up.
This work has the potential to complement traditional negotiated verification regimes. It can provide additional eyes on the typical verification task of checking compliance with existing treaties. But it can also go further. Given that non-governmental practitioners are not limited to negotiated systems, they can be agile in developing and applying innovative techniques, and they can work across different issue areas to reveal important trends e.g. tracing wildlife trafficking can reveal patterns of small arms proliferation. Non-governmental monitoring also provides a degree of redundancy in information collection systems, which makes it more likely that unexpected illicit behaviours will be spotted, as well as making sure that there are more opportunities to confirm or refute initial findings.
While no system can ever provide universal transparency of all undesirable artefacts and activities, the multivalency and flexibility inherent to the open source domain mean that in theory it could be harnessed to develop global systems for tracking security risks – including weapons, diseases, greenhouse gas emissions and others. Given that open source investigative work is already widespread, and set to become more so, it would seem obvious to start thinking of ways to coordinate or scale up existing initiatives, and explore whether open source research can be harnessed to be more than the sum of its parts, for example, by building networks of verification practitioners.
However, there are challenges to realising this potential. There are numerous ethical issues to consider within the work, including questions about who gets to choose what to look for in these systems, who is watching, and who is being monitored – all of which intersect with concerns about surveillance and privacy. There are also questions around what happens with findings from open source research; how can they be authenticated, and used by decision makers to reach effective policy solutions? Efforts to maximise the political traction of open source work need to address these issues, while maintaining the capacity of the sector to be innovative and flexible.
The current growth in open source research could represent a way of meeting a global governance need. In the quest for treaty-based solutions to security challenges, verification has come to be seen as a deal-breaker, with countries’ willingness to participate in international disarmament or non-proliferation often pivoting on perceptions about how verifiable agreements are – i.e. the extent to which compliance can be monitored, and cheating can be detected. Traditional approaches have focussed on international organisations, or countries’ own National Technical Means, to deliver verification goals. Non-governmental open source monitoring could complement this activity; while traditional approaches provide regular routine benchmarks, open source research can respond quickly and agilely to specific developments. Henrietta Wilson, Dan Plesch and Olamide Samuel
Image courtesy of Wikimedia Commons
Image of city lights