The revolutions in the Arab world showcased the catalytic role that information technology can play in political movements. As each regime toppled, leaders in other countries increasingly focused on how to prevent similar uprisings within their borders, ramping up the demand for censorship and surveillance technology as one means to maintain control. At the same time, when the archives of the fallen regimes opened, they revealed details about the sources and suppliers of these technologies. It quickly came to light that a number of companies—mostly from the West—had been selling and supplying such technologies to regimes with poor human rights track records. Several governments and civil society organizations have criticized this practice, arguing that it clashes with the foreign policies of many countries actively promoting the protection of human rights. In response, they propose to update existing export regulations to capture this new trade area. But in practice, the dual-use character of the technologies poses a significant challenge in moving from vision to implementation. These efforts can therefore only be successful if they are coordinated multilaterally and informed by technical analysis to distinguish between legitimate and nefarious purposes. Here the devil really is in the detail.
Over the last few years, several news reports as well as the recent Citizen Lab report, Planet Blue Coat, have exposed exports of surveillance technology to controversial destinations—not only to countries subject to U.S. sanctions (Iran, Syria, Sudan, North Korea, and Cuba) but also to Egypt, Bahrain, Kuwait, Saudi Arabia and other countries with poor human rights records. In Syria, researchers discovered products sold by Blue Coat Systems, namely the Blue Coat ProxySG and PacketShaper, used in the regime’s networking filtering and monitoring apparatus. And the use of Gamma International’s FinFisher surveillance software in Bahrain and its potential role in human rights abuses has been the focus of efforts by Privacy International. A common theme in these reports is the recognition that this market remains largely unregulated. As Tom Simonite from the MIT Technology Review recently pointed out, with regard to the trade of vulnerabilities, “no law directly regulates the sale of zero-days in the United States or elsewhere.”
Self-regulation and corporate social responsibility are one obvious path to address this problem. Yet, signs from industry have been discouraging thus far and attempts to mainstream human rights principles into business procedures remain nascent. Jerry Lucas, president of the company that organizes the Intelligence Support Systems conferences that have become known for showcasing surveillance and censorship technology, demurs responsibility. “That’s just not my job to determine who’s a bad country and who’s a good country,” he has said. “That’s not our business, we’re not politicians… we’re a for-profit company. Our business is bringing governments together who want to buy this technology.”
Yet several governments have taken action. The Obama administration issued an unusual Executive Order in April 2012 to address the provision of surveillance technologies to Iran and Syria. The European Union established a similar ban on exports to Syria. The British government, on the other hand, decided not to rely on a human rights argument but rather its cryptography controls in responding to a public outcry over the behavior of Gamma International, a company which prides itself on offering “world-class offensive techniques for information gathering”.
A number of existing controls cover some types of surveillance and censorship technologies. In the U.S., for example, “surreptitious listening” controls already apply to certain devices, software, and technology that enable communications interception. There is also a precedent when it comes to arguing for such controls based on a human rights rationale. With regard to crime controls, “Congress has recognized the usefulness and symbolic value of these controls in supporting U.S. Government policy on human rights issues, foreign availability notwithstanding.” But these controls are not sufficient to capture the full range of technology. That is why the export control regime needs to be updated and expanded for the digital age.
Ideally, the sale of surveillance and censorship technologies to countries which abuse or ignore human rights would be eliminated altogether. But this aspirational goal requires a reality check even in the 21st century. It is clear that such efforts can realistically achieve not a complete elimination but at least a reduction of such exports and of human rights offenses with surveillance and censorship technologies. Moreover, it is likely that other states will step in to fill the demand after a few countries restrict their exports. The practical focus must therefore be on maximizing the effectiveness of such export controls to achieve the widest reach possible.
Export restrictions by Western countries, particularly in Europe and the US, are promising as a way increase the effectiveness of the control regime. These countries are leaders in technological innovation, and even if the markets shift to other countries, it will result in a reallocation of resources among the world’s leading technology industries nudging companies to invest in other business. But this responsibility is not limited to the U.S. and Europe alone. Human rights are universal and the International Bill of Human Rights has become customary international law. It is therefore the task of all governments to ensure that technology is not used for human rights violations.
Countries that pride themselves on promoting and protecting human rights in their foreign policies have a moral obligation to put their money where their mouths are and to update existing export control regimes to resolve any hypocrisy. Companies also have obvious reputational risks to manage. For example, the Electronic Frontier Foundation (EFF) has supported a case against Cisco’s business in China, and Blue Coat Systems and Gamma International have faced significant public pressure. Clarifying companies’ responsibilities will help them avoid reputational harm in the future.
Cryptography controls—like the ones the UK used in response to Gamma International—are a useful reminder of past efforts to control software and the 1990s struggle now known as the “Crypto Wars.” Back then, governments tried to restrict the export of cryptography – in the United States under the U.S. Munitions List – as much as possible, a technology which had been the purview of military and intelligence agencies alone. This turned out to be impracticable as cryptography became increasingly ubiquitous. Recent changes to the multilateral Wassenaar regime further relaxing its cryptography provisions highlight this trend. Today, the goal is not to put the genie back in the governments’ bottle and to create a blanket ban—instead, it is to make a narrow and specific group of technologies subject to a licensing regime for review prior to export, based upon their destination and likely end-use in order to address human rights concerns, not national security.
Implementing such a reform of the existing control regime must center on two key principles. First, it must be informed by the expertise of technologists in order to develop well-written language and to minimize potential unintended consequences that are particularly common in the arena of dual-use technology. Second, this reform must be part of a coordinated, multilateral effort to avoid a collective action problem, to mitigate the market effects on the countries’ industry, and to maximize the controls’ effectiveness. EFF has done some groundbreaking work on a Know-your-Customer approach to controlling surveillance technologies and the work by the Citizen Lab and Privacy International has created additional momentum to build on.
The 0day market is a particularly salient example of the challenges at hand. According to Simonite the vulnerabilities unveiled at some of the traditional conferences “haven’t been quite so dramatic in recent years. One reason is that a freshly discovered weakness in a popular piece of software, known in the trade as a ‘zero-day’ vulnerability because the software makers have had no time to develop a fix, can be cashed in for much more than a reputation boost and some free drinks at the bar. Information about such flaws can command prices in the hundreds of thousands of dollars from defense contractors, security agencies and governments.” This matches the assessment by the Stockholm International Peace Research Institute (SIPRI) which concluded in its 2013 report that “the expansion of arms producing companies into the cybersecurity market—a clear trend in the first tier of the SIPRI Top 100—is due the growing political and budgetary importance of cybersecurity as a national security issue.”
Relying on self-regulation also seems ineffective in this particular market. Vulnerability seller Donato Ferrante recently told NPR that “I don’t see bad guys or good guys,” Ferrante says. “It’s just business.” And Christopher Soghoian from the ACLU has pointed out that “there are plenty of researchers who are selling these things for what they deem to be the true market value. And the true market value is whatever governments and their middlemen are willing to pay.” This is why the National Research Council’s Herb Lin, chief scientist at the Computer Science and Telecommunications Board, suggests a licensing regime limiting the sale to “authorized” parties but highlights the enforcement challenges.
Another key challenge is that different communities use different narratives and, where they overlap, they can clash. For example, in national security circles, cyber-warfare has become associated with malware—such as the Stuxnet worm—that can attack SCADA systems and have destructive effects in the physical world. Human rights activists have used the term cyber-war, on the other hand, to draw attention to the struggle between governments and protesters that take place online as well as in the streets. 0day vulnerabilities are a rare example where these narratives converge, since they can be used for surveillance of citizens, espionage against a company, or warfare against another country.
But it will be increasingly important to clearly distinguish between those vocabularies and to exercise greater care in how terms are used. For example, describing censorship and circumvention technologies as arms or munitions, potentially to be controlled under the U.S. Munitions List, could have unintended consequences in international negotiations over cyber-security. Russia and China consider content control to be part of information security and information warfare and could use the definition of surveillance and censorship technologies as munitions in support of their definition. Many challenges pertaining to cyberspace are new and require new concepts, but some of the old – the direct fatality caused by a weapon, and the potential indirect fatality by censorship – are worth keeping.
Updating export controls to the digital age will not be easy. It would be naïve to think trade in censorship and surveillance technologies could be eliminated, but it is certainly possible to at least reduce their export to dubious end-users. The challenges of implementing and enforcing such controls are great, but worth tackling to align existing export control regulations with human rights based foreign policies for a digital age.
About Danielle Kehl
Danielle Kehl is a policy program associate in the Open Technology Institute at the New America Foundation where she provides research and support on a number of policy issues including spectrum management and ICT for development. Before coming to New America, Danielle worked on the policy team at Access (AccessNow.org), an international NGO which advocates for digital human rights. Prior to that, she was a Fulbright Fellow in Rwanda, where she taught English and worked on community development projects. She graduated from Yale University with a B.A. in history, concentrating on political and social movements in the 20th century. She has also interned at the Council on Foreign Relations and worked as a research associate at the Center for the Study of American Politics and the MacMillan Center for International and Area Studies at Yale.
About Tim Maurer
Tim Maurer focuses on Internet policy in international affairs as a program associate at the Open Technology Institute. He conducts research on Internet governance, human rights policy, and cyber-security. Maurer is also an adjunct fellow with the Technology and Public Policy Program at the Center for Strategic and International Studies where he worked was a research associate prior to joining New America. He assisted David Sanger with parts of his best-selling book Confront and Conceal, analyzing President Obama’s national security cyber policy. He has worked for the United Nations in Rwanda, Geneva, and New York and is a non-resident fellow at the Global Public Policy Institute. Maurer holds a Master in Public Policy from the Harvard Kennedy School, where he was a McCloy fellow concentrating on international and global affairs. His award-winning thesis was a research project conducted for the White House National Security Council. He received his B.A. in political science from the Freie University Berlin.