Cross-Border Data

The CLOUD Act Doesn’t Help Privacy and Human Rights: It Hurts Them

By Neema Singh Guliani, Naureen Shah
Friday, March 16, 2018, 1:08 PM

At a time when human rights activists, dissidents and journalists around the world face unprecedented attacks, we cannot afford to weaken our commitment to human rights. But the recently introduced CLOUD Act would do just that.

The bill purports to address complaints that current mechanisms for foreign governments to obtain data from U.S. technology companies are slow, requiring review by the Justice Department and a warrant issued by a U.S. judge pursuant to the mutual legal assistance (MLA) process. The solution it proposes, however, is a dangerous abdication of responsibility by the U.S. government and technology companies.

Writing on Lawfare, Peter Swire and Jennifer Daskal have penned a the CLOUD Act, arguing that things don’t work well now, that they could get worse and that this is the best option on the table. But even if we accept Daskal and Swire’s dire view of the state of current affairs, their argument leaves a lot unexplained—such as why an alternative framework or improved version of the CLOUD Act is not tenable, why efforts to pass the bill without any public markups of the legislation or the opportunity for amendments are advisable, and why no major international human rights organizations support it. Two of the largest human rights organizations, Amnesty International and Human Rights Watch, oppose the bill, along with over twenty other privacy and civil liberties organizations. (Swire and Daskal do note that some of these groups participated in a working group on this issue, though they don’t describe the strenuous objections made during that process.)

Most importantly, however, Daskal and Swire do not address how this bill could fail human rights activists and people around the world.

The very premise of the current CLOUD Act—the idea that countries can effectively be safe-listed as human-rights compliant, such that their individual data requests need no further human rights vetting—is wrong. The CLOUD Act requires the executive branch to certify each of these foreign governments as having “robust substantive and procedural protections for privacy and civil liberties” written into their domestic law. But many of the factors that must be considered provide merely a formalistic and even naïve measure of a government’s behavior. Flip through Amnesty International or Human Rights Watch’s recent annual reports, and you can find a dizzying array of countries that have ratified major human rights treaties and reflect those obligations in their domestic laws but, in fact, have arrested, tortured and killed people in retaliation for their activism or due to their identity.

In the case of countries certified by the executive branch certifies, the CLOUD Act would not require the U.S. government to scrutinize data requests by the foreign governments—indeed, the bill would not even require notifying the U.S. government or a user regarding a request. The only line of defense would be technology companies, which hypothetically could refuse the request and refer it to the MLA process, but which may not have the resources, expertise, or even financial incentive to deny a foreign government request. Likewise, the bill requires that countries permit “periodic” reviews for compliance with civil liberties and privacy protections, but does not specify what these reviews will entail. It also doesn’t require even a cursory individual review of all orders or explain how the U.S. government can effectively ensure compliance in a timely fashion when without being aware of requests in real time. For this reason, the periodic U.S. government reviews contemplated in the bill are an insufficient substitute for case-by-case consideration.

Daskal and Swire point to other safeguards: Judges or independent authorities in the foreign country would review their government’s requests for data, they argue. But what about when courts greenlight, rather than check, police and intelligence services to go after human rights activists? This is not a problem confined to a small set of countries. In 2016, Amnesty International recorded at least in which human rights defenders were detained or arrested based solely on their work.

Similarly, the CLOUD Act would not prevent harm to human rights activists and minorities in cases where a country experiences a rapid deterioration in human rights. Under the CLOUD Act, once a foreign government gets an international agreement, it is safe-listed for five years—with no built-in mechanism to ensure that the U.S. government acts quickly when there is a rapid change in circumstances.

For example, in early 2014, Turkey may have met the CLOUD Act’s vague human rights criteria; Freedom House even it a three and four on its index for political and civil rights. But since the attempted coup in mid-2016, the Turkish government has arrested —including journalists and activists such as the chair and director of Amnesty International’s Turkey section—many on bogus terrorism charges. According to : “Most of these accusations of terrorism are based solely on actions such as downloading data protection software, including the ByLock application, publishing opinions disagreeing with the Government’s anti-terrorism policies, organizing demonstrations, or providing legal representation for other activists.”

Under the CLOUD Act, neither Congress nor U.S. courts would be able to prompt a review or a temporary moratorium for a case like Turkey. Users, without notice, would have little practical ability to lodge complaints with the U.S. government or providers. Even if the U.S. government were to take action, the CLOUD Act fails to ensure a sufficiently quick response to protect activists and others whose safety could be threatened.

In such a situation, the only real fail-safe to prevent a technology company from inadvertently acceding to a harmful data request is the technology company itself. But would even a well-intentioned technology company, particularly a small one, have the expertise and resources to competently assess the risk that a foreign order may pose to a particular human rights activist? Would it know, as in the example above, when to view Turkey’s terrorism charges in a particular case as baseless? In many cases, companies would likely rely on the biased assessments by foreign courts and fulfill requests.

Daskal and Swire argue that without the CLOUD Act, foreign governments with poor privacy standards will turn to data localization, which would pose greater human rights risks. But if the bill’s criteria are as strong as needed to protect privacy and human rights, those same foreign governments will not qualify for an international agreement—and so they may still push for data localization. The bill also does nothing to prevent a foreign government with an international agreement from data localization. If a technology company refused a government’s requests, the government could threaten to retaliate with localization and pressure the company to comply.

Finally, Swire and Daskal fail to address the CLOUD Act’s numerous ambiguities as to what human rights standards are a predicate to inclusion in the new data club the bill purports to create. Indeed, many of the criteria listed are merely factors that must be considered, not mandatory requirements. To highlight just a handful of the deficiencies in the bill:

  • The bill states that the Justice Department must consider whether a country respects free expression, without stating whether free expression is defined under U.S. law, international law, or a country’s own domestic law;
  • The bill states the Justice Department must consider whether a country respects “international universal human rights” without definition or clarity regarding how to assess this (indeed, this is not a recognized term in U.S. or international law);
  • The bill requires that requests be based on “articulable and credible facts, particularity, legality, and severity regarding the conduct under investigations”—a standard that is, at best, vague and subject to different interpretations, and is likely lower than the current probable cause standard applied to requests;
  • The bill fails to prohibit agreements in cases in which a country has a pattern or practice of engaging in human rights abuses, nor does it require an assessment as to whether there is effective central control of law enforcement or intelligence units;
  • The bill fails to require that countries meet any standards for metadata requests—leaving companies free to provide this data to human rights abusing countries without restriction;
  • For the first time, the bill allows foreign governments to wiretap and intercept communications in real-time, without even requiring governments to adhere to critical privacy protections in the Wiretap Act (such as notice, probable cause, or a set duration); and
  • The bill permits broad information sharing between governments, allowing countries (including the U.S.) to obtain information from foreign partners under standards that may be lower than their own domestic law.

These ambiguities provide the Justice Department with significant flexibility regarding the human rights standards a country must meet. What’s more, there’s no way for Congress or the judicial branch to practically act as a check in cases in which the executive branch makes the wrong decision. Country determinations are not subject to U.S. judicial review, and Congress would need to pass legislation within 90 days, likely with a veto proof majority, to stop an agreement from going into effect—an extremely high hurdle that will be difficult to overcome.

In light of this, it’s far from clear that, as Daskal and Swire write, the bill “will raise privacy protections on a global scale.” If members of Congress and technology companies want to address concerns with the MLA process while protecting privacy and human rights, they should abandon the CLOUD Act and craft a rights-respecting solution.