Opinion, Berkeley Blogs

The good, not so good, and long view on Bmail

By Chris Hoofnagle

Many campuses have decided to outsource email and other services to "cloud" providers.  Berkeley has joined in by migrating student and faculty to bMail, operated by Google.  In doing so, it has raised some anxiety about privacy and autonomy in communications.  In this post, I outline some advantages of our outsourcing to Google, some disadvantages, and how we might improve upon our IT outsourcing strategy, especially for sensitive or especially valuable materials.

Why outsourcing matters

Many of us welcome possible alternatives to CalMail, which experienced an embarrassing, protracted outage in fall 2011.  Many of us welcomed the idea of migrating to Gmail, because we use it personally, have found it user-friendly and reliable, and because it is provided by a hip company that all of our students want to work for.

But did we really look before we leaped?  Did we really consider the special context of higher education, one that requires us to protect both students and faculty from outside meddling and university-specific security risks?  Before deciding to outsource, we have to be sure that there are service providers that understand our obligations, norms, and the academic context.

In part because of the university’s particular role, our email is important and can be unusually sensitive to a variety of threats.  Researchers at Berkeley are conducting clinical trials with confidential data and patient information.  We are developing new drugs and technologies that are extremely valuable.  Some of us perform research that is classified, export-controlled, or otherwise could, if misused, cause great harm.  Some of us consult to Fortune 500 companies, serve as lawyers with duties of confidentiality, or serve as advisors to the government.  Some of us are the targets of extremist activists who try to embarrass us or harm us physically.  Some of us are critical of companies and repressive governments.  These entities are motivated to find out the identities of our correspondents and our strategic thinking, through either legal or technical means.  And not least, our email routinely contains communications with students about their progress, foibles, and other sensitive information, including information protected by specific privacy laws, such as the Family Educational Rights and Privacy Act (FERPA). We have both legal and ethical duties to protect this information.

Our CalMail operators know these things, and as I understand it, they have been very careful in protecting the privacy of campus communications. Outsourcing providers such as Google however, may be far less likely to be familiar with our specific duties, norms, and protocols, or to have in place procedures to implement them. Outsource providers may be motivated to provide services that they can develop and serve “at scale” and that do not require special protocols. As described below, this seems to have been the case with Google’s contracts with academic institutions.

Finally, communications platforms are powerful.  They are the focus of government surveillance and control because those who control communications can influence how people think and how they organize.  Universities have historically experienced periodic pressures to limit research, publication, teaching, and speech. Without communications confidentiality, integrity, and availability, the quality of our freedom and the role we play in society suffers.  And thus the decision to entrust the valuable thoughts of our community to outsiders requires some careful consideration.

The Good

There are some clear benefits to outsourcing to Google.  They include:

  • An efficient, user-friendly communications system with a lot of storage.  The integration of Google Apps, such as Calendar, is particularly appealing, given the experience we have had with CalAgenda.  Google Drive is a pleasure compared to the awkward AFS.
  • Our communications may in some senses be more securely stored in the hands of Google.  Google has some of the best information security experts in the world.  They are experienced in addressing sophisticated, state-actor-level attacks against their network.  To its credit, Google has been more transparent about these attacks than other companies.
  • Although it is not implemented at Berkeley, Google offers two-factor authentication.  This is an important security benefit not offered by CalMail that could reduce the risk that our accounts are taken over by others.  Those of us using sensitive data, or who are at risk of retaliation by governments, hackers, activists, etc., should use two-factor authentication.
  • As a provider of services to the general public, Google is subject to a key federal communications privacy law.  This law imposes basic obligations on Google when data are sought by the government or private parties.  It is not clear that this law binds the operations of colleges and universities generally.  However, this factor is not very important with respect to the Berkeley's adoption of bMail, as we have adopted a strong electronic communications policy protecting emails systemwide.
  • Google recently announced that it will require government agents to obtain a probable cause warrant for user content.  This is important, because other providers release "stale" (that is, over 180 days old) data to government investigators with a mere subpoena.  A subpoena is very easy to obtain, whereas a probable cause warrant standard requires the involvement of a judge, an important check against overzealous law enforcement.  Google's position protects us from the problem that our email archives can be obtained by many government officials who need only fill out and stamp a one-page form.
  • The Not So Good

    Still, there are many reasons why outsourcing, and outsourcing to Google specifically, creates new risks.  While our IT professionals did an in-depth analysis of Google and Microsoft, it seems that the decision to outsource was taken before the reality of the alternatives available to us were evaluated.

    • We must consider issues around contract negotiations and whether services provided fulfill the requirements I set forth above. In initial negotiations, Google treated Berkeley IT professionals like ordinary consumers—it presented take-it-or–leave-it contracts.  Google was resistant to, though it eventually accepted, assuming obligations under FERPA, a critical concession for colleges and universities.  Google also used a gag clause in its negotiations with schools.  This made it difficult for our IT professionals to learn from other campuses about the nuances of outsourcing to Google.  As a result, much of what we know about how other campuses protected the privacy of their students and faculty is rumor that cannot be invoked, as it implicitly violates the gag clause.
    • On the most basic level, we should pause to consider that both companies the campus considered for outsourcing are the subject of 20-year consent decrees for engaging in deceptive practices surrounding privacy and/or security.  Google in particular, with its maximum transparency ideology, does not seem to have a corporate culture that appreciates the special context of professional secrecy.  The company is not only a fountainhead of privacy gaffes but also benefits from shaping users' activities towards greater disclosure.
    • As discussed above, UC and Berkeley routinely handle very sensitive information, and many of us on campus have special obligations or particularized vulnerabilities.  Companies with valuable secrets do not place crown jewels in clouds.  When they do outsource, they typically buy “single-tenant” clouds, computers where a single client's data resides on the machine.  Google's service is a “multi-tenant” cloud, and thus Berkeley data will only be separated from others on a logical level.  Despite the contract negotiation, Google's is a consumer-level service and our contract has features of that type of service.  There is a rumor that one state school addressed this issue by negotiating to be placed in Google's government-grade cloud service, but because of the secrecy surrounding Google's negotiations, I cannot verify this.
    • Third parties are a threat to communications privacy, but so are first parties—communications providers themselves.  While we may perceive cloud services as being akin to a locker that the user secures, in reality these are services where the provider can open the door to the locker.  In some cases, there is a technical justification for this, in other cases, companies have some business justification, such as targeting advertising or engaging in analysis of user data.  Our system contract "does not allow Google to serve ads," but it is important to note that companies can engage in data analysis whether or not advertising is present.
    • It is rumored that some campuses understood this risk, and negotiated a "no data mining clause."  This would guarantee that Google would not use techniques to infer knowledge about users' relationships with others or the content of messages.  Despite our special responsibilities to students to protect their information and our research and other requirements, we lack this guarantee.
    • Despite the good news about Google's warrant requirement, we still need to consider intelligence agency monitoring of our data.  Any time data leaves the country, our government (and probably others) captures it at the landing stations and at repeater stations deep under the ocean.  And the bad news is our systemwide contract does not keep data in the U.S, but the Berkeley amendment keeps core service data in the US.  Even while stored in the country, there are risks.  For instance, the government could issue a national security letter to Google, demanding access to hundreds or even thousands of accounts while prohibiting notice to university counsel.  Prior to outsourcing, those demands would have to be delivered to university officials because our IT professionals had the data.  Again, to its credit, Google is one of the most forthcoming companies on the national security letter issue, and its reporting on the topic indicates that some accounts have been subject to such requests.
    • Google represented that its service meets a SAS 70 standard in response to security concerns, but it is not clear to me that this certification is even relevant.    SAS 70 speaks to the internal controls of an organization, and specifically to data integrity in the financial services context.  The University's concerns are broader--confidentiality and availability are key elements--and apply to both external and internal controls and the University’s rights to monitor and verify.  There are notable examples of SAS 70 compliant cloud services with extreme security lapses, such as Epsilon (confidentiality) and AWS (availability).  SAS 70 allows the company, which is the client of the auditor, and the auditor itself, to agree upon what controls are to be assured.
    • Google will have few if any incentives to develop privacy-enhancing technologies for our communications platform, such as a workable encryption infrastructure.  As it stands, the contract creates no incentives or requirements for development of such technologies, and in fact, such development runs counter to Google's interests.
    • In the end, CalMail was being very effectively maintained by only a few employees. It is not clear to me that an outsourced solution—which, in order for the security and other issues to be managed properly, requires Berkeley personnel to interface with the system and with Google—is necessarily less costly. This is especially concerning in light of the fact that we appear to have lost the connection to IT personnel who understand the sensitivity of the data we handle, and moved to a much more consumer-oriented product.
    • The long view

      Looking ahead, we should carefully consider how we could assume the best posture for outsourcing. Instead of experimenting with Google, we would be better served by an evaluation of the campus needs that includes regulatory and ethical obligations and that captures the norms and values of our mission.  Provider selection should be broader than choosing between Google and Microsoft.

      As a first step, we should charge our IT leadership with forming formal alliances with other institutions to jointly share information and negotiate with providers.  Google's gag provision harmed our ability to both recognize risks and to address them.

      We need to be less infatuated with "the cloud," which to some extent is a marketing fad.  Many of the putative benefits of the cloud are disclaimed in these services' terms of service.  For instance, a 2009 survey of 31 contracts found that, "…In effect, a number of providers of consumer-oriented Cloud services appear to disclaim the specific fitness of their services for the purpose(s) for which many customers will have specifically signed up to use them."  The same researchers found that providers' business models were related to the generosity of terms.  This militates towards providers that charge some fee for service as opposed to "free" ones that monetize user data.

      We should charge our IT professionals with the duty of documenting problems with outsourced services.  To more objectively understand the cloud phenomenon, we should track the real costs associated with outsourcing, including outages, the costs of managing the relationship with Google, and the technical problems that users experience.  Outsourcing is not costless.  We could learn that employees have simply been transferred from the operation of CalMail to the management of bMail.  We should not assume that systems mean fewer people—they may appropriately require meaningful staffing to fulfill our needs. As the expiration date of system wide Google contract approaches in June 2015, these metrics will help us make an economical decision.

      Finally, there are technical approaches that, if effective, could blunt, but not completely eliminate, the privacy problems created by cloud services.  Encryption tools, such as CipherCloud, exist to mask data from Google itself.  This can help hide the content of messages, reduce data mining risks from Google, and cause the government to have to come to Berkeley officials to gain access to content.  The emergence of these services indicates that there is a shared concern about storing even everyday emails in cloud services.  These services cost real money, but if we continue to think we can save money by handing over our communications systems to data mining companies, we are likely to end up paying in other ways.

      Note: this post has been updated to amplify differences between the UC System contract with Google and the Berkeley amendment, which has terms limiting storage of most data to the US.