Cyber Risk Insurance: When Conventional Liability Coverage Might Not be Enough

September 16, 2012

494499_74504756Businesses seem to be adopting cloud computing, e-commerce, and other internet technologies at an ever-accelerating rate, and these technologies continue to evolve and adapt to meet business’ needs. Insurance policies that protect businesses from losses and other liabilities, however, are not always so quick to change. For businesses that rely on various forms of electronic communication and data, this can create a gap in coverage and a risk of catastrophic loss. Some insurers have begun to fill the gap with “cyber risk” policies, and coverage is slowly beginning to appear in general liability policies.

Understanding Conventional Coverage

A standard business liability policy will cover ordinary losses, such as damaged facilities, broken equipment, or ruined inventory. This type of coverage is essential for the sorts of problems businesses have faced for millennia: damage in a storm or other natural disaster; theft or loss of essential business machinery or computers; injury to a customer on the business premises; or loss of inventory, such as spoilage of food during a power outage. Conventional coverage might include loss of electronic data as a result of equipment failure or force majeure, but it most likely will not include some of the newer threats of the internet era. Read the rest of this entry »


International Privacy Group Issues Recommendations on Cloud Computing Policy

August 27, 2012

1254879_36460671The International Working Group on Data Protection in Telecommunications (“Working Group”), an organization of European data protection agencies, recently released a report on the protection of data and privacy in international cloud computing. Although the European Union (EU) and the United States take different approaches to data privacy, the Working Group’s report draws on U.S. practices in presenting a series of recommendations for data protection between countries. Whatever actions the U.S. takes regarding international data protection, the Working Group’s recommendations offer a useful guide for U.S. businesses that use cloud computing and wish to minimize their risk of data breaches and other losses.

The Working Group, founded in 1983, operates under the auspices of the European Commission, the executive body of the EU. Its headquarters is in Berlin, Germany. A directive from the European Parliament, passed in 1995 and becoming effective in 1998, established procedures to protect individuals’ personal data, facilitate the “free movement” of data, and restrict the movement of data to non-EU countries with less-stringent privacy protections. Article 29 of the directive established a “Working Party on the Protection of Individuals with regard to the Processing of Personal Data,” whose mandate is similar to that of the Working Group. In 2000, the U.S. and the EU entered into a “safe harbor” agreement that affirmed the adequacy of U.S. data protection laws under the EU’s own framework. The Federal Trade Commission (FTC) has authority over data protection issues in the U.S. The Article 29 Working Party has called for the U.S. to make further agreements with the EU regarding data security between government agencies. Read the rest of this entry »


Generic Top-Level Domains and Protection of Your Trademark Rights

August 7, 2012

For most of the history of the internet, only a few generic top-level domains (gTLDs), such as .com or .net, have been available to most users. That changed in June 2011, when the Internet Corporation for Assigned Names and Numbers (ICANN) decided to open a window of time to allow applications for new gTLDs. The window was open from January 12 to May 30, 2012, and in that time ICANN received more than two thousand applications. After a review by ICANN, many of these gTLDs may go live by next year. Until that time, ICANN has made a list of all pending applications available to the public, and has placed the burden of identifying possible trademark infringement on the trademark owners.

ICANN is a California-based nonprofit organization that manages the global registry of domain names and IP addresses, which computers use to locate particular websites. It also maintains the current list of twenty-two gTLD’s. The current system began in the 1980’s with only seven gTLDs, and ICANN has gradually expanded the list, which now includes .biz, .info, the recently-added and controversial .xxx, and several more. Many of the new proposed gTLDs are merely descriptive, such as .school or .beer, but some contain brand or trade names that may infringe existing trademarks. Read the rest of this entry »


U.S. Utilities Face Cybersecurity Risks as Hacker Attacks Mount

June 11, 2012

'PIPELINES FROM A NATURAL GAS INSTALLATION (IN BACKGROUND) RUN THROUGH RATTLESNAKE BAYOU AT THE WESTERN END OF THE FREEPORT SULPHUR CO. PIPELINE CANAL' by Messina, John, 1940-, Photographer (NARA record: 8464458) [Public domain], via Wikimedia CommonsThe prospect of a cyberattack on public utilities and other vital infrastructure has loomed in America’s imagination for years, serving as the plot for countless films, thriller novels, and television shows. Recent news from the federal government and the private sector has brought attention back to the topic. American infrastructure may remain vulnerable to certain types of cyberattacks, and the possible damage from such an attack would impact public and private resources alike. The risks faced by public and private utilities may help businesses assess their own cybersecurity risks and serve as a model for their own risk management.

The U.S. Department of Homeland Security (DHS) recently issued an alert regarding attacks by an unknown group of hackers that, over the past six months, have targeted the nation’s natural gas pipelines. DHS reportedly does not know if the attacks are an attempt to gain intelligence about the U.S. gas pipeline system, or if the attacks intend to damage the system. The attacks involve a technique known as “spear-phishing,” which sends e-mails that appear to come from friends or family of a targeted individual. Malware attached to the e-mails infects the target’s computer and attempts to steal passwords that would allow access to utility control systems. DHS has reportedly been working with utility companies since March to fight the attacks. Hackers, some linked to China, have targeted the natural gas sector several times in the past few years. Read the rest of this entry »


Public Cloud Computing Has New Guidelines to Help Protect Users’ Privacy and Security

April 25, 2012

Cloud computing opennessAdministrators and users of “public cloud computing” services have a new set of guidelines for managing risks to the security of the systems and the privacy of the stored data. The National Institute of Standards and Technology (NIST), an agency of the U.S. Department of Commerce, has followed up on its recent document offering a definition of cloud computing with a set of guidelines for privacy and security in cloud systems. While government agencies comprise the principal audience for NIST’s guidelines, private companies and organizations can benefit as well. End-user consumers, whose personal information is often most at risk of cyberattack, will also find the guidelines beneficial. We will focus on security and privacy considerations for businesses and other organizations.

The NIST published its report, “Guidelines on Security and Privacy in Public Cloud Computing,” in December 2011. It recommends a security and privacy environment based on careful planning that is tailored to a particular cloud provider’s system. Planners should take the needs of the organization into account when creating a cloud computing solution, paying close attention to the computing environments of both the service provider and the user. Finally, cloud computing environments require accountability, with constant monitoring of the system’s effectiveness.

Planning for Security and Privacy

Cloud computing represents a major departure from previous models of information management. Sensitive data no longer resides on a private server, but rather “in the cloud.” It therefore requires careful planning of organizing and storing data, as well as management of security and privacy over the life of the organization. Security and privacy are particularly vulnerable in the initial process of transferring data to new storage media, and also in the ongoing process of retrieving data for use.

Understanding the Cloud Environment

Organizations have unique computing needs, and cloud providers offer multiple types of services. To effectively manage risk, organizations must have a detailed understanding of the cloud provider’s services. In particular, an organization must understand its responsibilities, as opposed to those of the cloud provider, for security and privacy of information.

Ensuring the Cloud Service Meets the Organization’s Needs

An organization is unlikely to find a cloud provider whose default service precisely meets their needs for security and privacy. The organization should clearly articulate their particular risks and vulnerabilities, and should be prepared to negotiate services with a cloud provider to find the best possible service.

Ensuring the Client-Side Service Meets the Organization’s Needs

Cloud computing is two-sided. Organizations must ensure the security of their own users as well as the cloud service itself. Users access cloud providers’ services through web browsers, smartphone apps, and other software. Hackers can easily breach many client-side applications, so careful planning and understanding is crucial for an organization. Read the rest of this entry »


Federal Government Develops Definitions and Standards for Cloud Computing

April 18, 2012

Cloud computingThe National Institute of Standards and Technology (NIST) is a non-regulatory agency of the U.S. Department of Commerce. Its purpose is to develop standards for measurements in science and technology that “promote U.S. innovation and industrial competitiveness.” The ultimate goal is to “enhance economic security and improve our quality of life.” NIST was founded in 1901 as the National Bureau of Standards. Its standards and regulations regarding measurements of weight, mass, and other metrics influence much of the world’s commerce. It even operates a website that provides the official time for any location in the United States. NIST has recently turned its attention to cloud computing.

In a publication titled “The NIST Definition of Cloud Computing,” released in September 2011, NIST has issued its guidelines for standardized definitions and terminology in relation to the field of “cloud computing.” The Federal Information Security Management Act of 2002 (FISMA) requires NIST to develop these guidelines for the purpose of facilitating information security. A set of standard terms and definitions is crucial to developing security protocols for cloud-based data, particularly when data may be spread across multiple servers or networks in multiple physical locations. Although the specific audience of NIST’s publication is the federal government, it notes that private organizations may choose to follow its recommendations.

NIST defines “cloud computing” as a model that allows access to shared resources online that is convenient and available on demand from anywhere a user has access to the internet. Resources may include data storage, applications, and other services, and should involve little management on the user’s end. The report defines “cloud computing” based on a set of “essential characteristics,” “service models,” and “deployment models.”

Five essential characteristics define cloud computing. A consumer must be able to access cloud services on-demand and on a self-service basis, with no human interaction required. Services must be available through ordinary network access, such as through laptop computers, tablets, or smartphones. Cloud services should be pooled to serve multiple consumers at once. Services should also be sufficiently elastic to allow for rapid changes in demand on system resources, giving consumers the same or similar experience no matter how many users are online. Finally, the service should be measurable, allowing both the service provider and consumer to track usage statistics like bandwidth and storage. Read the rest of this entry »