Data Ransom No Cause For Panic

By Matt Curtin

(Back to article)

Shock and horror were widely on display this month after a computer cracker demanded a $10 million dollar ransom not to publish sensitive information of Virginians allegedly gleaned from a state-run prescription drug database. Let us look more closely at this business of holding data for ransom.

An Incredible Claim

Let's take a look at the ransom note. What strikes me immediately is the message looks like it was crafted to make the situation look worse than it is:

"I have your shit! In *my* possession, right now, are 8,257,378 patient records and a total of 35,548,087 prescriptions. Also, I made an encrypted backup and deleted the original. Unfortunately for Virginia, their backups seem to have gone missing, too. Uhoh :("

The text illuminates two issues: the first claim is there has been a breach of sensitive information. The second claim is that the attacker has denied availability of the information to legitimate users. How could an attacker know that the system's "backups seem to have gone missing", anyway? In truth, the only public evidence available shows that someone was able to put an unauthorized message on the website (which is in fact a third kind of breach―a lack of integrity). How much of this is true remains unclear. It's somewhere between zero and 100%.

Nothing New Here

Reported break-ins are not new. What is new is the dramatic rise in the number of reported confidentiality breaches since states began passing breach notification laws. The Identity Theft Research Center (ITRC) cataloged 91 cases of personal information being breached through compromise of system security in 2008, making it roughly 14% of all incidents it cataloged that year.

Holding data for ransom is not new either. In 1996, security researchers Adam Young and Moti Yung introduced "cryptovirology," where cryptography could be employed by attackers against those attempting to provide security. Less than three years later, "ransomware" appeared: malicious software that encrypts an unsuspecting user's files and demands payment to release the data. In January 2000, we learned that online retailer CD Universe suffered a compromise accompanied by a $100,000 ransom. Since then, we have seen other cases of both ransomware and simple extortion.

Doing Our Homework

Our industry must do a better job of assessing the real impact of the threat of extortion or other compromise of sensitive data. For too long, we have repeated whatever statistic was in vogue. We need not just to know the number of incidents or records, but the impact of those losses and the probability that others will suffer the same loss. As an industry, we have been either guessing or listening to what vendors of "solutions" tell us what the problem is.

The ITRC has cataloged over 1,500 breach incidents since 2005; enough that a colleague and I were able to study the data and find statistically significant correlations between type of breach and industry vertical. This analysis showed that sensitive personal data is reported exposed through computer break-ins at lower rates than usual in both of the industries relevant here, healthcare and public administration. Other work to catalog more data loss incidents has been undertaken by the Open Security Foundation. At this past April's RSA conference, Elizabeth Nichols presented her analysis  of that data set. These are all important first steps in understanding what is really happening out there.

The Hard Part

We are at a critical time for IT and now is the time for leadership. We must show our organizations and the public at large how to understand the relative rewards and risks of using information in electronic form. We can either rise to the occasion or be replaced by leaders that will.

To wit, part of leadership requires choosing an achievable goal. I propose that we in IT can serve an important role in the use of electronic data only to the degree that we work realistically within our discipline. It means informing other people who can then work within their disciplines.

We need to recognize that some level of failure in systems will be the optimal level to balance the amount of funding that we get against the return that we're expected to provide. No rational person expects to get return on an investment portfolio with zero risk; and any portfolio with risk is going to have a few losers. We as a society make these tradeoffs in many ways: whether explicitly or implicitly. Two universal constants of computing remain operative today: computer systems fail, and people make mistakes. We need to build our infrastructures understanding these realities.

Now that "cybercrime" is being perpetrated by organized crime undoubtedly attackers will get more sophisticated. In light of this, Phil Williams of the CERT Coordination Center wrote in August 2001 that the real problem is not breaking into computers, but crime generally. If we mean to succeed, I believe that we will need to do three things:

We have the tools to describe what is happening, to understand its impact, and its frequency. We need to make use of these and work with other leaders to find the right balances among risk, utility, and expense in our infrastructure. Choosing rationally, rather than responding out of fear, is the path forward.

Matt Curtin is a Columbus-based technologist, writer, and entrepreneur. Matt founded Interhack in 1997 as a research group that looked at the side-effects of using the Internet as a large-scale computing and communication platform. In 2000, he reorganized Interhack into a professional service practice focused on forensic computing and information assurance.