The Dataloss list sent the following article through yesterday afternoon:
Obamacare Employee Accidentally Sends Out 2,400 Social Security Numbers
This is concerning, but I hate to say it, not unexpected. We know that the weakest link in security is always people. Likely a worker was trying to be helpful and didn't think. As a result, an email with an Excel spreadsheet full of names and Social Security Numbers was sent out.
What was concerning is that this should have been picked up by any decent Data Loss Prevention (DLP) solution. It sounds like such a solution, even though we're dealing with privacy data, isn't in place. Perhaps it is in place but not configured correctly. This isn't surprising given these quotes from the article:
“Users of the exchange will need to provide sensitive information, including Social Security numbers, that will be sent to a federal hub to verify such things as citizenship and household income….
“All states and the federal government, which also is setting up exchanges for some states, are scurrying to get the complex system running in less than three weeks.
“‘The people who believe in this are so driven that there’s a subcontext of “Just let us do our job and get as many people signed up as possible, and we’ll pick up the debris later,”’ said Steve Parente, a University of Minnesota finance professor who specializes in health IT issues.
“Parente testified on Capitol Hill earlier this week, urging caution in pushing the federal hub online before it has been thoroughly tested.
I obviously can't validate the truthfulness of these quotes. That's not my point. Instead, I want to point out what we see too often with regards to deployments. Most IT folks, especially iT security folks, have seen implementations pushed through before they're fully vetted. Obviously, there are differing levels of risk depending on what the implementation does. When it comes to privacy data, however, there should be a measured and thoughtful process for deployment that includes testing the system properly. Too often we see data exposed, especially privacy data, because a suit somewhere wanted a system implemented and the staff to "pick up the debris later." In other words, we see quotes like this often across a multitude of systems. So long as this "full speed ahead" attitude is the majority one for decision makers, and so long as this is generally accepted by the customers of those decision makers, we will continue to see these kinds of leaks.
After all, it's near impossible to tighten everything down as it is in a properly tested system. We always have to deal with the human element. Then there's the unknown, such as a bug in the code that no one uncovered during standard user acceptance testing (which is why fuzzing has become more popular over the years). When we accelerate implementation at the cost of testing and other details-oriented tasks, we should expect even more breaches. Given that we can't avoid sharing this sensitive data in order to get services, we've got to push back against this "implement now" attitude. The truth of it is that as IT workers, we typically have little clout. The reason we have little clout is because a decision maker is going to say, "The customers want this now!" Therefore, as customers, we have to push back and say, "We want this, but only when you've done your due diligence in tightening the bolts properly."