Back

HIPAA May not Protect Compulsive Liars

At a recent networking event I heard a manager express frustration over managing an employee who got caught up in her own fairy tales that resulted in a very embarrassing termination.  She told her co-workers that she was diagnosed with cancer and needed time off for surgery and treatment.  The company responded with genuine concern and care, assuring her that she will have all the support and time off she will need.  However, after an attempt to send her flowers to the hospital, they discovered that she was not there, and a little more probing confirmed that she never had cancer in the first place.  Once I got over the ridiculousness of this lie, I started thinking about the implications of being able to determine whether someone is at the hospital… Is letting someone know that a patient is not at a particular hospital at a specific time considered Protected Health Information (PHI)?  What about calling the hospital, asking for the room where Mr. Kravchenko is located and promptly being routed to my room?  Isn’t the simple fact of agreeing to route the call already considered PHI?  I realize that this may not be the biggest or most prominent HIPAA violation for most hospitals, requiring some familiarity with the patient in order to make the inquiry effective.  However, this also seems like this would allow for targeted inquiries into individual’s health records, all without having consent.  I can also see how interested but not authorized parties can start checking for attendance to substance abuse or psychological treatments simply by calling at the time when the patient is suspected to be there. Obviously, HIPAA was not created to protect compulsive liars from being able to deceive their employers and it hard to feel bad for the person who would lie about being sick with a terminal disease.  However, this example does highlight the need for staff at hospitals and out-patient facilities to be trained on handling incoming inquiries, including deliveries of balloons and flowers.  This also means that hospitals may need to come up with a different / better way of handling incoming calls to patient rooms, and may even need to start using “passwords” before routing a call.  While many such incidents are anecdotal and often do not create a lot of sympathy for the “patient”, this does highlight just how easy it is for unauthorized disclosure of PHI to happen.

A couple of months ago, I attended the Nuclear Energy Institute’s Cyber Security Implementation Workshop in Baltimore.  The keynote speaker was Brian Snow, who is a well-known security expert with substantial experience at the National Security Agency.  Early in his talk, Snow highlighted the fact that security practitioners do not operate in a benign environment, where threats are static, but, rather must work to continually counter malice. A good analogy that Snow provided deals with transportation.  When you need a vehicle for use in a benign environment, you use a car; when you need a vehicle for use in a malicious environment, you use a tank, which is purpose-built for such an environment. A security program needs to provide the defensive capabilities of a tank.  However, few security practitioners have the luxury of building the program from scratch and, instead, must attempt to retrofit tank-level security into an IT environment that was designed to be less complex, less expensive, and simpler to maintain, much like a car is.  Due to this fact, security practitioners tend to run into numerous roadblocks when adding layers of controls.  While it may not be feasible to build a complete approach to information security from the ground up, it is important for IT management to recognize that a proactive strategy of incorporating defensive controls will lead to the most robust and effective information security program possible.  Additionally, security practitioners may encounter resistance to applying particular controls.  In this case, a risk-based approach is advised.  Will forgoing this control leave the tank substantially weakened or is the additional protection afforded by the control something that can truly be done without? Ultimately, a team implementing a corporate security program likely has more obstacles to overcome than the builder of a tank due to the fact that there is far more room for different interpretations of risk in the boardroom than on the battlefield.  Even so, it is important to put each and every decision about controls in context; as the reliance on information systems expands even further into industries such as healthcare, energy, and defense, lives truly may depend on it.

Back

Big Changes in PA-DSS v2.0

Maybe, maybe not. If you are a payment application vendor, are you worried about the changes that have happened with the new release of the Payment Application Data Security Standard (PA-DSS)? For the most part, the requirements have not changed but there are a couple of items that may require some changes in the application, the documentation, or even the processes around the application.

Storing sensitive authentication data

In PA-DSS version 1.2, it was not acceptable to store authentication data (i.e. track 1 data, CVV, etc.). The revision for PA-DSS version 2.0 now allows for sensitive authentication data (track1, track 2, CVV) to be stored but only if there is sufficient business justification and the data is stored securely. This is only for card issuers and companies that support issuing processing. It has never been permissible for merchants to store this information even if encrypted. During the testing portion of the audit, the auditor will be testing for sensitive authentication data using forensic methods. The auditor will also verify that the application is intended for card issuers and/or companies that support issuing services.

Auditing

One of the changes to PA-DSS is that the application needs to support centralized auditing. This means the audit data must be able to be moved to a centralized log server (i.e. syslog-ng, Windows Event Logs). During the testing portion of the audit, the auditor is going to have to see that the lab has a centralized log sever configured and that the application logs are moving to this server. The PA-DSS Implementation Guide also has to provide instructions and procedures for incorporating the logs into a centralized logging environment.

One less requirement

As a final note, there is one less requirement. Requires 10 and 11 have been merged, instead of having two separate requirements, one for the merchant and one for the payment application vendor, there is now only one requirement covering remote access to the payment application.

Conclusion

The PA-DSS version 2.0 requirements, in most cases, are clearer. It makes it easier for payment application vendors to understand the requirements and to pass the audit.

Back

Counseling the Corporate Board

There was a great quote in a recent Ponemon study sponsored by Cenzic and Barracuda: “Most organizations have been hacked, yet 88 percent still spend more on coffee than on app security.” Combined with the recent revelation that oil companies and components of our national infrastructure have been compromised (see McAfee’s Global Energy Cyberattacks: “Night Dragon” for more information), this should be cause for significant alarm. Aside from funny quips like the one above, there are massive tangible costs associated with the recent breaches. One of the most shocking losses is the cost associated with US fighter jet technology. It’s estimated that China “saved” over $20 billion in the development of its latest stealth fighter. Although not publicly discussed, it’s commonly acknowledged that China’s advances were due in large part to lapses in US information security. What’s scary are the breaches that we are hearing about are occurring at organizations that spend significantly more than average on information security. While each has its issues, the military spends massive amounts on information security and large oil companies tend to allocate security significant budget dollars.  In addition, the breaches at the oil companies were fairly simple: break in through externally available web applications and step through to confidential information and critical processes. Most of the attacks in the McAfee report were based on existing and commonly used tools. If highly profitable companies that spend significant amounts of money on information security are being breached, it shows how massive the problem is that we are facing and how difficult it will be for smaller less profitable organizations to confront. In the past, when I spoke to what might be considered an ordinary mid-sized business (one that didn’t think it had significant security needs) like manufacturing or healthcare, the response was often “who would want to break into our environment.” Unbelievably, these comments can still be heard within the IT groups of Fortune 500 companies; however, with breaches at organizations like Minneapolis’ Valspar (a Fortune 500 paint manufacturer which had its paint formulas stolen) corporate boards are beginning to understand the risk related to information security within IT and this is one of the keys to addressing the problem. Corporate boards need to wake up to the massive problem, fund information security, and demand more information about their organization’s posture on a regular basis.  Since boards are usually not made up of IT or security experts, it’s the responsibility of Information Risk, Security, Audit, and IT to provide them with tangible information about security and risk posture.  While boards could ask for the coffee vs. security budget ratio, there are better ways to look at this and budget for this. However, making the point to a non-IT oriented board takes tangible events and understandable facts. As the recent reports and news articles show, the events are happening. It’s up to boards, executive management, IT and information security to understand the facts and plan / fund appropriately.