We held the Secure360 conference in the Twin Cities last week. Presentation topics included PCI, cloud computing, and problems within the security industry. While it can get tiring discussing the industry’s problems, I like trying to understand the difficult nature of information security and enjoy the challenge of trying to overcome the obstacles related to rationally dealing with risk. On this topic, Rich Mogull had a very good presentation, “Putting the Fun in Dysfunctional,” about the inherent problems with information security. I appreciate insights from someone with both an IT and a physical security background and I thought he did a nice job discussing why security is such a difficult area for a business to understand.  I agree with the points he made that at the most simple level security and risk are abstract, long-term concepts that require a rational approach.  Rich did a good (and entertaining) job of illustrating that, as humans, we are often not rational. Generally we deal in the short-term and prioritize with our basic needs. In the context of a corporate environment, understanding and dealing with risk is extremely difficult.   I’d add to Rich’s discussion that in most organizations building mature risk management is essentially like playing a game of telephone across functional departments, most of which find risk and security to be totally foreign concepts (except, of course, at financial institutions). Rich’s thesis created a nice framework for the other core topics at the conference. A number of presentations dealt with the dangers of cloud computing. Because we created the cloud without rationally dealing with risk and security, it’s an afterthought; there are huge holes in cloud computing security and therefore significant risk.  David Bryan had a great presentation on the subject. The other core topic, PCI, is generally thought of as a compliance issue.  Anton Chuvakin put some context around PCI and how it fits as a basis for a security program.  I’ve seen a number of organizations do this, and Anton did a nice job outlining the gaps related to using the standard as a basis. While no standard is ideal, it’s a start and generally kick starts a maturation of risk management within organizations that adopt the approach. Overall, the Secure360 conference was very good and the speakers both local and national were great.  Kudos to the organizers. I look forward to next year.

Back

PCI Compliance: Now a Finance Issue as Well

As an information security professional, my experience within the payment card security industry has taught me that credit card fraud is not just an information security or information technology issue, but increasingly also a financial one. In order to process payment cards, organizations must execute agreements with financial institutions (“acquirers”) that legally obligate them to put in place appropriate controls to protect the underlying data. In most organizations, it is the finance and accounting teams that are most familiar with the business processes involved with the acceptance, chargeback and settlement of credit card payment data. Therefore, it is very important that the CFO and finance teams be involved in any effort to construct a sound credit card security program or approach. Such a program should seek to both minimize the risk and the cost of compliance.?The payments community has learned that stolen credit card data is a valuable commodity among criminals; just ask the folks at TJX or Heartland Payment Systems, where breaches resulted in the exposure of credit card data for millions of people.

PCI DSS

The compliance requirements (and the fines for noncompliance) are starting to be pushed down from the credit card companies to financial institutions or acquirers who are, in turn, pushing down to their customers (“merchants” and or “service providers”), contractually requiring organizations to become PCI-compliant. Organizations that have one acceptance channel for credit cards (e.g., a POS or via the web) and use third-party software should self-assess via the Self Assessment Questionnaire (SAQ). Financial professionals should use the published prioritized approach from the PCI Security Standards Council (SSC) to address specific risk areas within their organizations regarding credit card data. Those organizations that have multiple acceptance channels (storefront, Point of Sale and/or via the web) and that store credit card data should involve a Qualified Security Assessor (QSA) if assistance is needed.

Upcoming dates for the standard

There are two important PCI-related dates that are fast approaching, which finance people should be aware of. July 2010 marks the date after which all merchants must use certified payment applications. A payment application is any application that accepts, transmits or processes credit card data. An example of a payment application is a card swipe machine at a grocery store or a pay at the pump application at a gas station. September 2010 involves the PCI DSS itself, which will have updates to the standard released that month. These updates will take effect in January 2011.

Back

PCI Assessors Meeting

I am currently on my way back from Las Vegas and the PCI (Payment Card Industry) Assessors Meeting.   I guess it is appropriate that the Delta flight that I am on is a cashless flight; you are now able to buy all the $5 Pringles you can eat with a credit card.  But I digress; the real update here is regarding the PCI Assessors Meeting that I attended Thursday afternoon.  In all there were approximately 30 representatives of QSA (Qualified Security Assessors)/ASV (Approved Scanning Vendors) companies in attendance.  The event was not well attended from the perspective of many people that I spoke with, some attributing the lack of attendance on the session not being well publicized. The meeting provided a summary of the last couple of months within the payment card industry.  The monthly newsletter was discussed as well as the relevance of topics covered within the newsletter.  Evidence gathering and how evidence is verified provided some information on the top 10 areas for improvement for DSS (Data Security Standard) and PA-DSS (Payment Application – Data Security Standard).   Speaking of the PA-DSS, it was reported that there has been a 17% increase in the number of the ROVs (Report on Validation) being submitted.  The recently released ASV Program Guide was summarized during the meeting, including the new reporting templates.    The question and answer session lasted longer than the presentation and covered a broad area of topics.  There were questions related to assessment methodology and the need to have consistency in approach amongst the QSA firms.  The QA Program will be updated in the fall to coincide with the update to the DSS.  The updates or potential updates to the standard where not discussed as the card brands and the SSC (Security Standards Council) want to make sure that they adhere to the feedback lifecycle timelines that have been established. Overall the meeting would have been better received if there was more information provided regarding the proposed updates to the DSS.  However, the card brands and the members of the SSC were willing to engage in productive conversation that will benefit the standard in the long term.

Back

Are You Testing Your Web Application for Vulnerabilities?

As an organization that performs a large volume of code reviews and penetration tests, NetSPI is frequently asked which type of application assessment is the best option. Your primary options are a code review or a web application penetration test. Both are recommended and both find many of the vulnerabilities commonly found in web applications as defined by the Open Web Application Security Project (OWASP) Top 10 (https://www.owasp.org). By themselves, neither a code review nor a web application penetration test find all of the vulnerabilities that threaten the application.

Why perform them?

Many regulations either require them or highly recommend them. For example, Payment Card Industry Data Security Standard (PCI-DSS) requires that either a code review or a web application vulnerability security assessment (web application penetration test) is performed annually on any web application that stores, processes or transmits credit cards (PCI Requirement 6.6).  In addition, for payment application vendors (i.e. point-of-sale application, etc.) the PCI Payment Application Data Security Standard (PA-DSS) requires a code review and a penetration test targeting the OWASP Top 10.

Picking one over the other

Even though code reviews and web application penetration tests can find most of the same vulnerabilities, they look at the application differently and as a result their findings can differ. Typically both approaches find OWASP Top 10 issues such as SQL Injection, cross-site scripting (XSS), etc.  However, the efficiency and effectiveness how each method finds these vulnerabilities can differ. For example, code reviews are better at finding most instances of input validation issues (i.e. XSS or SQL Injection). All of the automated code scanning tools NetSPI uses trace the data within the application from its entry point to its exit point. A web application penetration test can find these instances but it could take days or weeks to prove they exist in the application.

Web application penetration testing

A web application penetration test can uncover vulnerabilities from the outside looking in. These vulnerabilities can be related to configuration or versions. An example might be an older version of Apache with a chunked encoding overflow vulnerability (https://osvdb.org/838). Web application penetration tests can also uncover vulnerabilities that are related to the operation of the application, such as default or easily guessed credentials. Other types of vulnerabilities found by a web application penetration test include:

  • Access control (forced browsing, etc.)
  • Session hijacking
  • Vulnerabilities related to business logic

In addition, web application penetration testing can find these instances easier than a code review. This being said, I am talking about a third-party code review, not a code review done by a person or person familiar with the code or the company’s development processes. Automated source code analysis tools do not find these types of vulnerabilities.  Manual testing can be done but could greatly inflate the cost of the code review.

Code Reviews

A code review looks at the application from the inside out. Vulnerabilities commonly found in a code review that cannot be easily found in an application penetration test include logging of sensitive data or application backdoors, as they are not exposed to the outside.  Other types of vulnerabilities found by a code review include:

  • Denial of services caused by not releasing resources
  • Buffer overflows
  • Missing or poor error handling
  • Dangerous functions
  • Hardcoded password or keys in the source code
  • Code implementation problems
  • Missing or poor logging

Final Thoughts

The most comprehensive approach to finding security vulnerabilities in web applications is performing both a code review and a web application penetration test. For critical applications, performing only one of these services can result in many vulnerabilities remaining within the application and unacceptable risk to the organization.

Discover how the NetSPI BAS solution helps organizations validate the efficacy of existing security controls and understand their Security Posture and Readiness.

X