Seth Peter
More by Seth Peter
WP_Query Object ( [query] => Array ( [post_type] => Array ( [0] => post [1] => webinars ) [posts_per_page] => -1 [post_status] => publish [meta_query] => Array ( [relation] => OR [0] => Array ( [key] => new_authors [value] => "18" [compare] => LIKE ) [1] => Array ( [key] => new_presenters [value] => "18" [compare] => LIKE ) ) ) [query_vars] => Array ( [post_type] => Array ( [0] => post [1] => webinars ) [posts_per_page] => -1 [post_status] => publish [meta_query] => Array ( [relation] => OR [0] => Array ( [key] => new_authors [value] => "18" [compare] => LIKE ) [1] => Array ( [key] => new_presenters [value] => "18" [compare] => LIKE ) ) [error] => [m] => [p] => 0 [post_parent] => [subpost] => [subpost_id] => [attachment] => [attachment_id] => 0 [name] => [pagename] => [page_id] => 0 [second] => [minute] => [hour] => [day] => 0 [monthnum] => 0 [year] => 0 [w] => 0 [category_name] => [tag] => [cat] => [tag_id] => [author] => [author_name] => [feed] => [tb] => [paged] => 0 [meta_key] => [meta_value] => [preview] => [s] => [sentence] => [title] => [fields] => [menu_order] => [embed] => [category__in] => Array ( ) [category__not_in] => Array ( ) [category__and] => Array ( ) [post__in] => Array ( ) [post__not_in] => Array ( ) [post_name__in] => Array ( ) [tag__in] => Array ( ) [tag__not_in] => Array ( ) [tag__and] => Array ( ) [tag_slug__in] => Array ( ) [tag_slug__and] => Array ( ) [post_parent__in] => Array ( ) [post_parent__not_in] => Array ( ) [author__in] => Array ( ) [author__not_in] => Array ( ) [search_columns] => Array ( ) [ignore_sticky_posts] => [suppress_filters] => [cache_results] => 1 [update_post_term_cache] => 1 [update_menu_item_cache] => [lazy_load_term_meta] => 1 [update_post_meta_cache] => 1 [nopaging] => 1 [comments_per_page] => 50 [no_found_rows] => [order] => DESC ) [tax_query] => WP_Tax_Query Object ( [queries] => Array ( ) [relation] => AND [table_aliases:protected] => Array ( ) [queried_terms] => Array ( ) [primary_table] => wp_posts [primary_id_column] => ID ) [meta_query] => WP_Meta_Query Object ( [queries] => Array ( [0] => Array ( [key] => new_authors [value] => "18" [compare] => LIKE ) [1] => Array ( [key] => new_presenters [value] => "18" [compare] => LIKE ) [relation] => OR ) [relation] => OR [meta_table] => wp_postmeta [meta_id_column] => post_id [primary_table] => wp_posts [primary_id_column] => ID [table_aliases:protected] => Array ( [0] => wp_postmeta ) [clauses:protected] => Array ( [wp_postmeta] => Array ( [key] => new_authors [value] => "18" [compare] => LIKE [compare_key] => = [alias] => wp_postmeta [cast] => CHAR ) [wp_postmeta-1] => Array ( [key] => new_presenters [value] => "18" [compare] => LIKE [compare_key] => = [alias] => wp_postmeta [cast] => CHAR ) ) [has_or_relation:protected] => 1 ) [date_query] => [request] => SELECT wp_posts.ID FROM wp_posts INNER JOIN wp_postmeta ON ( wp_posts.ID = wp_postmeta.post_id ) WHERE 1=1 AND ( ( wp_postmeta.meta_key = 'new_authors' AND wp_postmeta.meta_value LIKE '{d30df52b43823b82d0e6cc23bb15bd921ae91d81e8eaf6887cbc42af72d90063}\"18\"{d30df52b43823b82d0e6cc23bb15bd921ae91d81e8eaf6887cbc42af72d90063}' ) OR ( wp_postmeta.meta_key = 'new_presenters' AND wp_postmeta.meta_value LIKE '{d30df52b43823b82d0e6cc23bb15bd921ae91d81e8eaf6887cbc42af72d90063}\"18\"{d30df52b43823b82d0e6cc23bb15bd921ae91d81e8eaf6887cbc42af72d90063}' ) ) AND wp_posts.post_type IN ('post', 'webinars') AND ((wp_posts.post_status = 'publish')) GROUP BY wp_posts.ID ORDER BY wp_posts.post_date DESC [posts] => Array ( [0] => WP_Post Object ( [ID] => 1286 [post_author] => 18 [post_date] => 2010-01-22 07:00:23 [post_date_gmt] => 2010-01-22 07:00:23 [post_content] => I’ve always been a firm believer in incorporating manual testing as part of any security assessment; after all, a human is the best judge of evaluating the contents of application output, and best able to truly understand how an application is supposed to function. But after attending Darren Challey’s (GE) presentation at the 2009 OWASP AppSec conference, I was encouraged that someone actually measured the value of manual testing – and justified my belief! According to Darren, no single application assessment or code review product could find more than about 35% of the total vulnerabilities GE could find with a manual process. That alone should encourage anyone serious about eradicating vulnerabilities within their applications to step it up a notch! I would not want to be the person certifying an application for public consumption with only 30% of security issues fixed! To understand why manual testing is so critical, let’s break down some of the reasons why assessment tools have limitations. For network scanners, vulnerabilities are largely based upon remote OS and application footprints; accuracy will decrease if that footprint is inaccurate or masqueraded. Application scanners must try to interpret application output; if an application uses custom messaging, what’s the scanner supposed to think? Code review products are never going to be able to accurately interpret code comments, identify customized backdoors, or follow application functionality that might appear orphaned. One must also keep in mind that an assessment product will only report on something if the vendor has written a check or signature for it; think about how many vulnerability signature authors exist compared to the number of hackers identifying new exploits. Automated testing has a very important role in security assessments: these security tools help us identify a large swath of mainstream issues in an efficient manner. And manual testing can be expensive and time consuming. However, the cost of fixing vulnerabilities after an application or system is in production is also very costly and time consuming. According to the Systems Sciences Institute at IBM, a production or maintenance bug fix costs 100x more than a design bug fix. Furthermore, the cost of a breach increases annually as well. Adding comprehensive manual testing to your assessment criteria does have an ROI, and more importantly, could improve your detection accuracy by 60% or more! [post_title] => Manual vs. Automated Testing [post_excerpt] => [post_status] => publish [comment_status] => closed [ping_status] => closed [post_password] => [post_name] => manual-vs-automated-testing [to_ping] => [pinged] => [post_modified] => 2021-04-13 00:05:51 [post_modified_gmt] => 2021-04-13 00:05:51 [post_content_filtered] => [post_parent] => 0 [guid] => https://netspiblogdev.wpengine.com/?p=1286 [menu_order] => 840 [post_type] => post [post_mime_type] => [comment_count] => 1 [filter] => raw ) [1] => WP_Post Object ( [ID] => 1304 [post_author] => 18 [post_date] => 2009-11-02 07:00:23 [post_date_gmt] => 2009-11-02 07:00:23 [post_content] => Should you rely on just one solution to identify all of your vulnerabilities? Most of us rely upon just one anti-virus scanner, right? Every vulnerability scanner claims to be better than its competitors, but how could this be? Where is the Consumer Reports on this subject? I think there is a mix of reasons why this subject has not been picked up by the likes of Gartner or Forrester—it's quite technical and hard to understand, and the audience may be too small. I have inquired of two independent security test labs recently as to whether or not vulnerability scanning products were ever tested and compared against one another, with the results then published. The short answer is no. Products are often benchmarked against standard criteria, and results are privately reported according to whether or not they meet the minimum criteria. There have been some rogue studies on the subject, and I have conducted extensive testing myself. I can confirm that certain products are better than their competitors, but not in all areas. Because there are not well-defined standards or readily available test results, security practitioners are left using a vulnerability scanner that performs like a piano with many keys out of tune. In our own testing we have seen variations of up to 60% among leading products. In addition, their comprehensiveness and accuracy depend on what operating systems, applications, and configuration settings you have and whether or not your scanner vendor agrees that a particular vulnerability is important enough to test for. In a decade-old product space, we have not seen complete maturity of either the space or the products themselves. During this time there have been a number of acquisitions of product vendors, and some of those acquired products no longer exist. At the same time, new and exciting products and vendors continue to emerge. The requirements of a scanner have evolved from OS level service checks to include web application vulnerabilities, authenticated configuration testing, and zero day attacks. Within the typical server environment, there are so many vulnerabilities identified time and time again, that many organizations find it difficult to embrace the idea that there may be actually more vulnerabilities out there that go undetected. If your security team is a capable one, I encourage you to incorporate both commercial and open source tools, and even consider the introduction of more than one commercial product. If you outsource this service, ask your vendor what products it tests with and whether or not it can consolidate all findings from all vendors into one comprehensive report. In lieu of product comparison benchmarks, this approach may be your best option to ensure you are not leaving large areas of vulnerabilities undiscovered. Keep in mind, if you hire a product vendor to perform your assessment, its professional services team may not be able to use a different vendor’s product within its own solution. For those of you concerned with the thought of too many vulnerabilities, check back in a couple weeks, as I plan to discuss some techniques for vulnerability prioritization and remediation. [post_title] => Vulnerability Scanning with Multiple Products [post_excerpt] => [post_status] => publish [comment_status] => closed [ping_status] => closed [post_password] => [post_name] => vulnerability-scanning-with-multiple-products [to_ping] => [pinged] => [post_modified] => 2021-04-13 00:06:17 [post_modified_gmt] => 2021-04-13 00:06:17 [post_content_filtered] => [post_parent] => 0 [guid] => https://netspiblogdev.wpengine.com/?p=1304 [menu_order] => 856 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw ) [2] => WP_Post Object ( [ID] => 1312 [post_author] => 18 [post_date] => 2009-10-05 07:00:23 [post_date_gmt] => 2009-10-05 07:00:23 [post_content] => Integrating security checks and balances with your application development processes is certainly uncharted territory for many security professionals. Why is this so? With the multitude of benefits that custom developed applications bring to an organization, there is also a multitude of risks, namely that sensitive, regulated, and confidential data is being stored, processed, transmitted, and exchanged inside and outside the boundaries of the organization. Why don’t we have a better handle on these risks? I think that we as security professionals have missed an opportunity to embed security in the development process. Let’s begin to understand why this is and how we can go about changing it. Step back a few years to when your security program was just really getting off the ground. Developed applications were supposed to just work right. Applications either met design specs or they didn't; wasn't it that simple? Why did security need to be involved? Back in the day, applications were always quarantined behind the firewall, and business partners only received data via a data exchange solution. Out of scope for security, right? Now, fast forward from the days of firewalls and anti-virus concerns and take notice: everyone who has any reason to access your data can, from anywhere, anytime. Sure, you’re protecting it with a strong password, maybe even two-factor authentication, a VPN, or rock-solid acceptance and confidentiality agreements. You're conducting vulnerability scans on the DMZ nightly, right? Would you ever dream of scanning that application nightly? In production? Are you crazy? That might cause an outage. But hold on, and consider what else you may be missing. Good application security practices require security checks and balances throughout the entire lifecycle, not just a security assessment at the end of the line. These checks and balances include things like secure coding standards, developer training, well-defined security requirements, security code reviews, the use of sanitizing test data, separation of duties, security testing during development checkpoints, security use case testing, and penetration testing. So, if that’s where we need to be, how do we get there? Is the answer Security Software Assurance Programs? They sound like a great concept, prescriptive programs that include everything from A – Z for application security; all you need to do is implement it and watch the needle on the risk meter drop. Let's get real. Budgets are tight, development timelines are tight, and security programs are underfunded. Can you realistically launch a massive and invasive security program? Once again, I think we missed an opportunity somewhere between the 1990s and now to ensure that security and application development work in tandem. So what do you do now? I suggest you think big, but start smart. Pick two or three obvious pain points and start there, e.g., with more testing or better requirements. If there is one thing to get in there sooner than later, start training your development leads (literally, scare the daylights out of them). Keep in mind, you will need to turn this into a full-on program, so begin working towards it, but begin by re-acquainting yourself with your developers and providing them with some high value wins. Help them find security issues in-house before someone else does. [post_title] => Are We Ready for a Security Software Assurance Program? [post_excerpt] => [post_status] => publish [comment_status] => closed [ping_status] => closed [post_password] => [post_name] => are-we-ready-for-a-security-software-assurance-program [to_ping] => [pinged] => [post_modified] => 2021-04-13 00:06:00 [post_modified_gmt] => 2021-04-13 00:06:00 [post_content_filtered] => [post_parent] => 0 [guid] => https://netspiblogdev.wpengine.com/?p=1312 [menu_order] => 862 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw ) [3] => WP_Post Object ( [ID] => 1318 [post_author] => 18 [post_date] => 2009-08-11 07:00:23 [post_date_gmt] => 2009-08-11 07:00:23 [post_content] => One of the common themes I took away from the 2009 Blackhat Briefings was the inherent security risks associated with using social media and networking sites. (These concerns have also received some coverage in trade pubs; see, for example a recent Computerworld article: https://tinyurl.com/mc7yb8) Using social media applications is not just a personal computing trend; they have also become integrated into our corporate cultures. Many organizations are using these sites for corporate marketing, file sharing, communications, and recruiting. In the past, the corporate policy of most organizations was not to post resumes online, use your corporate email account as your username to access a website, or post pictures from the company holiday party on a website (at least the potentially incriminating ones). Now, corporations are eager to get the word out about how great it is to work there, or connect with employees, or find out what events they will be attending, even posting opinionated blog entries such as this one. While these applications can open great new doors, they need some associated corporate guidance. I say guidance because a more explicit security policy regarding usage of Twitter, LinkedIn, or Facebook is likely to be unenforceable. Employees may refrain from using corporate accounts for these applications, but if they like them, they will find ways to use them. Here are some basic guidance points that you may consider in your next security-awareness email.[post_title] => Social Media and Corporate Guidance [post_excerpt] => [post_status] => publish [comment_status] => closed [ping_status] => closed [post_password] => [post_name] => social-media-and-corporate-guidance [to_ping] => [pinged] => [post_modified] => 2021-04-13 00:06:15 [post_modified_gmt] => 2021-04-13 00:06:15 [post_content_filtered] => [post_parent] => 0 [guid] => https://netspiblogdev.wpengine.com/?p=1318 [menu_order] => 870 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw ) [4] => WP_Post Object ( [ID] => 1324 [post_author] => 18 [post_date] => 2009-07-14 07:00:23 [post_date_gmt] => 2009-07-14 07:00:23 [post_content] => Preparing for an audit can be one of the best ways to fund and improve your security program, but this “stimulus package” for your compliance effort typically dwindles once an organization completes or passes an audit. I see this happen frequently in recurring or annual audits, but it is particularly relevant with the recent news of Merrick Bank. Specifically, Merrick Bank is suing Savvis for certifying CardSystems Solutions to be Visa CISP compliant prior to a breach that exposed some 40 million payment card records and resulted in $16 million in fines to the card brands. While this is the not the first breach of a PCI-audited company, it is the first one in which the auditor has been sued. The case raises an important question: Who is ultimately responsible for ensuring that a good security program is in place? Here are some simple, yet critical, points to ensure your security program is driven by something more than the audit itself.
- When using social media sites, be sure to use different passwords for different sites, and never use your corporate password. These sites have varying password reset controls; don’t let a breach of one account impact all your accounts.
- Remember, in the case of company documents, if it’s not meant for the company's public website, it probably isn’t meant to be shared on some-one else’s–-even if they told you it is secure. Watch out for sites like Google docs or yammer.com that create a perception of privacy and security. Let your security team determine acceptable sites.
- There are a couple of key items that you should never post publicly, such as your birth-date, social security number, or employee ID. If the site requires such data, consider making something up or ensuring it’s not displayed in a public profile.
- There are certain items that companies don’t technically classify as confidential, yet keeping them a secret and off the social networking sites is a good thing. This could includes rumors, planned purchases, technology used, and projects you’re working on. Posting your job history may be OK, but for current activities, just keep it generic.
As an information security auditor and advisor, I have seen numerous organizations pour budget and resources into a compliance initiative and then literally stop everything after an audit has been conducted. By conveying the importance of building a program based on security that meets compliance, an organization will be better prepared to defend against breaches and satisfy its auditor at the same time. [post_title] => Is your Compliance Driven by More Than an Audit? [post_excerpt] => [post_status] => publish [comment_status] => closed [ping_status] => closed [post_password] => [post_name] => is-your-compliance-driven-by-more-than-an-audit [to_ping] => [pinged] => [post_modified] => 2021-04-13 00:06:07 [post_modified_gmt] => 2021-04-13 00:06:07 [post_content_filtered] => [post_parent] => 0 [guid] => https://netspiblogdev.wpengine.com/?p=1324 [menu_order] => 875 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw ) ) [post_count] => 5 [current_post] => -1 [before_loop] => 1 [in_the_loop] => [post] => WP_Post Object ( [ID] => 1286 [post_author] => 18 [post_date] => 2010-01-22 07:00:23 [post_date_gmt] => 2010-01-22 07:00:23 [post_content] => I’ve always been a firm believer in incorporating manual testing as part of any security assessment; after all, a human is the best judge of evaluating the contents of application output, and best able to truly understand how an application is supposed to function. But after attending Darren Challey’s (GE) presentation at the 2009 OWASP AppSec conference, I was encouraged that someone actually measured the value of manual testing – and justified my belief! According to Darren, no single application assessment or code review product could find more than about 35% of the total vulnerabilities GE could find with a manual process. That alone should encourage anyone serious about eradicating vulnerabilities within their applications to step it up a notch! I would not want to be the person certifying an application for public consumption with only 30% of security issues fixed! To understand why manual testing is so critical, let’s break down some of the reasons why assessment tools have limitations. For network scanners, vulnerabilities are largely based upon remote OS and application footprints; accuracy will decrease if that footprint is inaccurate or masqueraded. Application scanners must try to interpret application output; if an application uses custom messaging, what’s the scanner supposed to think? Code review products are never going to be able to accurately interpret code comments, identify customized backdoors, or follow application functionality that might appear orphaned. One must also keep in mind that an assessment product will only report on something if the vendor has written a check or signature for it; think about how many vulnerability signature authors exist compared to the number of hackers identifying new exploits. Automated testing has a very important role in security assessments: these security tools help us identify a large swath of mainstream issues in an efficient manner. And manual testing can be expensive and time consuming. However, the cost of fixing vulnerabilities after an application or system is in production is also very costly and time consuming. According to the Systems Sciences Institute at IBM, a production or maintenance bug fix costs 100x more than a design bug fix. Furthermore, the cost of a breach increases annually as well. Adding comprehensive manual testing to your assessment criteria does have an ROI, and more importantly, could improve your detection accuracy by 60% or more! [post_title] => Manual vs. Automated Testing [post_excerpt] => [post_status] => publish [comment_status] => closed [ping_status] => closed [post_password] => [post_name] => manual-vs-automated-testing [to_ping] => [pinged] => [post_modified] => 2021-04-13 00:05:51 [post_modified_gmt] => 2021-04-13 00:05:51 [post_content_filtered] => [post_parent] => 0 [guid] => https://netspiblogdev.wpengine.com/?p=1286 [menu_order] => 840 [post_type] => post [post_mime_type] => [comment_count] => 1 [filter] => raw ) [comment_count] => 0 [current_comment] => -1 [found_posts] => 5 [max_num_pages] => 0 [max_num_comment_pages] => 0 [is_single] => [is_preview] => [is_page] => [is_archive] => [is_date] => [is_year] => [is_month] => [is_day] => [is_time] => [is_author] => [is_category] => [is_tag] => [is_tax] => [is_search] => [is_feed] => [is_comment_feed] => [is_trackback] => [is_home] => 1 [is_privacy_policy] => [is_404] => [is_embed] => [is_paged] => [is_admin] => [is_attachment] => [is_singular] => [is_robots] => [is_favicon] => [is_posts_page] => [is_post_type_archive] => [query_vars_hash:WP_Query:private] => ef6ce86d50da1eaeb4f88bae6fb5fd7e [query_vars_changed:WP_Query:private] => [thumbnails_cached] => [allow_query_attachment_by_filename:protected] => [stopwords:WP_Query:private] => [compat_fields:WP_Query:private] => Array ( [0] => query_vars_hash [1] => query_vars_changed ) [compat_methods:WP_Query:private] => Array ( [0] => init_query_flags [1] => parse_tax_query ) )
- Understand the role of the auditor. When preparing or undergoing an information security audit, it’s critical for organizations to consider the role the auditor performs within a security program. This role should never be a member of your security team or a designer/implementer of your security systems; it must be strictly be a reviewer of your security state at a point in time. While some coaching and direction can be good, all decisions and program enhancements must be driven by the organization itself.
- An audit is not a design session. If your security program design is heavily based on the initial audit gap report, your program will not be sustainable. Although you and your auditor share the same goal, ensuring you are compliant, your auditor’s coaching will be targeted on one thing—meeting a specific requirement. Your program will then also be designed merely to meet the standard and not take into consideration sustainability, holistic approaches, and integration with existing business requirements.
- If you are not 100% ready for the audit, you should not be audited. Because an audit is intended strictly to be an independent review of the security program, if your organization does not feel it can meet all aspects of the audit, you should fix it first. Don’t consider your audit a pass/fail game between you and your auditor. That is not the point of being measured, nor is it a best practice. Plus, any audit that is worth something requires a joint attestation by you and your auditor; if you can’t sign off on it, neither should the auditor.
- Aim higher than the compliance requirements alone. Information security is one of those areas where going above and beyond the call of duty can be a good thing. Compliance requirements are meant to be the minimum standard by which you are allowed to operate; strive to exceed them. Don’t budget based on just meeting the requirement, but budget based on what you think is required for your organization to manage risk effectively. Yes, that’s risk, not compliance.