May 2012 Archives

Certifications Won't Solve Anything

Hopefully this doesn't come off as "pick on SGS Blog" day, but there was a passing comment in a new post there today that I felt needed a quick comment. Specifically:

"Maybe NBISE, in addition to creating a better breed of cybersecurity practitioner, can help define and grow a corps of energy sector security executives comfortable with working at the BoD and C-suite level."

I've half-watched, with heavy eye-rolling, the early efforts of the National Board of Information Security Examiners (NBISE). My cynicism grows out of seeing tons of certifications in this industry that are meaningless, and yet overhyped as some sort of useful panacea. Much of this can be tied to the CISSP, which was effectively mandated in U.S. DoD Directive 8570.1. The net effect of that mandate has been to see a glut of under-qualified CISSPs in the work force flogging a certificate that they think proves that they're experienced "subject-matter experts." Sadly, having sat through tons of interviews with folks like that, I can tell you that it's not true. Incidentally, completing a SANS course or a basic certification is comparably shallow and only indicative of an interest in career development, not expertise.

Anyway... I could rant on endlessly, but I won't. Instead, I just want to highlight that - as a society - we have not ever really struck on a good way to assess competency. Degrees, certificates, etc., all provide a baseline measure of a degree of information learned, but they don't generally demonstrate actual competency. Note that many skilled and high-risk professions (e.g., electricians, plumbers, carpenters, doctors, police, firefighters) actually have fairly extensive additional training programs, complete with mentorship or apprenticeship periods that must be completed prior to being allowed to go solo (and, in high-risk professions, "flying solo" is often not allowed as a standard practice). It seems to me that this should be a consideration for our field(s) in the future.

Lastly, I do agree with the overall point of the post (and the original WSJ article): we do need to be reaching out to the Board and educating them on their complete risk profile, including IT (operational) risk. This practice should be part of a standard briefing, provided in business terms, and tied to a commercially reasonable, legally defensible plan of action. But you already knew that... ;)

Published Metrics Can Be Overrated

I encountered an interesting post yesterday over on the Smart Grid Security blog. In it, the author asked the question:

"Without a lingua franca for security, how will anyone ever know which organizations are doing a comparatively better or worse job? Whether one's own organization is kicking butt or having its butt kicked?"

It's an interesting question, and something that has been discussed a lot over the years. However, I think I've reached a new conclusion in my thinking on the subject, which is this: I only think a very few Key Performance Indicators are appropriate, useful, or necessary when it comes to public sharing.

Specifically, coming from an engineering perspective, I think that the standard IT KPIs of Availability, Mean time between failures, Mean time to repair, and Unplanned downtime really matter most, even with regards to security. We may also want to add in there some sort of reasonable risk assessment, as relates to the SEC Q1 filing guidance, but those numbers will tend to be stated more as dollars (which we understand), rather than as operational metrics.

Now, don't get me wrong... I think that each organization should have other measurements. However, what each org measures, and how they use those metrics, will tend to be unique/specialized. Rather, my point here is that there are really very few publicly sharable metrics that matter, and I think that they come back to what we already know as useful IT KPIs. More importantly, moving to this approach tabs in perfectly with a topic that I'll be writing about soon, which is eliminating "security" as a category altogether in favor of focusing on IT operations reliability and GRC practices.

Here's an interesting dilemma... how does one go about estimating losses in the public sector? NIST RMF side-steps this problem by advising people to assume the worst-case scenario for their estimates, but this leads to all sorts of problems (if everything is "critical," then how do you prioritize?). Given my background with FAIR, I've thought that perhaps it could show me a better way through this question... however, it's a bit of a pickle!

First, a quick primer on FAIR and loss estimates: In estimating losses, FAIR splits the estimate between direct losses to the primary stakeholder and losses triggered by secondary stakeholders. Losses are then estimated (using calibrated ranges and confidence statements) in 6 categories: Productivity, Response, Replacement, Competitive Advantage, Fines & Judgments, and Reputation. In most cases, the first 3-4 categories tend to be primary losses, while the last 2-3 tend to be secondary losses.

However, let's now turn this around to the public sector. Assuming that they're the primary stakeholder, and that the public and other entities are the secondary stakeholders, can we produce a reasonable loss estimate? First off, let's think about those 6 categories... we can immediately remove the last 3 (CompAdv, F&J, and Rep) as not applying. The government doesn't seem to fine itself, and there's really not much you can do if they're compromised. After all, so long as you're within the borders of the US, you're subject to the US Government. It's not like you can physically stay put and opt out to a different government. This just leaves us Productivity, Response, and Replacement. Leaving "government productivity" jokes aside, it's pretty clear that any loss estimates here should be fairly low, and thus not necessarily meaningful or compelling. So, perhaps this is a failed approach...

What then would be a better approach? One notion floated is to flip the stakeholders. What if you were to first estimate the loss to the public as the primary stakeholder, and then considered other costs (such as to the government itself) as the secondary stakeholder losses? That is perhaps a lot more interesting, since there may be some reasonable arguments that the compromise of certain datasets will have a sizable negative impact on the public (especially when viewed as a whole - so each individual loss rolled up to a large aggregate). Suffice to say, this line of thinking certainly opens the door to a more compelling analysis, and it's definitely worth exploring further...

What do you think?

This is an incomplete thought...

Using an analogy to healthcare or epidemiology is certainly not a new thing. Some circles have been talking about this idea for quite a while. In fact, one need only think about malware being referred to as "viruses" to get an immediate connection. It's also fairly similar to the ecological analogy that some have posed in the past, particularly as it relates to application security.

That said, I noticed this week at Secure360 that many risk management people were now talking about the analogy to epidemiology, not only as it relates to evidence-based medicine and evidence-based risk management, but also as an overarching concept.

I've not had adequate time to fully parse through this notion, but intuitively I rather like the concept. It seems to map fairly cleanly to many security and risk management problems, and it certainly aligns very well with the imperative for business survivability. Whether it will continue to hold-up to other practices remains to be seen, but for a starting point we could do much worse. It also provides a very good case of where compliance regimes can be beneficial (think of all the places where checklists are relied upon to ensure patient safety and wellbeing).

Once this idea has had some time to percolate, I'll try to loop back and write more about it...

SIRAcon Wrap-up

This past Monday we held the very first Society of Information Risk Analysts Conference (SIRAcon). The event was hosted in the same venue, and in coordination with, the Secure360 conference. For a first-time conference, this was a really remarkable event. We had about 35 attendees, all of whom were very engaged in the conversation. Overall, we could not have asked for a much better event.

Personally, my favorite part of the conference was the "risk practitioners' panel" that I helped organized (I know that sounds a bit ego-centric, but it was a lot of fun!). It was great to have such a range of talented, experienced risk management professionals talk about their experiences, challenges they've encountered, and how they see the future unfolding.

Given the success of this inaugural event, I think it's safe to say that there will be another. It's obviously way too early to say when and where it will be, but it'll definitely happen. We hope that you'll be able to join us next time!

The topic of "cybersecurity" is once again very hot in Washington, DC. Unfortunately, this means it's in the domain and purview of politicians, which should make any self-respecting professional wince. After all, it's not often that politicians get regulations "just right"... one need only look at recent failures like No Child Left Educated (er, Behind, I suppose) to see just how bad things can get when politicians cross the line from legislating toward outcomes vs. legislating very specific practices. The electricity sector provides another ready example, though a bit more complex, insomuch as the detailed NERC Critical Infrastructure Protection (CIP) requirements have overwhelmed organizations that have displayed an underwhelming since of urgency or competency around the topic of cybersecurity.

The point to this mild rant is simply this: the more deeply politicians seem to get involved with cybersecurity, the worse things seem to get. And, lest we be led astray, we should not forget that, aside from the Education sector, civilian agencies in the federal government are perhaps the worst offenders when it comes to failing to implement reasonably solid cybersecurity. There are a few reasons why I think this is the case.

About this Archive

This page is an archive of entries from May 2012 listed from newest to oldest.

April 2012 is the previous archive.

June 2012 is the next archive.

Find recent content on the main index or look in the archives to find all content.

Monthly Archives


  • about
Powered by Movable Type 6.3.7