28 February 2014

Contextually relevant information security metrics


In "Business Analytics - An Introduction", Evan Stubbs describes "value architecture" in these terms: "Results need to be measurable, they need to be contextually relevant, they need to link into a strategic vision, and their successful completion needs to be demonstrable".

Breaking that down, I find that there are really only two key factors. If results are measurable, that implies to me that they can be demonstrated. Also, it's hard to see how results that are 'contextually relevant' might not 'link into a strategic vision' since that is the context, or at least a major part of it. So, in short, results need to be both relevant and measurable.

Of those two aspects, measurability is the easier. Read "How to Measure Anything" by Douglas Hubbard! Evan also talks about objectivity, and he is writing in the context of big data analytics, meaning the difficult problem of extracting useful meaning from huge and dynamic volumes of complex data. Measurability is largely a matter of mathematics, or more precisely statistics. I agree it is a major issue but, with all due respect to the statistical wizzards, fairly mechanistic and logical.

That leaves the more awkward question of relevance. What are 'contextually relevant results'? Evan pointed out the strategic element, implying that relevant results can be extrapolated from corporate strategies. Strategies typically elaborate on how the organization intends to achieve identified long-term goals and interim objectives, often using the metaphor of a journey across a landscape passing waypoints en route to a destination. That in turn suggests the idea of measuring the actual direction and speed of travel - the trajectory - relative to the planned route as well as proximity to the eventual goal. Metaphorically speaking, it's more efficient to take a direct route than to be constantly side-tracked and diverted, and perhaps get lost.

So how does this relate to information security metrics? Evan implies that we need to defined the intended results of information security in terms that are both relevant and measurable.

Again I will side-step the measurability angle in order to focus on relevance. How is information security relevant to the organization?

In strategic terms, information security can be expressed in several different ways. Usually, we talk about protecting information assets, the defensive perspective. In this frame of reference, information systems, networks and information need to be defended against all manner of threats that would harm them. Relevant metrics here tend to relate to measuring and assessing the risks (threats, vulnerabilities and impacts, including security incident and business continuity metrics) and security controls (especially control efficiency and cost-effectiveness, implying most financial security metrics). Compliance is a classic defensive objective, hence compliance-related metrics also fit into this group.

Some of us also talk in terms of information security as a business enabler - letting the organization do business, safely, that would otherwise be too risky. Here we're thinking more proactively: security has an offensive as well as a defensive strategic role. Relevant metrics in this domain include the assurance angle, giving management confidence in the security arrangements so that they can concentrate on taking the most direct route. Hence control reliability metrics, plus various test and audit results, are in this group. Proactively exploiting strengths in information security also implies going beyond mere compliance (which, it has to be said, is a low hurdle) towards good or even best practice. Security maturity, benchmarking and governance metrics are relevant to business enablement. Measures of the integration of information security into various business and IT processes and systems are an example.

What are we left with?  Mmm, I'm not sure I can think of any information security metric that doesn't fit into one or other of those categories. Can you?

26 February 2014

Holistic security metrics

Yet again today I find my blood pressure reading as I read yet another incredibly biased pronouncement on security metrics from security vendors:
"Do you know what security metrics are right for your organization? For a holistic view, both network and host metrics are required, including firewalls, routers, load balancers, and hosts."
To claim that having network and host security metrics qualifies as holistic almost beggars belief, for any thinking person's definition of the term but I'm afraid it's typical of the incredibly myopic purely technical perspective on security metrics, continually reiterated for blatantly obvious marketing reasons by the purveyors of ... IT security products.

Being sick and tired of explaining that IT security is a dead end off the main information security highway, I'll merely suggest a few non-technical security metrics that might get us a tiny bit closer towards a truly holistic view:

For an holistic view of information security, I respectfully submit that "network and host metrics" fall woefully short of sufficient.  They are needed, yes, but they are definitely not enough.  

12 February 2014

PRAGMATIC Security Metric of the Quarter #7

PRAGMATIC Information Security Metric of the Seventh Quarter


According to the overall PRAGMATIC scores assigned by ACME's managers, the latest metric discussed was the top choice in the three months just past, but it was a close-run thing:

Example metric P R A G M A T I C Score
Information security incident management maturity 90 95 70 80 90 85 90 85 90 86%
Information security ascendancy 97 87 15 94 86 90 99 97 99 85%
Quality of system security 83 88 83 73 90 68 80 82 10 73%
Integrity of the information asset inventory 82 66 83 78 80 43 50 66 70 69%
Proportion of systems security-certified 72 79 73 89 68 32 22 89 88 68%
Number of different controls 71 75 72 75 88 30 50 65 43 63%
Controls consistency 78 83 67 60 71 33 27 31 27 53%
Value of information assets owned by each Information Asset Owner 48 64 78 57 79 38 50 22 26 51%
Number of information security events and incidents 70 60 0 50 72 35 35 70 50 49%
% of business units using proven identification & authentication 69 73 72 32 36 4 56 2 50 44%
Distance between employee and visitor parking 1 0 6 93 2 93 66 45 66 41%
Employee turn vs account churn 30 30 11 36 44 36 62 57 20 36%
Non-financial impacts of information security incidents 60 65 0 20 60 6 30 20 17 31%



"Maturity of the organization's information security incident management activities" seems to us to be an excellent proxy or indicator for the organization's overall approach to information security. The maturity scoring process we have described makes this a valuable metric, not just in terms of the final maturity rating but also the additional information that emerges when comparing current practices against accepted good practices.

Just as interesting are the metrics languishing at the bottom of the league table. For example, "Non-financial impacts of incidents" may appear, at first glance, to hold considerable promise as a security metric but the PRAGMATIC score clearly indicates ACME management's severe misgivings once they explored the metric in more detail.

Instead of simply selecting metrics on the basis of their the overall PRAGMATIC scores, management could instead select high-rating metrics for any one of the individual PRAGMATIC criteria, or any combination thereof - for example, 'information security ascendancy' is rated the most predictive and cost-effective security metric of this little lot.

In researching and developing the PRAGMATIC method for the book, we explored the possibility of weighting the PRAGMATIC ratings in order to place more or less emphasis on the criteria. There may be situations where that is a sensible approach but, in the end, we decided that the overall PRAGMATIC score was the most valuable and straightforward metametric.

06 February 2014

SMotW #91: incident management maturity

Security Metric of the Week #91: information security incident management maturity


Notwithstanding the photo, we're using 'maturity' here in the sense of wisdom, stability and advanced development, rather than sheer age! The idea behind maturity metrics is to assess the organization against the current state of the art, also known as good practice or best practice.

This particular metric measures the organization's processes for managing (identifying, reporting, assessing, responding to, resolving and learning from) information security incidents. 

That's all very well in theory, but how do we actually identify good/best practices, and then how do we measure against them?

The maturity metrics described in PRAGMATIC Security Metrics employ a method that I developed and used very successfully over 3 decades in information security and IT audit roles. The scoring process breaks down the area under review into a series of activities and offers guidance notes or criteria for bad, mediocre, good and best practices in each of those activities, based on an appreciation of the related risks and control practices gained from experience and research. The scoring tables contain a distillation of knowledge in a form that gives reasonably objective guidance for the assessment, without being overly restrictive. The approach is flexible since the table is readily updated as new practices and issues emerge (including good and not so good practices discovered in the course of my audits, assessments and consultancy work across hundreds of organizations and business units, plus advice gleaned from standards, advisories, textbooks, vendors, blogs and so forth), either by amending the wording of the existing rows in the scoring table or by adding new rows. Furthermore, the assessor has some latitude at run-time (during the assessment) to read between the lines, applying his/her expertise and knowledge in determining how well the organization is really doing against each of the criteria. The metric deliberately and consciously blends objectivity with subjectivity through a measurement process that turns out to be surprisingly useful, informative and repeatable in practice.

The maturity metrics scoring tables given in the book are illustrations or examples to demonstrate the approach and get you started, but it's up to you to take them forward, adapting and developing them henceforth. The scoring tables, and hence the metrics, are themselves intended to continue evolving and maturing over time. 

ACME gave this metric an overall PRAGMATIC score of 86%, putting it firmly in contention as our "security metric of the quarter" ...

The next post on the Security Metametrics blog will list the quarter's metrics in order of their PRAGMATIC scores

05 February 2014

Just how dynamic is information security?

"Information security is not the easiest of things to manage.  The lack of suitable metrics makes it even harder in many organizations.  Security management decisions are generally made on the strength of someone’s gut feel (an important but fallible and potentially biased approach), or for external compliance purposes (seldom aligned with the organization’s risk appetite).  Metrics are the only way to tell whether best practices are truly good enough, and provide the data to make informed choices, identify improvement opportunities, and drive things in the right direction." 

That's the executive summary of a new management paper on security metrics for our Information Security 101 security awareness module, which we are currently revising and updating.  The current module was released at the end of 2010 and, despite being a relatively superficial overview of a selection of general-interest information security topics for new hires, it's surprising how much has changed over the past 3 years.  BYOD, cloud computing, ransomware and SIEM, for examples, were barely on the radar back then, while the whole Big Brother NSA thing was still under wraps.  

That set me thinking about the rate of change of information security.  Infosec pros like me often spout off about ours being a 'highly dynamic field'.  Are we justified in saying so?  On what basis do we assert that?  What do we even mean?  Is infosec any more or less dynamic than other fields, in fact?  The questions keep coming!

Being a self-confessed metrics freak, I can't help but wonder at whether and how we might actually measure this, ideally in such a way as to be able to compare different areas on a common basis.  Let's simplify things down to a comparison of infosec against, say, risk management or perhaps management as a whole.  That train of thought suggests the idea of inviting managers and subject matter experts to rate a bunch of activities or concerns on the basis of their perceived changeability or dynamism.  A straightforward survey would suffice, asking respondents to rank maybe 5 to 10 areas, perhaps allowing them to add additional areas as they see fit.

Meanwhile, an even more pragmatic metric is staring us in the face: 2 paragraphs ago I mentioned that our Information Security 101 awareness module needs revision after just 3 years. How does that review period compare to the equivalent awareness/training materials covering things such as HR, compliance, health & safety etc.?  You might argue that there are several factors driving the review and update process aside from changes in those fields, and indeed there are but we could potentially address that issue by surveying numerous organizations, somehow avoiding the self-selection bias by, for instance, polling the readers of a general management website or magazine, or members of groups such as the Institute of Directors.  Supplemental survey questions could help us identify and sift out biased responses.

OK, well it's all starting to look a bit difficult and expensive at this point, and things are just as awkward on the other side of the cost-benefit equation.  What would we gain by measuring the dynamics of information security?  A facile reason would be to put some meat on the bones of the bland assertions by us infosec pros, but that's hardly a valid business driver.  A more useful purpose for the metric would be to help drive the strategy, for instance emphasizing the need for more rapid and tactical responses to emerging information security issues.  I can imagine a number of governance, strategy and policy decisions in various organizations being guided by the numbers ... or not.

At the end of the day, gut feel, presumptions, assertions and perceptions appear to be sufficient to drive strategy right now, so I'm not entirely convinced there is a clear value case for a metric concerning the rate of change in information security.  However, if you are an infosec manager or CISO pondering how to argue your next budget or investment proposal, this rambling piece might just spark a novel approach.